“Nothing like a herd of nerds with a beer in one hand and a smart phone in the other.” – @charliecurve
Communication between humans is both complex and primal. While we tend to focus on verbal communication as the primary means for contact, blind and deaf people will attest that there are other ways to communicate. Disabilities aside, we don’t just communicate with our words, but our ability to interpret the other means of communication varies based on knowledge. In this tech heavy world we now reside in, physical communication has changed. Body language has become something much different than it used to be. Whether or not we can yet call this adaptation or evolution is yet to be determined, but in the grander sense of things – modern portable tech has had a very noticeable effect on body language.
Psychology Today has a pretty comprehensive guide to interpreting body language, both on how people read yours and how to read other people. Regardless, body language is more open to interpretation than language. In language, we have indicators in speech that are easily interpreted. A change in tone or cadence can indicate sarcasm and annoyance. With body language, these indicators are not only harder to interpret accurately, but compliment speech both positively and negatively. In a way, while speech is generally a conscious control, body language is something that not many people have conscious control over. This is what creates a psychological market for interpretation.
Now, in the 21st century, reading someones body language has become more difficult – because their attention is not focused on you anymore. In the days when you would speak to someone without a cell phone in their hands, their attention was on you. Therefore, their body language – a shrug, a sideways glance – was focused on you. You knew that you could interpret their action in relevance to the conversation. With a digital distraction however, their brain is splitting time, so body language is suddenly something that you risk being terribly wrong about – they could be shrugging at a tweet they just read, not at you. This complication in human relation will have less than desirable effects for the future.
Body language, and the interpretation of it, starts with the eyes. There are many ways to determine someones actual feelings towards an interaction with their eyes. Any variation in focus, depending on the subject matter, can indicate a lie, distrust and so on. Knowing when and how to use this information can be valuable in social and business interactions – and might be why I’ve become so jaded in business – yet is at risk. Enter devices like Google Glass. Unlike GUNNARS, Google Glass will force users to variate their focal points. Reading someone who is wearing this device will become nearly impossible.
Yet, as we tend to do – we will adapt. So will the psychology around interpreting body language. While Google Glass slowly becomes more insinuated into the tech world, we’ll figure out the changes in how someone interacts while they have a digital device on their face. So there is hope. While traditional body language is being muddled, digital device use in social interactions is creating a new subset of body language. This subset is slowly replacing the traditional body language vocabulary. The indicators will change, but the subconscious responses that prompt them will have not.
In conversation, when someone is displeased with what you are saying – they might roll their eyes or tilt their head. Both indicators that they don’t believe a word you are saying. This is helpful to know so you can alter your communication or back it up with more evidence. Unless you are lying. Regardless, with a digital device their attention is not entirely on you. So you have to look for other indicators. A glance up from the device, previously an eye roll, or more verbalization since their body is responding to the device – rather than the conversation.
With Google Glass though, assuming it does catch on, we lose the most important thing when it comes to body language – the eyes. Says Joel Hladecek of the Interactvist in one of his messages from the future: “In the same way that the introduction of cellphone headsets made a previous generation of users on the street sound like that crazy guy who pees on himself as he rants to no one, Google Glass pushed its users past that, occupying all their attention, their body in space be damned – mentally disconnecting them from their physical reality. With Glass, not even their eyes were trustworthy.”
The reason that Google Glass and future devices like it will have such a huge impact on body language is because they take away one of the most easily interpreted and oft communicated non verbal phrases – “go away”.
“Imagine Clint Eastwood trying to stare someone down *while* wearing Google Glass. Death stare revoked.” – @hanelly
Outside of homeless panhandlers, most people quickly interpret a down glance or lack of eye contact as “go away”. Digital devices have actually made this easier to convey and receive, as people tend to not look up from their phones or look down at their phones when they do not want to continue an interaction. This is one of those subset indicators mentioned earlier. How can you tell if someone wants you to piss off if they staring right at you but not responding? Needless to say, Google Glass is going to cause a lot of frustration for social interactions at first. The wearer is going to get frustrated with others mistaking the wearers focus, while others are going to get frustrated with the wearer not paying attention to the physical world.
Current studies on the effect of portable technology on body language are few and far between. Future studies on the effects of wearable technology such as Google Glass and their effects on body language might not even be a consideration. We’re still at early adoption point of this technology and how it affects our social and psychological interaction – we take it for granted.
It’s not just Google Glass that will be altering our body language. The return of calculator watches – the Apple iWatch – will also occupy the arms as well as the eyes, further hampering expression and interpretation of body language. Checking your watch might no longer show impatience because instead of a mindless checking of the time, there was a notification instead – a new subset. The question is now going to be – will we be able to adapt to this new functionality in human interactions?
Sadly, I don’t think we’re going to see any progress there. We are already heading down a path of terrible human interaction. Society is breaking down, we are not communicating as we should be. We’re afraid to speak out minds for fear of being labeled, we’re afraid of confrontation and discussion. Wearable tech will make this easier for us as a whole because we’ll be able to pull back further into our little shells. None of this is a good thing. It’s going to create a society of selfish, inward looking people with a loose grasp on the physical world. Is this an argument against this wearable technology? Hell no.
Regardless of the psychological, sociological and physiological impacts, wearable technology is the next step in our digital evolution. While we can’t stop or slow it down, we need to make sure that psychological science is at least aware of the changes taking place. Companies developing these technologies would be wise to conduct their own sociological studies so when the time comes, they are available and the public can be made aware. Perhaps they already are, perhaps they never will. In the meantime, I’ll be over in the corner, incessantly checking my pager.
h/t to @hanelly for the idea prompting.
There are no reviews yet.