Cars can hear us, but they cannot listen. They can understand the words, but they do not “get” the music, the emotions – how humans feel, what they mean and what they really care about. They miss out on human’s energy, they are clueless about their desperation, blinded to human’s most basic emotions and craving. So, how will humans interact with the car of the future?
This should be a major concern since on average people spent 4.3 whole years in a car, mostly alone and much of it stuck in traffic. Would anyone – in his right mind – ever spend more than four years with someone that had absolutely zero Emotional Intelligence (also known as EQ)?
Most would say “of course not” (normally accompanied with an exclamation mark, sometimes two) and yet when it comes to people’s most emotional means of transportation they choose to ignore the fact that their passion towards their vehicles is a strictly one-sided affair.
Technology: Interaction between human and the car
Like the always-on features gaining popularity in some cutting edge personal devices, cars are perfectly capable of receiving unbelievable amount of conversation. Many think this is only have to do with the simple and short voice-based commands people use to activate certain features in their cars, but the fact is most of the on-board conversation has nothing to do with simple voice commanding.
In a an extensive research released by the UN World Health Organization in 2011 observers witnessed 11 percent of Americans using cellphones in their car at any given time. The numbers had only grown since, the emotions humans convey in those conversations however remain the same. That is being said also of the conversations people hold with their passengers and occasionally with other fellow motorists (more so if they happen to be driving in a Mediterranean country).
Cars are simply surrounded by human voice – voice full of passion, anger, happiness, loneliness and desire – and at the same time are as handicapped as the virtual assistants on smartphones. Many experts say that first we need those personal assistants to understand complex words and syntax. That is just not true. We first need to make them listen!
The voice is there – in on-board phone chats and peer conversations. Once cars (and virtual personal assistants) understand our existing emotions only then can they start to understand us and expect – in return – a meaningful conversation. A conversation that goes beyond simple commands.
How would you talk to a car that truly understands you? Share your opinions in the comment section.
Please note that this article expresses the opinions of the author and does not reflect the views of Move Forward.