If I ever get involved in the development of any artificial intelligence (AI) that could recognize human emotions, I will name it “Shakespeare.” That is a towering order even for the most sophisticated AI now. Shakespeare’s body of work spanned the range of human emotions – from blender-powered confusion or glee to the scale of the Large Hadron Collider’s banging rage and ecstasy. I think any AI that would claim to detect human emotions should aim to approximate the range of human emotions that the first-rate bard exploited.
Sensors of all kinds have been around for decades and they have been increasingly able to detect all sorts of signals from our bodies and our brains especially with regard to diseases or to measure our fitness progress. We don electrode caps, wires, go through scanning machines so that whatever is happening inside us can be projected outside us and inform the world and ourselves of what’s going on. These are generally invasive methods.
One of the most, if not the most elusive and difficult thing for computers to read without being invasive are human emotions. We know this because companies have been doing this for a while now and until now, they have only been able to make use of the similar sensors and devices used largely by health industry that I have mentioned.
Microsoft has been making new application programming interfaces (APIs) that independent developers can use to make their own apps. For recognizing emotions, Microsoft has Emotion API was launched last year, enabling apps to recognize anger, contempt, fear, disgust, happiness, neutral, sadness, or surprise. It does this based on facial expressions. This relies on visual recognition – akin to how your phone can recognize faces in your stacks of photos. But what if some technology can really detect the inner you?
Enter a new invention by a bunch of MIT scientists who recently figured out a way to bounce off radio frequencies off of you so that it can come back “carrying” your breathing and heart rate – which are known to be indicators of emotions. Radio frequencies (RF) will be emitted to hit you and these waves will bounce back changed, having detected your breathing and heart rates. It is the very small differences between these rates across time that could differentiate and recognize emotions. These differences will be sorted out by computer codes (algorithms) that the group developed. In their small testing group, the emotions decoded by RF were 87% correct.
The new technology is called EQ Radio and it can recognize 4 basic emotions: happiness, excitement, anger and sadness. It does away with sensors having to be worn by the user which is a big deal because it readily lends itself for widespread use by anything that you could connect online through wi-fi : stores, online sites, games, films, and the internet of things. In other words, you would not necessarily know that your emotions are being read.
Imagine how much more emotionally powerful advertising could be if they know which buttons to push since they could read your feelings? For the arts, you can be watching a stage play and the story’s trajectory could be directed directly by the predominant emotion of the audience. A film you are watching online could also be directed by your emotions so that no two films are ever told the same way. Or imagine if your house through your wi-fi could sense your heartbreak and as you enter the door, your connected lighting will dim and “Smile, though your heart is aching” by Charlie Chaplin will play softly to console you.
For online learning though, EQ radio has to be go beyond recognizing only these 4 emotions to include even the subtler but equally deep emotions that has to do with learning. You can only genuinely learn something if you feel emotionally connected to it. There have been studies that point to specific emotions that enable us to learn. These are called knowledge emotions and they are: surprise, interest, confusion and awe. These emotions would have to be evoked by the content to enable genuine learning. I don’t know how these look like in terms of breathing and heart rates or if these rates are enough to detect these knowledge emotions.
It would probably be more reliable if signals come from which brain part is activate or muted but so far, we know that brain’s electro-chemical signals do not leak and get to be picked up by radio signals. Otherwise, you should hear your own singing in your head interfering with the signal of your own radio channel. Although I think that would be really cool if that could happen.
But just like any technology, it could be used for dark purposes. This enables devices to read what you feel and once your feelings are out there, they too can be hacked. Imagine if this technology is used for an app for mental health. As much as it can help you, in the wrong hands, they can also mess with your feelings. Merchants could now potentially hijack your feelings and use these to feed your addiction to that shoe, bag, cookie or whatever gives you that rewarding feeling.
But a predominantly dark scenario for this technology assumes that we will turn into machines as machines try to creep into our humanness. I would like to think the story of our complexity as thinking, feeling humans, has neither ceased nor will it ever. Even if AI becomes Shakespeare Plus, it may never be able to catch up and fully be able to figure out the real thing. – Rappler.com