UPDATED 12:21 EDT / NOVEMBER 28 2013

Machines + human emotions : The path to AI

Imagine a world where machines can distinguish your emotions and react accordingly? For example, if you are stressed, your smart bathroom will automatically run a warm, relaxing bath filled with your favorite scents to relieve stress. Or if your kitchen senses that you are a little sluggish when you wake up in the morning, it will make you a strong pot of coffee.

Those are just some of the ways emotion-sensitive machines can make our lives better, or at least more convenient. As the world becomes more connected, machines are communicating with each other, sometimes on our behalf, at an increasing rate.  Just as humans have learned to communicate most effeciently through the use of emotions (think facial expressions and voice tone), machines are becoming more effecient in their independence and inter-machine data sharing.  As emotions are often the most telling form of communication, these abbreviated but concise messages may be the best way for machines to get smarter.

In the human world, there are various methods researchers are applying in order to assess people’s emotional states  in order to improve products, especially connected products.  Here’s a few initiatives that could change the way machines interact with each other, and with humans.

Posture

.

They say you can tell a lot about a person just by looking at how he or she sits. If a person is leaning forward, that indicates the person is interested in what is being said. If the person slouches, that could indicate boredom or disinterest. But that’s not always the case, as some people lean forward when they are sitting just so they can get comfortable, or to prevent themselves from falling asleep.

MIT researchers Rosalind W. Picard and Selene Mota used a commercially available set of 42×48 pressure sensors placed on the back and seat of a chair to build a new algorithm that could recognize not only nine static postures of the subject, but also analyze how these postures change over time. Though the algorithm does not really get an emotional reading on the person, it is able to analyze the data collected in order for the machine to deliver more meaningful actions.

Facial recognition

.

This may be the most-used parameter when assessing a person’s emotional disposition. A slight eyebrow raise could indicate that a person is disagreeing with what he or she is hearing or seeing, or a slight widening of the eyes could indicate that the person is surprised.

Researchers are focusing on areas around the eyes and the mouth to recognize patterns, as slight changes can indicate a shift in a person’s emotion. But the problem with facial recognition is that the subject needs to be still in order for it to be accurate. To address this problem and expand their studies to moving subjects, some researchers are utilizing IBM’s Blue Eyes camera. The camera is able to track the eyes by using the “red-eye effect” principle. The pupils are very sensitive to light thus they constrict or dilate depending on lighting conditions. But emotions also affect this process.

The Blue Eyes camera is surrounded by a ring of infrared LEDs that blinks on and off. At the same time, two adjacent strips of LEDs blink off and on. The purpose of the latter strips is to highlight the face without causing the red-eye effect. The images taken with the red-eye effect and those that do not have the effect are then compared in order to get an initial sense of the changes. This technology is a good first step in real-time facial recognition.

Physiological changes

.

These days, fitness-tracking apps on smart devices are quite popular. But researchers are now seeing the device itself as something that could help tie emotions to machines. These devices can now track heart rate, temperature, oxygen saturation and other parameters. The data gathered can be used to assess when a person experiences these changes and from that, allow the app or bracelet to make decisions for the wearer.

For example, if you reached your goal and your heart is beating abnormally fast, it can suggest that you take a rest or even call a paramedic. Or, if you are a person who is not too intuned with your emotions, a device can tell you what emotion you are feeling, such as falling in love with someone.

Picard and her team have developed the first physiology-based emotion recognition system for recognizing an individual’s emotions over time. The system uses four physiological signals that learn patterns of an individual over the course of the testing period. It has achieved an 81 percent recognition accuracy. It classifies which of the eight emotional states (i.e., anger, joy, sadness, hatred, platonic love, romantic love, reverence and neutral) the person is manifesting.

In order to conduct the tests, the subject is placed in a quiet space and asked to focus on each of the eight emotions, one at a time. The researchers then take note of the physiological changes happening as the subject focuses on each of the emotions.

There are still a lot of kinks to be ironed out since there are many more emotions that can’t be detected by simply monitoring physiologic changes. In those cases, researchers would have to monitor subjects at the biochemical level.

The future of emotion-sensing machines

 .

The following are just some examples of how today’s technologies can be improved with emotion sensors:

.

Pressure-sensitive keyboard

If you work in an office, at one point or another you may have found yourself ready to toss the keyboard and everything that goes with it. It seems the keyboard, acting as the interface between humans and computers, takes the brunt of abuse when PC-related frustration ensues. But keyboards can now be equipped with sensors that measure the amount of pressure you deliver to the keys. If it senses that you are stressed or angry, your computer can automatically play your favorite soothing music.

Right now, Microsoft’s new Surface 2 tablets can be equipped with pressure-sensitive keyboards. Though the pressure-sensitive keyboard is able to tell when you are simply resting your fingers on the keys or if you just accidentally brushed the key, it doesn’t really measure how much pressure you are applying on each key. This means that it’s far from being able to analyze whether or not something is bothering you just from the way you are typing. But this could be something we might see in our near future.

.

A voice-analyzing assistant

Imagine if Siri could tell your emotions just by noting your intonation or how loudly you are speaking, and then recommend things for you to do depending on your mood? For example, if you sound sad, Siri can set up a meeting with your closest friend to cheer you up or have your favorite fast food delivered to you.

.

A smarter fitness tracker

Most fitness trackers these days are smart enough to not only chart your progress but also motivate you by allowing you to set your goal and monitor your progress. But what if you don’t feel like doing your routine anymore? A fitness tracker would be better if it were able to sense that you are no longer motivated. Before you suddenly stop exercising, it can sense that your motivation is waning and suggest activities to keep you motivated.

Source: Toward Machines with Emotional Intelligence

photo credit: wing_clipper via photopin cc
photo credit: Torsten Reimer CC BY NC SA

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU