UPDATED 16:00 EDT / JULY 26 2017

EMERGING TECH

Artificial and emotional intelligence meet in new autonomous driving tech

Face-reading artificial intelligence technology has shown up in identification and security use cases. What applications would result if such an AI could also tell when people are psyched, depressed or confused?

“There are only seven or six universal emotions, plus neutral,” said Modar Alaoui (pictured), founder and chief executive officer at Eyeris Technologies Inc., developers of emotion recognition software.

These emotions are akin to the seven primary colors; all variations of both colors and emotions derive from a basic set, Alaoui stated at this year’s When IoT Met AI: The Intelligence of Things conference in San Jose, California. They are hardwired into humans’ brains and show on their faces through microexpressions, he told Jeff Frick (@JeffFrick), host of theCUBE, SiliconANGLE Media’s mobile livestreaming studio. (* Disclosure below.)

Microexpressions often reveal what people can’t or won’t. “They can generally give up a lot of information as to whether a person has suppressed a certain emotion or not. Or whether they are thinking about something negatively before they can respond positively, etc.,” Alaoui said.

They can also help autonomous vehicles drive safely and cater to occupants’ likes and dislikes, according to Alaoui.

Backseat driver

Eyeris fuses with sensors to improve people’s experiences, Alaoui explained. It can tell if a driver is growing dangerously tired or careless.

Fully autonomous, driverless vehicles are still at least 10 years off, Alaoui predicted. But when they arrive, the focus of the software and services in the car will shift to the occupants. “All of these services will revolve around who is inside the vehicle by age, gender, emotion, activity, etc.,” he said.

Eyeris software will be hitting the road by early 2018. “We made some announcements earlier this year at CES [Consumer Electronics Show] with Toyota and Honda,” he concluded.

Watch the complete video interview below, and be sure to check out more of SiliconANGLE’s and theCUBE’s coverage of When IoT Met AI: The Intelligence of Things. (* Disclosure: TheCUBE is a paid media partner for When IoT Met AI. Neither Western Digital Corp., the event sponsor, nor other sponsors have editorial influence on theCUBE or SiliconANGLE.)

Photo: SiliconANGLE

A message from John Furrier, co-founder of SiliconANGLE:

Support our open free content by sharing and engaging with our content and community.

Join theCUBE Alumni Trust Network

Where Technology Leaders Connect, Share Intelligence & Create Opportunities

11.4k+  
CUBE Alumni Network
C-level and Technical
Domain Experts
15M+ 
theCUBE
Viewers
Connect with 11,413+ industry leaders from our network of tech and business leaders forming a unique trusted network effect.

SiliconANGLE Media is a recognized leader in digital media innovation serving innovative audiences and brands, bringing together cutting-edge technology, influential content, strategic insights and real-time audience engagement. As the parent company of SiliconANGLE, theCUBE Network, theCUBE Research, CUBE365, theCUBE AI and theCUBE SuperStudios — such as those established in Silicon Valley and the New York Stock Exchange (NYSE) — SiliconANGLE Media operates at the intersection of media, technology, and AI. .

Founded by tech visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a powerful ecosystem of industry-leading digital media brands, with a reach of 15+ million elite tech professionals. The company’s new, proprietary theCUBE AI Video cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.