Technology is moving beyond reading facial expressions to other advanced metrics for understanding emotion.
As Emotion Recognition, one of our 100 Things to Watch in 2013, gets more sophisticated, technology is moving beyond reading facial expressions to other advanced metrics for understanding emotion. Heart rate trackers, pressure sensors and perspiration monitors are among the ways to intuit how people are feeling; the next step is then to create hyper-personalized experiences or messages based on those emotions. “What if [our devices] could pick up the tics and tells of our brewing anger—or, for that matter, any other emotion—and respond accordingly?” asks The New York Times, spotlighting how this technology is developing.
Your cookie settings are affecting the functionality of this site. Please revisit your cookie preferences and enable Functional Cookies: Cookie Settings
Stanford engineers have equipped an Xbox controller with sensors that monitor a player’s automatic nervous system. This feedback could change gaming in real time, offering an experience customized to the user’s emotional state. Another example is AutoEmotive, a concept car designed by the MIT Media Lab, which uses pressure and perspiration sensors in door handles and steering wheels to monitor drivers’ stress levels. This information might be used to tailor messaging the driver receives or navigation routes, or even warn other drivers of a particularly angry motorist nearby.
In the more immediate future, we’ll see a greater use of emotion recognition. As we explained last year, retailers such as Russian cosmetics chain Ulybka Radugi are exploring systems that integrate emotion recognition into the checkout process to provide more personalized experiences and offers. In another example, New Zealand’s BNZ bank created a website with face-scanning capabilities, allowing visitors to gauge their emotional reaction to financial topics such as retirement or cash flow, helping them understand how they feel about money.