Amazon.com is developing a voice-activated wearable device that can recognise human emotions.
The wrist-worn gadget is described as a health and wellness product.
Eventually the technology could be able to advise the wearer how to interact more effectively with others.
The project is a collaboration between Lab126, the hardware development group behind Amazon’s Fire phone and Echo smart speaker, and the Alexa voice software team.
It’s unclear how far the project is, or if it will become a commercial device. Amazon gives teams wide latitude to experiment with products, some which will never come to market.
Amazon declined to comment.
Amid advances in machine learning and voice and image recognition, the concept of building machines that can understand human emotions has recently marched toward reality.
Companies such as Microsoft, Alphabet’s Google and IBM are developing technologies designed to derive emotional states from images, audio data and other inputs. Amazon has publicly discussed its desire to build a more lifelike voice assistant.
The technology could help the company gain insight for potential health products or be used to better target advertising or product recommendations. The concept will probably add fuel to the debate about the amount and type of personal data scooped by technology giants that collect reams of information about their customers.
Earlier this year, Bloomberg reported that Amazon has a team listening to and annotating audio clips captured by the company’s Echo line of voice-activated speakers.
A US patent, filed in 2017, describes a system in which voice software uses analysis of vocal patterns to determine how a user is feeling, discerning among “joy, anger, sorrow, sadness, fear, disgust, boredom, stress, or other emotional states”.
The patent, made public last year, suggests Amazon could use knowledge of a user’s emotions to recommend products or otherwise tailor responses.