The wrist-worn gadget is described as a health and wellness product in internal documents reviewed by Bloomberg. It's a collaboration between Lab126, the hardware development group behind Amazon's Fire phone and Echo smart speaker, and the Alexa voice software team.
Designed to work with a smartphone app, the device has microphones paired with software that can discern the wearer's emotional state from the sound of his or her voice, according to the documents and a person familiar with the program. Eventually the technology could be able to advise the wearer how to interact more effectively with others, the documents show.
It's unclear how far along the project is, or if it will ever become a commercial device. Amazon gives teams wide latitude to experiment with products, some of which will never come to market. Work on the project, code-named Dylan, was ongoing recently, according to the documents and the person, who requested anonymity to discuss an internal matter. A beta testing program is underway, this person said, though it's unclear whether the trial includes prototype hardware, the emotion-detecting software or both.
Amazon declined to comment.
The notion of building machines that can understand human emotions has long been a staple of science fiction, from stories by Isaac Asimov to Star Trek's android Data. Amid advances in machine learning and voice and image recognition, the concept has recently marched toward reality. Companies including Microsoft Corp., Alphabet Inc.'s Google and IBM Corp., among a host of other firms, are developing technologies designed to derive emotional states from images, audio data and other inputs. Amazon has discussed publicly its desire to build a more lifelike voice assistant.