According to a Bloomberg report, Amazon is in the process of developing a device that will be able to detect human emotions.
Yep, that’s where we’re at now.
The project, named Dylan, will work through a wrist-mounted device and an app. Details are hazy, and speculation is rife, but a practical use for the device may reach beyond health awareness, “you seem sad Steve, here are some mindfulness exercises to help,” and into the realms of manipulation.
Theoretically, the device could be used by salespeople to detect the moods of customers, enabling them to ‘play’ to those emotions. Sales training could become focused on using the tool to target certain emotions and therefore customers in certain frames of mind. This type of behaviour could mean people will end up buying things they don’t want, for the wrong reasons. A lot like the phenomenon of sad people purchasing online to make themselves feel better but with less freedom of thought.
This, of course, echos through to other forms of human relationships – romantic entanglements, business discussions, friendships and anything else that holds even a semblance of a ‘transactional’ element. For example, a person pursuing a romantic partner can use the device to gauge their interest and emotional response to comments and questions. Detecting changes, their approach can be changed and adapted – something we do automatically as humans – but without authenticity and bad decisions have the potential to be made.
Importantly, the device is in the very early stages of development, but it will obviously be something we will need to deal with at some stage. The ethics of tech is becoming an even more complicated subject.