
IPFS News Link • Privacy Rights
Emotion AI Excites Some Businesses, But the Legal Landscape Feels Fickle
• https://www.activistpost.com, By Chris BurtThis is the view advanced by Lena Kempe of LK Lawfirm in an article in the American Bar Association's Business Law Today.
Emotion AI comes with some of the same concerns as biometric technologies, such as risks to data privacy and bias. It also introduces the possibility that individuals whose emotions are understood by automated systems could also be manipulated by them.
Kempe suggests that the market is growing. The article cites a forecast from market analyst Valuates, which sets revenues in the field at $1.8 billion in 2022, and predicts rapid growth to $13.8 billion by 2032 as businesses attempt to improve online user experiences and organizations address mental health and wellbeing.
Kempe also notes that Affectiva was performing advertising research for a quarter of the companies in the Fortune 500 as of 2019. A year and a half later, the company said it was up to 28 percent, and today it is 26 percent, somewhat undercutting the claim of rapid growth.
Emotion AI uses data such as the text and emojis contained in social media posts, facial expressions, body language and eye movements captured by cameras, and the tone, pitch and pace of voices captured by microphones and shared over the internet. Biometric data such as heart rate can also be used to detect and identify emotions, as can behavioral data like gestures.
If this data or its output can directly identify a person, or if it can be reasonably linked to an individual, it falls under the category of personal information. This in turn, brings it into the scope of the European Union's General Data Protection Regulation and a raft of diverse U.S. state data privacy laws. In some cases outlined by Kempe, the information can qualify as sensitive personal data, triggering further restrictions under GDPR and state law.