top of page
Writer's pictureShirina Mulani

The Use of Emotional AI In Today’s Digital Age



Recently, the Straits Times published an article about the unique use of AI at True Light College, a secondary school for girls in Kowloon, Hong Kong. Students have been attending classes from home for majority of the past year, which allowed the school to utilise a software called 4 Little Trees, an artificial intelligence program that claims it can read the children’s emotions as they learn. The program’s goal is to help teachers make distance learning more interactive and personalised, by responding to an individual student’s reactions in real time. According to its founder, the software can read the children's’ feelings with 85% accuracy, and with the global pandemic, the popularity of the software has exploded. The use of this software has become much more widespread, with the original 34 schools expanding to include a total of 83 over the past year.


According to market research, the emotion detection industry is projected to almost double from $19.5bn in 2020 to $37.1bn by 2026. As impressive as this programme may seem, should we be utilising technology for this purpose at all?


As corporations and governments roll out this technology for widespread use, critics have voiced out a major flaw: that there is little evidence to show it works accurately. This is because while the algorithm may be able to detect and decode facial expressions, they do not necessarily translate to what a person is feeling or thinking or their thought process.


Risks of Emotional AI

Risks of Emotional AI

Researchers have found that emotions are expressed in a huge variety of ways, which makes it hard to reliably infer how someone feels from a simple set of facial movements. Meaning, a smile or a frown does not necessarily mean the person is happy or angry. Hence, companies have to go further in proving the link between expression and behaviour, as simply analysing a face does not guarantee an accurate interpretation of their emotions in that instance.


Because of the subjective nature of emotions, emotional AI is especially prone to bias. For example, one study found that emotional analysis technology assigns more negative emotions to people of certain ethnicities than to others. When put into the context of the workplace, AI could essentially hinder an individual’s career progression due to its interpretation of their emotions. AI is often also not sophisticated enough to understand cultural differences in expressing and reading emotions, making it harder to draw accurate conclusions. For instance, the act of slurping noodles, while considered rude in most places, is a socially acceptable practise in Japan. When these culture differences overlap, it can lead AI to wrongly label customers, and if left unaddressed, perpetuate stereotypes.



Potential Benefits of Emotional AI

On the other hand, emotional AI can be a powerful marketing tool with enormous potential to optimise customer relations, as it makes humans less inscrutable and easier to predict at scale. For instance, businesses can utilise the technology to analyse consumer behaviour in response to a range of advertisements, and thereby gain insight into what customers like and don't like. This strategy allows companies to carry out extensive testing before broadcasting advertisements that have been proven to be effective for their target audience.


This method can also be highly effective in the online retail industry. For instance, the algorithm can detect facial expressions to indicate that a customer is in distress, which may trigger the website to display a pop-up window to help the shopper contact customer service for help. Furthermore, should the shopper be hovering their cursor over the order button, the algorithm could send them a coupon code to incentivise and convert them into paying customers.


The Future of Emotional AI

The Future of Emotional AI

Despite the risks and concerns about the current accuracy and biases of emotional AI, many scientists are confident the technology will improve as the data used to train algorithms are better suited to the applications, and as companies begin to design country-specific solutions.





Regardless of this, critics remain skeptical if the benefits of this technology truly outweighs the costs. Emotions are what make us human, and putting the power of emotional connection in the hands of machines may give people an excuse to disconnect. For example, if an AI tool could check on a loved one and send a report that says everything’s fine, a user could decide that’s enough information and not bother confirming it’s true. Ultimately, it is up to businesses and governments to set boundaries for the appropriate use of Emotional AI in society today, and its potential for the world of tomorrow.






Comments


IT Support Knowledge

hardware computer it support singapore i
IT Block IT Support SG IT Company Singap
office infrastructure it support singapo

Singapore IT Company

bottom of page