The Use of Emotional AI In Today’s Digital Age



Recently, the Straits Times published an article about the unique use of AI at True Light College, a secondary school for girls in Kowloon, Hong Kong. Students have been attending classes from home for majority of the past year, which allowed the school to utilise a software called 4 Little Trees, an artificial intelligence program that claims it can read the children’s emotions as they learn. The program’s goal is to help teachers make distance learning more interactive and personalised, by responding to an individual student’s reactions in real time. According to its founder, the software can read the children's’ feelings with 85% accuracy, and with the global pandemic, the popularity of the software has exploded. The use of this software has become much more widespread, with the original 34 schools expanding to include a total of 83 over the past year.


According to market research, the emotion detection industry is projected to almost double from $19.5bn in 2020 to $37.1bn by 2026. As impressive as this programme may seem, should we be utilising technology for this purpose at all?


As corporations and governments roll out this technology for widespread use, critics have voiced out a major flaw: that there is little evidence to show it works accurately. This is because while the algorithm may be able to detect and decode facial expressions, they do not necessarily translate to what a person is feeling or thinking or their thought process.


Risks of Emotional AI

Risks of Emotional AI

Researchers have found that emotions are expressed in a huge variety of ways, which makes it hard to reliably infer how someone feels from a simple set of facial movements. Meaning, a smile or a frown does not necessarily mean the person is happy or angry. Hence, companies have to go further in proving the link between expression and behaviour, as simply analysing a face does not guarantee an accurate interpretation of their emotions in that instance.


Because of the subjective nature of emotions, emotional AI is especially prone to bias. For example, one study found that emotional analysis technology assigns more negative emotions to people