AI Dealing with Human Emotions
By: Palak Kotwani
From: Nagpur, Maharashtra, India
When I was a kid, I used to watch numerous science fiction movies having a robot as one of the main characters, who spoke perfect English but his speech sounded monotonic and dull –devoid of inflexions and variations. However, recent movies, like Tomorrowland, depict robots
having a sensitive, warm, human voice that is indistinguishable from humans. In the movie, Athena (the robot) interacts with Frank on a very personal level and they develop an emotionally intimate bond. So much so that he eventually falls in love with Athena – who is nothing more than a computerized robot.
Human emotions have a long evolutionary purpose for our survival as a species. They are either a reaction to an external stimulus, or a spontaneous expression of an internal thought process. Emotions like fear are often a reaction to an external stimulus, such as when we cross
a busy road the fear of getting run-over causes our evolutionary survival mechanism to take effect. These are external causes that trigger the emotions inside our brain. However, emotions can be invoked as the result of an internal thought process. For example, if I managed to find a solution to a complicated mathematical differential equation, that could make me happy as a result of a feeling of personal satisfaction. It may be a purely introspective action with no external cause, but solving it still triggers emotions.
In the same way, AI designers could simulate this emotion from the machine's internal logic. This could be the emotion of joy emanating from solving, for example, a differential equation.
Furthermore, simulating emotions triggered from external stimuli like joy, sadness, surprise, disappointment, fear, and anger could be invoked with interactions through written language, sensors, and so on. Computational methods would then be required for the processing
and expression of emotions that occur with human interaction.
Indeed, without emotions we would not have survived as a species and our intelligence has improved as a result of them. Furthermore, we cannot detach our emotions from the way in which we apply our intelligence. For example, a medical clinician may decide on medical
grounds that the best treatment option for a very elderly hospital patient would be a surgical procedure. However, the clinician’s emotional empathy with the patient might override this view. Taking the age of the patient into account, he or she may decide that the emotional stress likely to be incurred by the patient is not worth the risk of the operation – and therefore, rule it out. Emotional intelligence, as well as technical knowledge, is used to decide the treatment options. Of course, machines could never feel emotions akin to us humans. Nevertheless, they could simulate emotions that enable them to interact with humans in more appropriate ways.
However, it does not always make sense to try to replicate everything a human being feels in a machine. For example, some physiological feelings like hunger, and tiredness are feelings that alert us of the state of our body and are normally triggered by hormones and our digestive
system. A distinction should be made about the differences between mobile robots and a disembodied computer. The latter would have a range of emotions far more limited as it would not be able to physically interact with its environment as a robot would. The more sensory
feedback a machine could receive, the wider the range of feelings and emotions it would be able to experience.
Understanding Human Emotions using AI
In recent years, AI has improved significantly at detecting emotions in humans through voice, body language, facial expressions, and so on. For example, voice recognition AI software systems are learning to detect human emotions through speech intonation, speech pauses and
so on, in much the same way that we detect changes in emotional moods of our loved ones, friends or work colleagues. Recently, researchers have developed a deep learning AI program that can tell whether a person is a criminal just by looking at their facial features with an accuracy rate of 90%. In 2016 Apple bought a start-up company which created a software that can read facial expressions – called Emotient. This could be used to make AI programs like SIRI and Alexa understand the moods of their owners. Another application of this software could be in retailing: with support from in store CCTV cameras, they could determine the customer's thinking from their body language. For example, watching a customer return to the same item, or display a concentrated study might indicate a strong interest,
triggering an approach from store assistants.
The Future of AI Emotions
There are several potential benefits of using AI programs to detect human emotions. They don’t get paid, or get tired and can operate 24 hours per day making consistent decisions. Furthermore, the view that human emotions are off limits to machines that work in logic no longer carries weight. We know that humans are essentially a collection of biological algorithms shaped by millions of years of evolution. This means that non-organic algorithms could replicate and even surpass everything that organic algorithms can do in human beings.
Author: Palak Kotwani
Palak is a senior in high school and is passionate about mathematics, driverless cars, and data science. In her free time, she loves to play badminton and solve sudoku.
Minsky, M.L.: The society of mind. Simon and Schuster, New York, N.Y. (1986)
Kurzweil, R. How to Create a Mind? Viking Penguin Press. 2012.
Harari, Y. N. Homo Deus: A brief history of tomorrow, Harvill Secker Publishers, 2015.