You are here
Home > Data Science >

An Intro to Affective Computing

Not a lot of us have heard of Affective Computing. Most people I have spoken to about this didn’t know anything about Affective Computing. So I thought, I’ll just write an intro, explaining what I have understood about the discipline and hopefully, will get to learn more from the comments. So let’s get started.


Affecting computing is all about understanding human emotions in a human-machine interface system and responding based on those emotions. Consider this, you get into an ATM vestibule to draw some cash, but you’re tensed about getting late to your date, who is already waiting for you at the restaurant. If anybody sees you in this condition at the ATM vestibule, they’ll be able to easily understand that you’re tensed about something. Now imagine the ATM machine understanding your emotion, sensing that you’re stressed, and maybe asking you to relax or playing a soothing music. Wouldn’t that be great? This is, put (really) simply, affective computing. Now it does sound crazy, that a machine could understand your emotion and alter its behaviour accordingly. And today, we’ll try to understand how it works.


Affective computing, as a discipline, could be considered as a mix of computer science, psychology, and cognitive science. It is the study and development of systems which could recognise, understand, and even simulate human emotions. The ‘affective’ word in Affective computing is considered as a synonym for emotions (affect). The ultimate goal of the study is to build or simulate empathy in machines, so that they can understand the emotion of the human interacting with them, and then adapt to that emotion and alter their behaviour accordingly. That’s pretty interesting, don’t you think?

There have been numerous efforts in realising this goal, and to varied success. One such recent effort was also awarded an actual citizenship of Saudi Arabia. That’s something! If you’re wondering, I’m definitely talking about the amazing humanoid robot, Sophia. If you have never heard of her, below is a compilation of some of her interviews, it’s a must watch, at least for the fun of it.


Recognising human emotions

By now, you might be asking, how the heck could a computer understand human emotions? Well, there are many options available here, depending on the type of system you’re building. A few common options are video cameras, microphones, touch and pressure sensitive hardware, etc. Let’s look at a few examples.

To begin with, we’ll take the example of Sophia herself. You can see that she has eyes and ears. She’s obviously able to hear, listen, everything you say to her, that’s how she’s able to maintain a conversation with you. So she’s processing everything she’s listening. Depending on the tone of the speech of the speaker, she’ll be able to recognise if the speaker is calm, agitated, angry, or happy. She can adapt to that tone and change the way she responds. Now I’m not sure if she’s using the two cameras in her eyes to pick up facial expressions of the person speaking with her and then guess the emotional state of the person. But that’s a possibility too.

Next, suppose you’re chatting with a bot, maybe a support chat bot on a random e-commerce website. You had ordered something a couple of weeks back, and you still haven’t received the product. So you’re obviously a bit frustrated. The chat bot, based on the tone in your text messages exchanged with it, could be able to recognise your emotion and change its own emotion accordingly.

Similarly, if you’re chatting with a local chat bot, or a bot in a hardware product, the system will be able to measure the pressure with which you’re hitting the keys on the keyboard, provided the hardware required for it this is available in the setup. These are all examples of how a computer can try to recognise and understand the emotion of the human interacting with it.

As you can imagine, the data collected from all these different sensors on the machine (cameras, microphones, etc.), will be used in a machine learning model to train the model and then predict the emotion of the human in real time. So machine learning is an integral part of affective computing.


Applications

Right now, affective computing is still in the research phase. But there are projects such as Sophia that are pretty mature and making public appearances. But there aren’t a lot of such systems which are deployed in the public already. This, however, doesn’t mean there aren’t future applications for the technology.

There are already a few restaurants and hotels in parts of the world where robot receptionists are deployed. We have a restaurant here in Bangalore where the order is taken and the food is served only by robots. It’s amazing to see in action.

robot_restaurant_bangalore

Most future applications are planned in similar services and industries. You might see a robot pet which could do stuff similar to today’s service dogs. Or a robot to receive you at the airport after you land, or do your immigration, and other similar services. I’m pretty sure the smart brains behind this will definitely come up with more smart applications. But we’ll have to wait a bit longer to find out.

robot_receptionist
A robot receptionist

For now, if you’re interested in doing something of your own in the data science realm, check out my other posts about data science. Or if you want to play around with text classification, you could do that as well by going through my fastText series. FastText is a text classification library by Facebook, which is pretty interesting.

Sunny Srinidhi
Coding, reading, sleeping, listening, watching, potato. INDIAN. "If you don't have time to do it right, when will you have time to do it over?" - John Wooden
https://blog.contactsunny.com

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Top