Robots Got Synthetic Emotions

For robots to make highly specialized tasks, sometimes it is needed to embed affective behaviour that have not been associated traditionally with intelligence. Indeed, emotions play a critical role in human's way of reasoning and its decision-making activities. In other words, emotions have a critical impact over intelligence.

Robotics should decide autonomously without or with scarce human interference how to respond to changes in their contexts and environments. While some self-adaptive systems

may be able to run without any external intervention, others do it with high level goals. Due to self-adaptive systems are complex systems with a high degree of autonomy, it is harder to ensure that it behaves as desired and avoids wrong behaviour.

For robots to make highly specialized tasks, sometimes it is needed to embed affective behaviour that have not been associated traditionally with intelligence. Indeed, emotions play a critical role in human's way of reasoning and its decision-making activities. In other words, emotions have a critical impact over intelligence.

According to this, that human will not make decision effectively if human's subsystem of emotions is not working well. For that reason, how to apply the intelligent function of emotions into robots and how to make robots include intelligence are strongly interrelated. Artificial emotion is an emerging research subject and will make machine have artificial emotions.

Affective computing research have shown that people are biased in making both random and systematic errors when anticipating their own future emotional states. Given this level of divergence between anticipated and experienced reactions, it is worth examining computational methods to avoid those issues.

I proposed Artificial Intelligence techniques as a worthy tool for generating artificial emotions in robots immersed in complex environments with high uncertainty. I designed a robotic model that could empathize with the emotions of the people waiting in a queue and could accordingly act in order to take care of them. The research is published in Applied Soft Computing journal.

Affective computing pretends to narrow the space between computers and affective humans. Affective computing try to assign systems the human-like capabilities of emotions' observation, interpretation and generation. Emotions have a critical impact on humans physical states, actions, beliefs, motivations, decisions and desires. Appropriate balance of emotion makes human beings having flexibility and creativity in solving problem. In the same sense, if we want the robots to have real intelligence, to adapt the environment in which humans living and to communicate with human beings naturally, then robots need to understand and express emotions in a certain degree.

Affective computing is an emerging, but promising, research field dealing with the issues regarding emotions and computers. Over the last years emotions' research has become a multidisciplinary research field with a growing interest. Indeed, it plays a critical role in human-machine interaction. Automatic recognition of emotional states aims to improve the interaction between humans and machines. Furthermore, it could be used to make the systems act according to the current human emotions.

It could be worthy in a lot of real-life applications as a fear-type emotion recognition for audio-based surveillance systems, real-life emotion detection within a medical emergency call centre, military applications, pilots and drivers stress detection, semi-automatic diagnosis of psychiatric diseases, and so on.

Close

What's Hot