Human-Robot Interaction
emotions, behavior and motion sensors

Our laboratory owns two humanoid robots Nao H25. These robots help us in exploration of social interaction between humans and robots. Enabling social interctions in robots allow us to put into use and test many of the methods fro analysis of brain signals developed in our laboratory.

We explore a number of paradigms, among them Brain-Computer interfaces, social interaction and detection and communication of emotions. It is known that emotions enhance synchronization of brain activity between subjects and that strong emotions make brain activities of different individuals ``literally'' synchronous [Nummenmaa et al. 2012, Hasson et al. 2012].

Subjects exchange emotions either through visualization and sonification of their EEG, or throught their robotic avatars.

In the first steps in our Human-Robot interaction project we had to teach our robots new behaviors and movements, in order to make behavior of robots more human-like. Familiarity and friendliness is essential for interaction with humans. Robots NAO25 are very flexible, they can perform fairly complex behaviors such as dancing or singing, individually or synchronized.

A rich repertoire of behaviors of a single robot are useful for basic display of emotions. In social interactions, however, signals from the environment have to be incorporated into their behavior.

Examples of emotions and behaviors displayed by interacting robots.

Emotions expressed by robots can differ by their current pose and affective state. A robot getting angry after stumbling can be seen in the next video:

Robots can act together or independently. Below you can find some more examples of two robots expressing similar emotions, gestures and reactions. Each robot acts independently upon a signal about affective state or context of commincation, the beavior shown is chosen randomly from a set of pre-recorded and categorized behaviors.

Our robots learn new behavior and motion patterns with the help of the Microsoft Kinect sensor. In order to create new behaviors quickly, we are developing a real-time interface that translates motion captured by the motion sensor directly to Nao. It is however always safer to test your action in the simulator as shown in the video below:

We exploit a number of open-source libraries like OpenCV and our methods to allow robots to perform some basic tasks required for interactions.