public

Epigenetic Robotics 2009 under the auspices of FEELIX GROWING

Epirob09 LogoThe 9th International Conference on Epigenetic Robotics: Modeling Cognitive Development in Robotic Systems (EPIROB'09) is hosted in Venice, Italy from the 12th to 14th of November 2009 and is organized under the auspices of FEELIX GROWING this year. More information can be found at: www.epigenetic-robotics.org. 

Robot Playground at the Science Museum

The Feelix Growing Nursery (from AthenaWeb)How will humans interact with the robots of the future?

Feelix Growing Nursery features robots developed by UK labs to help scientists answer this increasingly important question and were on show in the Science Museum's Antenna Gallery on 17 - 19 February 2009.

The Robot Playground displayed a group of robots from a ‘robot nursery' developed by Dr Lola Cañamero as part of the European Feelix Growing Project, led by University of Hertfordshire. The nursery contained different types of baby robots: an expressive robotic head, Aibo dog-robots and humanoid Nao robots. Like any good human nursery, these robots were given a play mat and some toys to explore, while being watched over by human caregivers.

The expressive robot head responds to the emotions of humans sat facing it. Visitors did get the opportunity to see it express its own emotions and play peek-a-boo with it. The Aibo dog robots invited people to become their caregivers, using visual and tactile contact to relieve distress. Toys were on hand to calm these playful canine robots. Researchers from the University of Hertfordshire were on hand to answer questions and to get feedback from visitors to the event.

 

Murray, JC., Cañamero, L. (2008) 'Towards a Hormone-Modulated Model for Emotion Expression in a Socially [...]', SAB'08

Murray, JC., Cañamero, L. (2008) 'Towards a Hormone-Modulated Model for Emotion Expression in a Socially Interactive Robot Head'. In: The Role of Emotion in Adaptive Behavior and Cognitive Robotics at SAB 2008, Osaka, Japan. July 7-12, 2008.
Abstract:
In this paper we present a robot head ERWIN capable of human-robot interaction, endowed with interactive mechanisms for allowing the emotional state and expression of the robot to be directly influenced by the social interaction process. Allowing the interaction process to influence the expression of the robot head can in turn influence the way the user interacts with the robot, in addition to allowing the user to better understand the intentions of the robot during this process. We discuss some of the interactions that are possible with ERWIN
and how this can affect the responce of the system. We show an example scenario where the interaction process makes the robot go through several different emotions.

Gaussier, P., Andry, P., Boucenna, S. (2008) 'Dynamic fields and Interactive Systems', Dynamic Field Theory & Applications

Gaussier, Ph., Andry, P., Boucenna, S. (2008) 'Dynamic fields and Interactive Systems'. In: Proceedings of Conference on Dynamics & Applications. Braga, Portugal. September 2008
Abstract:
The dynamical system approach is an interesting framework to analyse and design complex control architectures
[7, 6]. Focusing on the dynamics allows to overstep some limitations of functional approaches and
to enlight possible emergent properties. For instance, in previous works, using the perception ambiguity, we
have shown that a simple visuo-motor homeostat can be used to trigger low level imitation capabilities [5, 4].
Moreover, dynamical neural fields allow to combine easily in a single system different control strategies (different
motor commands obtained from different neural networks working at different frequencies can be easily merged
in a single neural field allowing the control of several degrees of freedom). Yet, in these systems performances
directly depend on the human capabilities to maintain the interaction. To allow turn taking or simply long term
interactions the robot must not be only a reactive system but must be endowed with some ”will” to interact.
In recent works, we have shown a simple internal oscillator can be used to maintain low level interactions. To
go one step further, we try to address the question of predicting what could be the stable states of a system
interacting with its environment [2, 3]. As a toy problem, we have analysed how an expressive robot head could
learn to associate the facial expression of a human or another robot with its own internal emotional state. We
have shown in the case of a simple reactive architecture that a solution to obtain a stable state of interaction is

Lagarde, M., Andry, P., Gaussier, Ph. (2008) 'Distributed Real-Time Neural Networks In Interactive Complex Systems' CSTST'08

Lagarde, M., Andry, P., Gaussier, Ph. (2008) 'Distributed Real-Time Neural Networks In Interactive Complex Systems'. In: Proceedings of the IEEE International Conference on Soft Computing as Transdisciplinary Science and Technology (CSTST 08). Paris, France. October 2008
Abstract:
In this paper, we present two graphical softwares which help the modeling and the simulation of real time, distributed neural networks (NNs). Used in the frame of the development of control architectures for autonomous robots, these softwares allow a real time control and an on-line learning of different behaviors. We present as an illustration two control architectures: the first one allowing a mobile robot to navigate using on-line learning of visual places, and the second one allowing a robotic head to learn to express a given set of emotions during imitation games.

Syndicate content