Our research in autonomous robotics is organized around the problems posed by robotic assistants, that is, partially autonomous robot systems that interact with human operators with whom they share a natural environment. Robotic assistants need an array of sensor systems and powerful perceptual algorithms so that they may acquire enough information about the scene to interpret user commands and autonomously perform actions such as orienting toward objects, retrieving them, possibly manipulating them and handing them over to the human operator. Based on analogies with how nervous systems generate motor behavior and simple forms of cognition, we use attractor dynamics and their instabilities at three levels to generate movement trajectories, to generate goal-directed sequences of behaviors, and to derive task-relevant perceptual representations that support goal-directed behavior.
Interested in autonomous robotics?
If you are a student interested in our work, have a look at the lecture Autonomous Robotics: Action, Perception, and Cognition, or our lab course in autonomous robotics.
We also offer group study projects, as well as Bachelor, Master, and Diploma projects for students of various fields. Check the offered projects or just contact us with your needs and we will talk about possible projects.
PhD students are encouraged to also attend our yearly summer school on the topic.
If you would like to visit the lab, meet some of the people and have a look at our robots, just send us an email.
CoRA is a non-mobile anthropomorphic robotic assistant. It consists of an anthropomorphic arm with seven degrees of freedom and a stereo camera head. Additionally, it features haptic and force/torque sensors for direct physical interaction as well as an additional camera system to detect humans gaze direction.
New robotic assistant
We are currently working on replacing CoRA with a completely new arm and stereo camera system. The seven degrees of freedom arm by KUKA is in place and working, with a three-finger hand with pressure sensors already mounted at the top. The system will soon be completed by a stereo vision head, similar in design to the CoRA system, but with fast and higher resolution cameras. Additionally, everything will be mounted on a new, custom-made table, which will also serve as the workspace of the robot during experiments.
NAO is a medium-sized humanoid robot, developed by the French company Aldebaran. The robot has 25 degrees of freedom, two HD (1280x720 pixel) color cameras, loud-speakers and microphones, as well as infrared, ultra-sound, and pressure sensors. We have used it in study projects, a Master thesis, and in ongoing research concerned with the organization of behaviors.
Mobile robots (e-puck and Khepera II)
Easy to program and transport, this class of small, mobile robots is extensively used in our lab sessions for students, visiting groups from schools, and yearly summer school.
The e-puck robot is only 7cm wide, has a differential wheel drive (one wheel on either side), and is equipped with a camera and infrared sensors to perceive its environment. We now use it almost exclusively for our lab sessions. Sometimes, we fall back to using the older Khepera II models, however.
cedar is the core C++ library developed in the Autonomous Robotics group. It bundles our knowledge and ideology of autonomous robotics with an emphasis on cognition, embodiment, and dynamics.
cedar is the result of an ongoing effort to rewrite and integrate a collection of powerful, yet incoherent and fragmented code that had been developed over many years. We put a lot of effort in keeping the different parts of cedar compatible to each other. Linking up perception and motor control of a robot and putting some cognitive processing in between should not be constrained by programming issues, but should only be a question of applied concepts. cedar provides a tool aimed at facilitating the connection of modules to create robotic architectures. You can focus on "what" to connect, keeping the effort of "how" to connect it as low as possible.