RUB » Institut für Neuroinformatik » Research » Research Groups » Autonomous Robotics

Autonomous Robotics

Our research in autonomous robotics is organized around the problems posed by robotic assistants, that is, partially autonomous robot systems that interact with human operators with whom they share a natural environment. Robotic assistants need an array of sensor systems and powerful perceptual algorithms so that they may acquire enough information about the scene to interpret user commands and autonomously perform actions such as orienting toward objects, retrieving them, possibly manipulating them and handing them over to the human operator. Based on analogies with how nervous systems generate motor behavior and simple forms of cognition, we use attractor dynamics and their instabilities at three levels to generate movement trajectories, to generate goal-directed sequences of behaviors, and to derive task-relevant perceptual representations that support goal-directed behavior.

Interested in autonomous robotics?

If you are a student interested in our work, have a look at the lecture Autonomous Robotics: Action, Perception, and Cognition, or our lab course in autonomous robotics.

We also offer group study projects, as well as Bachelor, Master, and Diploma projects for students of various fields. Check the offered projects or just contact us with your needs and we will talk about possible projects.

PhD students are encouraged to also attend our yearly summer school on the topic.

If you would like to visit the lab, meet some of the people and have a look at our robots, just send us an email.

Robotic platforms

CoRACoRA

CoRA is a non-mobile anthropomorphic robotic assistant. It consists of an anthropomorphic arm with seven degrees of freedom and a stereo camera head. Additionally, it features haptic and force/torque sensors for direct physical interaction as well as an additional camera system to detect humans gaze direction.

New robotic assistant

KUKA armWe are currently working on replacing CoRA with a completely new arm and stereo camera system. The seven degrees of freedom arm by KUKA is in place and working, with a three-finger hand with pressure sensors already mounted at the top. The system will soon be completed by a stereo vision head, similar in design to the CoRA system, but with fast and higher resolution cameras. Additionally, everything will be mounted on a new, custom-made table, which will also serve as the workspace of the robot during experiments.

NAONao grasping

NAO is a medium-sized humanoid robot, developed by the French company Aldebaran. The robot has 25 degrees of freedom, two HD (1280x720 pixel) color cameras, loud-speakers and microphones, as well as infrared, ultra-sound, and pressure sensors. We have used it in study projects, a Master thesis, and in ongoing research concerned with the organization of behaviors.

Mobile robots (e-puck and Khepera II)

Mobile robots

Easy to program and transport, this class of small, mobile robots is extensively used in our lab sessions for students, visiting groups from schools, and yearly summer school.

The e-puck robot is only 7cm wide, has a differential wheel drive (one wheel on either side), and is equipped with a camera and infrared sensors to perceive its environment. We now use it almost exclusively for our lab sessions. Sometimes, we fall back to using the older Khepera II models, however.

cedar

cedar is the core C++ library developed in the Autonomous Robotics group. It bundles our knowledge and ideology of autonomous robotics with an emphasis on cognition, embodiment, and dynamics.

cedar is the result of an ongoing effort to rewrite and integrate a collection of powerful, yet incoherent and fragmented code that had been developed over many years. We put a lot of effort in keeping the different parts of cedar compatible to each other. Linking up perception and motor control of a robot and putting some cognitive processing in between should not be constrained by programming issues, but should only be a question of applied concepts. cedar provides a tool aimed at facilitating the connection of modules to create robotic architectures. You can focus on "what" to connect, keeping the effort of "how" to connect it as low as possible.

People

Group leader:
Prof. Dr. Gregor Schöner

Researchers:
Jean-Stephane Jokeit
Oliver Lomp
Farid Oubbati
Dr. Hendrik Reimann
Mathis Richter
Dr. Yulia Sandamirskaya
Stephan Zibner

How to find us

The robotics lab is situated in building NB, level 02, room 77. On most week days, at least one of us can be found there between 10am and 5pm.