Research
Our research is focused on autonomous mobile robots, i.e. physical systems that are able to autonomously move in their environment and accomplish different tasks. The key aspects of our research are:
Multi-robot-navigation and coordination
In this field, we are concerned with the investigation of methods that enable groups of robots to efficiently navigate in their environment without colliding or interfering with each other. For instance, strategies for coordinated traversal of narrow passages have to be realized. Furthermore, we are developing robust methods for path planning. The aim is to generate plans that minimize the risk of navigation clashes and at the same time are robust against problems like e.g. unanticipated delays of single robots.
Modeling of environments and state estimation
The aim in this project is the development of models and techniques for interpretation of sensor data. The main task / problem is to generate a consistent model of the environment by one or multiple robots. Important subtasks include exploration, i.e efficient investigation of unknown environments, and localization, i.e. estimation of the robot position in the environment, based on sensor information.
Interaction
The objective of this project is to simplify dealing with robots in a way that allows humans to easily interact and communicate with a mobile system. Therefore, we have to consider robot perception, i.e. the question how the robot perceives and interprets actions of users. Additionaly, a robot must be able to communicate its intentions to the user.
Applications
Here, we are interested in applications for mobile robots in natural environments populated/inhabited by humans. Possible applications are mobile information assistants operating in shopping malls, exhibitions or museums. The mobile robots RHINO and MINERVA, for instance, were employed as interactive museum tour guides in 1997 / 1998. Furthermore, we want to develop service robots that are able to accomplish e.g. surveillance and clearance tasks at home.