Robots Make and Deliver Pancakes: A Cooperative Effort By a PR2 (TUM's James) and TUM-Rosie

PR2 and Rosie Combine Efforts to Make and Deliver Pancakes

Dejan Pangercic of the Intelligent Autonomous Systems Group at TUM (Technische Universität München) wrote in to show us a cool dual-robot demonstration where a PR2 robot (TUM's James) and TUM-Rosie combine their efforts to prepare and deliver pancakes -- Yum!  The demonstration system is quite impressive, featuring: door and drawer opening, object recognition, grasping and manipulation, navigation, multi-robot cooperation, etc.  The demo seems to use a fair bit of stock ROS functionality, as well as some new functionality and CRAM integration (Cognitive Robot Abstract Machine, a reasoning framework from TUM).  I'm anxious to learn more about the system: assumptions, limitations, and methods.  Hopefully more advanced details are forthcoming.  Check out the video below.

Here is a video of the "Robotic Roommates Making Pancakes": 

 

Here is what Dejan Pangercic had to say about the system:

Cognitive robots baking pancakes

 

Three robots from the Munich-based cluster of excellence CoTeSys (Cognition for Technical Systems) starred as chefs, serving the international observers with self-made pancakes.

James, a PR2 robot, opened and closed cupboards and drawers, taking the prepared pancake mix from the refrigerator and delivering it to his square-shouldered companion Rosie. Using her two human-like arms and hands, she baked and flipped one pancake after the other and had James serve them to the guests. In the meantime, the small iCub, a humanoid robot who is 1.30m tall, was sitting in the corner of the kitchen, playing with Lego blocks delivered by James from time to time.

What may appear like a mechanical puppet theater is in fact a demonstration that cognitive capabilities like learning, probabilistic inference and action planning enable service robots to perform simple everyday tasks, says Michael Beetz, project leader and vice chairman of the CoTeSys cluster. These cognitive capabilities allow the robot to cope with mechanical inaccuracies, shifted furniture, obstacles and execution errors. Failures in the task execution are dealt with using learned knowledge about the processes involved, the tools and their usage. If the spatula has not been grasped correctly, for example, the robot will notice this and adjust its grasp. And if the robot does not know what to do, it downloads the required information from the Internet -- like humans do, just a bit faster. To retrieve the correct bottle of pancake mix from the refrigerator, it looks up a picture on the Web. The instructions for baking pancakes were also retrieved from the web; they were found at wikihow.com.

It is not primarily the individual cognitive mechanisms implemented on the robots that are new, says Michael Beetz, but rather the way in which they are grounded in sensor data and embedded into the robot's control programs in order to improve their performance. This allows robots to perform complex manipulation actions including tool-use and object interactions, as well as the coordination of multiple hands and even robots. Cameras and laser scanners are the robots' "eyes", sensors in the arms and hands measure forces, and the knowledge about one's own capabilities and optimization strategies help the robots in performing their tasks. After the demonstrations, the robots were even able to explain what they have done and why they have done so.

"This globally unique demonstration of cognitive capabilities of three autonomous robots", says Michael Beetz, "helps us to learn a lot about the challenges in performing human everyday activities. This will allow us to develop significantly more versatile, flexible, reliable and learning robot control programs."

A large part of the software used in this demonstration will soon be made available to robot researchers the world over, as part of the open-source software library ROS. In this way, the developments within the CoTeSys cluster will have a lasting impact on the development of future cognitive systems.  

 

The iCub demo Dejan alluded to is embedded here:


 

For those keeping score, it appears TUM-Rosie has a new look.  The arms on the old TUM-Rosie (left) have been rearranged into a new angled configuration (right).  The new configuration kind of reminds me of the arm configuration on Intel / CMU's Herb robot.

TUM Rosie Robot (Old)   TUM Rosie Robot (New)

 

Comments

This is Hawt!
—Anonymous

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <p>
  • HTML tags will be transformed to conform to HTML standards.

More information about formatting options