This new humanoid robot named "Cody" comes from Georgia Tech's Healthcare Robotics Lab (to which I belong). Cody is composed of a Segway RMP 50 Omni mobile base, 1-DoF vertical linear actuator, and a pair of 7-DoF Meka Arms with series elastic actuators (the same as Simon). This mobile manipulator has shown some pretty impressive capabilities. It can open doors, drawers, and cabinets using equilibrium point controllers developed by Advait Jain and Prof. Charlie Kemp. It also has a nice direct physical interface (touching interface) to reposition the robot that was developed by Tiffany Chen and Prof. Charlie Kemp. Much of the code controlling this robot is open-source and has ROS (Robot Operating System) interfaces. Be sure to check out the videos and photos below.
This robot is described in detail in a few recent papers:
An early photo (below left) shows some of the robot's main components, while another photo (below right) shows Cody with it's pan-tilt stereo head and aesthetic case.
My labmate (Advait Jain) and advisor (Prof. Charlie Kemp) have built some nice "equilibrium point controllers" (EPC) to control the low mechanical impedance arms and perform door, drawer, and cabinet opening. In general, EPCs are simulated, torsional, viscoelastic springs (with constant stiffness and damping) and a variable equilibrium angle. These controllers seem to vastly simplify robot behavior creation, and work with very low refresh rates (something around 10Hz, commanded in Python). You can learn more in the papers describing this work (here and here), or on the project's homepage.
The photos below show examples of the robot opening a number of mechanisms using a special "hook" end effector.
You can also watch videos of it in action.
Another labmate (Tiffany Chen) and advisor (Prof. Charlie Kemp) have built a number of interfaces to reposition the robot -- both the mecanum base and the arms. One of the interfaces is a typical gamepad (Playstation-like) interface, while the other is a novel "direct physical interface (DPI)."
The direct physical (touching) interface is rather intuitive. You grab hold of the robot and reposition it as you would a person, guiding them by the hand or arm. Tiffany and Charlie recently reported the results of a user study wherein nurses repositioned the robot using these two interfaces. Their results (paper here) were presented at the Human-Robot Interaction (HRI2010) conference in Japan. Here is a video of Tiffany demonstrating the direct physical interface:
Force sensor readings and end effector positions are measured and used to compute forward/backward and angular velocities of the robot's base. The user can also grab the robot's highly compliant arms and either push them toward or pull them away from the robot to create changes in the angles of the robot's shoulder joints. These angles are measured and used to compute direct side-to-side velocities of the robot's omni-directional base. Force sensor readings and end effector positions are used to compute velocities along the vertical linear actuator. The user is able to control these 4 DoF simultaneously. In a user study of 18 nurses from the Atlanta, GA area, we showed that this DPI is superior to a comparable gamepad interface according to several objective and subjective measures.
Here are a few images as well.