Friend and colleague, Richard Roberts, has offered to give us an glimpse of his recent work on learning autonomous robot behaviors/controllers from human operators:
While working on the DARPA LAGR project, we found it exceedingly difficult to tune our reactive behaviors to work well in cluttered and patchy environments. Either obstacle avoidance was too sensitive and the robot would not drive through gaps, or it was too aggressive and the robot would collide with obstacles. Of course, we could have made the behaviors more and more complicated, introducing more parameters to tune, but we wanted an easy way to have the robot “just do what I say!” Thus, we developed a system for interactive, on-line training of behaviors with a remote control. The user flips a switch to training mode, and drives the robot how they would like it to drive, then flips the switch back to autonomous to test the behavior.
Back on December 15th, we got a look at the internals of a SICK Laser Rangefinder (LIDAR), a $6k device that employs a single laser diode to produce ~6000 points per second (~600 points per scan at ~10Hz) over a 180° field-of-view. Now, we can compare that to the Rolls Royce of Laser Rangefinders -- the Velodyne Lidar, a $75k device employing 64 laser diodes to produce 1.3 million data points per second with a 360° horizontal field-of-view and a 26.8° vertical field-of-view. Below is a video of Bruce Hall, President of Velodyne LIDAR, demonstrating the HDL-64E in operation and taking a look at its internals. It may not be a complete disassembly (it does cost $75,000 afterall!), but it does provide some interesting insights into the Velodyne's internals.
The Situational Awareness Mast (SAM, also known as a Zipper Mast) from Geosystems Inc. is a telescoping linear actuator that has a unique property -- it's stroke length is an order of magnitude greater than its nominal height! For example, the SAM8 is a 10 lb device with a stroke length (8ft) that is 24 times it's nominal height (4 inches)! This can be used to vertically translate a robot's sensor suite for better visibility while still allowing for a low profile. Read on for information on the different Zipper Mast variants, the patent describing the system, and an exclusive video of a Zipper Mast on an iRobot Packbot!
Back in May 2008 it was announced that CMU professors Sara Kiesler and Jodi Forlizzi (from the HCI Institute) and Paul Rybski (from the Robotics Institute) were awarded $500k in Microsoft's Human-Robot Interaction funding to develop a social, snack-selling robot to traverse Newell-Simon and Wean halls (press release). After seeing a prototype appear on Flickr in July, we've all been waiting patiently to see pictures of the final version. Well, the wait is over -- photos of the new CMU snackbot, conceptual designs, and construction photos are contained below! It appears that the CMU team is progressing nicely.
This is great! Honda is celebrating its 50th year in the US by creating a 49-foot tall Asimo float that will lead off the Rose Parade on January 1st, 2009. To quote the Honda press release: "Honda's Rose Parade float, a 49-foot replica of Honda's ASIMO humanoid robot, and the parade's first-ever hydrogen-powered fuel cell pace car, the Honda FCX Clarity, will lead the 120th Rose Parade as well as kick off Honda's 50th anniversary of U.S. operations." I'm always a fan of robots being displayed (and appreciated) by the general public; thanks to DVICE for pointing this out.
I've always wanted to pull apart a SICK laser rangefinder (LIDAR). However, the $6k price-tag (and advisor repercussions) have always been a sufficient deterrent. Well, Kyle Vogt of MIT has disassembled what looks to be a SICK LMS-210 -- perhaps his was already broken? Anyway, the internal design is surprisingly simple. It's interesting to look at the internals of such iconic piece of robotics hardware. Read on for more images.
The folks at Dr. Matsumaru's Bio-Robotics & Human-Mechatronics Laboratory have worked on some very interesting human-robot interaction projects. I'm particularly interested in their video-projector interfaces. In one scenario, the video projector shows the robot's intended motion trajectory. In another scenario, dubbed the "Step-On Interface" or SOI, users step on projected "buttons" to control the robot. According to videos (below), Dr. Matsumaru is targeting home-based service robots. Read on for videos and more information about the video projector robot interfaces, as well as some others (using visible lasers, LCDs, and Persistence of Vision or POV displays).
You may recall Justin, the humanoid robot sporting two DLR-III lightweight arms and two DLR-II hands. Well, Justin has recently acquired a 4-wheel mobile base dubbed "Rollin' Justin". The base utilizes a "powered-caster" design similar to the Willow Garage PR2, except that the torso-caster linkage contains a spring-loaded lift mechanism that gives the base a variable footprint. I'm sure this will prove useful when trying to squeeze through doors, adapting to uneven terrain, or providing a larger support polygon. While we currently do not have any video of the system in action, there are a number of great pictures and design documents below.
Myself and several colleagues are anxiously following the creation of Willow Garage's PR2 mobile manipulation robot. By looking at the progress on WG's blog, it appears they're well on their way to functioning units by early next year; they already have some bases, spines, heads, and even an arm up and running -- read on to see more images from the PR2 "alpha" prototypes. One interesting aspect of Willow Garage is that their "Robot Operating System" (ROS), being developed by the Player-Stage founder Brian Gerkey, is entirely open source and run on (among others) Ubuntu Linux! You may also recall that Keenan Wyrobek and Eric Berger (formerly at Stanford, now both at Willow Garage) had a hand in the PR1 robot, with impressive videos of the robot cleaning up rooms, fetching beer, and unloading a dishwasher (see videos below).
There has been a lot of press in the last six months revolving around El-E, the autonomous mobile manipulation platform for the motor impaired out of Georgia Tech's Healthcare Robotics Lab (to which I belong). There was a report in the NY Times on El-E's laser-pointer interface, and now a report in MIT Tech Review on El-E behaving like a service dog. Recently, the lab's director (and my advisor) Dr. Charlie Kemp, gave an impressive talk at Carnegie Mellon's Robotics Institute (CMU-RI) where he adeptly ties together these research initiatives and makes a compelling case for more autonomous mobile manipulators for the motor impaired. Read on for the CMU-RI video and some choice images and themes from the talk.
I think it is great that the Rotundus GroundBot (a spherical robot) made the Popular Science "Best of What's New in 2008"; however, I'm a bit perplexed... New Scientist featured the spherical robot all the way back in early 2005; how is it "new" now in 2008? Either way, this serves as a convenient time to re-examine this novel robot -- one that brings back memories of the old solar-powered, spherical BEAM robots from Solarbotics (it was called a "miniball," and is now discontinued). Read on for some compelling images and pictures of the Rotundus GroundBot, the spherical robot.
OK, I know what everyone is thinking... "What is this craziness? Inter-Galactic Love?" Well, let's just attribute it to a poor Japanese-English translation -- the title should have been left at just "Hinokio," which is a play on words from the old, classic film title "Pinocchio." In my opinion, this is the second-best robot movie of all time in terms of robot realism and "cool" humanoid robots (second to I-Robot), though it does posses some of those cheesy Japanese memes. The movie is about a Japanese boy who is unable to walk and thus uses a telepresence, humanoid robot to experience life; everything the robot sees, hears, and feels, so does the boy. The film has amazing graphics and cinematography, and the human-robot interaction techniques are very well thought-out. I'd recommend everyone grab a copy and watch it; it's definitely worth the time. Read further for more detailed information and some very cool images from the film.
Back on October 10th, John Leonard gave a Georgia Tech Robotics Institute talk about MIT's DARPA Urban Grand Challenge experience. The MIT entry, a Land Rover LR3 named Talos, came in fourth place overall (out of 6 finishers and 11 qualifiers). I thought the most interesting aspect of the design was that it was originally intended to be a "low cost" solution (meaning many $6k SICK lidars, low-cost cameras, and radars), but that ultimately the success of the design hinged on the use of the $75k Velodyne lidar and an equally (or more) expensive Applanix GPS plus Inertial Measurement Unit (IMU) combo. Regardless, it was an impressive piece of engineering, and they have released much of their code and driving datasets to the public. Be sure to check out the rest of the post below to see some cool point-cloud visualizations made possible by those phenomenal Velodyne lidars!
The 2009 Robotics: Science and Systems conference has announced their call for papers (and workshops). RSS is one of the premier robotics conferences, and it is being held from June 28th through July 1st at the University of Washington in Seattle. The paper submission deadline is January 15h, and the deadline for workshop topic submissions is December 15th.
Troody is a 16 DOF autonomously powered and controlled biped robot built to resemble a Troodon, a small carnivorous dinosaur that lived in the Cretaceous. Troody remains one of my favorite robots of all time; when I was younger, its bio-inspired design (based off of actual fossil aspect ratios) and its lifelike movements were inspirational. Unfortunately, Troody may have been a bit ahead of its time -- there was little hope of commercializing such a complex robot for aspiring youngsters like myself to play with. Meanwhile, Troody's homepage has gone extinct, Troody is now in a traveling StarWars exhibit hangin' out with Darth and Yoda, and Peter Dilworth has moved on to WowWee (the creators of another pre-historic dinosaur robot, the Roboraptor). We will miss you Troody...
One of the newest offerings from Segway is the RMP 50 Omni, a trimmed down version of the RMP 400 Omni. This platform has mecanum wheels which give the base the ability to drive forward, backward, right, left, and turn independently. It is a capable mobile base in a sleek and low profile package, but this product doesn't come with all the features one would expect from a $21,000 platform.
Researchers at Georgia Tech (labmates of this author) have developed a robot that can robustly open closed doors. The target application for the robot, named El-E ("Ellie"), is assistive tasks related to healthcare in the homes of the disabled. This application demonstrates a set of behaviors that enable a mobile manipulator to reliably open a variety of doors and traverse doorways using force-sensing fingers and a laser range finder.
During the Spring 2007 semester, several friends (and labmates) took a course at Georgia Tech on mobile manipulation. This was no ordinary class... the final exam's assignment was to use a Segway base with KUKA arm to fetch a cup of coffee! There are a ton of reasons that this is interesting, from mobility, navigation, perception, manipulation, etc. However, the most impressive thing is that each group used different software to complete the task. One team used MS Robotics Studio, another used Player/Stage on Linux, and another used a functional language called OCaml on Mac.
Festo is known as a top-notch automation hardware manufacturer, but apparently their research division is capable of making very artistic, bio-inspired robots as well. This post specifically examines their robotic dirigible and submersible manta rays, both of which harbor a life-like gracefulness. I encourage you to check out the videos below; the technical specifications are provided for good measure.
Back in November of 2007, I saw a presentation by Professor Siciliano from University of Naples where he briefly mentioned (and had a video) of a very cool humanoid robot named Justin. I've seen a lot more of DLR-III lightweight arms now that DLR and Kuka are working together to push them out into industry; though I must admit that I like Justin's blue arms compared to the characteristic Kuka-orange. Perhaps the most impressive aspect of these arms is that each has a power-to-weight ratio greater than unity. This, combined with some very capable DLR-II Hands make Justin an impressive bi-manual research platform.
Kuka unveiled two prototype products at IROS 2008 in September, both ultimately targeting educational use. The first product was very sleek holonomic (omnidirectional) base employing mecanum wheels. The second product was a cute little 5 DOF (plus 1 DOF gripper) arm. While the Kuka representatives mentioned possible price-points of
$3,500 for the base and $4,000 for the arm, there was no mention of a timetable. See below for additional discussion and videos! Updated July 26th 2010: The initial price-points were (admittedly) entirely too optimistic for such a high-quality arm and base, especially since each use real-time Ethercat controllers. Kuka recently began offering an official product (named YouBot) that is available for around $24,000 -- still a great value when compared to other mobile manipulation platforms.
There has been a lot of discussion recently by Intel's CTO (Justin Rattner) about some really compelling future technologies: wireless power and programmable matter (made of catoms). Of course, the programmable matter (catoms) he is discussing are basically robots operating as a swarm. Wouldn't it be neat to see the swarms actually powered wirelessly? While Intel has thus far worked on the two technologies disjointly, work presented by myself at ICRA 2008 is addressing the intersection -- wirelessly powering a swarm of robots (publication here).