I would like to mark this momentous occasion by sharing it with you -- that, and it is just plain cool (and artistic)! The folks at RadiologyArt.com have been building Computed Tomography (CT) scans of various objects (dolls, electronics, vacuum tubes, McDonalds hamburgers, etc). The addition of a remote control dog and wind-up drumming bunny represent (to my knowledge) the first examples of CT scans of robots, albeit rudimentary robots. Read on for pictures and amazingly detailed videos.
There was a paper just released in Science (Materials) about "Giant-Stroke, Superelastic Carbon Nanotube Aerogel Muscles." This is a rare case where I believe the research material far exceeds the buzzword hype! The new material responds to applied voltages by expanding 220% in a few milliseconds, operating in temperatures as low as liquid-nitrogen and as high as the melting point of iron. It has the strength and stiffness of steel (by weight) in one direction and yet is as compliant as rubber in the other two. It has extremely low density due its airy (aerogel) properties, is conductive, and transparent. This materials innovation has the potential to rejuvenate research on artificial muscles, which has generally been focused on shape memory alloys (i.e. nickle-titanium or Nitinol), piezoelectrics (such as PZT), or electroactive polymers (EAPs). Read on for a discussion about these alternative technologies, their drawbacks, and why this new material may be a game-changer!
Describing science as "beautiful" makes perfect sense to me; I believe the physics experiments described in The Prism and the Pendulum are on par with the greatest paintings and sculptures ever conceived! However, I'm having difficulties classifying the $30,000 robot, Keepon: Is it a research robot, an art-robot, or both? On one hand, there is evidence supporting its role in important robotics research. On the other hand, there are the numerous (many more?) whimsical videos of Keepon dancing to music or traveling the world, such as the "Keepon Goes Seoul-Searching" video to be shown on Friday at the Human-Robot Interaction (HRI) 2009 conference (we show this video below). Having seen Keepon in person, I can attest to its "cuteness" factor and quality design... but my questions are: "Where is the line between art and research drawn?" "Does such a line, necessarily, exist?" and "How can HRI researchers and peer-reviewers objectively evaluate important robotics research that also possesses strong artistic components?" I'd love to hear your thoughts.
While Hizook covered the Rollin' Justin robot over three months ago, the rest of the world (including Engadget) had to wait until CeBIT, where Rollin' Justin "debuted" today. Lots of great pictures and videos were taken, including a video where Rollin' Justin is led around by the hand (I assume using the force/torque sensing capabilities of the DLR-III lightweight arm or the DLR-II hand). However, the "serious" coverage at CeBIT left out one of Justin's most hilarious commands: "dance like in pulp fiction." We show this video (to be shown at the upcoming ICRA 2009 conference) below.
I've been meaning to mention this for some time now... SICK has released a "new" laser rangefinder, the LMS 100. This laser rangefinder seems to be a departure from the classic "coffee-pot" look of yore (i.e. SICK LMS 291). In fact, it's form-factor and specifications are quite similar to the Hokuyo UTM-30LX; it seems like the LMS 100 might be SICK's strategic response to the "budget" LIDAR manufacturer's (Hokuyo's) burgeoning popularity among indoor roboticists. Priced at $5000 USD ($1000 less with academic discounts), I'm curious how it actually compares in performance (in the field) to the $5600 Hokuyo UTM -- can anyone weigh in? Read on for a comparison of specifications.
Hizook reader Yue Khing pointed us to another disassembled laser rangefinder (LIDAR); this time, it is an Omron STI OptoShield OS3100. This LIDAR seems similar in form (apparently referred to as "coffee pots" in industry) and specification to the SICK LMS laser rangefinders. Honestly, I wasn't even aware that Omron made laser rangefinders, so I'm not sure what these units cost, or how common they are; however, it is still interesting to compare their internal design to the SICK LMS series and Velodyne laser rangefinders we've already seen disassembled. Read on for pictures and videos.
As of January 2009, the iBOT powered-wheelchair will be discontinued. This is unfortunate for the disabled community -- Dean Kamen and the others at DEKA (the same people responsible for the Segway and Luke Arm) developed an amazing robotic wheelchair that was (somewhat) unique it its ability to transition from a statically-stable, 4-wheel configuration to a dynamically-stable, 2-wheel configuration to give occupants added height. Further, by pivoting pairs of wheels, the wheelchair and occupant were able to dynamically balance while traversing stairs, not to mention the wheelchair's basic ability to traverse (relatively) poor terrain, such as sand and gravel! All of this was possible due to careful controllers and internal gyros (not entirely dissimilar to a Segway). Read further for discussion -- specifically about why this loss for the disabled community could be an opportunity in disguise for the robotics community and a big win for Kamen and company.
I think this is both brilliant and hilarious... University of Delaware researchers, James Galloway and Sunil Agrawal, were awarded a two-year, $325k NSF grant to explore robot-enabled mobility for special needs children, with the goal of spurring cognitive development -- this is brilliant. However, why focus solely on special needs children? I think it is hilarious to imagine "regular" children using "smart wheelchairs" to putter around before they learn to crawl / walk -- it would certainly make for some entertaining rounds of baby-bumper-cars! Adding to the hilarity, their initial prototypes are Pioneer robots pulling a plywood trailer, supported by casters, with a small chair atop (images below)! But who am I to judge... we can all relate to "ugly prototype syndrome."
Early robots have found utility in warfare dating back to World War II (and arguably earlier), with the invention of simple electrical servo-mechanisms for fire control and targeting. While fire control has become extremely advanced, its "human in the loop" nature kept us (relatively) oblivious of the ethical implications of robots in warfare. However, increased autonomy and point-and-click capabilities are forcing us to reevaluate the ethical implications of robots in warfare. Enter a new book by P.W. Singer, entitled Wired for War: The Robotics Revolution and Conflict in the 21st Century. Singer was recently interviewed by NPR (and on The Daily Show by Jon Stewart), where he talked about a number of interesting issues. Links and discussion follow.
Friend and colleague, Richard Roberts, has offered to give us an glimpse of his recent work on learning autonomous robot behaviors/controllers from human operators:
While working on the DARPA LAGR project, we found it exceedingly difficult to tune our reactive behaviors to work well in cluttered and patchy environments. Either obstacle avoidance was too sensitive and the robot would not drive through gaps, or it was too aggressive and the robot would collide with obstacles. Of course, we could have made the behaviors more and more complicated, introducing more parameters to tune, but we wanted an easy way to have the robot “just do what I say!” Thus, we developed a system for interactive, on-line training of behaviors with a remote control. The user flips a switch to training mode, and drives the robot how they would like it to drive, then flips the switch back to autonomous to test the behavior.
Back on December 15th, we got a look at the internals of a SICK Laser Rangefinder (LIDAR), a $6k device that employs a single laser diode to produce ~6000 points per second (~600 points per scan at ~10Hz) over a 180° field-of-view. Now, we can compare that to the Rolls Royce of Laser Rangefinders -- the Velodyne Lidar, a $75k device employing 64 laser diodes to produce 1.3 million data points per second with a 360° horizontal field-of-view and a 26.8° vertical field-of-view. Below is a video of Bruce Hall, President of Velodyne LIDAR, demonstrating the HDL-64E in operation and taking a look at its internals. It may not be a complete disassembly (it does cost $75,000 afterall!), but it does provide some interesting insights into the Velodyne's internals.
The Situational Awareness Mast (SAM, also known as a Zipper Mast) from Geosystems Inc. is a telescoping linear actuator that has a unique property -- it's stroke length is an order of magnitude greater than its nominal height! For example, the SAM8 is a 10 lb device with a stroke length (8ft) that is 24 times it's nominal height (4 inches)! This can be used to vertically translate a robot's sensor suite for better visibility while still allowing for a low profile. Read on for information on the different Zipper Mast variants, the patent describing the system, and an exclusive video of a Zipper Mast on an iRobot Packbot!
Back in May 2008 it was announced that CMU professors Sara Kiesler and Jodi Forlizzi (from the HCI Institute) and Paul Rybski (from the Robotics Institute) were awarded $500k in Microsoft's Human-Robot Interaction funding to develop a social, snack-selling robot to traverse Newell-Simon and Wean halls (press release). After seeing a prototype appear on Flickr in July, we've all been waiting patiently to see pictures of the final version. Well, the wait is over -- photos of the new CMU snackbot, conceptual designs, and construction photos are contained below! It appears that the CMU team is progressing nicely.
This is great! Honda is celebrating its 50th year in the US by creating a 49-foot tall Asimo float that will lead off the Rose Parade on January 1st, 2009. To quote the Honda press release: "Honda's Rose Parade float, a 49-foot replica of Honda's ASIMO humanoid robot, and the parade's first-ever hydrogen-powered fuel cell pace car, the Honda FCX Clarity, will lead the 120th Rose Parade as well as kick off Honda's 50th anniversary of U.S. operations." I'm always a fan of robots being displayed (and appreciated) by the general public; thanks to DVICE for pointing this out.
I've always wanted to pull apart a SICK laser rangefinder (LIDAR). However, the $6k price-tag (and advisor repercussions) have always been a sufficient deterrent. Well, Kyle Vogt of MIT has disassembled what looks to be a SICK LMS-210 -- perhaps his was already broken? Anyway, the internal design is surprisingly simple. It's interesting to look at the internals of such iconic piece of robotics hardware. Read on for more images.
The folks at Dr. Matsumaru's Bio-Robotics & Human-Mechatronics Laboratory have worked on some very interesting human-robot interaction projects. I'm particularly interested in their video-projector interfaces. In one scenario, the video projector shows the robot's intended motion trajectory. In another scenario, dubbed the "Step-On Interface" or SOI, users step on projected "buttons" to control the robot. According to videos (below), Dr. Matsumaru is targeting home-based service robots. Read on for videos and more information about the video projector robot interfaces, as well as some others (using visible lasers, LCDs, and Persistence of Vision or POV displays).
You may recall Justin, the humanoid robot sporting two DLR-III lightweight arms and two DLR-II hands. Well, Justin has recently acquired a 4-wheel mobile base dubbed "Rollin' Justin". The base utilizes a "powered-caster" design similar to the Willow Garage PR2, except that the torso-caster linkage contains a spring-loaded lift mechanism that gives the base a variable footprint. I'm sure this will prove useful when trying to squeeze through doors, adapting to uneven terrain, or providing a larger support polygon. While we currently do not have any video of the system in action, there are a number of great pictures and design documents below.
Myself and several colleagues are anxiously following the creation of Willow Garage's PR2 mobile manipulation robot. By looking at the progress on WG's blog, it appears they're well on their way to functioning units by early next year; they already have some bases, spines, heads, and even an arm up and running -- read on to see more images from the PR2 "alpha" prototypes. One interesting aspect of Willow Garage is that their "Robot Operating System" (ROS), being developed by the Player-Stage founder Brian Gerkey, is entirely open source and run on (among others) Ubuntu Linux! You may also recall that Keenan Wyrobek and Eric Berger (formerly at Stanford, now both at Willow Garage) had a hand in the PR1 robot, with impressive videos of the robot cleaning up rooms, fetching beer, and unloading a dishwasher (see videos below).
There has been a lot of press in the last six months revolving around El-E, the autonomous mobile manipulation platform for the motor impaired out of Georgia Tech's Healthcare Robotics Lab (to which I belong). There was a report in the NY Times on El-E's laser-pointer interface, and now a report in MIT Tech Review on El-E behaving like a service dog. Recently, the lab's director (and my advisor) Dr. Charlie Kemp, gave an impressive talk at Carnegie Mellon's Robotics Institute (CMU-RI) where he adeptly ties together these research initiatives and makes a compelling case for more autonomous mobile manipulators for the motor impaired. Read on for the CMU-RI video and some choice images and themes from the talk.
I think it is great that the Rotundus GroundBot (a spherical robot) made the Popular Science "Best of What's New in 2008"; however, I'm a bit perplexed... New Scientist featured the spherical robot all the way back in early 2005; how is it "new" now in 2008? Either way, this serves as a convenient time to re-examine this novel robot -- one that brings back memories of the old solar-powered, spherical BEAM robots from Solarbotics (it was called a "miniball," and is now discontinued). Read on for some compelling images and pictures of the Rotundus GroundBot, the spherical robot.
OK, I know what everyone is thinking... "What is this craziness? Inter-Galactic Love?" Well, let's just attribute it to a poor Japanese-English translation -- the title should have been left at just "Hinokio," which is a play on words from the old, classic film title "Pinocchio." In my opinion, this is the second-best robot movie of all time in terms of robot realism and "cool" humanoid robots (second to I-Robot), though it does posses some of those cheesy Japanese memes. The movie is about a Japanese boy who is unable to walk and thus uses a telepresence, humanoid robot to experience life; everything the robot sees, hears, and feels, so does the boy. The film has amazing graphics and cinematography, and the human-robot interaction techniques are very well thought-out. I'd recommend everyone grab a copy and watch it; it's definitely worth the time. Read further for more detailed information and some very cool images from the film.
Back on October 10th, John Leonard gave a Georgia Tech Robotics Institute talk about MIT's DARPA Urban Grand Challenge experience. The MIT entry, a Land Rover LR3 named Talos, came in fourth place overall (out of 6 finishers and 11 qualifiers). I thought the most interesting aspect of the design was that it was originally intended to be a "low cost" solution (meaning many $6k SICK lidars, low-cost cameras, and radars), but that ultimately the success of the design hinged on the use of the $75k Velodyne lidar and an equally (or more) expensive Applanix GPS plus Inertial Measurement Unit (IMU) combo. Regardless, it was an impressive piece of engineering, and they have released much of their code and driving datasets to the public. Be sure to check out the rest of the post below to see some cool point-cloud visualizations made possible by those phenomenal Velodyne lidars!
The 2009 Robotics: Science and Systems conference has announced their call for papers (and workshops). RSS is one of the premier robotics conferences, and it is being held from June 28th through July 1st at the University of Washington in Seattle. The paper submission deadline is January 15h, and the deadline for workshop topic submissions is December 15th.
Troody is a 16 DOF autonomously powered and controlled biped robot built to resemble a Troodon, a small carnivorous dinosaur that lived in the Cretaceous. Troody remains one of my favorite robots of all time; when I was younger, its bio-inspired design (based off of actual fossil aspect ratios) and its lifelike movements were inspirational. Unfortunately, Troody may have been a bit ahead of its time -- there was little hope of commercializing such a complex robot for aspiring youngsters like myself to play with. Meanwhile, Troody's homepage has gone extinct, Troody is now in a traveling StarWars exhibit hangin' out with Darth and Yoda, and Peter Dilworth has moved on to WowWee (the creators of another pre-historic dinosaur robot, the Roboraptor). We will miss you Troody...