At ICRA 2009, the Rollin' Justin humanoid robot (the lovable robot that "Danced Like in Pulp Fiction") demonstrated some impressive teleoperation capabilities. The man-machine interface (MMI) consists of two components. The first component comprises two DLR-III lightweight arms, the same type employed by the robot, terminated with force-torque sensing load cells to command the omnidirectional base or the arms / hands. Meanwhile, the second component, a fully-immersive heads-up display with vicon (optical) head tracking, constantly streams robot-mounted camera images to the heads-up display while simultaneously panning and tilting the robot's head in concert with the user's head movements. All-in-all, this is a very impressively engineered system. Be sure to check out the pictures and video below.
Like almost all roboticists, I'm a huge fan of robot movies. My common favorites include: I-Robot, Blade Runner, Iron Man, Short Circuit, AI, Wall-E, Hinokio, and so on. Well, there is a new Sci-Fi movie called "District 9" coming out this weekend that (based on previews) sports some impressive robotic systems -- particularly exoskeletons. The writer / director of this new movie is Neill Blomkamp, who has also produced numerous short films featuring robots (a few of which are shown below). In a psuedo-tradition, we're having a lab outing to a matinee showing of "District 9" this weekend. I'll be sure to let you know how it goes in the comments, but in the meantime check out the pictures and trailers below.
I saw a press release by Robosoft (a French company that creates "advanced robotics solutions") with attractive CAD drawings of a robotic walker meant to assist the elderly. I thought this was a good opportunity to examine some of the other robotic solutions in this space, from the more complex Care-O-Bot II from Fraunhofer to the most simplistic passively-breaking walkers that prevent stumbling and excessive acceleration. Read further for more information, and if you know of any examples of robotic walkers to assist the elderly, please chime in!
By now, most roboticists are familiar with the myriad gecko-type robots that employ Van der Waals forces (created by microscopic synthetic setae) to cling to walls. Less well-known is the work on an electrically-controllable alternative developed by researchers at SRI International (formerly called Stanford Research Institute) called "electroadhesion". Impressively, the electroadhesive can support 0.2 to 1.4 N per square centimeter, requiring a mere 20 micro-Watts per Newton. This means that a square meter of electroadhesive could hold at least 200kg (440 lbs) while only consuming 40 milli-Watts, and could turn on and off at the flick of a switch! Read on for pictures, videos, and discussion.
A few blogs are passing around videos of the Ishikawa Komuro Lab's high-speed robot hand performing impressive acts of dexterity and skillful manipulation. However, the video being passed around is slight on details. Meanwhile, their video presentation at ICRA 2009 (which took place in May in Kobe, Japan) has an informative narration and demonstrates additional capabilities. I have included this video below, which shows the manipulator dribbling a ping-pong ball, spinning a pen, throwing a ball, tying knots, grasping a grain of rice with tweezers, and tossing / re-grasping a cellphone!
The iconic Pixar animated lamp, Luxo Jr., unofficially debuted in animatronic form at Disney's Hollywood Studios in late June (videos below). Both the animated and animatronic Luxo Jrs. have remarkable anthropomorphic emotive capabilities in spite of their simple, non-human form. This reminds me of conversations in Dr. Andrea Thomaz's human-robot interaction course about applying animation techniques to design more effective social robots -- clearly Disney's Imagineers have perfected this art.
I came upon this new commercial (video below) entitled "The Runner -- Exploit Yourself" created by Big Lazy Robot (a design / visual effects studio) for Nike. The humanoid robot performs impressive feats of urban acrobatics, strongly resembling a more agile version of the movie-star robot, Hinokio. It is always interesting to compare robot fact with fiction. Hopefully the future lives up to (nay, exceeds) our expectations.
There was a very interesting plenary talk at ICRA 2009 about "Computational Cameras" given by Prof. Shree Nayar of Columbia University. A video of the plenary is included below, as well as a discussion of some of its contents -- from assorted pixel techniques for high dynamic range to flexible depth of field photography -- all very cool stuff! These developments are particularly relevant to robotics, as cameras are probably the most ubiquitous sensors encountered. This video was made available in the ICRA 2009 podcasts. While there is a large push for open-access journals / conferences, freely-available recordings of conference talks is even more lacking. As I find these more entertaining than television, I really hope this becomes a common trend (perhaps the RSS committee members are watching...?).
While most (semi)autonomous mobile manipulators employ expensive articulated arms with grippers (6 or more DOF), the Healthcare Robotics Lab at Georgia Tech, the same folks who made EL-E, are also examining low-complexity end effectors modelled off of dustpans and kitchen turners for non-prehensile grasping of isolated objects from the floor. When mounted on an iRobot Create (Roomba), the system's performance was impressive; it successfully grasped ~95% of the 34 test objects across numerous orientations / configurations and four different surfaces -- an impressive feat of robustness given that the end effector is a single under-actuated "sweeper" (1 DOF) working in tandem with a planar wedge, the whole system operates via open loop control, and the objects were quite varied (from small individual pills to large containers, and from deformable textiles to rigid bottles). This system is slated to appear at ICRA 2009 in Kobe, Japan in the next few days and is documented in a paper entitled "1000 Trials: An Empirically Validated End Effector that Robustly Grasps Objects from the Floor" (of which I am a coauthor). Read further for videos and additional discussion.
I would like to mark this momentous occasion by sharing it with you -- that, and it is just plain cool (and artistic)! The folks at RadiologyArt.com have been building Computed Tomography (CT) scans of various objects (dolls, electronics, vacuum tubes, McDonalds hamburgers, etc). The addition of a remote control dog and wind-up drumming bunny represent (to my knowledge) the first examples of CT scans of robots, albeit rudimentary robots. Read on for pictures and amazingly detailed videos.
There was a paper just released in Science (Materials) about "Giant-Stroke, Superelastic Carbon Nanotube Aerogel Muscles." This is a rare case where I believe the research material far exceeds the buzzword hype! The new material responds to applied voltages by expanding 220% in a few milliseconds, operating in temperatures as low as liquid-nitrogen and as high as the melting point of iron. It has the strength and stiffness of steel (by weight) in one direction and yet is as compliant as rubber in the other two. It has extremely low density due its airy (aerogel) properties, is conductive, and transparent. This materials innovation has the potential to rejuvenate research on artificial muscles, which has generally been focused on shape memory alloys (i.e. nickle-titanium or Nitinol), piezoelectrics (such as PZT), or electroactive polymers (EAPs). Read on for a discussion about these alternative technologies, their drawbacks, and why this new material may be a game-changer!
Describing science as "beautiful" makes perfect sense to me; I believe the physics experiments described in The Prism and the Pendulum are on par with the greatest paintings and sculptures ever conceived! However, I'm having difficulties classifying the $30,000 robot, Keepon: Is it a research robot, an art-robot, or both? On one hand, there is evidence supporting its role in important robotics research. On the other hand, there are the numerous (many more?) whimsical videos of Keepon dancing to music or traveling the world, such as the "Keepon Goes Seoul-Searching" video to be shown on Friday at the Human-Robot Interaction (HRI) 2009 conference (we show this video below). Having seen Keepon in person, I can attest to its "cuteness" factor and quality design... but my questions are: "Where is the line between art and research drawn?" "Does such a line, necessarily, exist?" and "How can HRI researchers and peer-reviewers objectively evaluate important robotics research that also possesses strong artistic components?" I'd love to hear your thoughts.
While Hizook covered the Rollin' Justin robot over three months ago, the rest of the world (including Engadget) had to wait until CeBIT, where Rollin' Justin "debuted" today. Lots of great pictures and videos were taken, including a video where Rollin' Justin is led around by the hand (I assume using the force/torque sensing capabilities of the DLR-III lightweight arm or the DLR-II hand). However, the "serious" coverage at CeBIT left out one of Justin's most hilarious commands: "dance like in pulp fiction." We show this video (to be shown at the upcoming ICRA 2009 conference) below.
I've been meaning to mention this for some time now... SICK has released a "new" laser rangefinder, the LMS 100. This laser rangefinder seems to be a departure from the classic "coffee-pot" look of yore (i.e. SICK LMS 291). In fact, it's form-factor and specifications are quite similar to the Hokuyo UTM-30LX; it seems like the LMS 100 might be SICK's strategic response to the "budget" LIDAR manufacturer's (Hokuyo's) burgeoning popularity among indoor roboticists. Priced at $5000 USD ($1000 less with academic discounts), I'm curious how it actually compares in performance (in the field) to the $5600 Hokuyo UTM -- can anyone weigh in? Read on for a comparison of specifications.
Hizook reader Yue Khing pointed us to another disassembled laser rangefinder (LIDAR); this time, it is an Omron STI OptoShield OS3100. This LIDAR seems similar in form (apparently referred to as "coffee pots" in industry) and specification to the SICK LMS laser rangefinders. Honestly, I wasn't even aware that Omron made laser rangefinders, so I'm not sure what these units cost, or how common they are; however, it is still interesting to compare their internal design to the SICK LMS series and Velodyne laser rangefinders we've already seen disassembled. Read on for pictures and videos.
As of January 2009, the iBOT powered-wheelchair will be discontinued. This is unfortunate for the disabled community -- Dean Kamen and the others at DEKA (the same people responsible for the Segway and Luke Arm) developed an amazing robotic wheelchair that was (somewhat) unique it its ability to transition from a statically-stable, 4-wheel configuration to a dynamically-stable, 2-wheel configuration to give occupants added height. Further, by pivoting pairs of wheels, the wheelchair and occupant were able to dynamically balance while traversing stairs, not to mention the wheelchair's basic ability to traverse (relatively) poor terrain, such as sand and gravel! All of this was possible due to careful controllers and internal gyros (not entirely dissimilar to a Segway). Read further for discussion -- specifically about why this loss for the disabled community could be an opportunity in disguise for the robotics community and a big win for Kamen and company.
I think this is both brilliant and hilarious... University of Delaware researchers, James Galloway and Sunil Agrawal, were awarded a two-year, $325k NSF grant to explore robot-enabled mobility for special needs children, with the goal of spurring cognitive development -- this is brilliant. However, why focus solely on special needs children? I think it is hilarious to imagine "regular" children using "smart wheelchairs" to putter around before they learn to crawl / walk -- it would certainly make for some entertaining rounds of baby-bumper-cars! Adding to the hilarity, their initial prototypes are Pioneer robots pulling a plywood trailer, supported by casters, with a small chair atop (images below)! But who am I to judge... we can all relate to "ugly prototype syndrome."
Early robots have found utility in warfare dating back to World War II (and arguably earlier), with the invention of simple electrical servo-mechanisms for fire control and targeting. While fire control has become extremely advanced, its "human in the loop" nature kept us (relatively) oblivious of the ethical implications of robots in warfare. However, increased autonomy and point-and-click capabilities are forcing us to reevaluate the ethical implications of robots in warfare. Enter a new book by P.W. Singer, entitled Wired for War: The Robotics Revolution and Conflict in the 21st Century. Singer was recently interviewed by NPR (and on The Daily Show by Jon Stewart), where he talked about a number of interesting issues. Links and discussion follow.
Friend and colleague, Richard Roberts, has offered to give us an glimpse of his recent work on learning autonomous robot behaviors/controllers from human operators:
While working on the DARPA LAGR project, we found it exceedingly difficult to tune our reactive behaviors to work well in cluttered and patchy environments. Either obstacle avoidance was too sensitive and the robot would not drive through gaps, or it was too aggressive and the robot would collide with obstacles. Of course, we could have made the behaviors more and more complicated, introducing more parameters to tune, but we wanted an easy way to have the robot “just do what I say!” Thus, we developed a system for interactive, on-line training of behaviors with a remote control. The user flips a switch to training mode, and drives the robot how they would like it to drive, then flips the switch back to autonomous to test the behavior.
Back on December 15th, we got a look at the internals of a SICK Laser Rangefinder (LIDAR), a $6k device that employs a single laser diode to produce ~6000 points per second (~600 points per scan at ~10Hz) over a 180° field-of-view. Now, we can compare that to the Rolls Royce of Laser Rangefinders -- the Velodyne Lidar, a $75k device employing 64 laser diodes to produce 1.3 million data points per second with a 360° horizontal field-of-view and a 26.8° vertical field-of-view. Below is a video of Bruce Hall, President of Velodyne LIDAR, demonstrating the HDL-64E in operation and taking a look at its internals. It may not be a complete disassembly (it does cost $75,000 afterall!), but it does provide some interesting insights into the Velodyne's internals.
The Situational Awareness Mast (SAM, also known as a Zipper Mast) from Geosystems Inc. is a telescoping linear actuator that has a unique property -- it's stroke length is an order of magnitude greater than its nominal height! For example, the SAM8 is a 10 lb device with a stroke length (8ft) that is 24 times it's nominal height (4 inches)! This can be used to vertically translate a robot's sensor suite for better visibility while still allowing for a low profile. Read on for information on the different Zipper Mast variants, the patent describing the system, and an exclusive video of a Zipper Mast on an iRobot Packbot!
Back in May 2008 it was announced that CMU professors Sara Kiesler and Jodi Forlizzi (from the HCI Institute) and Paul Rybski (from the Robotics Institute) were awarded $500k in Microsoft's Human-Robot Interaction funding to develop a social, snack-selling robot to traverse Newell-Simon and Wean halls (press release). After seeing a prototype appear on Flickr in July, we've all been waiting patiently to see pictures of the final version. Well, the wait is over -- photos of the new CMU snackbot, conceptual designs, and construction photos are contained below! It appears that the CMU team is progressing nicely.
This is great! Honda is celebrating its 50th year in the US by creating a 49-foot tall Asimo float that will lead off the Rose Parade on January 1st, 2009. To quote the Honda press release: "Honda's Rose Parade float, a 49-foot replica of Honda's ASIMO humanoid robot, and the parade's first-ever hydrogen-powered fuel cell pace car, the Honda FCX Clarity, will lead the 120th Rose Parade as well as kick off Honda's 50th anniversary of U.S. operations." I'm always a fan of robots being displayed (and appreciated) by the general public; thanks to DVICE for pointing this out.
I've always wanted to pull apart a SICK laser rangefinder (LIDAR). However, the $6k price-tag (and advisor repercussions) have always been a sufficient deterrent. Well, Kyle Vogt of MIT has disassembled what looks to be a SICK LMS-210 -- perhaps his was already broken? Anyway, the internal design is surprisingly simple. It's interesting to look at the internals of such iconic piece of robotics hardware. Read on for more images.