I'm really intrigued by the recent announcement (Aug. 21st, 2009) that Jeff Bezos, founder of internet giant Amazon, has participated in a substantial $7 million round of funding via his personal investment firm (Bezos Expeditions) into Heartland Robotics, a stealthy startup co-founded by Rod Brooks -- who left iRobot (which he also co-founded) to launch this new venture. Brooks is certainly a figurehead in robotics, having effectively "invented" behavior-based robotics (not to mention being my academic grandfather -- the advisor of my advisor). But his Heartland Robotics remains an enigma, wrapped in a riddle, surrounded by mystery... Below I discuss some of my musings, but your additional speculation and insight would be greatly appreciated. Most significantly, how will this new relationship affect Kiva Systems -- Amazon recently acquired one of Kiva's largest clients (Zappos) for an impressive $928M!
It appears the I-Swarm robot project has produced some fully-integrated and apparently functional micro robots -- almost four years after we saw the initial conceptual videos appear online. What makes these robots so impressive is the level of integration; they possess a micro-step locomotion mechanism, a solar cell, custom IR communication modules, and an ASIC (custom silicon circuitry) all in a very compact package. I've quite impressed by the pictures and videos (embedded below). Since I-SWARM stands for "Intelligent Small-World Autonomous Robots for Micro-manipulation", I'm a bit perplexed by the lack of manipulation capabilities. They do have a small piezoelectric-driven cantilever arm in the front, but it currently doesn't seem as capable as AFM tips employed by the MiCRoN project's micro robots. Perhaps, as the PhysOrg article notes, they just need additional funding -- appropriate for such quality engineering and top-notch research.
The good folks at the AI and Robotics Blog have posted an entertaining video of a Kuka Light-Weight Robot (LWR) arm climbing a ladder -- an impressive feat for such a large arm. Those who read Hizook will recognize these arms as descendants of the DLR-III lightweight arms (featured frequently), also with a 1:1 mass-payload ratio, that have been employed on the Justin research platform. According to the AI and Robotics Blog posting, the arms are now available for purchase, though they still do not appear to be listed as an "available product" on Kuka's website. However, Kuka is certainly moving forward with these arms -- they have been in numerous demonstrations at recent trade shows; for example, I saw them featured at IROS 2008, and they recently appeared atop Kuka's new mobile manipulator, the OmniRob robot. Check out the video below.
We've seen robots controlled with projector interfaces and laser-pointer interfaces, and now we can add tabletop interfaces to the list. My labmate, Hai Nguyen, pointed out the CRISTAL project from the Media Interaction Lab at the Upper Austria University of Applied Sciences. The CRYSTAL project is an interesting "smart home" technology that uses a tabletop interface (similar to Microsoft's Surface) and a ceiling-mounted camera to display and control household electronics such as lights, TVs, digital picture frames, and robots! To command the robot, the user "draws" the desired robot path on the tabletop computer using their finger. The robot then follows the route via optical tracking through the ceiling-mounted camera. Interesting interaction, and its always good to see robots become sufficiently ubiquitous that they're classified (and controlled) in the same manner as other home electronics. Check out a video of the interaction below.
Dr. Andrea Thomaz of Georgia Tech's Socially Intelligent Machines Lab was recently awarded the prestigious "MIT Tech Review 2009 Young Innovators Under 35", an honor shared with last year's robotics recipient, Andrew Ng. Simultaneous to this fantastic news, Andrea's lab unveiled an amazing new robot named Simon (see photos and videos below). Simon features an articulated torso, dual 7-DOF arms, and anthropomorphic hands from Meka Robotics along with an expressive head designed at Georgia Tech. Simon is designed to study human-robot interaction from a social learning vantage, such as learning by demonstration and human-robot collaboration. I'm very enthralled for Andrea, and I'm proud to have taken her graduate research course on human-robot interaction while at Georgia Tech.
While perusing through Kuka's 2008 Annual Report, it became evident that the robotics giant is making a serious foray into mobile manipulation with its OmniRob concept robot (photos and videos below). This new robot sports a omnidirectional mobile platform based on mecanum wheels, a Kuka lightweight arm, and what appear to be dual SICK LMS100 laser range finders to provide 360° lidar coverage. Between Kuka's "toy" educational platform (covered by Hizook in October) and this more advanced offering, it is clear that Kuka is highly invested / interested in the future of mobile manipulation. With Kuka's classic expertise in robot arms, combined with competence in omnidirectional systems via their OmniMove industrial application line, Kuka will surely be a significant force in the exciting field of mobile manipulation.
At ICRA 2009, the Rollin' Justin humanoid robot (the lovable robot that "Danced Like in Pulp Fiction") demonstrated some impressive teleoperation capabilities. The man-machine interface (MMI) consists of two components. The first component comprises two DLR-III lightweight arms, the same type employed by the robot, terminated with force-torque sensing load cells to command the omnidirectional base or the arms / hands. Meanwhile, the second component, a fully-immersive heads-up display with vicon (optical) head tracking, constantly streams robot-mounted camera images to the heads-up display while simultaneously panning and tilting the robot's head in concert with the user's head movements. All-in-all, this is a very impressively engineered system. Be sure to check out the pictures and video below.
Like almost all roboticists, I'm a huge fan of robot movies. My common favorites include: I-Robot, Blade Runner, Iron Man, Short Circuit, AI, Wall-E, Hinokio, and so on. Well, there is a new Sci-Fi movie called "District 9" coming out this weekend that (based on previews) sports some impressive robotic systems -- particularly exoskeletons. The writer / director of this new movie is Neill Blomkamp, who has also produced numerous short films featuring robots (a few of which are shown below). In a psuedo-tradition, we're having a lab outing to a matinee showing of "District 9" this weekend. I'll be sure to let you know how it goes in the comments, but in the meantime check out the pictures and trailers below.
I saw a press release by Robosoft (a French company that creates "advanced robotics solutions") with attractive CAD drawings of a robotic walker meant to assist the elderly. I thought this was a good opportunity to examine some of the other robotic solutions in this space, from the more complex Care-O-Bot II from Fraunhofer to the most simplistic passively-breaking walkers that prevent stumbling and excessive acceleration. Read further for more information, and if you know of any examples of robotic walkers to assist the elderly, please chime in!
By now, most roboticists are familiar with the myriad gecko-type robots that employ Van der Waals forces (created by microscopic synthetic setae) to cling to walls. Less well-known is the work on an electrically-controllable alternative developed by researchers at SRI International (formerly called Stanford Research Institute) called "electroadhesion". Impressively, the electroadhesive can support 0.2 to 1.4 N per square centimeter, requiring a mere 20 micro-Watts per Newton. This means that a square meter of electroadhesive could hold at least 200kg (440 lbs) while only consuming 40 milli-Watts, and could turn on and off at the flick of a switch! Read on for pictures, videos, and discussion.
A few blogs are passing around videos of the Ishikawa Komuro Lab's high-speed robot hand performing impressive acts of dexterity and skillful manipulation. However, the video being passed around is slight on details. Meanwhile, their video presentation at ICRA 2009 (which took place in May in Kobe, Japan) has an informative narration and demonstrates additional capabilities. I have included this video below, which shows the manipulator dribbling a ping-pong ball, spinning a pen, throwing a ball, tying knots, grasping a grain of rice with tweezers, and tossing / re-grasping a cellphone!
The iconic Pixar animated lamp, Luxo Jr., unofficially debuted in animatronic form at Disney's Hollywood Studios in late June (videos below). Both the animated and animatronic Luxo Jrs. have remarkable anthropomorphic emotive capabilities in spite of their simple, non-human form. This reminds me of conversations in Dr. Andrea Thomaz's human-robot interaction course about applying animation techniques to design more effective social robots -- clearly Disney's Imagineers have perfected this art.
I came upon this new commercial (video below) entitled "The Runner -- Exploit Yourself" created by Big Lazy Robot (a design / visual effects studio) for Nike. The humanoid robot performs impressive feats of urban acrobatics, strongly resembling a more agile version of the movie-star robot, Hinokio. It is always interesting to compare robot fact with fiction. Hopefully the future lives up to (nay, exceeds) our expectations.
There was a very interesting plenary talk at ICRA 2009 about "Computational Cameras" given by Prof. Shree Nayar of Columbia University. A video of the plenary is included below, as well as a discussion of some of its contents -- from assorted pixel techniques for high dynamic range to flexible depth of field photography -- all very cool stuff! These developments are particularly relevant to robotics, as cameras are probably the most ubiquitous sensors encountered. This video was made available in the ICRA 2009 podcasts. While there is a large push for open-access journals / conferences, freely-available recordings of conference talks is even more lacking. As I find these more entertaining than television, I really hope this becomes a common trend (perhaps the RSS committee members are watching...?).
While most (semi)autonomous mobile manipulators employ expensive articulated arms with grippers (6 or more DOF), the Healthcare Robotics Lab at Georgia Tech, the same folks who made EL-E, are also examining low-complexity end effectors modelled off of dustpans and kitchen turners for non-prehensile grasping of isolated objects from the floor. When mounted on an iRobot Create (Roomba), the system's performance was impressive; it successfully grasped ~95% of the 34 test objects across numerous orientations / configurations and four different surfaces -- an impressive feat of robustness given that the end effector is a single under-actuated "sweeper" (1 DOF) working in tandem with a planar wedge, the whole system operates via open loop control, and the objects were quite varied (from small individual pills to large containers, and from deformable textiles to rigid bottles). This system is slated to appear at ICRA 2009 in Kobe, Japan in the next few days and is documented in a paper entitled "1000 Trials: An Empirically Validated End Effector that Robustly Grasps Objects from the Floor" (of which I am a coauthor). Read further for videos and additional discussion.
I would like to mark this momentous occasion by sharing it with you -- that, and it is just plain cool (and artistic)! The folks at RadiologyArt.com have been building Computed Tomography (CT) scans of various objects (dolls, electronics, vacuum tubes, McDonalds hamburgers, etc). The addition of a remote control dog and wind-up drumming bunny represent (to my knowledge) the first examples of CT scans of robots, albeit rudimentary robots. Read on for pictures and amazingly detailed videos.
There was a paper just released in Science (Materials) about "Giant-Stroke, Superelastic Carbon Nanotube Aerogel Muscles." This is a rare case where I believe the research material far exceeds the buzzword hype! The new material responds to applied voltages by expanding 220% in a few milliseconds, operating in temperatures as low as liquid-nitrogen and as high as the melting point of iron. It has the strength and stiffness of steel (by weight) in one direction and yet is as compliant as rubber in the other two. It has extremely low density due its airy (aerogel) properties, is conductive, and transparent. This materials innovation has the potential to rejuvenate research on artificial muscles, which has generally been focused on shape memory alloys (i.e. nickle-titanium or Nitinol), piezoelectrics (such as PZT), or electroactive polymers (EAPs). Read on for a discussion about these alternative technologies, their drawbacks, and why this new material may be a game-changer!
Describing science as "beautiful" makes perfect sense to me; I believe the physics experiments described in The Prism and the Pendulum are on par with the greatest paintings and sculptures ever conceived! However, I'm having difficulties classifying the $30,000 robot, Keepon: Is it a research robot, an art-robot, or both? On one hand, there is evidence supporting its role in important robotics research. On the other hand, there are the numerous (many more?) whimsical videos of Keepon dancing to music or traveling the world, such as the "Keepon Goes Seoul-Searching" video to be shown on Friday at the Human-Robot Interaction (HRI) 2009 conference (we show this video below). Having seen Keepon in person, I can attest to its "cuteness" factor and quality design... but my questions are: "Where is the line between art and research drawn?" "Does such a line, necessarily, exist?" and "How can HRI researchers and peer-reviewers objectively evaluate important robotics research that also possesses strong artistic components?" I'd love to hear your thoughts.
While Hizook covered the Rollin' Justin robot over three months ago, the rest of the world (including Engadget) had to wait until CeBIT, where Rollin' Justin "debuted" today. Lots of great pictures and videos were taken, including a video where Rollin' Justin is led around by the hand (I assume using the force/torque sensing capabilities of the DLR-III lightweight arm or the DLR-II hand). However, the "serious" coverage at CeBIT left out one of Justin's most hilarious commands: "dance like in pulp fiction." We show this video (to be shown at the upcoming ICRA 2009 conference) below.
I've been meaning to mention this for some time now... SICK has released a "new" laser rangefinder, the LMS 100. This laser rangefinder seems to be a departure from the classic "coffee-pot" look of yore (i.e. SICK LMS 291). In fact, it's form-factor and specifications are quite similar to the Hokuyo UTM-30LX; it seems like the LMS 100 might be SICK's strategic response to the "budget" LIDAR manufacturer's (Hokuyo's) burgeoning popularity among indoor roboticists. Priced at $5000 USD ($1000 less with academic discounts), I'm curious how it actually compares in performance (in the field) to the $5600 Hokuyo UTM -- can anyone weigh in? Read on for a comparison of specifications.
Hizook reader Yue Khing pointed us to another disassembled laser rangefinder (LIDAR); this time, it is an Omron STI OptoShield OS3100. This LIDAR seems similar in form (apparently referred to as "coffee pots" in industry) and specification to the SICK LMS laser rangefinders. Honestly, I wasn't even aware that Omron made laser rangefinders, so I'm not sure what these units cost, or how common they are; however, it is still interesting to compare their internal design to the SICK LMS series and Velodyne laser rangefinders we've already seen disassembled. Read on for pictures and videos.
As of January 2009, the iBOT powered-wheelchair will be discontinued. This is unfortunate for the disabled community -- Dean Kamen and the others at DEKA (the same people responsible for the Segway and Luke Arm) developed an amazing robotic wheelchair that was (somewhat) unique it its ability to transition from a statically-stable, 4-wheel configuration to a dynamically-stable, 2-wheel configuration to give occupants added height. Further, by pivoting pairs of wheels, the wheelchair and occupant were able to dynamically balance while traversing stairs, not to mention the wheelchair's basic ability to traverse (relatively) poor terrain, such as sand and gravel! All of this was possible due to careful controllers and internal gyros (not entirely dissimilar to a Segway). Read further for discussion -- specifically about why this loss for the disabled community could be an opportunity in disguise for the robotics community and a big win for Kamen and company.
I think this is both brilliant and hilarious... University of Delaware researchers, James Galloway and Sunil Agrawal, were awarded a two-year, $325k NSF grant to explore robot-enabled mobility for special needs children, with the goal of spurring cognitive development -- this is brilliant. However, why focus solely on special needs children? I think it is hilarious to imagine "regular" children using "smart wheelchairs" to putter around before they learn to crawl / walk -- it would certainly make for some entertaining rounds of baby-bumper-cars! Adding to the hilarity, their initial prototypes are Pioneer robots pulling a plywood trailer, supported by casters, with a small chair atop (images below)! But who am I to judge... we can all relate to "ugly prototype syndrome."
Early robots have found utility in warfare dating back to World War II (and arguably earlier), with the invention of simple electrical servo-mechanisms for fire control and targeting. While fire control has become extremely advanced, its "human in the loop" nature kept us (relatively) oblivious of the ethical implications of robots in warfare. However, increased autonomy and point-and-click capabilities are forcing us to reevaluate the ethical implications of robots in warfare. Enter a new book by P.W. Singer, entitled Wired for War: The Robotics Revolution and Conflict in the 21st Century. Singer was recently interviewed by NPR (and on The Daily Show by Jon Stewart), where he talked about a number of interesting issues. Links and discussion follow.
Friend and colleague, Richard Roberts, has offered to give us an glimpse of his recent work on learning autonomous robot behaviors/controllers from human operators:
While working on the DARPA LAGR project, we found it exceedingly difficult to tune our reactive behaviors to work well in cluttered and patchy environments. Either obstacle avoidance was too sensitive and the robot would not drive through gaps, or it was too aggressive and the robot would collide with obstacles. Of course, we could have made the behaviors more and more complicated, introducing more parameters to tune, but we wanted an easy way to have the robot “just do what I say!” Thus, we developed a system for interactive, on-line training of behaviors with a remote control. The user flips a switch to training mode, and drives the robot how they would like it to drive, then flips the switch back to autonomous to test the behavior.