At IROS 2009, IRobot demonstrated an interesting form of locomotion dubbed "particle jamming skin" (to create what became known as the "blob bot"). The robot was creepy, but the concept was interesting. In a recently available TEDMED 2009 talk (embedded below), IRobot CEO Colin Angle describes a unique particle jamming end effector (robot hand) for manipulation. By selectively inflating or deflating, the particle jamming end effector can change from a liquid-like state to ooze around a target object and then harden into a solid-like state to grasp or pickup the object. Colin shows a video of a PackBot with particle jamming end effector picking up medication, keys, and a (dummy) patient's arm. He also does a live demonstration using a hand-held particle jamming system. Be sure to check out the video and stills below -- they will help you understand this bizarre (but compelling) robot hand.
Back in October 2009, Colin Angle spoke at TEDMED 2009. It was a big announcement: IRobot was launching a new healthcare robotics business unit to be led by Tod Loofbourrow. Their ambitious goal: add 1 million years to users' lifetimes through robotic assistance. Some good synopses of the talk were posted, but videos of the event were elusive... until now. Below you can find the full video of Colin's talk and some points that I think are particularly poignant.
Today Willow Garage announced that eleven (rather than the original ten anticipated) PR2 Beta robots, with a total value of over $4.4M, will be loaned out to academic and research institutions worldwide to develop a slew of impressive capabilities over the next two years. The recipients include 7 US-based institutions, 3 European, and 1 Asian. The final list is a panoply of robotics specialists: University of Freiburg (Germany), Bosch, Georgia Tech, KU Leuven (Belgium), MIT, Stanford, TU Munich (Germany), UC Berkeley, U Penn, USC, and University of Tokyo (Japan) -- full details can be found in the Willow Garage press release. It is difficult to overstate the importance of this event in the grand history of robotics... Let me try to explain.
Phillip Torrone (senior editor of Make Magazine) and Limor Fried (aka Lady Ada), both of AdaFruit Industries, gave a talk at O'Reilly's Foo Camp East 2010 that unveiled the financials of two robotics-related open hardware projects. First, DIYDrones -- founded by Wired editor-in-chief Chris Anderson and makers of open hardware UAV components like autopilots and IMUs -- is approaching $1M in revenue (est. 2010). Second, MakerBot -- an open hardware 3D printer and purveyors of an online 3D design repository called Thingiverse -- has surpassed $1M in revenue. Looks like open hardware is really starting to gain momentum.
Robotiq is a new Canadian startup spun-out of the Laval University Robotics Lab and founded by Samuel Bouchard, Vincent Duchaine and Jean-Philippe Jobin. Their first product is a very cool looking three-fingered robot hand called the "Adaptive Gripper." It is comprised of three under-actuated fingers, two of which can change their position and orientation to support a variety of grasp configurations -- very similar in principle to the Barrett Hand and Schunk SDH Hand. The Adaptive Gripper's prominent finger linkages lead to a rather beautiful mechanical motion, as seen in the grasping videos (below). I would imagine the mechanical linkages also offer additional robustness compared to under-actuated cable-driven competitors and cost advantages over fully-actuated competitors. Unfortunately, its price is still an unknown -- perhaps someone attending ICRA 2010 in Alaska can stop by their booth and inquire...?
IRobot posted amazing first-quarter revenue, driving the stock price up 30% in one day to a new 52-week high. The new stock price is hovering right at $20 per share, up 96% from its late-July low of $10.21 per share. I'm glad too see such a positive turn around, though it is somewhat bittersweet -- I had been considering a purchase of IRBT shares ever since a prescient analysis by "Robot Stock News" in early December, but lacked funds to make the plunge. I guess meager graduate student salaries are not conducive to investing.
This new robot blimp, powered by electroactive polymers (EAPs), comes from the Swiss Federal Labs for Materials Testing and Research (EMPA). It reminds me of the Festo Air Ray, and definitely ranks up there with other cool EAP robots like the Artificial Muscle EPAM variants previously discussed on Hizook. Be sure to check out the video.
[We received the following note from Sonia Chernova @ MIT Media Lab] The Personal Robots group at the MIT Media Lab has released an online game designed to make robots smarter! Mars Escape is a two-player online game in which each player can take on the role of an astronaut or a robot on Mars. The players must work together to complete their mission before oxygen supplies run out.
Depth cameras go by many names: ranging camera, flash lidar, time-of-flight (ToF) camera, and RGB-D camera. The underlying sensing mechanisms are equally varied: range-gated ToF, RF-modulated ToF, pulsed-light ToF, and projected-light stereo. The commonality is that all provide traditional (sometimes color) images and depth information for each pixel (depth images) at framerate. Existing commercial offerings, such as the Swiss Ranger SR4000 and PMD Tech products, currently cost ~$10,000. Thus, I'm extremely excited by Dieter Fox's recent statement about a sub-$100 depth camera that could hit stores later this year! Dr. Fox has already leveraged a similar (this?) sensor to build cool 3D SLAM maps akin to Google Street View indoors -- see videos below. Is Dr. Fox's employer (Intel) building depth cameras? Is this a new PrimeSense offering? Or could it hail from fellow Seattle powerhouse, Microsoft, who not long ago purchased 3DV Systems (purveyor of ToF cameras) and who plans to release Project Natal (rumored to be projected stereo) later this year for the XBox 360? I'd love details, but am intrigued regardless! Updated March 31st 2010: Big news -- PrimeSense is supplying the 3D sensing technology to Project Natal for the XBox 360! Now I'm almost certain this is the sensor referred to by Dieter Fox.
This new humanoid robot named "Cody" comes from Georgia Tech's Healthcare Robotics Lab (to which I belong). Cody is composed of a Segway RMP 50 Omni mobile base, 1-DoF vertical linear actuator, and a pair of 7-DoF Meka Arms with series elastic actuators (the same as Simon). This mobile manipulator has shown some pretty impressive capabilities. It can open doors, drawers, and cabinets using equilibrium point controllers developed by Advait Jain and Prof. Charlie Kemp. It also has a nice direct physical interface (touching interface) to reposition the robot that was developed by Tiffany Chen and Prof. Charlie Kemp. Much of the code controlling this robot is open-source and has ROS (Robot Operating System) interfaces. Be sure to check out the videos and photos below.
Professional and hobbyist roboticists alike are snapping up Robotis Dynamixel Servos. These "smart" servos serve an important niche between $30 hobby servos and super-expensive harmonic drive servos. They sport torques ranging from 12 kg·cm to 106 kg·cm, and even more when doubled-up. Most of my experience is with the RX-28 and RX-64 variants, which have 300° swing, 10-bit position sensing resolution, (roughly) 8-bit position control, force/torque sensing, available compliance mode, and can daisy-chain more than 250 servos. At Georgia Tech's Healthcare Robotics Lab, we use dozens of these servos. I recently invested a decent amount of time overhauling our open-source (Python) control software, adding (among other things) thread-safe operation and ROS (Robot Operating System) compatibility. In this post, I'll do a brief overview of the Robotis Dynamixel offerings, look at a number of impressive applications where they are utilized, share pictures of a servo's disassembly, and give a brief tutorial using the new (awesome) open-source software libraries.
Colleague and labmate, Tiffany Chen, pointed out an interesting new robot named "MeBot" from MIT's Personal Robotics Group. Later this week, MeBot will be presented at the conference on Human-Robot Interaction (HRI 2010) in Osaka, Japan. The associated paper, "MeBot: A Robotic Platform for Socially Embodied Presence," has been nominated for best paper. In a nutshell, MeBot is a semi-autonomous robotic avatar that provides rich, remote interaction by conveying non-verbal channels of social communication in addition to video, something that is not provided by existing phone and video conferencing. The expressiveness of MeBot is impressive. It reminds me of the (now well-known) CrabFu Swashbot, but ups the ante by including video capabilities. Be sure to check out the videos and photos below to see what I mean.
This article is an illustrated summary of a recent paper we presented at CVPR 2009. We leverage some of the linear properties of optical flow fields to develop a method that automatically learns the relationship between camera motion and optical flow from data. The method can handle arbitrary imaging systems including very severe distortion, curved mirrors, and multiple cameras. Using this method, a robot can estimate it's motion in real time from video while detecting "motion anomalies" such as nearby or moving objects.
Back in 2007 and 2008, funding agencies had a pretty hefty interest in robots with amoeba-like locomotion, also known as whole-skin locomotion (WSL), blob 'bots, or Chembots. NSF awarded $400k to Dr. Dennis Hong of Virginia Tech's RoMeLa Lab and DARPA awarded $3.3M to iRobot to develop such robots. Now, most people are familiar with iRobot's jamming skin robot announced at IROS 2009 (photos / videos below). However, I would like to share with you the equally-clever and interesting work of Dr. Hong, including a new whole-skin locomotion robot called ChIMERA: "Chemically Induced Motion Everting Robotic Amoeba" that was unveiled at a recent TEDxNASA event. Dr. Hong's robots resemble those slippery water-snake toys that are incredibly difficult to grasp, with silicone skin (flexible but rugged exterior) and water or gel inside (soft interior). Read on to learn more!
Unmanned aerial vehicles (UAVs) are no longer relegated to military and police forces. Amateurs and hobbyists, working in close-knit online communities, are fusing old RC airplane concepts with modern technology to create UAVs that rival commercial offerings. Recent efforts suggest that an amateur UAV, complete with on-board cameras, wireless video downlinks, operator heads-up display, autonomous waypoint navigation / autopilot control, and ground tracking stations can all be had for less than $2,000 (read on for details)! Unfortunately, the FAA (aviation regulatory body in the United States) already treats commercial UAVs as regular planes, requiring aircraft registration and 60 day pre-flight plans. While the regulations for hobbyists seem to be more lax, I personally believe the FAA should embrace amateur UAV builders in the same way that the FCC embraced ham radio operators of yesteryear.
Well, it's official. Willow Garage CEO Steve Cousins just announced to the Robotics-Worldwide mailing list that Willow intends to give away 10 PR2 robots. These are some amazingly impressive robots, costing several hundred thousand dollars each. Willow's PR2 robots and open-source Robot Operating System (ROS) have been widely acclaimed by news organizations such as the New York Times, Popular Science, Hizook, and pretty much everyone else. This should be an interesting year for Willow Garage. The full Robotics-Worldwide announcement is below, and the Willow Garage Call for Proposals (CFP) can be found here. Updated Jan 21st 2010: Included some new (professional) photos of the finished PR2.
I would like to share a piece of work that I think is awesome on so many levels. First, it involves the weakly electric knifefish: a curious creature that maneuvers via ribbon-finned propulsion (a marvel of fluid dynamics) and possesses an uncommon sensing modality in the form of electric field sensing (essentially electrostatic / capacitive sensing). Second, the work models the fish as a dynamic system through its measured frequency response expressed in Bode plots, a process familiar to pretty much any type of engineer. You read that right, they made Bode plots of a fish -- how cool is that!? Be sure to check out the videos and photos below.
Autonomously seeking out power for battery recharging is a pretty crucial capability for advanced mobile robots. While Roomba-like docking stations are a quick fix, "plugging in" to existing infrastructures is preferable. Not long ago, the robotics world was abuzz with the Willow Garage Milestone 2, where (among other things) a PR-2 robot plugged itself into 9 different wall outlets. My curiosity on this subject was further piqued when I saw Intel's Marvin robot use electric fields emanating from an outlet's internal wiring to finely localize an outlet/plug and adeptly plug itself in, all sans camera. I'd like to share some photos and videos of recent efforts (by both the Willow and Intel folks), as well as examine the history of robots plugging themselves into wall outlets.
Having previously written about various artificial muscle technologies, I'd like to examine the electroactive polymer (EAP) variant in more detail. I'll briefly discuss how EAPs function, then move on to myriad examples of EAPs used in robotics applications, including: biomimetic robot eyes, childrens' toys, and flapping-wing ornithopters. I'll also look at electroactive polymer artificial muscles (EPAM) that were invented at SRI International and subsequently spun off to startup Artificial Muscle, Inc. In my favorite example, a hexapod walker was constructed at SRI whose muscles are used for both structural support in addition to actuation. Now if they could also function as energy storage devices, they'd be the ultimate biological analog.
A commercially-available ultra low-cost laser rangefinder is finally set to hit department store shelves in February! I'm speaking of the laser rangefinder presented at ICRA 2008 that costs $30 to build (commented on here at Hizook almost one year ago) that sits atop the recently announced Neato Robotics XV-11 vacuum cleaner. Others have thoroughly discussed the XV-11's competitiveness with iRobot products, the possible patent infringement of iRobots square-front design, and its ability to perform SLAM (Simultaneous Localization and Mapping). But everyone has glossed over the coolest part: Forget about Neato's $400 robot, $60 batteries, $30 wheels (etc.)... if made available, sub-$100 laser rangefinders would revolutionize hobby robotics! Read on for a description of this compelling (future?) component.
This is the third installment in what could be billed the "building series." The first two articles focused on rather involved fabrication techniques for larger robots; this time, I'd like to look at two more-accessible techniques for building miniature robots that I learned about at IROS 2009. The first technique, by Jessica Rajkowski and advisor Sarah Bergbreiter (et. al.) from University of Maryland, is a relatively new method employing multi-step photolithography via inkjet printed masks to build small polymer robots such as inchworms and grippers that are actuated by shape memory alloys (SMAs). The second technique examined is a bit more mature. Called "Smart Composite Microstructures" (SCM) and hailing from UC Berkeley's Biomimetic Millisystems Lab, this technique is used to build inexpensive, resilient, folded composite (cardboard, carbon fiber, fiberglass) prototypes with polymer hinges. Read on for details and videos.
Following up on last week's article about building robot hands with compliant under-actuated fingers, I'd like to examine a technique to build aesthetic shells for robot heads using a combination of 3D-printed master forms, silicone molds, and quick-setting plastic final products. The technique examined was used by MIT alum Cory Kidd to build 18 prototypes of the Autom weight-loss coach for his PhD dissertation, a product that is being continuously refined at Cory's new startup, Intuitive Automata. This technique seems a bit involved; I probably would have just outsourced a full 3D printed ABS version of the head, especially since there were 18 of them. However, I always find these advanced robot fabrication techniques enlightening.
I'm a huge fan of so-called micro robots -- those with cm length scales, thus ? m3. I've posted about numerous micro robots before, including the amazing Alice micro robot swarms from EPFL, and I am a long-time micro and nano autonomous sumo robot advocate (see RoboGames). Perhaps that is why I'm so excited about the SwarmRobot.org open hardware micro-robot swarm, developed by the University of Stuttgart and the University of Karlsruhe. All of the hardware and software is open (in the GPL sense), including parts lists, circuit board and chassis designs, and software. With a stated goal to produce sub-€100 robots, I'd really like to see this take off. Combined with a wireless power surface, a micro-robot in perpetual motion would make a great desk ornament!
Meka Robotics is a San Francisco robotics startup founded by MIT roboticists Aaron Edsinger and Jeff Weber, of Domo fame. They have produced some pretty amazing products in the last few years, including the humanoid robot Simon that was recently featured on Hizook. As I'm somewhat familiar with these arms and hands, I'd like to share some more detailed information, including new videos of the torso and a more detailed look at the anthropomorphic hands. In particular, it is worth noting that all motors on the 7-DOF arms and 4-DOF hands employ series-elastic actuators (SEAs), a technology that offers natural compliance and provides torque measurements at each joint -- two very useful qualities for robots interacting directly with people. Be sure to read on for videos and many pictures. Updated Oct. 19th 2009: exclusive photos, product data sheets, and new videos added.
With micro / pico projectors being sold for under $250, and robot toy maker Wowwee getting in the game, it was only a matter of time before projectors would be found on robots -- especially since the general concept dates back at least three decades to R2D2's holographic projections in the original Star Wars trilogy. In fact, Hizook previously examined a number of robots with projectors used to communicate intention. Following the development of a laser pointer interface by the Healthcare Robotics Lab (to which I belong), myself and numerous labmates ruminated about the marrying of these two technologies -- it seemed a natural extension of the "Clickable World", wherein the world is composed of virtual buttons or icons selected via a laser pointer analogous to a PC mouse, to include visual feedback via an on-robot projector. It seems ideas rarely stand in isolation; I'm now aware of two robotic systems that use both video projectors and laser pointer interfaces. The first is a very preliminary "late breaking results" submission to HRI 2009, while the other is a fully-realized system developed in JST's ERATO program. The latter research happens to have a compelling video, embedded below.