Depth cameras go by many names: ranging camera, flash lidar, time-of-flight (ToF) camera, and RGB-D camera. The underlying sensing mechanisms are equally varied: range-gated ToF, RF-modulated ToF, pulsed-light ToF, and projected-light stereo. The commonality is that all provide traditional (sometimes color) images and depth information for each pixel (depth images) at framerate. Existing commercial offerings, such as the Swiss Ranger SR4000 and PMD Tech products, currently cost ~$10,000. Thus, I'm extremely excited by Dieter Fox's recent statement about a sub-$100 depth camera that could hit stores later this year! Dr. Fox has already leveraged a similar (this?) sensor to build cool 3D SLAM maps akin to Google Street View indoors -- see videos below. Is Dr. Fox's employer (Intel) building depth cameras? Is this a new PrimeSense offering? Or could it hail from fellow Seattle powerhouse, Microsoft, who not long ago purchased 3DV Systems (purveyor of ToF cameras) and who plans to release Project Natal (rumored to be projected stereo) later this year for the XBox 360? I'd love details, but am intrigued regardless! Updated March 31st 2010: Big news -- PrimeSense is supplying the 3D sensing technology to Project Natal for the XBox 360! Now I'm almost certain this is the sensor referred to by Dieter Fox.
This new humanoid robot named "Cody" comes from Georgia Tech's Healthcare Robotics Lab (to which I belong). Cody is composed of a Segway RMP 50 Omni mobile base, 1-DoF vertical linear actuator, and a pair of 7-DoF Meka Arms with series elastic actuators (the same as Simon). This mobile manipulator has shown some pretty impressive capabilities. It can open doors, drawers, and cabinets using equilibrium point controllers developed by Advait Jain and Prof. Charlie Kemp. It also has a nice direct physical interface (touching interface) to reposition the robot that was developed by Tiffany Chen and Prof. Charlie Kemp. Much of the code controlling this robot is open-source and has ROS (Robot Operating System) interfaces. Be sure to check out the videos and photos below.
Professional and hobbyist roboticists alike are snapping up Robotis Dynamixel Servos. These "smart" servos serve an important niche between $30 hobby servos and super-expensive harmonic drive servos. They sport torques ranging from 12 kg·cm to 106 kg·cm, and even more when doubled-up. Most of my experience is with the RX-28 and RX-64 variants, which have 300° swing, 10-bit position sensing resolution, (roughly) 8-bit position control, force/torque sensing, available compliance mode, and can daisy-chain more than 250 servos. At Georgia Tech's Healthcare Robotics Lab, we use dozens of these servos. I recently invested a decent amount of time overhauling our open-source (Python) control software, adding (among other things) thread-safe operation and ROS (Robot Operating System) compatibility. In this post, I'll do a brief overview of the Robotis Dynamixel offerings, look at a number of impressive applications where they are utilized, share pictures of a servo's disassembly, and give a brief tutorial using the new (awesome) open-source software libraries.
Colleague and labmate, Tiffany Chen, pointed out an interesting new robot named "MeBot" from MIT's Personal Robotics Group. Later this week, MeBot will be presented at the conference on Human-Robot Interaction (HRI 2010) in Osaka, Japan. The associated paper, "MeBot: A Robotic Platform for Socially Embodied Presence," has been nominated for best paper. In a nutshell, MeBot is a semi-autonomous robotic avatar that provides rich, remote interaction by conveying non-verbal channels of social communication in addition to video, something that is not provided by existing phone and video conferencing. The expressiveness of MeBot is impressive. It reminds me of the (now well-known) CrabFu Swashbot, but ups the ante by including video capabilities. Be sure to check out the videos and photos below to see what I mean.
This article is an illustrated summary of a recent paper we presented at CVPR 2009. We leverage some of the linear properties of optical flow fields to develop a method that automatically learns the relationship between camera motion and optical flow from data. The method can handle arbitrary imaging systems including very severe distortion, curved mirrors, and multiple cameras. Using this method, a robot can estimate it's motion in real time from video while detecting "motion anomalies" such as nearby or moving objects.
Back in 2007 and 2008, funding agencies had a pretty hefty interest in robots with amoeba-like locomotion, also known as whole-skin locomotion (WSL), blob 'bots, or Chembots. NSF awarded $400k to Dr. Dennis Hong of Virginia Tech's RoMeLa Lab and DARPA awarded $3.3M to iRobot to develop such robots. Now, most people are familiar with iRobot's jamming skin robot announced at IROS 2009 (photos / videos below). However, I would like to share with you the equally-clever and interesting work of Dr. Hong, including a new whole-skin locomotion robot called ChIMERA: "Chemically Induced Motion Everting Robotic Amoeba" that was unveiled at a recent TEDxNASA event. Dr. Hong's robots resemble those slippery water-snake toys that are incredibly difficult to grasp, with silicone skin (flexible but rugged exterior) and water or gel inside (soft interior). Read on to learn more!
Unmanned aerial vehicles (UAVs) are no longer relegated to military and police forces. Amateurs and hobbyists, working in close-knit online communities, are fusing old RC airplane concepts with modern technology to create UAVs that rival commercial offerings. Recent efforts suggest that an amateur UAV, complete with on-board cameras, wireless video downlinks, operator heads-up display, autonomous waypoint navigation / autopilot control, and ground tracking stations can all be had for less than $2,000 (read on for details)! Unfortunately, the FAA (aviation regulatory body in the United States) already treats commercial UAVs as regular planes, requiring aircraft registration and 60 day pre-flight plans. While the regulations for hobbyists seem to be more lax, I personally believe the FAA should embrace amateur UAV builders in the same way that the FCC embraced ham radio operators of yesteryear.
Well, it's official. Willow Garage CEO Steve Cousins just announced to the Robotics-Worldwide mailing list that Willow intends to give away 10 PR2 robots. These are some amazingly impressive robots, costing several hundred thousand dollars each. Willow's PR2 robots and open-source Robot Operating System (ROS) have been widely acclaimed by news organizations such as the New York Times, Popular Science, Hizook, and pretty much everyone else. This should be an interesting year for Willow Garage. The full Robotics-Worldwide announcement is below, and the Willow Garage Call for Proposals (CFP) can be found here. Updated Jan 21st 2010: Included some new (professional) photos of the finished PR2.
I would like to share a piece of work that I think is awesome on so many levels. First, it involves the weakly electric knifefish: a curious creature that maneuvers via ribbon-finned propulsion (a marvel of fluid dynamics) and possesses an uncommon sensing modality in the form of electric field sensing (essentially electrostatic / capacitive sensing). Second, the work models the fish as a dynamic system through its measured frequency response expressed in Bode plots, a process familiar to pretty much any type of engineer. You read that right, they made Bode plots of a fish -- how cool is that!? Be sure to check out the videos and photos below.
Autonomously seeking out power for battery recharging is a pretty crucial capability for advanced mobile robots. While Roomba-like docking stations are a quick fix, "plugging in" to existing infrastructures is preferable. Not long ago, the robotics world was abuzz with the Willow Garage Milestone 2, where (among other things) a PR-2 robot plugged itself into 9 different wall outlets. My curiosity on this subject was further piqued when I saw Intel's Marvin robot use electric fields emanating from an outlet's internal wiring to finely localize an outlet/plug and adeptly plug itself in, all sans camera. I'd like to share some photos and videos of recent efforts (by both the Willow and Intel folks), as well as examine the history of robots plugging themselves into wall outlets.
Having previously written about various artificial muscle technologies, I'd like to examine the electroactive polymer (EAP) variant in more detail. I'll briefly discuss how EAPs function, then move on to myriad examples of EAPs used in robotics applications, including: biomimetic robot eyes, childrens' toys, and flapping-wing ornithopters. I'll also look at electroactive polymer artificial muscles (EPAM) that were invented at SRI International and subsequently spun off to startup Artificial Muscle, Inc. In my favorite example, a hexapod walker was constructed at SRI whose muscles are used for both structural support in addition to actuation. Now if they could also function as energy storage devices, they'd be the ultimate biological analog.
A commercially-available ultra low-cost laser rangefinder is finally set to hit department store shelves in February! I'm speaking of the laser rangefinder presented at ICRA 2008 that costs $30 to build (commented on here at Hizook almost one year ago) that sits atop the recently announced Neato Robotics XV-11 vacuum cleaner. Others have thoroughly discussed the XV-11's competitiveness with iRobot products, the possible patent infringement of iRobots square-front design, and its ability to perform SLAM (Simultaneous Localization and Mapping). But everyone has glossed over the coolest part: Forget about Neato's $400 robot, $60 batteries, $30 wheels (etc.)... if made available, sub-$100 laser rangefinders would revolutionize hobby robotics! Read on for a description of this compelling (future?) component.
This is the third installment in what could be billed the "building series." The first two articles focused on rather involved fabrication techniques for larger robots; this time, I'd like to look at two more-accessible techniques for building miniature robots that I learned about at IROS 2009. The first technique, by Jessica Rajkowski and advisor Sarah Bergbreiter (et. al.) from University of Maryland, is a relatively new method employing multi-step photolithography via inkjet printed masks to build small polymer robots such as inchworms and grippers that are actuated by shape memory alloys (SMAs). The second technique examined is a bit more mature. Called "Smart Composite Microstructures" (SCM) and hailing from UC Berkeley's Biomimetic Millisystems Lab, this technique is used to build inexpensive, resilient, folded composite (cardboard, carbon fiber, fiberglass) prototypes with polymer hinges. Read on for details and videos.
Following up on last week's article about building robot hands with compliant under-actuated fingers, I'd like to examine a technique to build aesthetic shells for robot heads using a combination of 3D-printed master forms, silicone molds, and quick-setting plastic final products. The technique examined was used by MIT alum Cory Kidd to build 18 prototypes of the Autom weight-loss coach for his PhD dissertation, a product that is being continuously refined at Cory's new startup, Intuitive Automata. This technique seems a bit involved; I probably would have just outsourced a full 3D printed ABS version of the head, especially since there were 18 of them. However, I always find these advanced robot fabrication techniques enlightening.
I'm a huge fan of so-called micro robots -- those with cm length scales, thus ? m3. I've posted about numerous micro robots before, including the amazing Alice micro robot swarms from EPFL, and I am a long-time micro and nano autonomous sumo robot advocate (see RoboGames). Perhaps that is why I'm so excited about the SwarmRobot.org open hardware micro-robot swarm, developed by the University of Stuttgart and the University of Karlsruhe. All of the hardware and software is open (in the GPL sense), including parts lists, circuit board and chassis designs, and software. With a stated goal to produce sub-€100 robots, I'd really like to see this take off. Combined with a wireless power surface, a micro-robot in perpetual motion would make a great desk ornament!
Meka Robotics is a San Francisco robotics startup founded by MIT roboticists Aaron Edsinger and Jeff Weber, of Domo fame. They have produced some pretty amazing products in the last few years, including the humanoid robot Simon that was recently featured on Hizook. As I'm somewhat familiar with these arms and hands, I'd like to share some more detailed information, including new videos of the torso and a more detailed look at the anthropomorphic hands. In particular, it is worth noting that all motors on the 7-DOF arms and 4-DOF hands employ series-elastic actuators (SEAs), a technology that offers natural compliance and provides torque measurements at each joint -- two very useful qualities for robots interacting directly with people. Be sure to read on for videos and many pictures. Updated Oct. 19th 2009: exclusive photos, product data sheets, and new videos added.
With micro / pico projectors being sold for under $250, and robot toy maker Wowwee getting in the game, it was only a matter of time before projectors would be found on robots -- especially since the general concept dates back at least three decades to R2D2's holographic projections in the original Star Wars trilogy. In fact, Hizook previously examined a number of robots with projectors used to communicate intention. Following the development of a laser pointer interface by the Healthcare Robotics Lab (to which I belong), myself and numerous labmates ruminated about the marrying of these two technologies -- it seemed a natural extension of the "Clickable World", wherein the world is composed of virtual buttons or icons selected via a laser pointer analogous to a PC mouse, to include visual feedback via an on-robot projector. It seems ideas rarely stand in isolation; I'm now aware of two robotic systems that use both video projectors and laser pointer interfaces. The first is a very preliminary "late breaking results" submission to HRI 2009, while the other is a fully-realized system developed in JST's ERATO program. The latter research happens to have a compelling video, embedded below.
I'm really intrigued by the recent announcement (Aug. 21st, 2009) that Jeff Bezos, founder of internet giant Amazon, has participated in a substantial $7 million round of funding via his personal investment firm (Bezos Expeditions) into Heartland Robotics, a stealthy startup co-founded by Rod Brooks -- who left iRobot (which he also co-founded) to launch this new venture. Brooks is certainly a figurehead in robotics, having effectively "invented" behavior-based robotics (not to mention being my academic grandfather -- the advisor of my advisor). But his Heartland Robotics remains an enigma, wrapped in a riddle, surrounded by mystery... Below I discuss some of my musings, but your additional speculation and insight would be greatly appreciated. Most significantly, how will this new relationship affect Kiva Systems -- Amazon recently acquired one of Kiva's largest clients (Zappos) for an impressive $928M!
It appears the I-Swarm robot project has produced some fully-integrated and apparently functional micro robots -- almost four years after we saw the initial conceptual videos appear online. What makes these robots so impressive is the level of integration; they possess a micro-step locomotion mechanism, a solar cell, custom IR communication modules, and an ASIC (custom silicon circuitry) all in a very compact package. I've quite impressed by the pictures and videos (embedded below). Since I-SWARM stands for "Intelligent Small-World Autonomous Robots for Micro-manipulation", I'm a bit perplexed by the lack of manipulation capabilities. They do have a small piezoelectric-driven cantilever arm in the front, but it currently doesn't seem as capable as AFM tips employed by the MiCRoN project's micro robots. Perhaps, as the PhysOrg article notes, they just need additional funding -- appropriate for such quality engineering and top-notch research.
The good folks at the AI and Robotics Blog have posted an entertaining video of a Kuka Light-Weight Robot (LWR) arm climbing a ladder -- an impressive feat for such a large arm. Those who read Hizook will recognize these arms as descendants of the DLR-III lightweight arms (featured frequently), also with a 1:1 mass-payload ratio, that have been employed on the Justin research platform. According to the AI and Robotics Blog posting, the arms are now available for purchase, though they still do not appear to be listed as an "available product" on Kuka's website. However, Kuka is certainly moving forward with these arms -- they have been in numerous demonstrations at recent trade shows; for example, I saw them featured at IROS 2008, and they recently appeared atop Kuka's new mobile manipulator, the OmniRob robot. Check out the video below.
We've seen robots controlled with projector interfaces and laser-pointer interfaces, and now we can add tabletop interfaces to the list. My labmate, Hai Nguyen, pointed out the CRISTAL project from the Media Interaction Lab at the Upper Austria University of Applied Sciences. The CRYSTAL project is an interesting "smart home" technology that uses a tabletop interface (similar to Microsoft's Surface) and a ceiling-mounted camera to display and control household electronics such as lights, TVs, digital picture frames, and robots! To command the robot, the user "draws" the desired robot path on the tabletop computer using their finger. The robot then follows the route via optical tracking through the ceiling-mounted camera. Interesting interaction, and its always good to see robots become sufficiently ubiquitous that they're classified (and controlled) in the same manner as other home electronics. Check out a video of the interaction below.
Dr. Andrea Thomaz of Georgia Tech's Socially Intelligent Machines Lab was recently awarded the prestigious "MIT Tech Review 2009 Young Innovators Under 35", an honor shared with last year's robotics recipient, Andrew Ng. Simultaneous to this fantastic news, Andrea's lab unveiled an amazing new robot named Simon (see photos and videos below). Simon features an articulated torso, dual 7-DOF arms, and anthropomorphic hands from Meka Robotics along with an expressive head designed at Georgia Tech. Simon is designed to study human-robot interaction from a social learning vantage, such as learning by demonstration and human-robot collaboration. I'm very enthralled for Andrea, and I'm proud to have taken her graduate research course on human-robot interaction while at Georgia Tech.
While perusing through Kuka's 2008 Annual Report, it became evident that the robotics giant is making a serious foray into mobile manipulation with its OmniRob concept robot (photos and videos below). This new robot sports a omnidirectional mobile platform based on mecanum wheels, a Kuka lightweight arm, and what appear to be dual SICK LMS100 laser range finders to provide 360° lidar coverage. Between Kuka's "toy" educational platform (covered by Hizook in October) and this more advanced offering, it is clear that Kuka is highly invested / interested in the future of mobile manipulation. With Kuka's classic expertise in robot arms, combined with competence in omnidirectional systems via their OmniMove industrial application line, Kuka will surely be a significant force in the exciting field of mobile manipulation.
At ICRA 2009, the Rollin' Justin humanoid robot (the lovable robot that "Danced Like in Pulp Fiction") demonstrated some impressive teleoperation capabilities. The man-machine interface (MMI) consists of two components. The first component comprises two DLR-III lightweight arms, the same type employed by the robot, terminated with force-torque sensing load cells to command the omnidirectional base or the arms / hands. Meanwhile, the second component, a fully-immersive heads-up display with vicon (optical) head tracking, constantly streams robot-mounted camera images to the heads-up display while simultaneously panning and tilting the robot's head in concert with the user's head movements. All-in-all, this is a very impressively engineered system. Be sure to check out the pictures and video below.
Like almost all roboticists, I'm a huge fan of robot movies. My common favorites include: I-Robot, Blade Runner, Iron Man, Short Circuit, AI, Wall-E, Hinokio, and so on. Well, there is a new Sci-Fi movie called "District 9" coming out this weekend that (based on previews) sports some impressive robotic systems -- particularly exoskeletons. The writer / director of this new movie is Neill Blomkamp, who has also produced numerous short films featuring robots (a few of which are shown below). In a psuedo-tradition, we're having a lab outing to a matinee showing of "District 9" this weekend. I'll be sure to let you know how it goes in the comments, but in the meantime check out the pictures and trailers below.