This is a guest essay by Fred Nikgohar. Fred is the CEO of RoboDynamics, makers of the TiLR robot which was recently featured in a New York Times overview of telepresence robots. Fred argues that we've reached a watershed moment in robotics facilitated by cheap 3D sensors like Microsoft's Kinect (ie. PrimeSense RGB-D camera) -- that the Kinect provides a roadmap where "the best solution to complex, low-cost sensing (or actuation for that matter) is to take advantage of affordable, mass-produced components, complementing them with the innovative use of software solutions that benefit from constantly declining prices of computation."
I would like to introduce you to iRobot's latest prototype: a new telepresence robot named AVA that was unveiled this week at the Consumer Electronics Show (CES 2011) -- see the video below. Through AVA, iRobot intends to explicitly leverage the proliferation of tablets / smart phones and their associated app stores. They intend to furnish actuation (mobile base, pan-tilt unit, telescoping linear actuator spine, etc.), a sensor suite (including sonar, laser, and a depth camera like Microsoft's Kinect), and basic robot software (eg. obstacle avoidance, mapping, and direct physical interfaces). Meanwhile, you provide the brains in the form of a tablet (eg. iPad or Android). In theory, this should open up mobile robot application development to a much broader audience, creating the oft-discussed robot app store. When combined with the recent announcement of the Scooba 230 floor cleaning robot (which I will certainly purchase), I would say iRobot is still innovating!
Many folks visiting Hizook today are looking for a "robot coffee machines," specifically, the Tassimo BrewBot by Bosch. In actuality, BrewBot is not a robot at all! They are using a cute robot coffee machine to sell decidedly non-robot coffee makers, and it makes me sad... I want the actual robot! Either way, kudos to the clever marketers at Bosch for making such a great commercial (embedded below). In the meantime, if you're interested in seeing a real coffee making and delivering robot, go check out the Nestlé Nespresso Nesbot. Dang it, now I've got a hankering for an espresso...
Take a moment and envision an electromagnet: a simple coiled wire driven by a hefty electrical current gives a fully-programmable magnetic field strength (on, off, and everything between). Electromagnets are ubiquitous, but it turns out that there is a little-known device with similar functionality yet zero static power consumption -- they are called electropermanent magnets, and they've been around and in use since the 1960's! A 2010 PhD thesis by MIT Media Lab's Ara Knaian examines the physics, scaling, trade-offs, and several new actuator designs (eg. stepper motors) using these little-known wonders. Recently, electropermanent magnets facilitated an innovation in "programmable matter," where they were instrumental in creating the world's smallest self-contained modular robots to date (12mm/side). Read on for details about this fascinating technology, along with discussions about existing and possible robotic applications.
Heartland Robotics, the stealthy robotics startup founded by iRobot co-founder and robotics legend Rod Brooks, was in the news again last week after closing a $20M financing round. Little is known about the company beyond broad superlatives from executives about building robots to "increase productivity and revitalize manufacturing." Now, successful fundraising by a robotics startup is great news, but alone it was insufficient to draw my laser-focus away from thesis work. However, a Boston.com article this weekend provided a tantalizing new nugget of information that I absolutely must share -- Heartland is working on a new mobile manipulator with a $5,000 projected price point complete with one or two arms, grippers, sensor head, and a mobile base. If coupled with a depth camera (eg. Kinect) and a decent computer, this could be a really compelling robot platform! If this price point is real, perhaps those superlatives aren't so inflated after all...
After many years of searching for the perfect telescoping linear actuator, I would like to share my discovery of the I-Lock Spiralift 75 (ILS75) prototype by Paco Spiralift. The ILS75 has a compact form factor (10x15x15 cm) that can telescope out to 1.6 meters while lifting a 175kg load (350+ lbs). It relies on a system of interlocking horizontal and vertical metal bands that "unroll" to lift a load, a process best illustrated in the videos embedded below. The ILS75 is just one Spiralift offering; others range all the way up to a goliath version (Spiralift ND18) that can lift an 11,000 kg load 12 meters (25,000 lbs to 40 ft). To date, Spiralift mechanisms have been applied to theater stage-lift systems and automotive lifts. At Hizook, we believe robotics is a compelling application, especially as a robot's vertical spine (eg. on EL-E, Cody, and PR2) -- increasing the robot's effective workspace to include floor and tables, and also for compact and easy transportation. As such, we're working with Paco Spiralift to gauge roboticsts' interest and vet the technical specs of the ILS75 -- tell us what you think in the comments.
A few weeks ago, my labmates from Georgia Tech's Healthcare Robotics Lab presented a paper at IROS 2010 entitled, "Towards an Assistive Robot that Autonomously Performs Bed Baths for Patient Hygiene." Their work used Cody, a robot with compliant arms, and a specialized "bath mitt" end effector to perform wiping motions that could clean selected areas of an actual person's body, including the upper arm, forearm, thigh, and leg. In this robotic cleaning task, the robot initiated and actively made contact with a human. The psychological impact of such robot-initiated contact is an interesting question -- one I believe will be important for future healthcare and human-robot interaction (HRI) tasks. Read on for a video and discussion by the authors.
KUKA has developed an impressive array of omnidirectional robot platforms: OmniMove, OmniRob, and youBot. A new video on the youBot Store shows how an OmniMove holonomic base (containing eight mecanum wheels) can be transformed into a seriously heavy-lifting mobile manipulator through the addition of a huge Titan robot arm, which has been called the "world's strongest robot arm" and is capable of lifting 1000 kg. The video (embedded below) shows this latest platform towering over the smaller youBot platform. I wonder if this new platform would qualify for BattleBots...? It would make for a fun exposition match!
Apparently novel robot end-effectors are popular this week (see the particle jamming robot grippers), as we've spotted another: a previously-unseen robot gripper from SRI International that uses an electrically controlled reversible adhesion called electroadhesion. We've looked at SRI's electroadhesive wall-climbing robots before, where electrostatic forces are able to support extreme loads with relatively little power consumption. Several friends and I ruminated about the possibility of embedding the electrodes in a robot's gripper to ease manipulation, but it seems SRI beat us to the punch. It also looks like they're developing general-purpose, highly-compliant electroadhesive pads for a variety of applications; according to the specifications, I should be able to walk up a wood wall using a pad of less than 16x16 inches2 while consuming less than 18 milli-Watts -- cool stuff! Few details are currently available, so I will post updates in the comments as we learn more. In the meantime... pictures!
Remember that compliant "jamming" end effector unveiled by Colin Angle (iRobot CEO) at TEDMED 2009? Even then, it was demonstrated picking up medication bottles, keys, and water bottles (a hand-held version was also demonstrated). Well, it just got a whole-lot more official with the publication of "Universal robotic gripper based on the jamming of granular material" in the Proceedings of the National Academy of Sciences (PNAS). The cool thing about this method of grasping is its relative simplicity: a rubber sack (balloon) filled with coffee grounds is pressed onto an object, it conforms to the object's natural contours, and the air is pumped out (a volume change less than 0.5%) to form a stable grasp-- no complex grasp planning required. Be sure to check out the new video and photos!
Dejan Pangercic of the Intelligent Autonomous Systems Group at TUM (Technische Universität München) wrote in to show us a cool dual-robot demonstration where a PR2 robot (TUM's James) and TUM-Rosie combine their efforts to prepare and deliver pancakes -- Yum! The demonstration system is quite impressive, featuring: door and drawer opening, object recognition, grasping and manipulation, navigation, multi-robot cooperation, etc. The demo seems to use a fair bit of stock ROS functionality, as well as some new functionality and CRAM integration (Cognitive Robot Abstract Machine, a reasoning framework from TUM). I'm anxious to learn more about the system: assumptions, limitations, and methods. Hopefully more advanced details are forthcoming. Check out the video below.
Yesterday Georgia Tech's PR2 robot made a LIVE appearance on CNN. The event was accompanied by interviews of Dr. Charlie Kemp (director of Georgia Tech's Healthcare Robotics Lab and my advisor) and Keenan Wyrobek (Willow Garage figurehead). Travis Deyle (yours truly) was also present and responsible for the robot demonstration. While some of the PR2's movements (some driving, waving to the audience, etc) were scripted or teleoperated via joystick, the actual medication delivery demonstration was fully autonomous and used UHF RFID sensing (a major component of my PhD research), the base laser rangefinder, and a slightly-modified TrajectoryPlannerROS. The demo went off without a hitch, and as Keenan mentioned on the PR2-Users mailing list, "Their demo is a milestone (albeit a gutsy one) for PR2. The first nationally televised, LIVE, sensor-based demo with a PR2." Check out the video (embedded below), as well as some behind-the-scenes pictures of the PR2 inside CNN's studio.
Tonight's episode of Big Bang Theory, a comedy sitcom about CalTech scientists / engineers, prominently featured a telepresence robot as "Shel-Bot." Specifically, the show featured Willow Garage's telepresence robot called Texai that was covered in the NY Times just a few weeks ago. Apparently this is the second robot appearance in Big Bang Theory's new season (one in each of two new episodes) -- though I have yet to watch last week's episode with a "gratifying" robot manipulator. It's so nice to see real robots on TV, though I'm sure my wife could live without my excited banter and shutter clicks as I (literally) take screenshots with our DSLR camera (see shots below).
There is an interesting article in the Seattle Times about former Microsoft robotics evangelist, Tandy Trower, launching a new startup named Hoaloha Robotics. His goal is to create a $5k-10k personal robot (aka mobile manipulator) in the next five-to-ten years that can address the needs of older adults, such as telepresence activities and other healthcare tasks. Hoping to leverage cheap 3D sensing (like depth cameras a la Microsoft's Kinect) and inexpensive computing, this one-man (so far) company is another entrant in a new, budding market. Having been personally involved with the design, construction, programming, and brief home-deployment of a mobile manipulator (EL-E), I can confidently say that Tandy & co. have a lot of work cut out for themselves -- I wish them luck and success.
I would like to point out two news items involving telepresence robots that are definitely worth reading. First, a "manifesto" reprinted from the June 1980 issue of Omni magazine where artificial intelligence pioneer, Marvin Minsky, shares his views on telepresence (a term he originally coined). His essay includes a prediction of remote avatars (ie. Surrogates), operation in hazardous / remote environments, and even a discussion of how little development has occurred since the 1950's (remember, his essay is from 1980; did you know that full-body exoskeletons were produced back in the 1950s?!). Second, a NYTimes article by John Markoff that discusses five top American contenders in the space: Vgo (Vgo Communications), Tilr (RoboDynamics), Texai (Willow Garage), RP-7i (InTouch Health), and QB (Anybots). The article captures the society aspect and high-level overview but lacks meaty technology details (though the side-by-side photo montage is useful for direct comparison).
By now, you're probably familiar with the Nao humanoid robot from Aldebaran Robotics -- the robot that supplanted the Sony Aibo as the robot du jour for Robocup's Standard Platform League (international robot soccer competition) back in 2007 and retains that prestigious title yet. Recently, Aldebaran announced a new Educational Partnership Program that aims to expose students of higher education to the joys of programming advanced robots. Contemporaneously, Aldebaran announced a set of four product derivatives to match varied academic budgets, ranging from full humanoids, to upper-body manipulation rigs, and 2-DoF robot heads for audio-visual experimentation (see details below). Crucially, this new initiative provides a stable hardware platform with a comprehensive software suite (alternatively, extensive open-source ROS drivers) to match your educational, research, or just whimsical robot needs.
Apparently my hunch about the recent humanoid being the standard platform for the DARPA Autonomous Robot Manipulation Software (ARM-S) program was spot-on! A new blog post on ROS.org confirms that this is the DARPA "ARM Robot" and that there is a public contest to name the robot. The blog post gives a few hardware details: "The 'ARM Robot' has two Barrett WAM arms, BarrettHands, 6-axis force torque sensors at the wrist, and pan-tilt head. For sensors, it has a color camera, SwissRanger depth camera, stereo camera, and microphone." The program winners are also enumerated: Carnegie Mellon University, HRL Laboratories, iRobot, NASA-Jet Propulsion Laboratory, SRI International and University of Southern California. Be sure to check out the video of the (now confirmed) unnamed DARPA ARM-S robot platform embedded below. Updated Sept. 1st, 2010: This robot was integrated / developed by RE2, a Carnegie Mellon spin-off located in Pittsburgh, PA that specializes in agile defense robotics with an emphasis on intelligent mobile manipulation platforms.
Dr. Motilal Agrawal from the Artificial Intelligence Center at SRI International just sent an email to the robotics worldwide mailing list seeking qualified PhD or Masters job candidates (or interns) with experience in ROS, C++ / Python, and grasping / manipulation. In the email, Dr. Agrawal points to a movie that shows off a new humanoid robot being used at SRI that sports dual Barrett WAM arms, each with a Barrett three-fingered hand -- see the movie embedded below. I can't wait to see what SRI plans to do with its new robot; they always seem to do such thorough work. You'll notice that this design is becoming increasingly common, from Intel / CMU's HERB robot to Alexander Stoytchev's robot at Iowa State. Updated Aug. 31st, 2010: My hunch was correct; we just received confirmation that this robot is indeed the "standard" hardware platform for the DARPA ARM-S program and was developed / integrated by RE2.
I just stumbled across an amazing new video (embedded below) from Howie Choset's Biorobotics Laboratory at CMU of a teleoperated snake robot climbing a tree. While I have seen a lot of snake robots built over the years, including some amphibious versions that can swim, this is the first time I have seen one climbing a tree -- a task that some biological species do amazingly well! This is clearly a case of personal ignorance; other snake robots from the Biorobotics lab have been performing similar feats for years, as evidenced by videos from 2008 (also embedded below). However, I was sufficiently captivated by the new and old videos to share them with you.
Dr. Aaron Dollar of Yale's GRAB Lab was recently awarded the prestigious "MIT Tech Review 2010 Young Innovators Under 35" award, better known as TR35, for his work on building flexible robot hands through shape deposition manufacturing (SDM). The SDM process allows multiple materials to be integrated into a single mechanism, including soft finger pads, compliant joints, rigid members, sensors, and even tubes to run wires and cables. In fact, this is the same / similar process by which the Meka Robotics H2 Hand (eg. on Simon) is constructed. Anyway, this is a promising trend for robotics research; TR35 seems to consistently recognise the contributions of top roboticists, such as Andrea Thomaz (2009), Andrew Ng (2008), Robert Wood (2008), Josh Bongard (2007), etc. Congratulations Aaron!
Today Velodyne Lidar introduced the HDL-32E, a new laser rangefinder with 32 simultaneously-operating laser beams that cumulatively output up to 800,000 points per second. The new laser rangefinder provides full 360° scans at up to 20 Hz with ranges from 5 cm to 100 meters and a vertical field of view from +10° to -30° (datasheet). The entire device is very compact at just 8.5 cm in diameter and 15 cm tall -- not much larger than a soda can! The HDL-32E has a list price of $29,900 and is expected to ship in the next few weeks, apparently to meet a pretty hefty initial demand. The new laser rangefinder is the successor to the Velodyne HDL-64E, a vastly successful device that was pivotal for many DARPA Grand Challenge autonomous cars and even saw applications in cutting-edge music videos. I have high hopes for this new LIDAR.
While Willow Garage made an important announcement about the forthcoming commercial availability of PR2 robots earlier this week, I want to focus your attention on something a bit more whimsical. At the PR2 launch party, Willow Garage founder (Scott Hassan) was throwing around the idea of a PR2 video competition for PR2 Beta Program recipients, complete with substantial cash prize. True to his word, Scott set up a rules / video submission site; in a nutshell: the competition deadline was Aug. 17th, had $10k in aggregate prize money, and was to be judged by Scott, his wife, and his children. Today the results were announced on the pr2-users mailing list. You can find (all?) the submitted videos, including the winners, embedded below -- check 'em out and let us know which is your favorite in the comments!
Two weeks ago, Engadget / CrunchGear posted videos of RAPUDA (Robotic Arm for Persons with Upper limb DisAbilities) from AIST's Intelligent Systems Research Institute -- a wheelchair-mounted, light-weight robot arm with a prominent telescoping link that was demonstrated grasping a cup from a table, lifting the cup for drinking, and grasping an object from the floor via teleoperation (video embedded below). Given my proclivity for clever mechanisms, I wanted details about the telescoping link, specifically to determine how it compares to the Geosystems Situational Awareness Mast (aka Zippermast). Well, I found what I was looking for: a Japanese patent application for "Linearly Moving Extendable Mechanism and Robot Arm Equipped with Linearly Moving Extendable Mechanism." Basically, the telescoping segment consists of a series of small interlocking modules that are expelled (or reeled-in) through the "shoulder" link. Check out the pictures -- cool stuff!
Electrotactile arrays are a lesser-known form of human-machine interface (HMI) that apply electric current to skin-contacting surface electrodes to excite cutaneous nerves and give the illusion of texture, pressure, or pinpricks (depending on current strength and electrode resolution) all without mechanical vibration. This technique has been around for many years for: non-visual fighter pilot status displays, tongue interfaces, surgery guides, and for forehead-mounted camera displays for the blind. Enough background... The exciting news is a recent product developed by Senseg and Toshiba Information Systems called "E-Sense" that successfully embeds an electrotactile display into a touchpad, LCD, or other curved surface (eg. all over a cellphone), thereby providing programmable high-resolution texture feedback to a user -- see the video embedded below. I would wager that this feedback could greatly enhance haptic shared awareness in teleoperation / telemanipulation systems.
PlasticPals just pointed out DLR's 10-month effort to build a biped robot -- an effort that yielded a 1-meter, 50kg walking robot (video below). Mechanically, each leg has six degrees of freedom. A DLR / Kuka Light-Weight Robot (LWR) arm segment comprises the upper-leg, and a custom lower-leg segment connects to the foot through a six-axis force-torque sensor. Realtime control algorithms and dynamic simulations are performed using OpenHRP3 and Simpack. DLR claims that this is the "first electromechanically actuated bipedal robot with torque controlled joints," through which they intend to research compliant impedance control for biped locomotion. I share PlasticPals' musings: could these legs ultimately transform Justin into a bipedal walking humanoid?