Dejan Pangercic of the Intelligent Autonomous Systems Group at TUM (Technische Universität München) wrote in to show us a cool dual-robot demonstration where a PR2 robot (TUM's James) and TUM-Rosie combine their efforts to prepare and deliver pancakes -- Yum! The demonstration system is quite impressive, featuring: door and drawer opening, object recognition, grasping and manipulation, navigation, multi-robot cooperation, etc. The demo seems to use a fair bit of stock ROS functionality, as well as some new functionality and CRAM integration (Cognitive Robot Abstract Machine, a reasoning framework from TUM). I'm anxious to learn more about the system: assumptions, limitations, and methods. Hopefully more advanced details are forthcoming. Check out the video below.
Yesterday Georgia Tech's PR2 robot made a LIVE appearance on CNN. The event was accompanied by interviews of Dr. Charlie Kemp (director of Georgia Tech's Healthcare Robotics Lab and my advisor) and Keenan Wyrobek (Willow Garage figurehead). Travis Deyle (yours truly) was also present and responsible for the robot demonstration. While some of the PR2's movements (some driving, waving to the audience, etc) were scripted or teleoperated via joystick, the actual medication delivery demonstration was fully autonomous and used UHF RFID sensing (a major component of my PhD research), the base laser rangefinder, and a slightly-modified TrajectoryPlannerROS. The demo went off without a hitch, and as Keenan mentioned on the PR2-Users mailing list, "Their demo is a milestone (albeit a gutsy one) for PR2. The first nationally televised, LIVE, sensor-based demo with a PR2." Check out the video (embedded below), as well as some behind-the-scenes pictures of the PR2 inside CNN's studio.
Tonight's episode of Big Bang Theory, a comedy sitcom about CalTech scientists / engineers, prominently featured a telepresence robot as "Shel-Bot." Specifically, the show featured Willow Garage's telepresence robot called Texai that was covered in the NY Times just a few weeks ago. Apparently this is the second robot appearance in Big Bang Theory's new season (one in each of two new episodes) -- though I have yet to watch last week's episode with a "gratifying" robot manipulator. It's so nice to see real robots on TV, though I'm sure my wife could live without my excited banter and shutter clicks as I (literally) take screenshots with our DSLR camera (see shots below).
There is an interesting article in the Seattle Times about former Microsoft robotics evangelist, Tandy Trower, launching a new startup named Hoaloha Robotics. His goal is to create a $5k-10k personal robot (aka mobile manipulator) in the next five-to-ten years that can address the needs of older adults, such as telepresence activities and other healthcare tasks. Hoping to leverage cheap 3D sensing (like depth cameras a la Microsoft's Kinect) and inexpensive computing, this one-man (so far) company is another entrant in a new, budding market. Having been personally involved with the design, construction, programming, and brief home-deployment of a mobile manipulator (EL-E), I can confidently say that Tandy & co. have a lot of work cut out for themselves -- I wish them luck and success.
I would like to point out two news items involving telepresence robots that are definitely worth reading. First, a "manifesto" reprinted from the June 1980 issue of Omni magazine where artificial intelligence pioneer, Marvin Minsky, shares his views on telepresence (a term he originally coined). His essay includes a prediction of remote avatars (ie. Surrogates), operation in hazardous / remote environments, and even a discussion of how little development has occurred since the 1950's (remember, his essay is from 1980; did you know that full-body exoskeletons were produced back in the 1950s?!). Second, a NYTimes article by John Markoff that discusses five top American contenders in the space: Vgo (Vgo Communications), Tilr (RoboDynamics), Texai (Willow Garage), RP-7i (InTouch Health), and QB (Anybots). The article captures the society aspect and high-level overview but lacks meaty technology details (though the side-by-side photo montage is useful for direct comparison).
By now, you're probably familiar with the Nao humanoid robot from Aldebaran Robotics -- the robot that supplanted the Sony Aibo as the robot du jour for Robocup's Standard Platform League (international robot soccer competition) back in 2007 and retains that prestigious title yet. Recently, Aldebaran announced a new Educational Partnership Program that aims to expose students of higher education to the joys of programming advanced robots. Contemporaneously, Aldebaran announced a set of four product derivatives to match varied academic budgets, ranging from full humanoids, to upper-body manipulation rigs, and 2-DoF robot heads for audio-visual experimentation (see details below). Crucially, this new initiative provides a stable hardware platform with a comprehensive software suite (alternatively, extensive open-source ROS drivers) to match your educational, research, or just whimsical robot needs.
Apparently my hunch about the recent humanoid being the standard platform for the DARPA Autonomous Robot Manipulation Software (ARM-S) program was spot-on! A new blog post on ROS.org confirms that this is the DARPA "ARM Robot" and that there is a public contest to name the robot. The blog post gives a few hardware details: "The 'ARM Robot' has two Barrett WAM arms, BarrettHands, 6-axis force torque sensors at the wrist, and pan-tilt head. For sensors, it has a color camera, SwissRanger depth camera, stereo camera, and microphone." The program winners are also enumerated: Carnegie Mellon University, HRL Laboratories, iRobot, NASA-Jet Propulsion Laboratory, SRI International and University of Southern California. Be sure to check out the video of the (now confirmed) unnamed DARPA ARM-S robot platform embedded below. Updated Sept. 1st, 2010: This robot was integrated / developed by RE2, a Carnegie Mellon spin-off located in Pittsburgh, PA that specializes in agile defense robotics with an emphasis on intelligent mobile manipulation platforms.
Dr. Motilal Agrawal from the Artificial Intelligence Center at SRI International just sent an email to the robotics worldwide mailing list seeking qualified PhD or Masters job candidates (or interns) with experience in ROS, C++ / Python, and grasping / manipulation. In the email, Dr. Agrawal points to a movie that shows off a new humanoid robot being used at SRI that sports dual Barrett WAM arms, each with a Barrett three-fingered hand -- see the movie embedded below. I can't wait to see what SRI plans to do with its new robot; they always seem to do such thorough work. You'll notice that this design is becoming increasingly common, from Intel / CMU's HERB robot to Alexander Stoytchev's robot at Iowa State. Updated Aug. 31st, 2010: My hunch was correct; we just received confirmation that this robot is indeed the "standard" hardware platform for the DARPA ARM-S program and was developed / integrated by RE2.
I just stumbled across an amazing new video (embedded below) from Howie Choset's Biorobotics Laboratory at CMU of a teleoperated snake robot climbing a tree. While I have seen a lot of snake robots built over the years, including some amphibious versions that can swim, this is the first time I have seen one climbing a tree -- a task that some biological species do amazingly well! This is clearly a case of personal ignorance; other snake robots from the Biorobotics lab have been performing similar feats for years, as evidenced by videos from 2008 (also embedded below). However, I was sufficiently captivated by the new and old videos to share them with you.
Dr. Aaron Dollar of Yale's GRAB Lab was recently awarded the prestigious "MIT Tech Review 2010 Young Innovators Under 35" award, better known as TR35, for his work on building flexible robot hands through shape deposition manufacturing (SDM). The SDM process allows multiple materials to be integrated into a single mechanism, including soft finger pads, compliant joints, rigid members, sensors, and even tubes to run wires and cables. In fact, this is the same / similar process by which the Meka Robotics H2 Hand (eg. on Simon) is constructed. Anyway, this is a promising trend for robotics research; TR35 seems to consistently recognise the contributions of top roboticists, such as Andrea Thomaz (2009), Andrew Ng (2008), Robert Wood (2008), Josh Bongard (2007), etc. Congratulations Aaron!
Today Velodyne Lidar introduced the HDL-32E, a new laser rangefinder with 32 simultaneously-operating laser beams that cumulatively output up to 800,000 points per second. The new laser rangefinder provides full 360° scans at up to 20 Hz with ranges from 5 cm to 100 meters and a vertical field of view from +10° to -30° (datasheet). The entire device is very compact at just 8.5 cm in diameter and 15 cm tall -- not much larger than a soda can! The HDL-32E has a list price of $29,900 and is expected to ship in the next few weeks, apparently to meet a pretty hefty initial demand. The new laser rangefinder is the successor to the Velodyne HDL-64E, a vastly successful device that was pivotal for many DARPA Grand Challenge autonomous cars and even saw applications in cutting-edge music videos. I have high hopes for this new LIDAR.
While Willow Garage made an important announcement about the forthcoming commercial availability of PR2 robots earlier this week, I want to focus your attention on something a bit more whimsical. At the PR2 launch party, Willow Garage founder (Scott Hassan) was throwing around the idea of a PR2 video competition for PR2 Beta Program recipients, complete with substantial cash prize. True to his word, Scott set up a rules / video submission site; in a nutshell: the competition deadline was Aug. 17th, had $10k in aggregate prize money, and was to be judged by Scott, his wife, and his children. Today the results were announced on the pr2-users mailing list. You can find (all?) the submitted videos, including the winners, embedded below -- check 'em out and let us know which is your favorite in the comments!
Two weeks ago, Engadget / CrunchGear posted videos of RAPUDA (Robotic Arm for Persons with Upper limb DisAbilities) from AIST's Intelligent Systems Research Institute -- a wheelchair-mounted, light-weight robot arm with a prominent telescoping link that was demonstrated grasping a cup from a table, lifting the cup for drinking, and grasping an object from the floor via teleoperation (video embedded below). Given my proclivity for clever mechanisms, I wanted details about the telescoping link, specifically to determine how it compares to the Geosystems Situational Awareness Mast (aka Zippermast). Well, I found what I was looking for: a Japanese patent application for "Linearly Moving Extendable Mechanism and Robot Arm Equipped with Linearly Moving Extendable Mechanism." Basically, the telescoping segment consists of a series of small interlocking modules that are expelled (or reeled-in) through the "shoulder" link. Check out the pictures -- cool stuff!
Electrotactile arrays are a lesser-known form of human-machine interface (HMI) that apply electric current to skin-contacting surface electrodes to excite cutaneous nerves and give the illusion of texture, pressure, or pinpricks (depending on current strength and electrode resolution) all without mechanical vibration. This technique has been around for many years for: non-visual fighter pilot status displays, tongue interfaces, surgery guides, and for forehead-mounted camera displays for the blind. Enough background... The exciting news is a recent product developed by Senseg and Toshiba Information Systems called "E-Sense" that successfully embeds an electrotactile display into a touchpad, LCD, or other curved surface (eg. all over a cellphone), thereby providing programmable high-resolution texture feedback to a user -- see the video embedded below. I would wager that this feedback could greatly enhance haptic shared awareness in teleoperation / telemanipulation systems.
PlasticPals just pointed out DLR's 10-month effort to build a biped robot -- an effort that yielded a 1-meter, 50kg walking robot (video below). Mechanically, each leg has six degrees of freedom. A DLR / Kuka Light-Weight Robot (LWR) arm segment comprises the upper-leg, and a custom lower-leg segment connects to the foot through a six-axis force-torque sensor. Realtime control algorithms and dynamic simulations are performed using OpenHRP3 and Simpack. DLR claims that this is the "first electromechanically actuated bipedal robot with torque controlled joints," through which they intend to research compliant impedance control for biped locomotion. I share PlasticPals' musings: could these legs ultimately transform Justin into a bipedal walking humanoid?
I recently learned that the holonomic mobile base developed at the University of Bonn's Autonomous Intelligent Systems Lab (NimbRo@Home) for the Dynamaid robot has become commercially available. It is officially called the VolksBot Omni and is being sold in Fraunhofer's VolksBot line for 9000 EUR (~$11,700 USD). Fundamentally, the VolksBot Omni is a powered-caster omnidirectional robot base (similar to the PR2 or Justin robots), except that its actuators are exclusively Robotis Dynamixel servos -- four modules, each with two EX-106 servos for drive torque and one RX-64 for module steering. The base is fairly light weight (around 5kg), but sports a 40x60cm chassis that supports a 20kg payload. It has a top speed of 50 cm/sec, is controlled via USB, and has ROS / Player drivers.
In our first request for assistance, we asked you to contribute / submit articles related to your own projects, ideas, and research. Here we'll ask you to consider these other methods of lending Hizook a hand: Provide a modicum of financial support by using Hizook's affiliates (eg. Amazon and Trossen Robotics) when making robotics purchases; Enroll for a Hizook user account; Or just become an active commenter. Again, our goal is to foster a community for academics and professionals that promotes informal, yet educated, robotics discussions outside of rigid peer-reviewed conference and journal settings. We cannot hope to do this alone; your assistance is crucial!
I'm consistently surprised by the outpouring of support and enthusiasm from you, our Hizook readers, about this site and its content -- it's abundantly clear that academic and professional roboticists would welcome a commons for informal, yet educated, robotics discussions outside rigid peer-reviewed conference and journal settings. Hizook was founded to fill that role. However, keeping up with world-wide robotics news would be a full-time endeavor -- a role to which I alone could not possibly do justice, especially accounting for my own research aspirations. Thus, Hizook is asking for your help... There are lots of ways to assist! In this first installment, we'll look at the the most pressing manner in which Hizook requests your assistance: contribute / submit articles related to your own projects, ideas, and research.
NASA's $2.3B Mars Science Laboratory (MSL) robot known as Curiosity took its first test drive on Friday inside a NASA cleanroom, moving about 1 meter. As the successor to two wildly successful Mars rovers (Spirit and Opportunity), NASA has high hopes for Curiosity, which weights as much as a small SUV, has a six-wheel rocker-bogie suspension about waist height, and is nuclear powered via a radioisotope thermoelectric generator (RTG). Curiosity is an amazing piece of technology in its own right, and even more impressive when considering its marvelous sensor payload. Personally, I'm proud to see my tax dollars being used for such impressive scientific pursuits. Check out Curiosity's first test drive in the video below.
Perching is one of the most common aerobatic maneuvers executed by birds and is representative of a large and important class of aggressive aerial maneuvers that take advantage of unsteady aerodynamics. During a perching maneuver, birds often exceed 90 degrees in angle-of-attack, exploiting both viscous and pressure drag for rapid deceleration. Russ Tedrake and Rick Cory at MIT's Robot Locomotion Group have drawn inspiration from these insane maneuvers by developing a gliding UAV that can perform perching -- eventually (presumably) allowing a UAV to perch and recharge on powerlines. This is an impressive feat on many levels: the physics (semi-turbulent flow, visualized in their photos), a controls perspective (dealing with high-speed maneuvers, non-linear dynamics, and real-time constraints), and an application perspective (the eventual integration of powerline recharging). Be sure to check out the photos and videos!
CNET's Road Trip 2010 series dropped by iRobot HQ, where "Cool Stuff" is aptly emblazoned on the doors. They snapped a number of interesting photos of lesser-known iRobot history / robots, including: underwater gliders, previously unseen chembot prototypes, and Landroids (mini-Packbots). But perhaps the most interesting nugget was the Roomba testing lab, where Roombas scuttle around for 1500-2000 hours of durability testing across various floor types with different levels of dirtiness -- check out the video below. The multi-floor testing is uncannily similar to that which we performed on the Roomba Dustpan robot.
I recently became aware of an effort by ISO (International Organization for Standardization) to define a standard for domestic service robots -- more specifically, ISO-13482 "Safety requirements for non-medical personal care robots." I must confess having mixed feelings about this development. On one hand, it is exciting that the personal robotics revolution is near-enough at hand to warrant the definition of a standard -- there are many standards for industrial robots (eg. ISO-10218 and ISO-9409), but none for domestic personal robots. On the other hand, I'm a bit concerned that a somewhat-binding international standard is being developed prematurely and in a rather closed-door fashion -- issues upon which I will elaborate below. Thankfully, there will be plenty of discussion at IROS 2010 (Taipei, Taiwan in mid-October) at the "Workshop on Standardization for Service Robots." Lack of resources will likely preclude my attendance, so perhaps someone can fill us in after the fact...?
FastCompany spotted a new version of HERB (Home Exploring Robot Butler) at the CMU Quality of Life Technology Center. HERB is a joint effort between Intel Research's Personal Robotics Program and Carnegie Mellon University. The new version sports two Barrett WAM arms on a Segway RMP mobile base and has a very distinctive rotating (instead of tilting) planar laser rangefinder. The new HERB certainly has a unique design -- be sure to check out the photos and video below where HERB grasps objects from a table.
Hizook previously covered a number of DARPA Chembot projects, including Dr. Hong's Whole-Skin Locomotion (aka amoeba robot) and the IRobot Jamming Skin Robot (aka "blob bot"). The original blob bot was rather creepy, but researchers from IRobot, MIT, and Harvard have ameliorated the situation by creating a decidedly non-creepy successor: a Chembot with soft (silicone?) selectively-inflatable body segments for locomotion. Hopefully a fully-integrated version (power, actuation, and control) is near at hand. Read on for photos and a video of the new prototype.
Back in late March, Hizook provided an overview of various depth cameras (aka range cameras, 3D cameras, time-of-flight cameras, RGB-D cameras), including the PrimeSense solution now known to be the basis of Microsoft's Kinect (formerly Project Natal). In the last 3 months, the depth camera space has seen numerous updates, such as additional commercial offerings and product updates / availability. Perhaps the most exciting news (as we speculated in March) is that low-cost offerings will indeed be hitting the market later this year: Microsoft recently confirmed that Kinect (formerly Project Natal) will start shipping in early November and is already available for pre-order on Amazon.com for $150 USD! More questions than answers remain -- here is what we know, help us fill in the gaps...