After many years of searching for the perfect telescoping linear actuator, I would like to share my discovery of the I-Lock Spiralift 75 (ILS75) prototype by Paco Spiralift. The ILS75 has a compact form factor (10x15x15 cm) that can telescope out to 1.6 meters while lifting a 175kg load (350+ lbs). It relies on a system of interlocking horizontal and vertical metal bands that "unroll" to lift a load, a process best illustrated in the videos embedded below. The ILS75 is just one Spiralift offering; others range all the way up to a goliath version (Spiralift ND18) that can lift an 11,000 kg load 12 meters (25,000 lbs to 40 ft). To date, Spiralift mechanisms have been applied to theater stage-lift systems and automotive lifts. At Hizook, we believe robotics is a compelling application, especially as a robot's vertical spine (eg. on EL-E, Cody, and PR2) -- increasing the robot's effective workspace to include floor and tables, and also for compact and easy transportation. As such, we're working with Paco Spiralift to gauge roboticsts' interest and vet the technical specs of the ILS75 -- tell us what you think in the comments.
A few weeks ago, my labmates from Georgia Tech's Healthcare Robotics Lab presented a paper at IROS 2010 entitled, "Towards an Assistive Robot that Autonomously Performs Bed Baths for Patient Hygiene." Their work used Cody, a robot with compliant arms, and a specialized "bath mitt" end effector to perform wiping motions that could clean selected areas of an actual person's body, including the upper arm, forearm, thigh, and leg. In this robotic cleaning task, the robot initiated and actively made contact with a human. The psychological impact of such robot-initiated contact is an interesting question -- one I believe will be important for future healthcare and human-robot interaction (HRI) tasks. Read on for a video and discussion by the authors.
KUKA has developed an impressive array of omnidirectional robot platforms: OmniMove, OmniRob, and youBot. A new video on the youBot Store shows how an OmniMove holonomic base (containing eight mecanum wheels) can be transformed into a seriously heavy-lifting mobile manipulator through the addition of a huge Titan robot arm, which has been called the "world's strongest robot arm" and is capable of lifting 1000 kg. The video (embedded below) shows this latest platform towering over the smaller youBot platform. I wonder if this new platform would qualify for BattleBots...? It would make for a fun exposition match!
Apparently novel robot end-effectors are popular this week (see the particle jamming robot grippers), as we've spotted another: a previously-unseen robot gripper from SRI International that uses an electrically controlled reversible adhesion called electroadhesion. We've looked at SRI's electroadhesive wall-climbing robots before, where electrostatic forces are able to support extreme loads with relatively little power consumption. Several friends and I ruminated about the possibility of embedding the electrodes in a robot's gripper to ease manipulation, but it seems SRI beat us to the punch. It also looks like they're developing general-purpose, highly-compliant electroadhesive pads for a variety of applications; according to the specifications, I should be able to walk up a wood wall using a pad of less than 16x16 inches2 while consuming less than 18 milli-Watts -- cool stuff! Few details are currently available, so I will post updates in the comments as we learn more. In the meantime... pictures!
Remember that compliant "jamming" end effector unveiled by Colin Angle (iRobot CEO) at TEDMED 2009? Even then, it was demonstrated picking up medication bottles, keys, and water bottles (a hand-held version was also demonstrated). Well, it just got a whole-lot more official with the publication of "Universal robotic gripper based on the jamming of granular material" in the Proceedings of the National Academy of Sciences (PNAS). The cool thing about this method of grasping is its relative simplicity: a rubber sack (balloon) filled with coffee grounds is pressed onto an object, it conforms to the object's natural contours, and the air is pumped out (a volume change less than 0.5%) to form a stable grasp-- no complex grasp planning required. Be sure to check out the new video and photos!
Dejan Pangercic of the Intelligent Autonomous Systems Group at TUM (Technische Universität München) wrote in to show us a cool dual-robot demonstration where a PR2 robot (TUM's James) and TUM-Rosie combine their efforts to prepare and deliver pancakes -- Yum! The demonstration system is quite impressive, featuring: door and drawer opening, object recognition, grasping and manipulation, navigation, multi-robot cooperation, etc. The demo seems to use a fair bit of stock ROS functionality, as well as some new functionality and CRAM integration (Cognitive Robot Abstract Machine, a reasoning framework from TUM). I'm anxious to learn more about the system: assumptions, limitations, and methods. Hopefully more advanced details are forthcoming. Check out the video below.
Yesterday Georgia Tech's PR2 robot made a LIVE appearance on CNN. The event was accompanied by interviews of Dr. Charlie Kemp (director of Georgia Tech's Healthcare Robotics Lab and my advisor) and Keenan Wyrobek (Willow Garage figurehead). Travis Deyle (yours truly) was also present and responsible for the robot demonstration. While some of the PR2's movements (some driving, waving to the audience, etc) were scripted or teleoperated via joystick, the actual medication delivery demonstration was fully autonomous and used UHF RFID sensing (a major component of my PhD research), the base laser rangefinder, and a slightly-modified TrajectoryPlannerROS. The demo went off without a hitch, and as Keenan mentioned on the PR2-Users mailing list, "Their demo is a milestone (albeit a gutsy one) for PR2. The first nationally televised, LIVE, sensor-based demo with a PR2." Check out the video (embedded below), as well as some behind-the-scenes pictures of the PR2 inside CNN's studio.
Tonight's episode of Big Bang Theory, a comedy sitcom about CalTech scientists / engineers, prominently featured a telepresence robot as "Shel-Bot." Specifically, the show featured Willow Garage's telepresence robot called Texai that was covered in the NY Times just a few weeks ago. Apparently this is the second robot appearance in Big Bang Theory's new season (one in each of two new episodes) -- though I have yet to watch last week's episode with a "gratifying" robot manipulator. It's so nice to see real robots on TV, though I'm sure my wife could live without my excited banter and shutter clicks as I (literally) take screenshots with our DSLR camera (see shots below).
There is an interesting article in the Seattle Times about former Microsoft robotics evangelist, Tandy Trower, launching a new startup named Hoaloha Robotics. His goal is to create a $5k-10k personal robot (aka mobile manipulator) in the next five-to-ten years that can address the needs of older adults, such as telepresence activities and other healthcare tasks. Hoping to leverage cheap 3D sensing (like depth cameras a la Microsoft's Kinect) and inexpensive computing, this one-man (so far) company is another entrant in a new, budding market. Having been personally involved with the design, construction, programming, and brief home-deployment of a mobile manipulator (EL-E), I can confidently say that Tandy & co. have a lot of work cut out for themselves -- I wish them luck and success.
I would like to point out two news items involving telepresence robots that are definitely worth reading. First, a "manifesto" reprinted from the June 1980 issue of Omni magazine where artificial intelligence pioneer, Marvin Minsky, shares his views on telepresence (a term he originally coined). His essay includes a prediction of remote avatars (ie. Surrogates), operation in hazardous / remote environments, and even a discussion of how little development has occurred since the 1950's (remember, his essay is from 1980; did you know that full-body exoskeletons were produced back in the 1950s?!). Second, a NYTimes article by John Markoff that discusses five top American contenders in the space: Vgo (Vgo Communications), Tilr (RoboDynamics), Texai (Willow Garage), RP-7i (InTouch Health), and QB (Anybots). The article captures the society aspect and high-level overview but lacks meaty technology details (though the side-by-side photo montage is useful for direct comparison).
By now, you're probably familiar with the Nao humanoid robot from Aldebaran Robotics -- the robot that supplanted the Sony Aibo as the robot du jour for Robocup's Standard Platform League (international robot soccer competition) back in 2007 and retains that prestigious title yet. Recently, Aldebaran announced a new Educational Partnership Program that aims to expose students of higher education to the joys of programming advanced robots. Contemporaneously, Aldebaran announced a set of four product derivatives to match varied academic budgets, ranging from full humanoids, to upper-body manipulation rigs, and 2-DoF robot heads for audio-visual experimentation (see details below). Crucially, this new initiative provides a stable hardware platform with a comprehensive software suite (alternatively, extensive open-source ROS drivers) to match your educational, research, or just whimsical robot needs.
Apparently my hunch about the recent humanoid being the standard platform for the DARPA Autonomous Robot Manipulation Software (ARM-S) program was spot-on! A new blog post on ROS.org confirms that this is the DARPA "ARM Robot" and that there is a public contest to name the robot. The blog post gives a few hardware details: "The 'ARM Robot' has two Barrett WAM arms, BarrettHands, 6-axis force torque sensors at the wrist, and pan-tilt head. For sensors, it has a color camera, SwissRanger depth camera, stereo camera, and microphone." The program winners are also enumerated: Carnegie Mellon University, HRL Laboratories, iRobot, NASA-Jet Propulsion Laboratory, SRI International and University of Southern California. Be sure to check out the video of the (now confirmed) unnamed DARPA ARM-S robot platform embedded below. Updated Sept. 1st, 2010: This robot was integrated / developed by RE2, a Carnegie Mellon spin-off located in Pittsburgh, PA that specializes in agile defense robotics with an emphasis on intelligent mobile manipulation platforms.
Dr. Motilal Agrawal from the Artificial Intelligence Center at SRI International just sent an email to the robotics worldwide mailing list seeking qualified PhD or Masters job candidates (or interns) with experience in ROS, C++ / Python, and grasping / manipulation. In the email, Dr. Agrawal points to a movie that shows off a new humanoid robot being used at SRI that sports dual Barrett WAM arms, each with a Barrett three-fingered hand -- see the movie embedded below. I can't wait to see what SRI plans to do with its new robot; they always seem to do such thorough work. You'll notice that this design is becoming increasingly common, from Intel / CMU's HERB robot to Alexander Stoytchev's robot at Iowa State. Updated Aug. 31st, 2010: My hunch was correct; we just received confirmation that this robot is indeed the "standard" hardware platform for the DARPA ARM-S program and was developed / integrated by RE2.
I just stumbled across an amazing new video (embedded below) from Howie Choset's Biorobotics Laboratory at CMU of a teleoperated snake robot climbing a tree. While I have seen a lot of snake robots built over the years, including some amphibious versions that can swim, this is the first time I have seen one climbing a tree -- a task that some biological species do amazingly well! This is clearly a case of personal ignorance; other snake robots from the Biorobotics lab have been performing similar feats for years, as evidenced by videos from 2008 (also embedded below). However, I was sufficiently captivated by the new and old videos to share them with you.
Dr. Aaron Dollar of Yale's GRAB Lab was recently awarded the prestigious "MIT Tech Review 2010 Young Innovators Under 35" award, better known as TR35, for his work on building flexible robot hands through shape deposition manufacturing (SDM). The SDM process allows multiple materials to be integrated into a single mechanism, including soft finger pads, compliant joints, rigid members, sensors, and even tubes to run wires and cables. In fact, this is the same / similar process by which the Meka Robotics H2 Hand (eg. on Simon) is constructed. Anyway, this is a promising trend for robotics research; TR35 seems to consistently recognise the contributions of top roboticists, such as Andrea Thomaz (2009), Andrew Ng (2008), Robert Wood (2008), Josh Bongard (2007), etc. Congratulations Aaron!
Today Velodyne Lidar introduced the HDL-32E, a new laser rangefinder with 32 simultaneously-operating laser beams that cumulatively output up to 800,000 points per second. The new laser rangefinder provides full 360° scans at up to 20 Hz with ranges from 5 cm to 100 meters and a vertical field of view from +10° to -30° (datasheet). The entire device is very compact at just 8.5 cm in diameter and 15 cm tall -- not much larger than a soda can! The HDL-32E has a list price of $29,900 and is expected to ship in the next few weeks, apparently to meet a pretty hefty initial demand. The new laser rangefinder is the successor to the Velodyne HDL-64E, a vastly successful device that was pivotal for many DARPA Grand Challenge autonomous cars and even saw applications in cutting-edge music videos. I have high hopes for this new LIDAR.
While Willow Garage made an important announcement about the forthcoming commercial availability of PR2 robots earlier this week, I want to focus your attention on something a bit more whimsical. At the PR2 launch party, Willow Garage founder (Scott Hassan) was throwing around the idea of a PR2 video competition for PR2 Beta Program recipients, complete with substantial cash prize. True to his word, Scott set up a rules / video submission site; in a nutshell: the competition deadline was Aug. 17th, had $10k in aggregate prize money, and was to be judged by Scott, his wife, and his children. Today the results were announced on the pr2-users mailing list. You can find (all?) the submitted videos, including the winners, embedded below -- check 'em out and let us know which is your favorite in the comments!
Two weeks ago, Engadget / CrunchGear posted videos of RAPUDA (Robotic Arm for Persons with Upper limb DisAbilities) from AIST's Intelligent Systems Research Institute -- a wheelchair-mounted, light-weight robot arm with a prominent telescoping link that was demonstrated grasping a cup from a table, lifting the cup for drinking, and grasping an object from the floor via teleoperation (video embedded below). Given my proclivity for clever mechanisms, I wanted details about the telescoping link, specifically to determine how it compares to the Geosystems Situational Awareness Mast (aka Zippermast). Well, I found what I was looking for: a Japanese patent application for "Linearly Moving Extendable Mechanism and Robot Arm Equipped with Linearly Moving Extendable Mechanism." Basically, the telescoping segment consists of a series of small interlocking modules that are expelled (or reeled-in) through the "shoulder" link. Check out the pictures -- cool stuff!
Electrotactile arrays are a lesser-known form of human-machine interface (HMI) that apply electric current to skin-contacting surface electrodes to excite cutaneous nerves and give the illusion of texture, pressure, or pinpricks (depending on current strength and electrode resolution) all without mechanical vibration. This technique has been around for many years for: non-visual fighter pilot status displays, tongue interfaces, surgery guides, and for forehead-mounted camera displays for the blind. Enough background... The exciting news is a recent product developed by Senseg and Toshiba Information Systems called "E-Sense" that successfully embeds an electrotactile display into a touchpad, LCD, or other curved surface (eg. all over a cellphone), thereby providing programmable high-resolution texture feedback to a user -- see the video embedded below. I would wager that this feedback could greatly enhance haptic shared awareness in teleoperation / telemanipulation systems.
PlasticPals just pointed out DLR's 10-month effort to build a biped robot -- an effort that yielded a 1-meter, 50kg walking robot (video below). Mechanically, each leg has six degrees of freedom. A DLR / Kuka Light-Weight Robot (LWR) arm segment comprises the upper-leg, and a custom lower-leg segment connects to the foot through a six-axis force-torque sensor. Realtime control algorithms and dynamic simulations are performed using OpenHRP3 and Simpack. DLR claims that this is the "first electromechanically actuated bipedal robot with torque controlled joints," through which they intend to research compliant impedance control for biped locomotion. I share PlasticPals' musings: could these legs ultimately transform Justin into a bipedal walking humanoid?
I recently learned that the holonomic mobile base developed at the University of Bonn's Autonomous Intelligent Systems Lab (NimbRo@Home) for the Dynamaid robot has become commercially available. It is officially called the VolksBot Omni and is being sold in Fraunhofer's VolksBot line for 9000 EUR (~$11,700 USD). Fundamentally, the VolksBot Omni is a powered-caster omnidirectional robot base (similar to the PR2 or Justin robots), except that its actuators are exclusively Robotis Dynamixel servos -- four modules, each with two EX-106 servos for drive torque and one RX-64 for module steering. The base is fairly light weight (around 5kg), but sports a 40x60cm chassis that supports a 20kg payload. It has a top speed of 50 cm/sec, is controlled via USB, and has ROS / Player drivers.
In our first request for assistance, we asked you to contribute / submit articles related to your own projects, ideas, and research. Here we'll ask you to consider these other methods of lending Hizook a hand: Provide a modicum of financial support by using Hizook's affiliates (eg. Amazon and Trossen Robotics) when making robotics purchases; Enroll for a Hizook user account; Or just become an active commenter. Again, our goal is to foster a community for academics and professionals that promotes informal, yet educated, robotics discussions outside of rigid peer-reviewed conference and journal settings. We cannot hope to do this alone; your assistance is crucial!
I'm consistently surprised by the outpouring of support and enthusiasm from you, our Hizook readers, about this site and its content -- it's abundantly clear that academic and professional roboticists would welcome a commons for informal, yet educated, robotics discussions outside rigid peer-reviewed conference and journal settings. Hizook was founded to fill that role. However, keeping up with world-wide robotics news would be a full-time endeavor -- a role to which I alone could not possibly do justice, especially accounting for my own research aspirations. Thus, Hizook is asking for your help... There are lots of ways to assist! In this first installment, we'll look at the the most pressing manner in which Hizook requests your assistance: contribute / submit articles related to your own projects, ideas, and research.
NASA's $2.3B Mars Science Laboratory (MSL) robot known as Curiosity took its first test drive on Friday inside a NASA cleanroom, moving about 1 meter. As the successor to two wildly successful Mars rovers (Spirit and Opportunity), NASA has high hopes for Curiosity, which weights as much as a small SUV, has a six-wheel rocker-bogie suspension about waist height, and is nuclear powered via a radioisotope thermoelectric generator (RTG). Curiosity is an amazing piece of technology in its own right, and even more impressive when considering its marvelous sensor payload. Personally, I'm proud to see my tax dollars being used for such impressive scientific pursuits. Check out Curiosity's first test drive in the video below.
Perching is one of the most common aerobatic maneuvers executed by birds and is representative of a large and important class of aggressive aerial maneuvers that take advantage of unsteady aerodynamics. During a perching maneuver, birds often exceed 90 degrees in angle-of-attack, exploiting both viscous and pressure drag for rapid deceleration. Russ Tedrake and Rick Cory at MIT's Robot Locomotion Group have drawn inspiration from these insane maneuvers by developing a gliding UAV that can perform perching -- eventually (presumably) allowing a UAV to perch and recharge on powerlines. This is an impressive feat on many levels: the physics (semi-turbulent flow, visualized in their photos), a controls perspective (dealing with high-speed maneuvers, non-linear dynamics, and real-time constraints), and an application perspective (the eventual integration of powerline recharging). Be sure to check out the photos and videos!