With micro / pico projectors being sold for under $250, and robot toy maker Wowwee getting in the game, it was only a matter of time before projectors would be found on robots -- especially since the general concept dates back at least three decades to R2D2's holographic projections in the original Star Wars trilogy.  In fact, Hizook previously examined a number of robots with projectors used to communicate intention.   Following the development of a laser pointer interface by the Healthcare Robotics Lab (to which I belong), myself and numerous labmates ruminated about the marrying of these two technologies -- it seemed a natural extension of the "Clickable World", wherein the world is composed of virtual buttons or icons selected via a laser pointer analogous to a PC mouse, to include visual feedback via an on-robot projector.  It seems ideas rarely stand in isolation; I'm now aware of two robotic systems that use both video projectors and laser pointer interfaces.  The first is a very preliminary "late breaking results" submission to HRI 2009, while the other is a fully-realized system developed in JST's ERATO program.  The latter research happens to have a compelling video, embedded below.

First, the cool video from the Japan Science and Technology Agency (JST) Exploratory Research for Advanced Technology (ERATO) program.

To quote the recent Interact 2009 publication, entitled "Designing a Laser Gesture Interface for Robot Control"

A laser pointer can be a powerful tool for robot control. It can accurately point to real world locations. However, previous studies only used it for target designation and did not fully explore its potential as a versatile input device. This study proposes a user interface using a laser pointer for giving various instructions to a robot. The basic idea is to apply stroke gesture recognition to laser trajectory. The user can specify the type of task and the target position, as well as target objects. We conducted user studies for first-time users using a prototype implementation and refined it. The participants using the final design were able to get accustomed using the final design of our interface in ten minutes. 

As I mentioned, this idea is not completely "out of the blue".  Rather, it is an impressive combination of two previously disjoint technology implementations.  The first idea involves video projectors.  Those of us who watch sci-fi movies, will probably recognize this scene from Star Wars. 

R2D2 Holographic Projector on Robot

That's R2D2 projecting a holographic movie.  Like I said, robot's with projectors are becoming more commonplace; for example Dr. Matsumaru's robots -- shown below and discussed previously on Hizook.

Video Projector Robot  Video Projector Robot

Then, there is the "Laser-Pointer Interface," wherein a user illuminates an object on a table or on the floor, and the robot autonomously retrieves the object.  This interaction can be seen in the images below.

Laser-Pointer Interface for El-E robot    Laser-Pointer Interface for El-E robot

The laser-pointer interface was expanded upon by the so-called "Clickable World", whereby 3D points (such as those provided by the laser-pointer interface) are used as virtual buttons that, when "clicked", direct the robot to perform particular tasks -- very similar to the interaction between a PC mouse with the icons on one's desktop.  (Note yours truly in the image.)

El-E Robot and the Clickable World    El-E Robot and the Clickable World

Like I mentioned, myself and labmates (particularly Hai Nguyen) ruminated about combining these two interactions -- effectively replacing the hand-drawn markings in the above images with physical light projected from the robot.  We thought this would be an effective way of communicating the robot's intentions, cues, state, and perceptual status.  A number of these aspects are clearly illustrated in the video above, though the laser pointer interface and projector are located in the ceiling rather than on the robot.  I suppose this distinction makes the JST system quite similar to work previously described by Hizook involving a Smart Home's robot controlled by a multi-touch / touchscreen interface.

I think it is fairly clear how these two technologies were married to create this hybrid style of interaction. 

Robot with video projector and laser pointer interface
Robot with video projector and laser pointer interface
Robot with video projector and laser pointer interface

The other implementation I alluded to comes from a "late breaking results" submission to the Human-Robot Interaction (HRI) 2009 conference, entitled "Robots with projectors: an alternative to anthropomorphic HRI".   This work out of the Digital Experience Lab is more similar in spirit to the "Step-On" projector interface by Dr. Matsumaru's, but also resembles the Healthcare Robotics Lab's laser pointer interface in that the robot carries the laser pointer detector and projector on-board.  Curiously, their existing papers appear unaware of the aforementioned related work.

Robot with video projector and laser pointer interface

 

I will be excited to see systems employing laser pointer interaction plus video projectors become more  common, but more importantly, I think it's important to recognize the raw utility of having a mobile projector at one's disposal -- you can watch TV / video, surf the net, or give presentations anywhere you personal robot can accompany you!

 

Comments (3)

Laser Pointer April 24, 2010 at 07:05 AM

Great article and thanks for your shareing.

We are a laser  manufacturers which can produce the laser pointer in this article.

Here is our website link: http://www.perfectlasers.net

Thanks again.

 

Homa

Travis Deyle April 24, 2010 at 11:16 AM

@Homa,

I checked out your website; you guys appear to sell some high-quality laser pointers (including yellow!).  However, I should note that the work discussed in this article used off-the-shelf laser pointers, that were deemed "safe" for operating without eye protection.

Kyle January 26, 2011 at 01:09 PM

Do you know what kind of pico projectors were used for this project??