Valve's "Lighthouse" Tracking System May Be Big News for Robotics

Valve Lighthouse Tracking Technology

While walking around the Bay Area Maker Faire this weekend, I stumbled across an amazing piece of technology: Valve's "Lighthouse" tracking system. Valve's demos were (supposedly) a major contributor to Oculus' fundraising efforts and ultimate sale to Facebook, and this device may have been a key piece of those demos. I'd heard rumors about this system for many months. It was designed for augmented and virtual reality -- namely, for head and controller tracking, where it needs to have some insane specifications for positional accuracy, angular accuracy, and especially(!!!) latency. Honestly, the rumors didn't do it justice: it's really elegant! This solution is exceedingly simple, low-cost, light weight, and performant.  It's much (much!) better than the image processing techniques I've seen to date. Most importantly.... I think this technology could be a "big deal" for robotics too.  I had a chance to speak with Valve's Alan Yates about how the Lighthouse system works; I didn't get all the specifications, but I did get some interesting information -- so read on!

 

Valve's "Lighthouse" Optical Tracking System:

 

I captured a few photos (with permission) at Maker Faire. The transmitter (highres here) has a few key components: a bank of infrared LEDs, and two spinning IR lasers -- one that sweeps across the X axis, and one that sweeps across the Y axis. These lasers are different from normal laser rangefinders' "point" lasers, in that each of the Lighthouse lasers is a line laser.  The two lasers are mechanically timed so that they're swept roughly 180-degrees out of phase.

Transmitter for Valve's "Lighthouse" tracking system

 

Unlike a laser rangefinder, the transmitter does not estimate the distance or pose of the tracked points -- it just emits light.  Each "receiver" is a simple photodiode, which can be integrated (with other photodiodes) into a rigid unit. Here's Valve's prototype for inclusion into a game controller (highres here). You can see a number of photodiodes (little black squares); IIRC, Alan said it had 17 individual photodiodes so that several would be unobstructed regardless of orientation.

Receiver for Valve's "Lighthouse" tracking system

 

I made an animated GIF to explain how the system works; let's look at the light as seen by one of the photodiodes:

Animated Explanation for Valve's "Lighthouse" tracking system

 

The IR LEDs provide the start of a timing sequence.  A microcontroller (attached to the photodiode) starts a counter (with fine-grained time resolution) when it receives that initial sync signal, and then waits for the X and Y line lasers to illuminate the diode.  Depending on the time elapsed, the microcontroller can directly map the time delay to X and Y angular measurements. Using multiple photodiodes and knowing the rigid body relationship between them, the microcontroller can calculate the entire 6-DoF pose of the receiver.

This solution is elegant for a few reasons:

  • The computation overhead is minimal, especially compared to image processing.
  • It's super-low latency. Unlike image tracking techniques, this system doesn't need to wait while data-intensive images are transmitted, processed, and analyzed.  The microcontroller count can be quickly and accurately mapped to angles in (basically) one instruction cycle.  This is super-critical when it comes to VR and AR, as angular errors or latency delays can create problematic artifacts that make them unusable.
  • It relies on the high(ish) time resolution at the receiver to determine angles. This has major benefits over systems (like Raskar's, which I'll discuss later) that use a similar technique, but have limitations due to spatial resolution rather than time resolution.
  • The receiver hardware is stupidly cheap: e.g., a $0.01 photodiode. It's also very small and light weight, so it could be included in almost any object.

 

One thing to note: I simplified this discussion a little bit.  Like many IR systems, the LEDs and lasers are actually modulated (Alan said, "on the order of MHz").  This is useful for a few reasons: (1) to distinguish the desired light signals from other IR interferers such as the sun; and (2) to permit multiple transmitters with different modulation frequencies. This is a pretty obvious enhancement, but it muddles the layperson description.

I was kinda scrapped for time, so I didn't pry too much about detailed specs. So no information on sensing rates or pose accuracy (I think I heard 1mm accuracy over 5m range, but my memory is hazy). When I asked about availability, they said they're keen to get them in the hands of Makers to explore new applications. I didn't really inquire about licensing or IP (hands over ears... "lalalala..."), but Alan did said they're more interested in ensuring standards compliance rather than extracting revenue.

They had a few other prototypes sitting around too:

Other prototypes of Valve's "Lighthouse" tracking system  Other prototypes of Valve's "Lighthouse" tracking system

 

According to recent news reports (here and here), this tracking system will be a critical component of the HTC Vive VR system:

Valve's "Lighthouse" tracking system as part of HTC Vive  Valve's "Lighthouse" tracking system as part of HTC Vive

 

Unfortunately the Maker Faire is over,** so you probably won't get a chance to see the units in person. 

** Sorry. I meant to get this post online in time for people attending on Sunday to swing by and check it out. But I'm busy working all weekend to finish some deliverables for my day job at Google[x] -- the same, sorry reason that I haven't been active on Hizook for months!

 

Here's what the poster said:

- Lighthouse Optical Position Tracking - 

A simple optical position tracking & navigation technique for maker projects.  Want your robot or quadcopter to know where it is and where it is pointing in free space?  Lighthouse is a scalable way to implement 3 or 6-DoF navigation for your project.

Maker Faire Poster for Valve's "Lighthouse" tracking system

 

 

Robotics Applications of the Lighthouse Tracking System

 

There are all kinds of robot localization systems that use 2-part tracking systems: GPS (TX on satellites, RX on robot), fiducials (eg. tracking colored blobs, QR codes, ARToolkit, April tags, etc), my work on long-range RFID, etc.  A tracking system of this kind has the potential to replace or augment all of those old applications.  So for example:

  • Tracking your quadrotor indoors so that you can pull off those crazy quadrotor demos at home without a $50k vicon system!
  • Tracking an inflatable robot's kinematics (or just end effector) -- an otherwise difficult task.
  • Replacing the fiducial tracking for systems like Kiva Systems, First Robotics, and RoboCup.
  • Integrating range measurements in the base station: You could trivially use the line lasers plus a camera to do ranging (like Morgan Quigley's borg scanner) or add a ToF camera. Then the base station would get range information as well as pose information from the "tags."

 

There are a lot of interesting possibilities and permutations....

 

 

A Quick Note on Related Work

 

If you're interested in Valve's Lighthouse system, you might also be interested in an analogous method by Ramesh Raskar (MIT Professor), Johnny Lee (of Project Tango fame), Jay Summet (a friend of mine), et. al. from sometime around 2005-2007. Instead of using line lasers, they use projectors (LED/lasers with masks, DLP light projectors, or any number of pico projectors) to project a series of 2D IR images onto a scene. A single photodiode can then determine its angular location based on the its observed light-dark signals. Combine multiple photodiodes for full 6 DoF pose.

I don't want to spend the time to go over the details.... which can be found in: "Lighting-Aware Motion Capture Using Photosensing Markers and Multiplexed Illumination" (websitePDF) and "Moveable Interactive Projected Displays Using Projector Based Tracking" (PDF). But here is an image the describes the basic idea -- it should be obvious given the relevance to Lighthouse.

Raskar's Projector Localization

 

 

Maker Faire Impressions

 

This is the third time I've attended Maker Faire.  I attended the inaugural Maker Faire in 2006, and it was awesome. It seemed like the audience and Makers were mostly hardcore hackers, and there was a large variety of projects.  After we moved back to California, we attended in 2013.... and it sucked. The entire event was dominated by commercial companies peddling their wares, and entirely too many 3D printers (yes, I get it... you can assemble a MakerBot).  We attended again this year on a whim, and I was impressed.  The quality, quantity, and variety of hobbyist Makers was vastly improved.  The only gripe I might make: It was bloody crowded!  Next year, I'll probably attend as a Maker so that I can get those Friday VIP passes to avoid the crowds.  Maybe I'll demo my wirelessly powered robot swarm -- or perhaps something entirely new....

 

Comments

I did a few basic searches for information on Lighthouse that pre-dates the Maker Faire. I found a few links, but I don't have time to read through them right now.  Just in case I find some time later, let's leave 'em in a comment:

  • Examining the Valve/HTC Vive Ecosystem: Basic Sensors and Processing (link)
  • It seems Oculus knew about the Vive laser tracking (link)
  • Tested: Vive review and interview (link)

 

—Travis Deyle

"Valve's demos were (supposedly) a major contributor to Oculus' fundraising efforts and ultimate sale to Facebook, and this device may have been a key piece of those demos."

It wasn't. :)

The Facebook sale preceeded the first Lighthouse-enabled headset by a couple of months.

—KT

What temporal resolution are they supporting?  I know their VR headset supports 90 hertz.  Are these running faster than that?  Can they support faster than that?

—Bruce Wright

@Bruce,

I don't know for certain.... But according to some of the other links I posted, it looks like they're operating on a 40-50ms cycle (so 20-25Hz). Practically speaking, you can use an IMU to provide high-datarate intermediate pose estimates (e.g. a Kalman Filter), with Lighthouse providing periodic "ground truth."  This is pretty common in the pose estimation literature (eg. GPS navigation).

—Travis Deyle

So it is Duck Hunt, and the user is the TV.

—Anonymous

Where can we find this kind of spinning laser ? 

What about impact of ambient light on accuracy ?

This system doesn't seem hard to reproduce with an Arduino.

What about accuracy and position of laser in the box ? I mean, does that the laser must run perfectly straight ?

—Anonymous

What you opinion about new laser 'PUCK' published by Velodyne?

And new product Quanergy costs only $250 and is the size of a credit card.

And 5hz 360degree 2d Laser Scanner (LIDAR): RPLIDAR. just only $399

Hi Anthony,

As I'm sure you can appreciate, laser rangefinders are a recurring interest for me:

 

There's been a lot of development in the lidar space, especially for ones targeting the autonomous car markets (e.g. Velodyne and Quanergy). As you also pointed out, there are a number of hobbyist-grade lidars such as the RPLidar and the TeraRanger -- both of which share similarities with lidars used by mass-market cleaning robots (ie. the Neato Robotics lidar with a $10 BOM cost).

Very little about these "new" systems is actually new.  Most of the improvements are incremental and/or cost optimizations as they move to production at scale. I'm afraid I haven't used any of these new lidars, and I do not actively track their development.  But I am glad to learn about them -- so thanks for commenting!  Please let me know if you learn of others!

 

Since you've commented on the Valve Lighthouse post, I should probably also point out: For a lidar to produce pose estimates, you must perform computations akin to those required by cameras (or depth cameras).  So it's unlikely that a lidar-based system will be able to match the latency specifications of LightHouse that are so very critical to VR/AR.  The latency specification may (or may not) be relevant to robotics applications too.... but cameras, depth cameras, and lidars produce "raw" data without requiring a receiver -- so they're also a bit more flexible.  Ah, tradeoffs.  :)

—Travis Deyle

You didn't mention Lidar-Lite from http://pulsedlight3d.com/ which has been out since early this year. The existing version has a 100 Hz (-ish) update rate and they just announced a new version with a 500 Hz update rate. I'm not sure what the latency is, but given that refresh rate it may be faster than the Lighthouse system.

—JBeale

@JBeale,

From what I can tell, the PulsedLight3D is a planar laser rangefinder.  It has nothing to do with localizing a "target" with high precision in 6DoF.  Rather, it's useful for a robot to do planar SLAM.

Am I missing something...?

—Travis Deyle

Nothing ever is new under the Sun I guess. The two mentioned methods (linear scan / binary search) are exactly the same methods used way back when dinosaurs still roamed the earth to find the position of a light pen programmatically. The _proper_ way was to measure delay from vertical sync of course, but that information was not always accessible, and is gone for good with the death of the scanned CRT display...

—Max

(I cannot figure out how to add my results - in either tabular, plain text, or PNG image without the spam filter rejecting my submission).

If anyone is interested in my results, please send me your email address and I can send these directly to you. If someone can  provide guidance on how to submit my table of results without getting rejected by the spam filter, I would appreciate that!

Can they support faster than that?

I need that information

Last month there was an article about a new VR Tracking system called MEMS Tracking System (MTS) developed by Jack McCauley (former Oculus VP of Engineering) in his Livermore, California lab.  The technical details are a bit scant, but here's a basic description and video:

For a proof of concept, MTS is impressive. When McCauley first demonstrated the system, a small spherical marker attached to a Gear VR headset was sitting about 5 feet away. He turned on the MTS basestation and I watched as the laser slowly indexed the scene looking for its target. It started, quite logically, at the top left, and ran horizontally until reaching its right-most limit, then returned to the left, dropped down slightly, and continued from there, line-by-line. When it reached the marker on the headset, it stopped. The initial scan took 4 or 5 seconds. So at this point I figured MTS was a neat concept, but there was still much work to be done to reduce 5 second cycle down to mere milliseconds so that the tracking would be fast enough for practical use.

 

If I to wager a guess... this looks like a simple DLP or galvo-based laser system that is doing a full FoV sweep, followed by precision tracking thereafter. Not a bad approach, but it will have some major drawbacks compared to other indoor tracking systems (like the LightHouse); namely, it can probably only track one target at a time. Here are a few photos of the base station:

MTS VR Tracking system by Jack McCauley MTS VR Tracking system by Jack McCauley

 

And here are a few screenshots during scanning (ie. Lissajous curves indicative of a galvo system) and during tracking.

MTS VR Tracking system by Jack McCauley MTS VR Tracking system by Jack McCauley

 

—Travis Deyle

It seems like PreNav might use a similar technique to MTS / Lighthouse, in that they use a basestation to do the localization:

Instead of tracker cameras placed in the corners of a room [like Vicon], the PreNav system uses two components: their own drone platform and a tripod-mounted ground station. The station does an initial laser scan of a structure that the pilot wants to fly around, then creates a 3D point cloud and prepares the drone’s flight path around it. After a manual review, the drone is launched and controlled in space by the ground station’s tracking and positioning tools, communicating at 100 times per second — much faster than GPS — to ensure accurate positioning at the centimeter level they’re boasting.

PreNav Drone Localization System

—Travis Deyle

Another approach, described by this DIYdrones user, is to do a "flipped" version of the StarGazer indoor robot localization system for outdoor drone localization:

Inverted StarGazer Localization system for Drones

Inverted StarGazer Localization system for Drones

—Travis Deyle

It is exciting to see the performance and affordability of the Valve Lighthouse.

A couple of thoughts:

  • I saw that Quanergy was mentioned in the comments above. They are making an affordable solid state LIDAR that could be used as the laser transmitter in future Lighthouse models.
  •  
  • Also wanted to give a shout out to the folks who created the Nikon Metrology Indoor GPS system, which has a pretty similar architecture to Lighthouse. That system is orders of magnitude more accurate and works over much larger working volumes (factory scale). Of course it is much more expensive as well. Here is a link.

This is great websites, I visited some of them but I didn't know that Google offers free games.And this is I play free flash games in this websites. - google maps driving dirctions

·        You have posted a extremely detail document. I go through all of your article and I actually really like it, I understand your point of view. gta 5 cheats

—Frank

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <p>
  • HTML tags will be transformed to conform to HTML standards.

More information about formatting options