Low-Cost Depth Camera Update: Microsoft Kinect in November, Others to Follow Shortly?

Microsoft Kinect Depth Camera

Back in late March, Hizook provided an overview of various depth cameras (aka range cameras, 3D cameras, time-of-flight cameras, RGB-D cameras), including the PrimeSense solution now known to be the basis of Microsoft's Kinect (formerly Project Natal).  In the last 3 months, the depth camera space has seen numerous updates, such as additional commercial offerings and product updates / availability.   Perhaps the most exciting news (as we speculated in March) is that low-cost offerings will indeed be hitting the market later this year:  Microsoft recently confirmed that Kinect (formerly Project Natal) will start shipping in early November and is already available for pre-order on Amazon.com for $150 USD!   More questions than answers remain -- here is what we know, help us fill in the gaps...

Having suffered with slow tilting laser rangefinders, my recent experiences with real-time depth cameras (both the SR4000 and Willow Garage's projected stereo on a PR2) make me a firm believer that depth cameras are going to be a boon to robotics.  The announcement of Kinect's availability and pricing has pretty much confirmed the start of the low-cost depth camera era.  However, I have numerous questions and concerns:

  • Does Microsoft's contract with PrimeSense permit general purpose use of the Kinect sensor, or will the data be encrypted and XBox-only?
  • Even if permitted, will Microsoft release an open API allowing the Kinect to stream data to a PC? If not, how long will it take hardware hackers to expose the data?
  • The underlying technology behind Kinect does not seem ammenable to multi-sensor environments (the projected patterns will likely interfere).  Can this be addressed, or will an alternative solution (ie. projected-light stereo) fill the void for robotics?
  • Will alternative products emerge at similar pricing?  Competition is always good, and there seem to be numerous contenders...

 

Back in late March, Hizook provided an overview of various depth cameras

  • PrimeSense  (now known to be the basis of Microsoft's Project Natal / Kinect)
  • Willow Garage projected-light stereo
  • Swiss Ranger SR4000
  • PMD Technologies CamCube
  • 3DV Systesms' ZCam.

 

To that list we can now add the following:

 
Optex Depth Camera Based on Canesta Solution
The device pictured at left is a ZC-1000 series depth camera offered by Optex (of Japan).  The design is actually based on a custom-silicon depth camera solution provided by Canesta.  Canesta, like Primesense, markets the underlying technology to OEMs for product integration.  Based on product flyers, it seems to be a pulsed IR time-of-flight variant.
 Panasonic Depth Camera D-Imager The Panasonic D-Imager is yet another offering.   You can learn more about it in this informational video.  It is supposedly being released some time in 2010-2011.  The D-Imager also seems to operate on pulsed IR time-of-flight, though it is unclear whether the underlying components can be attributed to Panasonic or someone else (ie. Canesta).
 Hobbyist Depth Camera There is an open-hardware project by Kyle McDonald that uses a webcam and projector to perform real-time 3D capture at 60fps.  This is similar to the Willow Garage projected stereo system, but seems to be a hobbyist effort.

 

Does anyone know of other depth camera examples (OEM or otherwise) or pricing / availability?  Also, if you hear about any Kinect hacks, be sure to let us know!


 

Comments

I mentioned this company in your previous post about the topic:

http://www.omekinteractive.com

http://techcrunch.com/2010/05/06/omek-interactive-project-natal/

and I will reiterate:

"Is this the death of Sick/Hokuyo, SwissRanger, the 'willow garage viedere rig', and other high end depth sensor manufacturers?"

you do mention "suffered with slow tilting laser rangefinders...confirmed the start of the low-cost depth camera era"

I'm curious what willow garage thinks about this, and why they use visible vs. IR and the expense of a videre/hokuyo depth perception rig - what is your opinion?

—Anonymous

@ Anonymous,

Ah yes, I opted not to include Omek Interactive.  As far as I can tell, their product is a middleware (software) that processes data from any depth camera, and that they do not have their own hardware product.  Is this accurate?

As for the Willow Garage projected stereo solution, it is my understanding that the visible light is just for easy debugging on their prototype(s).  I believe they have ongoing efforts to make this IR instead, but that's complete hearsay.

—Travis Deyle

One quick clarification for Anonymous: the Willow Garage rig is not a Videre rig. It uses global shutter ethernet cameras.

As for what WG thinks about all of this, I can't speak for the company, but who wouldn't be excited by cheaper depth sensors? The perception libraries that WG is building will work with a wide variety of sensors. Greater availability of good sensors only makes those libraries better and expands the community that can use and contribute to them.

—Ken

Just checked the updates on this post and wanted to thank Travis and Ken for their responses.

Re: Omek interactive hardware

I don't know - their site does not provide much detail

It is quite exciting to consider that wg is developing an open source sensor similar to the primesense/natal product

 

—Anonymous

I should have re-read their site again before posting - it does say middleware, no mention of providing hardware

I assume they are working with the primesense hardware (?) - hopefully Microsoft or Omek provides an open api too

—Anonymous

Can someone recommend references or links that explain the physics of these IR projector-camera pairs?  I am particularly interested in Primesense.  Time-of-flight (TOF) seems pretty straightforward--although the time differences must be very small.  Does PrimeSense work by TOF?  Thanks. 

—Alex

It looks like the folks at Adafruit are offering a $1,000 prize for an open Kinect API.  I'll update this article's comments when I receive notice of success.

 

Microsoft Kinect API

 

I must admit, about three months ago I had conversations with some folks about doing this exact same thing (funding an open source API effort for the Kinect).  I passed for two reasons: 

  1.  Insufficient Hizook-related funds and time (I am trying to finish my PhD in the next 12 months).
  2.  Fear of DMCA litigation from Microsoft. That, and I'm no Bunnie Huang (the original XBox hacker) -- he's hardcore!

That being said, I'm pumped that Adafruit is stepping up!

—Travis Deyle

Let the age of inexpensive framerate 3D sensing begin!  The world's first mass-market, low-cost depth camera (or RGB-D camera if you prefer) started shipping yesterday in the form of Microsoft's Kinect, powered by PrimeSense. 

Excitement surrounding the new device is quite high, and probably would have been even without Microsoft's $500M marketing campaign.  Adafruit doubled their "open Kinect API" bounty to $2,000 to spite Microsoft's protestations, and news of the effort is permeating the Internet.  Teardowns by iFixIt have already commenced: 

 

Kinect RGB-D Camera Teardown
Kinect RGB-D Camera Teardown
Kinect RGB-D Camera Teardown

 

And people have been exploring the projected infrared patterns used to estimate 3D: 

Kinects Projected Infrared Pattern

 

Mind you, Kinect is probably just the first mass-market depth camera to hit the market -- others will surely follow.  If you're interested in some alternative technologies and how they work, be sure to check out the old Hizook article on depth cameras (aka RGB-D cameras, ranging cameras, or ToF cameras). 

—Travis Deyle

A better night-vision video courtesy my labmate Marc Killpack:

—Travis Deyle
Has anyone tried this sensor outdoors?  I know it probably won't work very well, but even if it only works poorly it might still be good enough for certain applications.
—Jason

Owing to Hizook's previous coverage of depth camera variants and my prediction that low-cost devices (eg. Kinect) mark a pivotal moment in robotics, NewScientist asked for my counsel when writing their latest article, "Inside the race to hack the Kinect."  Unfortunately, they didn't use any of my quotes or provide acknowledgement in their article... :-(  However, they did follow up with some of the experts I mentioned (eg. Dieter Fox), and had a number of quotes from acquaintance Ken Conley. 

Oh well, I guess I forgive them... at least they recognize the importance to robotics, and they exposed me to this new RGB-D video from Dieter Fox's group:


 

—Travis Deyle

This is a bit late, but here's a short vid explaining the IR portion of the depth sensor: http://www.youtube.com/watch?v=uq9SEJxZiUg

—Scott Driscoll

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <p>
  • HTML tags will be transformed to conform to HTML standards.

More information about formatting options