Structured light 3D scanning of projector pixels (stage 2: proof of concept)

We’ve been working with Structured Light to create a 3D scan of where every pixel of a projector lands on a field of ‘stuff’. We are now trying this method with projecting onto trees to create a new type of Projection Mapping / 3D Media Technology.

By determining the 3D location of where every pixel from the projector lands on a tree, each usable pixel becomes a ‘voxel’, i.e. a pixel with a 3D position (note: unlike many voxel systems, we do not expect our voxel arrangement to be homogenous / regular).

We therefore create a set of voxels to display 3D images in the real world, i.e. a volumetric 3D display system.

Using our structured light scanning system built in openFrameworks called MapTools-SL (discussed here) and our low cost scanning setup involving a pair of webcams and a hacked tripod from Argos:

We create the following scan:

YouTube Preview Image

We then feed these known 3D points into a shader patch we wrote in VVVV,  which outputs relevant colour data to each voxel, therefore creating graphical shapes within the voxels.

In this video, we can see a sphere which travels through the tree. The sphere can move parallel to the beam of the projector, which indicates that the system is correctly resolving depth.

YouTube Preview Image

This second video demonstrates this effect from different angles, and also a preview of what video data is being sent to the projector:

YouTube Preview Image

This is a proof of concept that the mechanism actually works.

However there is much more yet to do. The video documentation is not clear, relevant content needs to be imagined/developed/tested, there is a lot of noise on the scan set and there are only a small percentage of voxels that are working.

If you would like to see this project in person, then you will be able to visit an ‘in progress’ prototype at FutureEverything 11th-14th May 2011 in Manchester.

 

Tags:

5 Responses to “Structured light 3D scanning of projector pixels (stage 2: proof of concept)”

  1. .tel layar » Augmented Reality: Kimchi and Chips, structured-light 3D projection mapping Says:

    […] http://www.kimchiandchips.com/blog/?p=583 […]

  2. Marek Says:

    That is incredible! I’d love to contribute some content!? Please get in touch!

  3. Kimchi and Chips' blog » Blog Archive » ofxPolyfit Says:

    […] implementation which lets you filter out good data from poor datasets. This is to be used with our tree projection experiments to filter out bad […]

  4. Petros Says:

    Great work! A few links/directions to some papers or sources describing the maths behind it, would be really helpful!!

  5. elliot Says:

    Hi Petros
    I haven’t written any papers, but the source is open at http://code.kimchiandchips.com
    Elliot

Leave a Reply