This is a quick and scruffy video demonstrating a new method for calibrating projection mapping that I’ve been workin gon
- Mesh of scene inside VVVV and on iPad app
- Create correspondences between points in world space (XYZ) and points in projector space (XY)
- Use the iPad to select the world space points, and to control a cursor to input the projector space points
We’re going to put together a clearer video about this as soon as we can (involving Mimi’s communication skills!). So hold tight if you can’t quite figure out from this phone camera clip. Also we’ll be going into more detail about all this at our workshop at Node 10 festival, presented by myself and Chris Plant in mid November. Code and documentation will also be made available around that time. I’m not entirely certain how to release the iPad/iPhone app yet.
The end aim of all this is that you can very accurately calibrate a projector for projection mapping within 5 or 10 minutes.
This method can also be extended for use with structured light with light sensors (either embedded or external). More on that shortly!