Using Padé projection mapping to calibrate Kinect’s 3D world with a projector.
- Using the kinect camera, we can scan a 3D scene in realtime.
- Using a video projector, we can project onto a 3D scene in realtime.
Combining these, we re-project images onto geometry to create a new technique for augmented reality
Previous videos (for process)
The pipeline is:
- Capture Depth at CameraXY (OpenNI)
- Convert to image of WorldXYZ
- Pade transformation to create WorldXYZ map in ProjectorXY
- Calculate NormalXYZ map in ProjectorXY
- Guassian Blur X of NormalXYZ in ProjectorXY
- Guassian Blur Y of NormalXYZ in ProjectorXY
- Light calculations on NormalXYZ, WorldXYZ maps in ProjectorXY