Past talks and workshops
February 24th, 2015Please find a selection of our past talks online at:
http://www.kimchiandchips.com/talks
and workshops at:
Please find a selection of our past talks online at:
http://www.kimchiandchips.com/talks
and workshops at:
A general problem with projecting onto objects (i.e. non-planar scenes) is the correct assignment of pixel brightness to every surface section on the object.
Here’s a short presentation regarding a technical idea we are experimenting with to automatically manage this scenario (Suggest pausing the video on longer slides):
The BAM map render pass finds the total available brightness for each surface element, allowing us to normalise how much brightness is actually presented to each surface element by modulating the brightness sent from each projector for that surface element.
Features include:
Elliot
It’s not necessarily inevitable to adopt a theory for your art.
However, it enables you to give an answer to “why” question sometimes.
(It also gives you an opportunity to pretend superficial though…)
This book is known as the best introduction to abstract art and composition.
But personally, it’s more interesting to see artist’s personal, emotional and non-scientific way of observation to understand a basic rule of visual form.
by Wassily Kandinsky.
Today we’re finally releasing the full results of the ScreenLab 0x01 that I was involved in curating at The University of Salford, MediaCityUK campus. For full details, check the article on The Creators Project.

Take a quick 3D Scan of some stuff with ReconstructMe then projection map onto it with CalibrateProjector.
Requires:
At the kimchips’ HQ we’ve been working on a long term intensive project, quietly resulting in a backlog of useful code projects that need documenting. Here’s a list of some of what’s been released since we last talked:
Updated video:
A simple camera for openFrameworks I threw together in transit. Name suggestions welcome!
ofxGrabCam is a camera for browsing your 3D scene. It ‘picks’ the xyz position under the cursor (using the depth buffer to get Z).
Rotations and zoom (left and right mouse drag) are then performed with respect to that xyz position
Inspired by Google SketchUp
P.s. this is probably not much use for point clouds / other sparse datasets where there’s nothing to ‘grab’ onto.
Available on github at http://github.com/elliotwoods/ofxGrabCam
You should also check out https://github.com/Flightphase/ofxGameCamera by the ever obvious jim.

I’ve had this sitting in my suitcase for a while, finally here’s the results of trying it out: (i hope to reword some of this later when I haven’t just got off of a 12 hour flight)
This adapter belongs in the same cupboard as a Matrox DualHead2Go (part of their GXM product line) but is made by Zotac who are new to this type of device. This device being: Take 1 video output socket on your computer, plug in one of these, and get 2 outputs (in our case for 2 projectors).
When you have everything connected, the 2 outputs appear to the computer as 1 large output (e.g. if you have 2 XGA projectors attached to the Zotac, then the computer will see a 2048*768 video head attached its output). This way, you can send separate signals to the 2 projectors (the left side goes to projector 1, the right side to projector 2).
At the moment I prefer to use HDMI because:
Here’s an image of the test setup:

When you plug everything together, nothing happens. It’s only when you switch the projectors on that the computer starts to recognise that there is a display attached. In fact, if you turn on 1 projector then you get an XGA output at the computer, only when both projectors are turned on does the 2*XGA output appear in the PC settings.
This is in contrast to the Matrox which offers you the relevant resolutions directly on connection of the Matrox to the computer (the connection state of the projectors isn’t generally reported to the computer’s graphics card). This is advantageous for reliability as the state of the system from the PC’s point of view remains constant.
The behaviour of the Zotac would generally require you to turn on the projectors before turning on the PC when running an installation which starts on boot.
The specification quotes 2*HD is supported (3840*1080), however it was not offered to me even though the projectors support it (I’ve tested and used 1920*1080 on these projectors before on 15meter signal length). Only the native resolution was offered to the PC for the dual modes (XGA). For single head modes more resolutions were offered.
The Zotac should be able to support 120Hz XGA (XGA@120Hz ~= HD@60Hz in terms of bandwidth), but this was not supported / offered.
Selecting XGA@120Hz resulted in the signal being passed through to 1 projector only, this gives the same behaviour as an ordinary Mini-DP to DVI adapter when used with this projector.
In fact, the adapter works perfectly well as a single HDMI signal adapter. This somewhat explains the strange initialisation (it seems to switch personalities between a single and dual head adapter). Since it’s only a few more £’s than getting an active Mini-DP>DVI>HDMI adapter chain, this becomes very attractive.
I like it!

Advantages over Matrox:

The Kinect device inputs a realtime 3D scan of a world scene.
A projector outputs a realtime 2D projection onto a 3D world scene.
Using OpenCV’s CalibrateCamera function, we are able to calculate the intrinsics (focal length, lens offset) and extrinsics (projector position, rotation) of a projector relative the 3D scan of the Kinect.
We project a 3D virtual world scene onto a 3D real world scene by presuming that they are geometrically consistent (thanks to the Kinect) and knowing the intrinsics and extrinsics of the projector.
We can think of this as either:
Patches and plugins are available at http://www.kimchiandchips.com/files/workshops/artandcode/ArtAndCode-CalibrateProjector.zip
(the old link went to github, but there seems to be some bugs with their download system at the moment https://github.com/elliotwoods/artandcode.Camera-and-projector-calibration/downloads)
Inside is a plugin which wraps EmguCV and OpenNI (you’ll need to have a recent version of OpenNI installed).
Also there are 2 patches:
CalibrateCamera
CalibrateProjector (WARNING : Renderer will open fullscreen on 'second' screen to right of main screen e.g. projector)
Workshop notes are available here
openFrameworks code here (will be adding / amending / breaking / creating in that repo. You might want to checkout the artandcode-end tag).
일시 : 현재 부터 – 2012년 1월 30일.
김치앤칩스에서 내년 1월 오픈할 인스톨레이션 프로젝트에 함께 할 어시스턴트를 구합니다.
처음 스텝부터 모든 프로세스를 함께 고민하고 풀어나갈 크리에이티브한 사람을 찾고 있는데요, 조금 더 구체적으로 말씀드리자면…