"Light Leaks" is an immersive installation built from a pile of mirror balls and a few projectors, created for CLICK Festival 2013 in Elsinore, Denmark.
The app is updated for openFrameworks 090
- https://github.com/kylemcdonald/ofxCv (in addons)
- https://github.com/kylemcdonald/ofxControlPanel (in addons)
Before doing any calibration, it's essential to measure the room and produce a
model.dae file that includes all the geometry you want to project on. We usually build this file in SketchUp with a laser rangefinder for measurements, then save with "export two-sided faces" enabled, and finally load the model into MeshLab and save it again. MeshLab changes the order of the axes, and saves the geometry in a way that makes it easier to load into OpenFrameworks.
- Capture multiple structured light calibration patterns using
EdsdkOsc. Make sure the projector size and OSC hosts match your configuration in
settings.xml. If the camera has image stabilization, make sure to turn it off.
- Place the resulting data in a folder called
ProCamScanand this will generate
- Place your
referenceImage.jpgof the well lit space in
camamokon your reference image. Hit the (back tick key) to generate the normals, then press the
saveXyzMapbutton to save the normals.
- Place the resulting
BuildXyzMap. This will produce
- Copy the results of
- Each projector should be focused on the mirror balls, outputting native pixels (no scaling or keystoning) and framing the entire collection of mirror balls.
Click Festival (2012)
- Mac Mini
- 1024x768 (native resolution) with TH2G
La Gaîté Lyrique (2014)
- 1280x1024 with TH2G on projectiondesign F32 sx+ (native 1400x1050) inset on the sensor
- When calibrating, create a network from the calibration computer that shares the ethernet and therefore provides DHCP.
- BlackMagic grabber and SDI camera for interaction
Scopitone Festival (2015)
- Mac Pro (the bin)
- Mitsubishi UD8350U 6500 lumens, 1920x1200
- 2 projectors running through triplehead2go, the last directly form
- Run the CalibrationCapture app from a laptop that is on the same network as the computer that is connected to the projectors.
- On the computer connected to the projector run the Calibration app.
- Plug the camera into the laptop, and position it in a location where you can see at least N points. The app will tell you whether the image is over or underexposed.
- Transfer the images to the Calibration app, it will decode all the images and let you know how accurately it could reconstruct the environment.
- In the Calibration app, select control points until the model lines up.
- Start the LightLeaks app.
- Auto calibrate from video taken with iPhone. Using BMC to encode the gray code signal.
- Auto create point mesh from images so 3d model is not needed, and camamok is not required.
- Change ProCamScan into a CLI that automatically is triggered by ProCamSample for live feedback on progress (highlight/hide points that have good confidence)