For this years StedSans I ended up creating a machine “x-raying” a building as the audience walked by, off course sporting a projector and a PC. (I know everyone else would use a Mac. And I don’t care.) The building was about 50 meters long, so I had to rig the contraption onto a trolley which I pushed manually down the opposite sidewalk of the building.
As the trolley moved along videos of bizarre situations should be displayed on the wall of the building.
I needed some way to sync the projected image with my movement so all the action would appear to stand still on the wall while the view port was moving. I was planning to connect a wheel to an optical fork, connected to an Arduino calculating the movement and sending it to the PC trough the USB- serial port. It ended up A LOT simpler; a large Lego wheel connected straight into the scrolling wheel of a mouse. Easy to make, easy to interface.
To display the more then 15 videos on the right place I made a patch in Processing begin able to start, stop and wall-sync up to 30 videos simultaneously. To keep frame rates up I started and stopped the videos as the entered and left the view port. Most of the videos had no sound, so suiting clips where played from Reason controlled from the Processing patch trough midi.
The plan worked out quite ok, but far from perfect. The problem the wheel sync is that it does not account for changes in projector angle or bumps in the sidewalk. My trolley also got far to heavy and was hard to control. I ended up making cue marks with chalk on the sidewalk where I stopped my trolley and pushed a key to resync the viewport position to the next cue. Hard to explain really but it did help a lot.
Footage is taken of the whole performance, but I doubt I’ll ever get finished mixing it down to a decent Youtube video to show around. Andy, I need you! To many projects and to little time!
And .. eh .. should write more details but I’ll just publish it now. Cheers!