We installed three video projectors facing the audience at a small venue, and made a patch in Processing simulating moving heads, strobes and other effects we came up with.
EDIT! I HAVE UPDATED THE SOURCE CODE TO BE INDEPENDENT OF THE DREADED DMX LIBRARY!
Click here for zip with the new source code (for Processing) tested on Mac and PC.
You will need to install Processing at http://processing.org/ and the RWMidi lib at http://ruinwesen.com/blog?id=95
Basic instructions are included in the header of lightTable.pde.
For this years StedSans I ended up creating a machine “x-raying” a building as the audience walked by, off course sporting a projector and a PC. (I know everyone else would use a Mac. And I don’t care.) The building was about 50 meters long, so I had to rig the contraption onto a trolley which I pushed manually down the opposite sidewalk of the building.
As the trolley moved along videos of bizarre situations should be displayed on the wall of the building.
I needed some way to sync the projected image with my movement so all the action would appear to stand still on the wall while the view port was moving. I was planning to connect a wheel to an optical fork, connected to an Arduino calculating the movement and sending it to the PC trough the USB- serial port. It ended up A LOT simpler; a large Lego wheel connected straight into the scrolling wheel of a mouse. Easy to make, easy to interface.
Foto: Dag Jensen
To display the more then 15 videos on the right place I made a patch in Processing begin able to start, stop and wall-sync up to 30 videos simultaneously. To keep frame rates up I started and stopped the videos as the entered and left the view port. Most of the videos had no sound, so suiting clips where played from Reason controlled from the Processing patch trough midi.
The plan worked out quite ok, but far from perfect. The problem the wheel sync is that it does not account for changes in projector angle or bumps in the sidewalk. My trolley also got far to heavy and was hard to control. I ended up making cue marks with chalk on the sidewalk where I stopped my trolley and pushed a key to resync the viewport position to the next cue. Hard to explain really but it did help a lot.
Footage is taken of the whole performance, but I doubt I’ll ever get finished mixing it down to a decent Youtube video to show around. Andy, I need you! To many projects and to little time!
And .. eh .. should write more details but I’ll just publish it now. Cheers!
After to many changes in the plot of the Stedsans Drammen scene, and after seeing the area with my own eyes, I figured the Lamp Forrest would thrive there. Instead we have agreed to do a virtual x-ray scan of one of the buildings which has a nice flat and bright surface for projecting.
A bit like this: http://www.chrisoshea.org/out-of-bounds but with the projector moving on a trolley to cover the whole building.
The idea is very much inspired by Luca, the Italian artist contributing to Stedsans Porsgrunn and other GF events.
The time is short and the road is long and winding. Godspeed myself.
(by the way, the new “white” layout is awful… I want Chuck back! )
My two years of infrared investigation were rendered obsolete with the arrival of the Kinect. And I think that’s ok, because this toy is so much more fun. The games I have tried are really lame, but hacking into its data streams opens up new ways to do interactivity.
So we are entering a stage of technological artistic development where the object easily can engage in a cybernetic relationship with the observer. This feedback process opens up a series of real possibilities, in contrast to the optical illusions from the history of painting.
REMASTERPIECES is an attempt to add something to painting in the 21th century. Using Kinect to set up a space where the spectator (usually a reflexive figure standing still) is turned into an actor. With a broad range of small programs we dissect the body of painting and recreate it as vulgar entertainment.
From scribbling cave walls to aid our hunting, illustrating the lives of our saints, discovering the properties of space and perspective, creating tools that copy reality into images, to the breakdown of the exterior and the outer manifestations of inner life…the time has come to reanimate the figure of the distant bourgeois observer.
What would happen if we treated a painting as a game and decoded its message into a playful one?
Hacking a PS3 and Kinect together for the ultimate FrankenConsole™
Our loosely based affinity group Demodrama has just launched its on-line platform and during the research I found the video above. I predict this will happen again and again in the future: small low/no budget groups reinventing and developing work that was done 10 years earlier by big institutions with a lot of money. As cameras and projectors become cheaper, computers more powerful and open source is widely used, this type of technology will be for everyone. We still need some more rounds on the code before it’s ready for release though.
What have we done? What do you need? First of all hacking a PS3 Eye camera, they are cheap and fast. Then a tracking software, we use Open CCV 1.3 (formerly tBeta and originally made for multi-touch screens), that will communicate with Processing through TUIO. In order to handle the complex setup with scenes, backgrounds, masks and sounds we use the Eclipse integrated development environment.
We are continuing with our digital masks. This is from the presentation we did in December. Next up is Encuentro AVLAB Helloworld!: Augmented Stage III.
Most of the work revolves around making the technical solutions performance friendly and along the way stumbling into problems: the resolution of the camera starts to affect calculations at long distances, we need clean code that makes the response time fast enough for the movement and other prototype stuff.