My two years of infrared investigation were rendered obsolete with the arrival of the Kinect. And I think that’s ok, because this toy is so much more fun. The games I have tried are really lame, but hacking into its data streams opens up new ways to do interactivity.

So we are entering a stage of technological artistic development where the object easily can engage in a cybernetic relationship with the observer. This feedback process opens up a series of real possibilities, in contrast to the optical illusions from the history of painting.

REMASTERPIECES is an attempt to add something to painting in the 21th century. Using Kinect to set up a space where the spectator (usually a reflexive figure standing still) is turned into an actor. With a broad range of small programs we dissect the body of painting and recreate it as vulgar entertainment.

From scribbling cave walls to aid our hunting, illustrating the lives of our saints, discovering the properties of space and perspective, creating tools that copy reality into images, to the breakdown of the exterior and the outer manifestations of inner life…the time has come to reanimate the figure of the distant bourgeois observer.

What would happen if we treated a painting as a game and decoded its message into a playful one?

Hacking a PS3 and Kinect together for the ultimate FrankenConsole™


There is an old tobacco factory in Lavapies, it should have been a new and shiny center for visual arts, but there is a crisis and the Cultural Department has no money. In the meantime the neighbors and artists have started moving in and organizing activities.

I have received another grant and this time I bought Max 5. This means that I could start working on my first patch for the EEG machine. It will be a straight brain wave to sound wave, only multiplying the frequencies into audible ones, much like a theremin. First of all the IBVA is fast, very fast at low resolution, but with only 3 sensors it’s a bit limited in the sampling area. I guess to really work with the P300 signals you need a full EEG helmet to capture all the parts. An Argentinian friend reminded me of the differences between this and Magnetic Resonance Imaging where you look inside the brain, EEG are signals going out of the cranium and can in a way be interpreted as a thermin. I suspect that working with IBVA live on a performer can quickly generate noise-only with its cable from sensors to the bluetooth sender. There are other systems, cheaper and more suited for a live setting, we hacked both the Jedi Trainer and Neurosky Mindset a time back when I helped develop this project:

The Mexican Standoff removes the link between the persons thoughts and actions; people are directly thinking about what they are doing – creating a hyper-reality. Two people use their minds via EEG headsets to fire their guns in a Mexican Standoff realized in an ultra-violent first person shooter (FPS). To trigger the EEG interface the person needs to relax. When this is detected the avatar will begin to shoot the other.

So here I am hooking myself up before the morning coffee. Looking, listening and feeling as it enters me. Often with headache as the result, staring into my brain doing nothing but mind flexing is tiring. These are the first steps in a larger project, it is the old impress – express. A video will form a base upon where the patients get their head information recorded. These data once stored and analyzed are later used in Max to control machines, sound and video in the performance space. When I discovered that the EEG laboratory at Gaustad was financed by CIA’s MKULTRA it was the last piece of the puzzle that I needed to make this into a performance project.

HelloWorld! Demodrama – Faces


Again at Medialab-Prado for a workshop, last time we held a two-days-build- your-own-light-harp session. This time I joined as a collaborator on Demodrama, one of the HelloWorld! projects. During their presentation, the siblings that started the project, said that one of their original ideas were to use an EEG-machine as a possible controller for the mask. A perfect project to test my machine. I really enjoyed the talk that Zachary Lieberman gave about his search for jaw dropping moments when making interactive pieces.

Demodrama uses the same infrared camera trick as we used with MouseMan, but the software is tBeta. It comes with libraries that tracks, calibrates and reproduce image and video, but a bit too much C ++ for me. The system of projecting green dots in the space that you later mark for the camera is the best. Thinking back I really don’t know how we managed to calibrate MouseMan that good without something like this.

It was originally thought to be a work with digital masks, but in my opinion it became more of a face as screen. No longer a tribal/animal mask with the head working as an extension of the body, but a free-floating white wall/black hole machine. My inital scepticism towards the lack of physicality, it is only a surface loosely connected to the head, was brought to shame when we saw the thing working. This will be magical.

The Player


A colleague of mine has received a grant from Centro Parraga in Murcia to do an interactive research project. Loosely based on two definitions of the word player, both a theatrical performer and a person who participates in some game, it stages the artist as the protagonist in a strange computer game. Spectators and public would turn into users and players. Ideally.

It became clear very early that two art nerds can’t make a computer game, complete with an avatar, 3D surroundings and interaction between the two. This is a two month project. We settled for making it in Flash, hoping that we could control that within Max. You can do this, of course, if you know how to ActionScript. And though I really like the name, I don’t have the energy to learn a new programming language now.

So we stepped back and decided to use some old code. Old code is a drag, but it allowed us to play loops of video by pressing keys on the keyboard. With Xpadder, a small program that converts gamepad signals to keys, we get the desired interaction. To generate sound and samples I use Max. This process is backwARTs.