There is an old tobacco factory in Lavapies, it should have been a new and shiny center for visual arts, but there is a crisis and the Cultural Department has no money. In the meantime the neighbors and artists have started moving in and organizing activities.
I have received another grant and this time I bought Max 5. This means that I could start working on my first patch for the EEG machine. It will be a straight brain wave to sound wave, only multiplying the frequencies into audible ones, much like a theremin. First of all the IBVA is fast, very fast at low resolution, but with only 3 sensors it’s a bit limited in the sampling area. I guess to really work with the P300 signals you need a full EEG helmet to capture all the parts. An Argentinian friend reminded me of the differences between this and Magnetic Resonance Imaging where you look inside the brain, EEG are signals going out of the cranium and can in a way be interpreted as a thermin. I suspect that working with IBVA live on a performer can quickly generate noise-only with its cable from sensors to the bluetooth sender. There are other systems, cheaper and more suited for a live setting, we hacked both the Jedi Trainer and Neurosky Mindset a time back when I helped develop this project:
The Mexican Standoff removes the link between the persons thoughts and actions; people are directly thinking about what they are doing – creating a hyper-reality. Two people use their minds via EEG headsets to fire their guns in a Mexican Standoff realized in an ultra-violent first person shooter (FPS). To trigger the EEG interface the person needs to relax. When this is detected the avatar will begin to shoot the other.
So here I am hooking myself up before the morning coffee. Looking, listening and feeling as it enters me. Often with headache as the result, staring into my brain doing nothing but mind flexing is tiring. These are the first steps in a larger project, it is the old impress – express. A video will form a base upon where the patients get their head information recorded. These data once stored and analyzed are later used in Max to control machines, sound and video in the performance space. When I discovered that the EEG laboratory at Gaustad was financed by CIA’s MKULTRA it was the last piece of the puzzle that I needed to make this into a performance project.
Again at Medialab-Prado for a workshop, last time we held a two-days-build- your-own-light-harp session. This time I joined as a collaborator on Demodrama, one of the HelloWorld! projects. During their presentation, the siblings that started the project, said that one of their original ideas were to use an EEG-machine as a possible controller for the mask. A perfect project to test my machine. I really enjoyed the talk that Zachary Lieberman gave about his search for jaw dropping moments when making interactive pieces.
Demodrama uses the same infrared camera trick as we used with MouseMan, but the software is tBeta. It comes with libraries that tracks, calibrates and reproduce image and video, but a bit too much C ++ for me. The system of projecting green dots in the space that you later mark for the camera is the best. Thinking back I really don’t know how we managed to calibrate MouseMan that good without something like this.
It was originally thought to be a work with digital masks, but in my opinion it became more of a face as screen. No longer a tribal/animal mask with the head working as an extension of the body, but a free-floating white wall/black hole machine. My inital scepticism towards the lack of physicality, it is only a surface loosely connected to the head, was brought to shame when we saw the thing working. This will be magical.
The time had come to try the EEG-machine for real. The recipe:
raw brain data
– a new powerful Macbook straight out of the box
– a new IBVA machine
– installing the software for the first time
– setting up the electrodes and headband
This part was a one hour rush. Then the problems started:
– the Bluetooth connection, as a first time user this took annoyingly long time
– understanding both the interface and the data and at the same time presenting it
We got a grant. We bought a new toy. But the story does not start there. It started in a flea market almost three years ago, with a vintage hair dryer that begged to be bought.
Later during the research for our robot Maxi Wanzl, we stumbled upon the openEEG project. It offered EEG for the rest of us, but you would have to build it yourself. The technical department became sceptical: Not another D.I.Y. prototype that requiered months of work.
So one year ago, when Cycling 74 published the news about an EEG-machine that came with its own maxMSP patch at reasonable prices, we rejoiced. Finally the possibility to work with an interface device that goes both ways. With this we can make the brain control lights, music, video, etc. Or we can register and store the brains reaction to these impulses.