I spent all of yesterday adapting these two sketches so that they would work on a PC instead of a Mac. I found out that certain libraries were not compatible with Windows 7, so I had to rewrite one of them. That one involved using an advanced video library (jmc video) instead of the Processing 2 default video library. I ended up switching the libraries to the default library as it worked the same on both Mac and PC.
Additionally I spent time installing drivers for the kinect on my office PC to test the quality of the playback. Everything appeared to run fine, but I will have to do a longer test to determine how it might perform in a longer setting.
And finally, the only real thing I need to do to alter these sketches for my upcoming show, is change the format size, so that they playback full screen on a monitor or projector, depending on which I end up using.
Here are two videos of the sketches I’ve been working on.
In this video I am using the SimpleOpenNI skeleton tracking code to track the location of my arms. When I raise them upon initialization the video starts up from the beginning. When I lower my hands, the video scrubs through to the end. The act of raising and lowering my arms parallels the actions going on in the video where the victim is being shot by lowering his arms.
This second kinect sketch uses the kinect to isolate users. The sketch then uses this information to mask out where people are. Where the individuals are you can see an image of a holographic police barricade. Behind them is the United States building where congress meets. This is in a response to Spain’s holographic protest that was located in front of their parliament building.
Next I will begin working on merging two of my Max MSP video “games” together. I will be taking the controllers from one and linking it up with the interface and videos of the other.