These last two months have been all about working on projection and trying to conceptualize my final project for the Hacktory show at the Crane Arts Building. Georgia Guthrie, Director of the Hacktory, asked for volunteers to learn a program called Madmapper, which is basically a projection mapping interface. I jumped at the opportunity! Bonus: artist and Hacktory user/board member/teacher, Kim Brickley was also working on the project, so I got do some collaboration.
The parameters were that we had to use a section of wall on the International House in Philadelphia (37th and Chestnut) and locally gathered data, whether it be pedestrian traffic or weather and use that data to create something with Madmapper to show during designPhiladelphia. Kim and I met several times to discuss content and the scope of the project. She worked on tackling getting the Kinect to work in Madmapper while I worked on getting Processing to do fun things with the Kinect, mouse movement, a Makeymakey and motion tracking. We also used an image of the IHP building that we took at night to create a palette for our project. We learned quickly that Madmapper is an excellent program that takes inputs from sources like a Kinect or Processing sketches and allows you to build surface shapes and translate other visuals into an application for projection. From their website: “The MadMapper provides a simple and easy tool for video-mapping projections and LED mapping. It removes a lot of the confusion related to this medium, effectively demystifying the process, allowing you, the artist or designer to focus on creating your content, and making the experience of mapping textures to physical objects in real time, fun.” The demo of what we learned was supposed to take place in October, but due to weather and other scheduling we had to postpone almost a month and of course, it was the first cold snap so there wasn’t a big turnout! Luckily, we’re showing the piece again indoors in December and working on some other projects that use the software.
There are several hoops that have to be jumped through in terms of getting the program Madmapper itself to work. Most of my time in the first part of September was spent on research and looking at tutorials and how-tos and figuring out various issues with the different computers we were using. We were both jumping back and forth to find solutions to various problems to get all the parts to talk to each other. Because one of the features of Madmapper is that it uses Syphon – an open source plugin for sharing video and and stills that allows you to break out of a single application solution for editing and mixing visuals – this adds additional downloads to the process of just getting visuals to project. It also uses Quartz Composer – “a node-based visual programming language provided as part of the Xcode development environment in Mac OS X for processing and rendering graphical data” – to run these various ‘plug-ins’ for Madmapper. Getting all the different computers we use to a) recognize Syphon in Processing b) recognize Processing or Quartz Composer in Madmapper and c) recognize the Kinect in Madmapper was more complicated than we expected or anticipated at first.The workflow looks a bit like this: Audiovisual Software (like Processing or Modul8 or a webcam) > Syphon>Madmapper. When you add in the Kinect it looks more like this: Kinect > QuartzComposer>Madmapper plugin>Syphon>Madmapper. We watched and read many tutorials, downloaded several Processing libraries and sketches, and downloaded several graphics plugins for Mac OS systems before we finally started to make things work, and we still had some issues making the systems communicate with each other.
The great thing about this project is that it led me deeper into Processing than I expected to go. Our content includes two Processing sketches I started messing with – one that creates random circles that look like cells or dots sparkling up the screen and fading away and the other that is a variation on the bouncing balls sketch (I learned the code for these from various sources including Github and several Processing forums). I also added sound to one of the sketches I was working on that hooks up a Makeymakey device. So, we have
one sketch changes when keys are pressed (W, A, S, and D, specifically) from color dots to white snowballs AND playsaccordion music from the Makeymakey and one sketch that has bouncing balls that rebound all over the screen and track mouse movement. Theoretically, the sketch that tracks mouse movement could be translated to a sketch that tracks hand movement with the Kinect. We intend to keep expanding on this project to create an interactive projection that recognizes movement and translates it into Processing sketches. And now, because we have some extra time, I have been learning about getting the Kinect working in Madmapper via Quartz Composer.
As a side project this month, my boyfriend Doug Goudy and I dressed up as Cosmic Creatures for Halloween thanks to the downloadable patterns from Wintercroft (thanks Steve, these were AMAZING additions to our costumes!) and I made an El-Pumpkino. Doug created the graphics for the masks that looked like spirit animals of the future, so I was an Interstellar Coyote complete with constellations and coyote skull and he was a Cosmic Bear with galaxies and a solar system. I added a switch, a LilyTiny and LilyPad LEDs to my mask and explored the use of Adafruit’s new silicon wire. I love that silicon wire! So much easier than conductive thread to work with and just as flexible. If we make these again, however, I want a drinking straw hole so I can keep the mask on all night for Halloween libations.