Motion Tracking and Projection Mapping

I decided to work with motion capture and projection mapping for my research sprint with REAP.  My idea was to capture human movement in real time using the Kinect and then incorporate performance during the live projection of that movement. The focus of my sprint was researching how projection can be used effectively in performance, and the technological challenges of incorporating this technology in a live performance environment, and subsequently, the potential challenges this may pose.

 

I began researching this topic by studying existing projects and products in this field. In the past, most of these projects were completed using the coding language Processing and Kinect for Windows. The most popular and referred tutorial I could find in this area was by Amnon Owed. His project involved using Processing and Kinect to make an interactive display. Here is the link to his tutorial.

I first tried this tutorial on my personal Macbook Pro running OS X Yosemite. I downloaded Processing 3, which was only released on September 30th, 2015. Unfortunately, I ran into problems very early in the process because the libraries used by Amnon in the tutorial were no longer compatible with the latest version of Processing. Troubleshooting this issue, I downloaded the version of Processing that was used for the tutorial.

My supervisor Joy Smith provided me with a Kinect V2 for Windows and using this older version of Processing, I got a bit further in the research, but when I tested with the code used in the tutorial and ran the program, I encountered a plain black screen. I decided to test the tutorial on a Windows laptop in the Felt Lab to verify if this was an issue involving trying to run it on a Mac operating system. Unfortunately, I encountered the same problem on that laptop as well. After looking for solutions online, I found that a lot of others ran into the same issue while trying this tutorial and Amnon made a post in a public forum saying the tutorial likely will not work anymore because of a lack of compatibility between the external libraries required for Processing, the Kinect hardware, and newer operating systems.

Running into these problems early on was frustrating, but also very helpful at the same time. I realized quickly that using multiple third party libraries on top of a constantly changing coding language is not a stable or marketable solution in the field of motion tracking and projection mapping. So I decided to take a step back and do more research to find alternative options.

In my search for an alternative to Processing, I found an app on Microsoft Store called Kinect Evolution, which tracks human figures in front of the camera and gives the user a live three dimensional view of the occurring body movement.

 

The app webpage for Evolution mentions that it is meant to be a tool for developers to help build their own Kinect apps. They even made the source code available for download. Using one of the Windows laptops at the lab and the source code as a guide, I started working on a new app called REAP Evolution. I fortunately got the program to run properly this time and tested it with some help from Joy. We both stood in front of the camera and after a short period it started tracking our movements. My initial exploration of this technology and motion capture was essentially done. If one was to continue trying to incorporate these technologies in the future, a good starting point is to look into ways of live projecting the movements captured by REAP Evolution.

Now that my initial explorations are complete, I will be working on incorporating similar technologies into a live performance through the development of a theatre piece with the Samarian Woman 28 Collective.