Final Project Update 2

This week I got the musical interactivity working with the Kinect data. In this way, you can play the instrument by waving your hands in the air and the notes will always be in pitch. By tracking the horizontal location of the hands, we can map out a grid across which the hands move and trigger new notes in each new zone of the grid. These triggers are fed into a Markov chain which intelligently selects the next note so that the performer can focus on phrasing and respond to the notes that they hear played back without concern for playing a “wrong” note.

ISSUES: At the end of the video, you can see that using both hands causes the instrument to overload with notes. I did some troubleshooting and am thinking that this is a result of the OSC buffer getting full and expelling its cue of notes. I have also not heard back since reaching out again to my friend at the Clive Davis Institute to use the Leslie speaker, so we are planning contingencies which include virtual instruments and effects.

The Markov Chain was an interesting experiment. In past projects, I locked the musical interaction to a scale that would only move linearly as the data increased or decreased. This was fun but got tiresome quickly as the interactivity was too straight ahead. To combat this, I needed a way to skip notes without adding any interaction. There are ways to do this mathematically, by say querying random indexes from a list of notes in a scale, but this felt too random. I wanted to retain some level of musicality which is why the weighted selection that Markov chain provides is more musical. I used the ml.markov object in MaxMSP.

Below is a sample of the visual element of our project which my project partner Morgan Mueller did a lot of work on. The scale of the shapes is mapped to the position of the hands and the size of the particles are triggered to grow as each new note is triggered by the performer.

visuals01.gif