Final Project Presentation

I am making a musical wand for my Music Interaction Design class. I would like to visualize the musical elements created by the wand using a particle system as in this example and this example from Jason Sigal (@therewasaguy on Github). I have been wanting to investigate FFT’s and isolating specific musical frequencies as characters and have them act as a flock, following the root note and oscillating in conjunction with the beat and each specific amplitude.

Testing a MIDI Wand for Music Interaction Design class at ITP 2019.

There is also a machine learning example I toyed with from earlier in the semester where I tried to isolate specific frequencies in a musical piece and animate them separately, to some success.

A music visualization of Eric Satie's Gymnopédie #1 using the ml5 library with pitch detection. Source code taken from https://github.com/ml5js/ml5-examples/tree/master/p5js/PitchDetection/PitchDetection

I feel like the wand, or baton depending on the context, is a universal and immediate interaction that will be engaging for audiences of all ages and musical skill levels. My vision is to have this wand live as an exhibit with both artistic and educational aspects. I will be building a number of “scenes” to use with the wand and I want to incorporate a particle system as one. Here are some examples of work from earlier in the semester (here) and from ICM (here and here) which would also be scenes for the wand.

Assignment 6 - Neural Network / ML Design Exercise

# Scenario

Describe the scenario, what "job" will the neural network do in the context of a software application?

It's street cleaning day on my block. I've moved my car the evening before and would like to move it back as soon as possible while avoiding a ticket.

I need to know two things:

1. Did the parking attendant already pass by yet?

2. Did the street sweeper also pass by?

If I move it back too soon, I risk getting a ticket and blocking the street sweeper. I also don't want to wait the full time as most of the spots will be taken before then and I also have places to be...

# Problem

Is this a "classification" problem? (And what are the labels?) Or is this a "regression" problem?

I would suspect this to be a classification problem. We need a camera that can watch the street on the hour of street cleaning and discern between a time to "move" the car and "park" the car, the latter being after the parking attendant's white car and the street sweeper have both passed. There could also be a sound component in detecting the street sweeper as it has a characteristic high-mid to high frequency sound.

# Learning

What approach will be used to train the model? Assuming supervised learning, what is the training dataset and how is it labeled?

The data set for "move" could be all the hours of the week during which street sweeping would take place, but has not. You could also include all the days of the week as "move" as well to give more data on this end, even though the street cleaning only is relevant for the two days of the week pertaining to your side of the street. The data set for "park" would only be the frames in which first the attendant passes by, which is usually in a white car with particular decals. Our streen happens to be the first street on the route, as far as we can tell. Shortly afterwards, usually about 15-30 minutes, the street sweeper will pass. The user would simply review the footage for each day and train the model on the frames where the parking attendant car and sweeper passed by to learn these conditions.

Then...

if (street cleaning day) {
  if (time of day) {
    if (parking attendant car passed && street sweeper passed) {
      "move" = "park";
      sendEmail(user@email.com, "safe to park");
    }            
  }
}

# Architecture

Describe, in informal terms, the architecture of the model. What are the inputs and what are the outputs?

The inputs would be the frames of the camera and the outputs would be the conditions for "normal", "attendant car", "street sweeper". If the model could recognize these three conditions, then we would be able to achieve our goal of knowing when we can move our car back safely.

Assignment 5 - Genetic Algorithms

Lillian Ritchie and I teamed up to tweak the Nature of Code Simple Smart Rockets example. We had the idea to add gasoline grades to the rocket fuel which would control mutation rate, population number, and speed (via frameRate since there was no easy-access speed parameter). Here are links to the sketch and the code on Github.

With 87 grade, there are more rockets because gas is cheap, but the rockets do not run as well, with the mutation rate set to 0.5. Also the rockets run slower with a frameRate of 30. For 89 grade, the rockets are fewer in number to reflect the rising cost, but they run a little more efficiently with a mutation rate of 0.3 and are faster with a frameRate of 48. At 93 grade, the rockets numbers are at their lowest, but they run quite well with a mutation rate of 0.01 and a frameRate of 60.

We had to use the resetSketch() function from Dan’s video so that we could make new populations with the characteristics we desired. Changing the numbers on the fly would crash as it throws a wrench in the whole inheritability of each generation and splices and adds to arrays in a messy way. We also thought of having three separate populations and using the radio buttons to just show and hide them, but that seemed a bit overkill.

Disappearing Attractor - Midterm Project

For my midterm project, I wanted to create a “disappearing attractor”. The idea is that one object attracts the other, and just as the attracted object gets close to its target, the target disappears. One inspiration is the book “Who Moved My Cheese?” by Dr. Spencer Johnson, which I’ve never read, but dig the concept. Life is in fact always changing, and how often I can be solely focused on the end goal and miss how the process is actually key to my evolution. So I’ve essentially created this scenario in a Processing sketch. Each time the blue dot approaches the red dot, the red dot disappears and re-spawns in a new location.

The addition of the green dots came about as I contemplated the Qabalistic ideas of Partzufim Theory as explained by Rubaphilos Salfluere. In this system, the conscious mind is of a male gender and the unconscious mind female, with the uniting force being Love, or romantic love in its lower form. The unconscious here is collectively referred to as the container of everything that lies outside of consciousness (not to be confused with the subconscious, which is a special situation represented elsewhere in the system).

According to the famous esoteric axiom, “As above, so below”, the inner world is the outer world, and vice versa. This means that our external reality, everything that is “not us”, is the unconscious contents of our mind projected outward. So everything in our external reality is literally a reflection of our internal unconscious and is therefore created by that unconscious.

If this process were overlaid onto the romantic pursuit between a man and a woman, the green dots could well represent a woman’s emotions that a barbaric man would ignore in his pursuit. Since he is unable to avoid hurting her, she does not allow him to reach the goal. If this process were overlaid onto the esoteric pursuit of Illumination, the conscious mind would need to become aware of the contents of the unconscious (green dots) in order to court the representation of the unconscious (red dot) and wed himself to the unconscious.

If a Genetic Algorithm were applied to this, perhaps as a goal for the second half of the semester, the male in this simulation to learn how to approach the female while being careful to not disturb her emotions (green dots). If he can find his way to the female without disturbing any of the green dots, he will reach his goal. But he only has 28 days to do this before the whole cycle starts all over again - ha!

Oscillation, Vectors and Forces

To get my sketches into my page this week, I used ITP resident Wenqi Li’s Github FlyBy post on Squarespace and embedding iframe’s . It was very helpful, whereas the gif making process of Golan’s left me with dropped frames. It would only save as fast as I could click “Save” on the download dialog box. I also had some fun reading a blog about changing the “Loading…” dialog that p5 gives you while preload() is running. Check it out here.

Below is an example of some oscillation work I did for ICM last semester.

So with that, I wanted to revisit Vectors and Forces as I wasn’t sure I had it nailed down from position to velocity to acceleration to force and back. I had an idea for balloon objects so I first got the balloons to raise float upwards in this sketch below. Here is the code.


Next, I tried to get the balloons to move realistically in the wind as if fanned by someone erratically. If you click the lower sketch, a “wind” vector is applied to the balloons. But below is as far as I got as coding wind like that is beyond me at the moment and getting the wind to go away and the balloons back to initial movement was really the missing key so far. Here is the code.

This functionality will marry with my Music Interaction Design project in which I have a physical wand which can send accelerometer data to p5. This wand could control a wind vector or control a pin object that would pop the balloons triggering a unique musical note.

Vectors and Forces

This is a later example (and the code editor), adding mouse interaction to act as a force on the bubbles. Notice that they only bounce off the walls when the mouse is pressed down, and then released otherwise (big thanks to Dan Shiffman for helping me implement this behavior).

Here is the first example using vectors and bubble objects in fullscreen and the code editor. I started by copying Dan’s NOC Bouncing Ball Vector Objects example and then making an array, pushing out a new bubble each time through draw and popping a bubble when they reach a count of 600 (I got a reminder about splicing and pushing with arrays here).

I also referenced Gene Kogan’s perlin noise code again for the oscillations in the purple color. And I referenced a technique I had used in a previous project (The Donut) to make the eye oscillate. I also used the processing / p5.js github wiki to center my canvas when going fullscreen. Very helpful in displaying work, but I would like to know more about embedding directly onto a web page and running the javascript only when the sketch is being displayed. This way there are no external links or resolution lost when exporting a sketch to a pic, video or gif.

Perlin Noise

I took some ideas from Gene Kogan’s Perlin Noise examples. I love the way this line moves and tumbles back over itself. One key element and favorite trick was to decrease the background opacity. I still don’t fully grasp what is happening with Perlin versus random in that random will continuously give random numbers in draw() but Perlin noise only outputs one value and how that is overcome, even in my own example.