IMA - Signal Processing Exploration

Last week, for my "stupid pet trick" assignment, I created this max patch that progresses a video based on amplitude. You can see via the meter object towards the top of the video the program measures the audio from the microphone and then based on the measurement progresses the frames in the video; it rewards you for making noise/talking/or listening to my favorite, Andrew Bird.

I am interested in refining this patch a little more, creating a more dynamic video, and perhaps installing it in a room with a projector. I can see it has the potential to relate to my current studio practice... Inputs and outputs with lots of possibility.


Here is some writing I had done about the project when I was brainstorming the idea: 
Brief Summary:     I plan to use audio input and video output. A microphone will sense the amplitude of a room, and based on that measurement a projected image’s hue will gradually shift in tone, through value/contrast, in real-time.
Statement: In my current body of work I have been exploring, through various media, the relationship between individuals and technology. Through this installation (stupid pet trick), I hope to encourage face-to-face interactions (conversations) to occur by rewarding the viewer with a dynamic visual, that changes based on the level of noise occurring in a space. I imagine this piece functioning like an ambient intelligent system, seamlessly integrated into the architecture of whatever space it is installed. Through a sensor (microphone), the projected image in the environment will respond to the human action and behavior of shuffling through space, commenting, taking phone calls, vocally interacting with others, etc. The reward will reach its greatest potential when space reaches the largest volume. I am still working through exactly what the visual will be currently.