As a follow-on from the previous post here is a video clip demonstrating the audio response to the movie clip and live camera feed. Note the change in pitch when light an
Experiment in Apple Motion using an external file to generate some of the text.
As a follow-up from the previous post, here is a clip showing the patch along with audio. Note how the pitch changes when light and dark objects are presented to the camera.
Here’s an experiment in Max/MSP/Jitter using the Chromakey and Kaleidoscope presets. There is also audio that responds to the RGB values of both the live camera and the fractal animation (not recorded on this example). Below is a screenshot of the Max patch.
My latest venture into interactive/new media and something I’ve been wanting to do since I left London many moons ago. Max/MSP/Jitter uses patches to construct audio/visual interfaces which respond to live and recorded data. I have been working with some of the tutorials and the ones I have found the most useful so far are the two shown above.
The first behaves very much like the old iTunes visualiser, drawing shapes in response to audio amplitude. The second behaves like theremin, changing pitch according to the RGB values captured through the laptop’s camera.
It’s a start, I’m looking forward to seeing what else Max can do, will also hope to share some video examples too.
The ACM SIGGRAPH Digital Arts Community Altered Books Digital Interventions exhibition is now online! This exhibition includes a couple of images from my Jabberwocky series
The images will also be included in a showreel at the 2015 SIGGRAPH conference this August at the LA Conference Center, USA.