As a follow-on from the previous post here is a video clip demonstrating the audio response to the movie clip and live camera feed. Note the change in pitch when light and dark objects are placed in front of the camera.
Here are some screenshots from a game I have been developing. Game elements include ray casting, day/night simulation, particle systems, first , game objectives, NPCs, melée weapons, enemies, health system and a mini map.
Screen capture of the game in action:
Experiment in Apple Motion using an external file to generate some of the text.
Continuing my research into game development, I have managed to create an infinite scroller-type game, similar to Flappy Bird. This is part of my ongoing short course development. The game includes elements such as infinite background scrolling, colliders, a score counter, scene management and utilises an object pooling system to optimise performance. The sprite objects were created in Adobe Photoshop.
Here is a screen capture of the game in action:
As a follow-up from the previous post, here is a clip showing the patch along with audio. Note how the pitch changes when light and dark objects are presented to the camera.
Here’s an experiment in Max/MSP/Jitter using the Chromakey and Kaleidoscope presets. There is also audio that responds to the RGB values of both the live camera and the fractal animation (not recorded on this example). Below is a screenshot of the Max patch.
My latest venture into interactive/new media and something I’ve been wanting to do since I left London many moons ago. Max/MSP/Jitter uses patches to construct audio/visual interfaces which respond to live and recorded data. I have been working with some of the tutorials and the ones I have found the most useful so far are the two shown above.
The first behaves very much like the old iTunes visualiser, drawing shapes in response to audio amplitude. The second behaves like a theremin, changing pitch according to the RGB values captured through the laptop’s camera.
It’s a start, I’m looking forward to seeing what else Max can do, will also hope to share some video examples too.
Here’s a couple of test renders – animation to follow!
I am currently working on a collaborative piece based upon the French composer and organist Jehan Alain’s seminal piece – Litanies.
I first heard this piece in 2013 during a tour with Simon Brett and a choir at Exeter Cathedral, UK. I have heard it a few times since then and have subsequently felt inspired to work on an animation based upon this piece.
The piece, composed in 1939 in the atonal style, is influenced on a train journey and later dedicated to one of Alain’s sisters who died tragically in a climbing accident not soon after the piece was created.
The self-similar repetition of the musical phrases lends itself to the reiteration of a visual fractal, hence my interest and keenness to work with this piece.
Simon Brett played Litanies as the finale to his organ recital at Dunblane Cathedral in July 2015 – the recording of this is the audio soundtrack to my project. My thanks to Simon and everyone up there for this.
I have been working Mandelbulber and have tonight found some amazing images: