I’ve been developing a patch within vvvv for the past few months to use in a live situation. Last weekend the first outing for the system was at Lakota in Bristol.
Short video by Lumen below:
The patch worked well and it received lots of good feedback. Using it live has helped in realising what else it could benefit from. More development on its way!
The majority of the motion tracking projects I have worked on in the past have involved a PS3 Eye camera, with the filter altered in order to only transmit infrared light. This, coupled with infrared lighting in some situations has worked very well and been very reliable. However, with the release of the Microsoft Kinect (TM) and the potential for depth tracking, there was no question that it would revolutionise motion tracking.
About 9 months ago I started to use vvvv for my live visual work. The main reason behind this was the level of customisation that was available with a graphical programming environment, and the increase in frame rate I would hopefully gain over previous software.
Thanks to “Heirro” and many others developing for vvvv, a Kinect node has been created. My initial patches have been based around this patch. The video below shows the results of my initial tests – a sound reactive motion and depth tracking patch developed for a live music night run by Drawn Recordings. There are so many possibilities for the Kinect, of which I hope to approach some over the coming months.
For the ‘Illuminate Bath’ festival, I will be collaborating with Lumen, Will Kendrick and Al Cotterell on a light and sound installation at the Roman Baths (in Bath).
The shot below was taken from the water’s edge. Viewing on the night will be from the upper balcony section. More info here.