Updated November 2012
There’s still a lot of interest in this project so I felt I should update this final Blog entry:
- Firstly, please note I developed this project for the Mac running Snow Leopard 10.6. I’ve heard that there maybe issues installing Synapse with Lion/Mountain Lion. I would make sure you have installed all the Library files I’ve listed in my Blog entries, Macports, etc. as without these it just will not work.
- Secondly, I also successfully installed the Kinect SDK for a Windows installation using Bootcamp, so this may also be an option if the Lion/Mountain Lion does not now support Synapse.
Full details of the Project Research and Development Timeline can be accessed using the links to the Blog entries shown at the bottom of this and every Blog post.
Performance Video – The Live Performance
Performance Video – Idea Development
One of the project brief options is to create a musical instrument. From this brief the group came up with the idea of filming a local choir with individuals each singing one note which we would record onto video. Then using a form of interactive technology we would playback these videos in realtime creating a live Performance Video.
A performer, call them the conductor or musician would control the development of the song by starting and stopping the playback of the videos each representing a musical tone (or Sound).
This would be unlike previous group projects as individual members of the group would be approaching this project from different directions, some us would concentrate on the Visuals while I personally would concentrate on the functionality, the interactivity and the technology involved in the project.
For this project I investigated a range of interactive methods that could be employed to manipulate and control a Video and then by using the interactive element of this project, to enable a Performer be that an single individual or group of people by their movements alone control the Video and Sound using Motion Capture Techniques.
I would design in effect a VJ’ing application, the videos responding to the hand movements of the performer. I would also design a delivery method that would include a live performance/broadcast which would for the purposes of this project should encompass projection onto a screen or even the face of a building, but for purposes of this project and the Blog I would just create a video of the working system.
Researching Interactivity – Motion Capture
At the early stage of the project I’d already decided that Microsoft’s Kinnect should be central to the project. Even though I’d had no previous knowledge or experience of designing for interactivity. Basing the project around the Kinnect would also ensure that I would be using the latest motion capture technology. I should qualify that statement by saying “The latest motion capture technology that is affordable and available for a student to have access to for this project.”
Test Videos Kinnect Sound Control and Wiimote MIDI controller.
To enable the Kinnect to successfully interface with a MAC I researched and identified a range of Applications and MAC Libraries which needed to be installed and compiled before the project could progress. I’ve detailed these in previous blog entries – see the links below.
I further researched other methods of interactivity using a Wii Remote (Wiimote) for motion capture using the internal accelerometers and motion detectors. I also added a MIDI enabled Keyboard and a Computers Keyboard to control Video playback in real-time (VJ’ing) in my orginal Quartz Composer composition.
Using a MIDI keyboard to control Video Clips
Researching VJ’ing Applications/Interfacing Solutions
I researched and experimented with a number of Applications and application design tools before deciding upon Apple’s Quartz Composer for the development of the VJ’ing element of this project.
Others investigated included MAX MSP with Jitter, Modul8 and Ableton, in fact I tried each of these in turn, designing simple solutions using each of these to test the concept. However even though all of these seemed to suitable for the project for a number of reasons I discarded them for alternatives.
For example, I discarded the Modul8 application as I’d already used this application previously for a project over the summer break. For details see my blog entries Ophelia Project and VJ’ing Modul8 – getting started. I decided that working with this application again would not add to my experience or skills therefore I discarded it to explore new tools and applications.
See my previous Blog entries for details of these other applications – Blog post links below.
Quartz Composer Performance Video Composition
This has been a challenging project, involving a steep learning curve in order to get to grips with the Kinect and Quartz Composer, involving a considerable amount of online research and experimentation to get to the stage of a working prototype. I really mean prototype because I’m sure that the final project could be refined and improved upon.
Referring to previous Blog entries I had a Microsoft Kinect working through my Macbook Pro and driving a MAX MSP patch revealing the depth map image and generating X and Y coordinates.
MAX MSP licenses are limited to a small number of MAC’s within the University and the 30 day demo copy I have was due to expire just before the final critique for the project so I decided that I would take a different path and work with Quartz Composer which is similar to MAX MSP but comes with Xcode.
Synapse for Kinect which can be found here http://synapsekinect.tumblr.com/ is a stand alone application for both Windows and Mac for interfacing Microsoft’s Kinect to MAX MSP and for me the most interesting Quartz Composer.
There’s a few programs and plugins to install before you can use Synapse with Quartz Composer, just follow the instructions on the website creating folders as required and I also found it helpful to keep all the files in one folder which I called Kinect.
One of the most important is qcOSC which is a Open Sound Control receiver, for our purposes this is what brings the Kinect data into Quartz Composer that is depth image and skeleton data which you can link from to control patches in Quartz Composer. Referring to Fig 1.1 you can see references to ‘left hand position’, ‘head’, ‘torso’, ‘right knee’ and so on this ties in with the skeleton viewed in Synapse.
Fig 1.2 shows the Kinect depth image patch which I’ve linked to a Sprite which outputs the live Video from the Kinect to the viewer window.
What I have developed is an interactive Performance Video Application and a Musical Instrument using motion capture technology to control Video and Sound in real time.
The centre of the project is Microsoft’s Kinect interfaced to a Macbook Pro running a custom Quartz Composer composition which converts the motion capture data from the Kinect to control in effect a VJ’ing application.
The XY coordinates derived from the Kinects Depth Image/Skeleton are processed and used to enable the starting and stopping of looping video clips based purely on the hand movements of the person standing in front of the Kinect.
Fig 2.0 Screenshot of Video clips playing when XY coordinates data captured from the hand movements match the location of the videos on screen.
The most troublesome part of the project proved to be using the XY coordinates data captured from the Kinect to enable the Billboard Sprites for each of the Video clips as the hand moved across the screen. I tried a number of patches including Boolean Logic and setting Ranges, in the end after a meeting with Liam we identified that the Conditional Patch would resolve this issue using the following equations.
Conditional Patch Settings
IF X is equal to or greater than A then X = TRUE AND IF Y is equal to or greater than B then Y = TRUE
This is the equation I started with but in the end I found that I had more accurate results by setting the calculation to be a much simpler
X = X+-T AND Y = Y+-T
T = Tolerance range so that the Video clip would be enabled when the hand was within 15 points of the actual centre of the Video clip.
Fig 3.0 Screenshot of the Quartz Composer composition for the completed project. As you can see it involves a large number of elements (patches) which if time permitting I plan to reduce by creating Macros which combine patches together creating a custom patch. For example (See Fig 4.0) to handle just the XY data capture controlling the Video Clips effectively creating one patch to replace 24 individual patches.
Fig 5.0 Screenshot of the Data Capture Calculator (Fig 4.0) consisting of 4 Conditional Patches and 2 Logical Patches converted into just one Macro Custom Patch.
Creating the Video content
The original concept called for filming a choir as they sang a single note each, however due to the choirs performance dates and commitments we were unable to arrange a mutual time for completing this task and so alternative arrangements were made.
In cooperation with the AUCB Performing Arts department we organised the hire of one of the acting studios and the services of an Actor (Thank you Ben O’Shea). As a group we setup the studio for filming Ben against a Green Screen as he sang each note in a sequence across a number of Octaves. We used a Canon EOS 5D MKII with the sound captured directly into the camera using a Rode Videomic. The lighting consisted of a Redhead kit, Fluorescent and Led lighting kits.
I personally assisted with the setup of the studio for filming in particular the lighting and filmed the setup and some of the performance for a making of video to be edited by one of the group members.
Other members of the group edited the video clips with only minor advice from myself regarding After Effects. Each of the video clips have been set a run-time of 10 seconds which can be extended if required and saved as a .MOV file format. It should be noted that the .MOV file format is the preferred format for Modul8 and so the video clips could also be used in a live performance using a Modul8 equipped MAC.
Live Performance and Broadcast
The original concept and subsequent design would work well as a Live Performance with the performer interacting with the Video Sequences and the Sound to create a improvised music composition based upon the motion capture data. The performance location could be either internal or externally sited with the Video projected onto a convenient wall, a screen or with a sufficiently powerful projector onto the side of a building.
It would also be possible to stream the Performance Video over the Internet (Broadcast) using a streaming service for example Groovy Gecko a private company offering live streaming services. It should be noted that video streaming just means playing video content on a computer, the video content is watched in real time (internet connection speed dependent) rather than downloading a video and then playing it from the computers hard drive although the video content is actually held on computer in a temporary file that is deleted automatically when the video has finished.
YouTube and Vimeo, indeed any Video source where the video is watched over the internet can be considered as a Video Streaming source. Using a streaming service would be more efficient than using your own computer to stream the video mainly due to bandwidth considerations.
For Live Streaming you would need to be able to encode, compress and upload to the internet in real-time which would require a additional hardware and a powerful computer to be able to achieve this.
As I mentioned this is a project that could be developed further, possibly into a full blown VJ’ing Application making use of the many Video Effects and Filters that are included in the Quartz Composer Patch Library.
The design of the Quartz Composer composition could be used to control sound by hand movement alone by varying pitch and/or volume, the Halo effect could be replaced by other video or image files in fact by hand movement alone any number of video transitions, effects or sound can be controlled just by adding the relevant patches.
Health & Safety
Live performance will most certainly involve an audience and therefore there should be some consideration given to the Health and Safety requirements that should be put in place for both the safety of the performers and the public.
- There should be some form of barrier between performer and the public to prevent collisions.
- The use of Electricity necessitates that precautions are taken to ensure that performers and the public are isolated from the electrics which should include securing trailing cables.
- Insurance – there is a legal obligation to have public liability insurance cover.
- The choice of venue may also have to be considered for example a live projection in a public place will have different Health and Safety requirements compared with a projection in an enclosed space.
Colour and depth-sensing lenses
Voice microphone array (made up of 4 microphones)
Tilt motor for sensor adjustment
Field of View
Horizontal field of view: 57 degrees
Vertical field of view: 43 degrees
Physical tilt range: ± 27 degrees
Depth sensor range: 1.2m – 3.5m
320×240 16-bit depth @ 30 frames/sec
640×480 32-bit colour@ 30 frames/sec
16-bit audio @ 16 kHz
Skeletal Tracking System
Tracks up to 6 people, including 2 active players
Tracks 20 joints per active player
Ability to map active players to LIVE Avatars
Project Blog Entry Links (Project Development Timeline)
- Performance Video – Conclusion
- Performance Video – Wiimote MAC
- Performance Video – VJ’ing using Quartz Composer
- Performance Video – Kinnect on MAC and PC
- Performance Video – MAX MSP Jitter
- Performance Video – Modul8
- Performance Video – The Human Orchestra
Word Count 2156