Clickformedia

Digital Media Design

By

Performance Video – Conclusion

Performance Video

Updated November 2012

There’s still a lot of interest in this project so I felt I should update this final Blog entry:

  • Firstly, please note I developed this project for the Mac running Snow Leopard 10.6. I’ve heard that there maybe issues installing Synapse with Lion/Mountain Lion. I would make sure you have installed all the Library files I’ve listed in my Blog entries, Macports, etc. as without these it just will not work.
  • Secondly, I also successfully installed the Kinect SDK for a Windows installation using Bootcamp, so this may also be an option if the Lion/Mountain Lion does not now support Synapse.
Full details of the Project Research and Development Timeline can be accessed using the links to the Blog entries shown at the bottom of this and every Blog post.

Performance Video – The Live Performance

Performance Video – Idea Development

Quartz Composer Video Wall Composition

Fig 1.0 Quartz Composer VJ’ing Composition

One of the project brief options is to create a musical instrument. From this brief the group came up with the idea of filming a local choir with individuals each singing one note which we would record onto video. Then using a form of interactive technology we would playback these videos in realtime creating a live Performance Video.

A performer, call them the conductor or musician would control the development of the song by starting and stopping the playback of the videos each representing a musical tone (or Sound).

The Approach

This would be unlike previous group projects as individual members of the group would be approaching this project from different directions, some us would concentrate on the Visuals while I personally would concentrate on the functionality, the interactivity and the technology involved in the project.

My Responsibilities

For this project I investigated a range of interactive methods that could be employed to manipulate and control a Video and then by using the interactive element of this project, to enable a Performer be that an single individual or group of people by their movements alone control the Video and Sound using Motion Capture Techniques.

I would design in effect a VJ’ing application, the videos responding to the hand movements of the performer. I would also design a delivery method that would include a live performance/broadcast which would for the purposes of this project should encompass projection onto a screen or even the face of a building, but for purposes of this project and the Blog I would just create a video of the working system.

Researching Interactivity – Motion Capture

At the early stage of the project I’d already decided that Microsoft’s Kinnect should be central to the project. Even though I’d had no previous knowledge or experience of designing for interactivity. Basing the project around the Kinnect would also ensure that I would be using the latest motion capture technology. I should qualify that statement by saying “The latest motion capture technology that is affordable and available for a student to have access to for this project.

Test Videos Kinnect Sound Control and Wiimote MIDI controller.

To enable the Kinnect to successfully interface with a MAC I researched and identified a range of Applications and MAC Libraries which needed to be installed and compiled before the project could progress. I’ve detailed these in previous blog entries – see the links below.

I further researched other methods of interactivity using a Wii Remote (Wiimote) for motion capture using the internal accelerometers and motion detectors. I also added a MIDI enabled Keyboard and a Computers Keyboard to control Video playback in real-time (VJ’ing) in my orginal Quartz Composer composition.

Using a MIDI keyboard to control Video Clips

Researching VJ’ing Applications/Interfacing Solutions

I researched and experimented with a number of Applications and application design tools before deciding upon Apple’s Quartz Composer for the development of the VJ’ing element of this project.

Others investigated included MAX MSP with Jitter, Modul8 and Ableton, in fact I tried each of these in turn, designing simple solutions using each of these to test the concept. However even though all of these seemed to suitable for the project for a number of reasons I discarded them for alternatives.

For example, I discarded the Modul8 application as I’d already used this application previously for a project over the summer break. For details see my blog entries Ophelia Project and VJ’ing Modul8 – getting started. I decided that working with this application again would not add to my experience or skills therefore I discarded it to explore new tools and applications.

See my previous Blog entries for details of these other applications – Blog post links below.

Quartz Composer Performance Video Composition

This has been a challenging project, involving a steep learning curve in order to get to grips with the Kinect and Quartz Composer, involving a considerable amount of online research and experimentation to get to the stage of a working prototype. I really mean prototype because I’m sure that the final project could be refined and improved upon.

Referring to previous Blog entries I had a Microsoft Kinect working through my Macbook Pro and driving a MAX MSP patch revealing the depth map image and generating X and Y coordinates.

MAX MSP licenses are limited to a small number of MAC’s within the University and the 30 day demo copy I have was due to expire just before the final critique for the project so I decided that I would take a different path and work with Quartz Composer which is similar to MAX MSP but comes with Xcode.

Quartz Composer - qcOSC an OSC receiver object

Fig 1.1 Quartz Composer – qcOSC an OSC receiver object

Synapse for Kinect which can be found here http://synapsekinect.tumblr.com/ is a stand alone application for both Windows and Mac for interfacing Microsoft’s Kinect to MAX MSP and for me the most interesting Quartz Composer.

There’s a few programs and plugins to install before you can use Synapse with Quartz Composer, just follow the instructions on the website creating folders as required and I also found it helpful to keep all the files in one folder which I called Kinect.

One of the most important is qcOSC which is a Open Sound Control receiver, for our purposes this is what brings the Kinect data into Quartz Composer that is depth image and skeleton data which you can link from to control patches in Quartz Composer. Referring to Fig 1.1 you can see references to ‘left hand position’, ‘head’, ‘torso’, ‘right knee’ and so on this ties in with the skeleton viewed in Synapse.

Fig 3.0 Quartz Composer - Kinect Patch

Fig 1.2 Quartz Composer – Kinect Patch

Fig 1.2 shows the Kinect depth image patch which I’ve linked to a Sprite which outputs the live Video from the Kinect to the viewer window.

Quartz Composer Lenticular Halo Patch

Fig 1.3 Quartz Composer Lenticular Halo Patch

What I have developed is an interactive Performance Video  Application and a Musical Instrument using motion capture technology to control Video and Sound in real time.

Fig 2.0 Quartz Composer Viewer

Fig 2.0 Quartz Composer Viewer

The centre of the project is Microsoft’s Kinect interfaced to a Macbook Pro running a custom Quartz Composer composition which converts the motion capture data from the Kinect to control in effect a VJ’ing application.

The XY coordinates derived from the Kinects Depth Image/Skeleton  are processed and used to enable the starting and stopping of looping video clips based purely on the hand movements of the person standing in front of the Kinect.

Fig 2.0 Screenshot of Video clips playing when XY coordinates data captured from the hand movements match the location of the videos on screen.

The most troublesome part of the project proved to be using the XY coordinates data captured from the Kinect to enable the Billboard Sprites for each of the Video clips as the hand moved across the screen. I tried a number of patches including Boolean Logic and setting Ranges, in the end after a meeting with Liam we identified that the Conditional Patch would resolve this issue using the following equations.

Conditional Patch Settings

IF X is equal to or greater than A then X = TRUE     AND    IF Y is equal to or greater than B then Y = TRUE

This is the equation I started with but in the end I found that I had more accurate results by setting the calculation to be a much simpler

X = X+-T   AND   Y = Y+-T

T = Tolerance range so that the Video clip would be enabled when the hand was within 15 points of the actual centre of the Video clip.

Fig 3.0 Quartz Composer Performance Video Composition

Fig 3.0 Quartz Composer Performance Video Composition

Fig 4.0 Quartz Composer XY Data Capture Calculator

Fig 4.0 Quartz Composer XY Data Capture Calculator

Fig 3.0 Screenshot of the Quartz Composer composition for the completed project. As you can see it involves a large number of elements (patches) which if time permitting I plan to reduce by creating Macros which combine patches together creating a custom patch. For example (See Fig 4.0) to handle just the XY data capture controlling the Video Clips effectively creating one patch to replace 24 individual patches.

Fig 5.0 Screenshot of the Data Capture Calculator (Fig 4.0) consisting of 4 Conditional Patches and 2 Logical Patches converted into just one Macro Custom Patch.

Fig 5.0 Quartz Composer Custom Macro Patch

Fig 5.0 Quartz Composer Custom Macro Patch

 

Creating the Video content

The original concept called for filming a choir as they sang a single note each, however due to the choirs performance dates and commitments we were unable to arrange a mutual time for completing this task and so alternative arrangements were made.

In cooperation with the AUCB Performing Arts department we organised the hire of one of the acting studios and the services of an Actor (Thank you Ben O’Shea). As a group we setup the studio for filming Ben against a Green Screen as he sang each note in a sequence across a number of Octaves. We used a  Canon EOS 5D MKII with the sound captured directly into the camera using a Rode Videomic. The lighting consisted of a Redhead kit, Fluorescent and Led lighting kits.

I personally assisted with the setup of the studio for filming in particular the lighting and filmed the setup and some of the performance for a making of video to be edited by one of the group members.

Other members of the group edited the video clips with only minor advice from myself regarding After Effects. Each of the video clips have been set a run-time of 10 seconds which can be extended if required and saved as a .MOV file format. It should be noted that the .MOV file format is the preferred format for Modul8 and so the video clips could also be used in a live performance using a Modul8 equipped MAC.

Live Performance and Broadcast

The original concept and subsequent design would work well as a Live Performance with the performer interacting with the Video Sequences and the Sound to create a improvised music composition based upon the motion capture data. The performance location could be either internal or externally sited with the Video projected onto a convenient wall, a screen or with a sufficiently powerful projector onto the side of a building.

It would also be possible to stream the Performance Video over the Internet (Broadcast) using a streaming service for example Groovy Gecko a private company offering live streaming services. It should be noted that video streaming just means playing video content on a computer, the video content is watched in real time (internet connection speed dependent) rather than downloading a video and then playing it from the computers hard drive although the video content is actually held on computer in a temporary file that is deleted automatically when the video has finished.

YouTube and Vimeo, indeed any Video source where the video is watched over the internet can be considered as a Video Streaming source. Using a streaming service would be more efficient than using your own computer to stream the video mainly due to bandwidth considerations.

For Live Streaming you would need to be able to encode, compress and upload to the internet in real-time which would require a additional hardware and a powerful computer to be able to achieve this.

Next Steps

As I mentioned this is a project that could be developed further, possibly into a full blown VJ’ing Application making use of the many Video Effects and Filters that are included in the Quartz Composer Patch Library.

The design of the Quartz Composer composition could be used to control sound by hand movement alone by varying pitch and/or volume, the Halo effect could be replaced by other video or image files in fact by hand movement alone any number of video transitions, effects or sound can be controlled just by adding the relevant patches.

Health & Safety

Live performance will most certainly involve an audience and therefore there should be some consideration given to the Health and Safety requirements that should be put in place for both the safety of the performers and the public.

The Basics
  • There should be some form of barrier between performer and the public to prevent collisions.
  • The use of Electricity necessitates that precautions are taken to ensure that performers and the public are isolated from the electrics which should include securing trailing cables.
  • Insurance – there is a legal obligation to have public liability insurance cover.
  • The choice of venue may also have  to be considered for example a live projection in a public place will have different Health and Safety requirements compared with a projection in an enclosed space.

Kinect Specification

Sensor

Kinect USB Pinouts

Kinect USB Pinouts

Colour and depth-sensing lenses
Voice microphone array (made up of 4 microphones)
Tilt motor for sensor adjustment

Field of View

Horizontal field of view: 57 degrees
Vertical field of view: 43 degrees
Physical tilt range: ± 27 degrees
Depth sensor range: 1.2m – 3.5m

Data Streams

320×240 16-bit depth @ 30 frames/sec
640×480 32-bit colour@ 30 frames/sec
16-bit audio @ 16 kHz

Skeletal Tracking System

Tracks up to 6 people, including 2 active players
Tracks 20 joints per active player
Ability to map active players to LIVE Avatars

Useful Link http://liambean.hubpages.com/hub/How-to-Hack-the-Microsoft-Kinect-Overview

Project Blog Entry Links (Project Development Timeline)

  1. Performance Video – Conclusion
  2. Performance Video – Wiimote MAC
  3. Performance Video – VJ’ing using Quartz Composer
  4. Performance Video – Kinnect on MAC and PC
  5. Performance Video – MAX MSP Jitter
  6. Performance Video – Modul8
  7. Performance Video – The Human Orchestra

Word Count 2156

By

Performance Video – VJing using Quartz Composer

VJing using Quartz Composer

The project took a new turn when I came across Synapse for Kinnect.

Synapse is an app for Mac and Windows that allows you to easily use your Kinect to control Ableton Live, Quartz Composer, Max/MSP/Jitter, and any other application that can receive OSC events. It sends joint positions and hit events via OSC, and also sends the depth image into Quartz Composer. In a way, this allows you to use your whole body as an instrument.
http://synapsekinect.tumblr.com/
Synapse for Kinect. Site Accessed Dec 10th 2011

After downloading the source files I quickly had the depth image running on the Mac and from this point I decided to develop the entire project using Quartz Composer.

Quartz Composer is included with Xcode and is a visual programming tool similar in many ways to Max MSP in that you drag patches into a composition and then connect these to create an API (Application Programming Interface).

With my new-found interest in using Quartz Composer I began researching for methods of creating the Video Wall from the video clips we recorded. Having already created a number of Quartz Composer compositions from Tutorials I had a good idea that the patches I would need would include the ‘Billboard’, ‘Keyboard’ and ‘Midi’.

Fig 1.0 Quartz Composer Video and Billboard Patches

Fig 1.0 Quartz Composer Video and Billboard Patches

To start the new composition I added the video loops to a folder and then in turn dragged each of these into the composition. Each video file creating its own patch (Fig 1.0) from the image output of these patches I connected these to the image inputs of the Billboard patch. This I repeated for the 8 video loops which would make up my video wall.

You then need to set the X and Y coordinates for the video loops using the Patch Inspector on the Billboard patches. These I arranged around the centre of the screen so that I could leave the centre space free in order to display the image from the Kinect device, in effect the conductor of the video loops.

Input Patches for Keyboard and Midi

Fig 2.0 Quartz Composer Input Patches

Fig 2.0 Quartz Composer Input Patches

To test the video wall functions as expected I dragged a ‘Keyboard’ patch onto the composition and linked the outputs of this to the enable inputs of each of the Billboard patches. The Keyboard keys are configurable so I changed these from the arrow key defaults to the number keys 1 to 8. (See Fig 2.0)

Now when I run the composition and pressed the number keys each of the video loops played in turn corresponding to the keys set to enable them.

The next step was to test that Midi would also be able to control the video loops and so I dragged a Midi patch onto the composition, which I set for middle C and 5th Octave. The outputs are labeled as for a keyboard and so it was a simple matter to link the outputs of the keys to the enable inputs of the Billboard patches.

However when I’d attached a Midi Keyboard to the USB of my Laptop nothing appeared to work. Going into the Audio Midi setup I made sure it was setup for the M-Audio Oxygen keyboard I was using and I also ran the test program which confirmed the keyboard was talking to the Laptop correctly. Unfortunately there was still no response from the Quartz composer composition so I had to go into the Midi patch and use the patch inspector to re-configure the settings. In  the end the correct settings were:-  Midi Channel 1, Middle C as 5 and Midi Source as ALL and then on the Keyboard itself setting to Octave 1. With these changes it was now possible to control the video loops using the keys on the piano keyboard.

Quartz Composer the complete Video Wall Patches

Fig 3.0 Quartz Composer the complete Video Wall Patches

To complete the design I added an Image patch to act as a background for the video wall it was at this point that I noticed that the layer order needed to be changed so that the background did not overlay the video loops. The completed patch looks complicated but it’s quite easy to understand. (See Fig 3.0 click on images to see bigger versions)

Finally the completed Quartz Composer composition with Video display can be seen in Fig 4.0 along with a short video showing the Midi Keyboard controlling the video loops. Please note these are not the final videos to be used in the project but are just temporary videos to prove the operation.

[youtube.com/watch?v=05ra1f9BsjU]
Quartz Composer Video Wall Composition

Fig 4.0 Quartz Composer Video Wall Composition

Project Blog Entry Links

  1. Performance Video – Conclusion
  2. Performance Video – Wiimote MAC
  3. Performance Video – VJ’ing using Quartz Composer
  4. Performance Video – Kinnect on MAC and PC
  5. Performance Video – MAX MSP Jitter
  6. Performance Video – Modul8
  7. Performance Video – The Human Orchestra

 

Word Count 697

%d bloggers like this: