Ian F. Hunt

Cinematographer and Filmmaker

By

Performance Video – Conclusion

Performance Video

Updated November 2012

There’s still a lot of interest in this project so I felt I should update this final Blog entry:

  • Firstly, please note I developed this project for the Mac running Snow Leopard 10.6. I’ve heard that there maybe issues installing Synapse with Lion/Mountain Lion. I would make sure you have installed all the Library files I’ve listed in my Blog entries, Macports, etc. as without these it just will not work.
  • Secondly, I also successfully installed the Kinect SDK for a Windows installation using Bootcamp, so this may also be an option if the Lion/Mountain Lion does not now support Synapse.
Full details of the Project Research and Development Timeline can be accessed using the links to the Blog entries shown at the bottom of this and every Blog post.

Performance Video – The Live Performance

Performance Video – Idea Development

Quartz Composer Video Wall Composition

Fig 1.0 Quartz Composer VJ’ing Composition

One of the project brief options is to create a musical instrument. From this brief the group came up with the idea of filming a local choir with individuals each singing one note which we would record onto video. Then using a form of interactive technology we would playback these videos in realtime creating a live Performance Video.

A performer, call them the conductor or musician would control the development of the song by starting and stopping the playback of the videos each representing a musical tone (or Sound).

The Approach

This would be unlike previous group projects as individual members of the group would be approaching this project from different directions, some us would concentrate on the Visuals while I personally would concentrate on the functionality, the interactivity and the technology involved in the project.

My Responsibilities

For this project I investigated a range of interactive methods that could be employed to manipulate and control a Video and then by using the interactive element of this project, to enable a Performer be that an single individual or group of people by their movements alone control the Video and Sound using Motion Capture Techniques.

I would design in effect a VJ’ing application, the videos responding to the hand movements of the performer. I would also design a delivery method that would include a live performance/broadcast which would for the purposes of this project should encompass projection onto a screen or even the face of a building, but for purposes of this project and the Blog I would just create a video of the working system.

Researching Interactivity – Motion Capture

At the early stage of the project I’d already decided that Microsoft’s Kinnect should be central to the project. Even though I’d had no previous knowledge or experience of designing for interactivity. Basing the project around the Kinnect would also ensure that I would be using the latest motion capture technology. I should qualify that statement by saying “The latest motion capture technology that is affordable and available for a student to have access to for this project.

Test Videos Kinnect Sound Control and Wiimote MIDI controller.

To enable the Kinnect to successfully interface with a MAC I researched and identified a range of Applications and MAC Libraries which needed to be installed and compiled before the project could progress. I’ve detailed these in previous blog entries – see the links below.

I further researched other methods of interactivity using a Wii Remote (Wiimote) for motion capture using the internal accelerometers and motion detectors. I also added a MIDI enabled Keyboard and a Computers Keyboard to control Video playback in real-time (VJ’ing) in my orginal Quartz Composer composition.

Using a MIDI keyboard to control Video Clips

Researching VJ’ing Applications/Interfacing Solutions

I researched and experimented with a number of Applications and application design tools before deciding upon Apple’s Quartz Composer for the development of the VJ’ing element of this project.

Others investigated included MAX MSP with Jitter, Modul8 and Ableton, in fact I tried each of these in turn, designing simple solutions using each of these to test the concept. However even though all of these seemed to suitable for the project for a number of reasons I discarded them for alternatives.

For example, I discarded the Modul8 application as I’d already used this application previously for a project over the summer break. For details see my blog entries Ophelia Project and VJ’ing Modul8 – getting started. I decided that working with this application again would not add to my experience or skills therefore I discarded it to explore new tools and applications.

See my previous Blog entries for details of these other applications – Blog post links below.

Quartz Composer Performance Video Composition

This has been a challenging project, involving a steep learning curve in order to get to grips with the Kinect and Quartz Composer, involving a considerable amount of online research and experimentation to get to the stage of a working prototype. I really mean prototype because I’m sure that the final project could be refined and improved upon.

Referring to previous Blog entries I had a Microsoft Kinect working through my Macbook Pro and driving a MAX MSP patch revealing the depth map image and generating X and Y coordinates.

MAX MSP licenses are limited to a small number of MAC’s within the University and the 30 day demo copy I have was due to expire just before the final critique for the project so I decided that I would take a different path and work with Quartz Composer which is similar to MAX MSP but comes with Xcode.

Quartz Composer - qcOSC an OSC receiver object

Fig 1.1 Quartz Composer – qcOSC an OSC receiver object

Synapse for Kinect which can be found here http://synapsekinect.tumblr.com/ is a stand alone application for both Windows and Mac for interfacing Microsoft’s Kinect to MAX MSP and for me the most interesting Quartz Composer.

There’s a few programs and plugins to install before you can use Synapse with Quartz Composer, just follow the instructions on the website creating folders as required and I also found it helpful to keep all the files in one folder which I called Kinect.

One of the most important is qcOSC which is a Open Sound Control receiver, for our purposes this is what brings the Kinect data into Quartz Composer that is depth image and skeleton data which you can link from to control patches in Quartz Composer. Referring to Fig 1.1 you can see references to ‘left hand position’, ‘head’, ‘torso’, ‘right knee’ and so on this ties in with the skeleton viewed in Synapse.

Fig 3.0 Quartz Composer - Kinect Patch

Fig 1.2 Quartz Composer – Kinect Patch

Fig 1.2 shows the Kinect depth image patch which I’ve linked to a Sprite which outputs the live Video from the Kinect to the viewer window.

Quartz Composer Lenticular Halo Patch

Fig 1.3 Quartz Composer Lenticular Halo Patch

What I have developed is an interactive Performance Video  Application and a Musical Instrument using motion capture technology to control Video and Sound in real time.

Fig 2.0 Quartz Composer Viewer

Fig 2.0 Quartz Composer Viewer

The centre of the project is Microsoft’s Kinect interfaced to a Macbook Pro running a custom Quartz Composer composition which converts the motion capture data from the Kinect to control in effect a VJ’ing application.

The XY coordinates derived from the Kinects Depth Image/Skeleton  are processed and used to enable the starting and stopping of looping video clips based purely on the hand movements of the person standing in front of the Kinect.

Fig 2.0 Screenshot of Video clips playing when XY coordinates data captured from the hand movements match the location of the videos on screen.

The most troublesome part of the project proved to be using the XY coordinates data captured from the Kinect to enable the Billboard Sprites for each of the Video clips as the hand moved across the screen. I tried a number of patches including Boolean Logic and setting Ranges, in the end after a meeting with Liam we identified that the Conditional Patch would resolve this issue using the following equations.

Conditional Patch Settings

IF X is equal to or greater than A then X = TRUE     AND    IF Y is equal to or greater than B then Y = TRUE

This is the equation I started with but in the end I found that I had more accurate results by setting the calculation to be a much simpler

X = X+-T   AND   Y = Y+-T

T = Tolerance range so that the Video clip would be enabled when the hand was within 15 points of the actual centre of the Video clip.

Fig 3.0 Quartz Composer Performance Video Composition

Fig 3.0 Quartz Composer Performance Video Composition

Fig 4.0 Quartz Composer XY Data Capture Calculator

Fig 4.0 Quartz Composer XY Data Capture Calculator

Fig 3.0 Screenshot of the Quartz Composer composition for the completed project. As you can see it involves a large number of elements (patches) which if time permitting I plan to reduce by creating Macros which combine patches together creating a custom patch. For example (See Fig 4.0) to handle just the XY data capture controlling the Video Clips effectively creating one patch to replace 24 individual patches.

Fig 5.0 Screenshot of the Data Capture Calculator (Fig 4.0) consisting of 4 Conditional Patches and 2 Logical Patches converted into just one Macro Custom Patch.

Fig 5.0 Quartz Composer Custom Macro Patch

Fig 5.0 Quartz Composer Custom Macro Patch

 

Creating the Video content

The original concept called for filming a choir as they sang a single note each, however due to the choirs performance dates and commitments we were unable to arrange a mutual time for completing this task and so alternative arrangements were made.

In cooperation with the AUCB Performing Arts department we organised the hire of one of the acting studios and the services of an Actor (Thank you Ben O’Shea). As a group we setup the studio for filming Ben against a Green Screen as he sang each note in a sequence across a number of Octaves. We used a  Canon EOS 5D MKII with the sound captured directly into the camera using a Rode Videomic. The lighting consisted of a Redhead kit, Fluorescent and Led lighting kits.

I personally assisted with the setup of the studio for filming in particular the lighting and filmed the setup and some of the performance for a making of video to be edited by one of the group members.

Other members of the group edited the video clips with only minor advice from myself regarding After Effects. Each of the video clips have been set a run-time of 10 seconds which can be extended if required and saved as a .MOV file format. It should be noted that the .MOV file format is the preferred format for Modul8 and so the video clips could also be used in a live performance using a Modul8 equipped MAC.

Live Performance and Broadcast

The original concept and subsequent design would work well as a Live Performance with the performer interacting with the Video Sequences and the Sound to create a improvised music composition based upon the motion capture data. The performance location could be either internal or externally sited with the Video projected onto a convenient wall, a screen or with a sufficiently powerful projector onto the side of a building.

It would also be possible to stream the Performance Video over the Internet (Broadcast) using a streaming service for example Groovy Gecko a private company offering live streaming services. It should be noted that video streaming just means playing video content on a computer, the video content is watched in real time (internet connection speed dependent) rather than downloading a video and then playing it from the computers hard drive although the video content is actually held on computer in a temporary file that is deleted automatically when the video has finished.

YouTube and Vimeo, indeed any Video source where the video is watched over the internet can be considered as a Video Streaming source. Using a streaming service would be more efficient than using your own computer to stream the video mainly due to bandwidth considerations.

For Live Streaming you would need to be able to encode, compress and upload to the internet in real-time which would require a additional hardware and a powerful computer to be able to achieve this.

Next Steps

As I mentioned this is a project that could be developed further, possibly into a full blown VJ’ing Application making use of the many Video Effects and Filters that are included in the Quartz Composer Patch Library.

The design of the Quartz Composer composition could be used to control sound by hand movement alone by varying pitch and/or volume, the Halo effect could be replaced by other video or image files in fact by hand movement alone any number of video transitions, effects or sound can be controlled just by adding the relevant patches.

Health & Safety

Live performance will most certainly involve an audience and therefore there should be some consideration given to the Health and Safety requirements that should be put in place for both the safety of the performers and the public.

The Basics
  • There should be some form of barrier between performer and the public to prevent collisions.
  • The use of Electricity necessitates that precautions are taken to ensure that performers and the public are isolated from the electrics which should include securing trailing cables.
  • Insurance – there is a legal obligation to have public liability insurance cover.
  • The choice of venue may also have  to be considered for example a live projection in a public place will have different Health and Safety requirements compared with a projection in an enclosed space.

Kinect Specification

Sensor

Kinect USB Pinouts

Kinect USB Pinouts

Colour and depth-sensing lenses
Voice microphone array (made up of 4 microphones)
Tilt motor for sensor adjustment

Field of View

Horizontal field of view: 57 degrees
Vertical field of view: 43 degrees
Physical tilt range: ± 27 degrees
Depth sensor range: 1.2m – 3.5m

Data Streams

320×240 16-bit depth @ 30 frames/sec
640×480 32-bit colour@ 30 frames/sec
16-bit audio @ 16 kHz

Skeletal Tracking System

Tracks up to 6 people, including 2 active players
Tracks 20 joints per active player
Ability to map active players to LIVE Avatars

Useful Link http://liambean.hubpages.com/hub/How-to-Hack-the-Microsoft-Kinect-Overview

Project Blog Entry Links (Project Development Timeline)

  1. Performance Video – Conclusion
  2. Performance Video – Wiimote MAC
  3. Performance Video – VJ’ing using Quartz Composer
  4. Performance Video – Kinnect on MAC and PC
  5. Performance Video – MAX MSP Jitter
  6. Performance Video – Modul8
  7. Performance Video – The Human Orchestra

Word Count 2156

By

Performance Video – Wiimote MAC

Wiimote Mac

A recently acquired Wii Games Console offered the opportunity to investigate interfacing the Wii Remote to my Macbook Pro and in turn to Quartz Composer.

The Wiimote is a Bluetooth device and the obvious thing to do was to connect it directly to the Mac using the built-in Bluetooth stack but although this connection works the Wiimote will be unresponsive as there is no native application on the Mac that supports the Wiimote.

Researching on the Internet I came across a number of Apps that do support interfacing the Wiimote to the Mac these are:-

  • OSCulator
  • DarwiinRemote
  • Wiinstrument
Fig 1.0 Connecting the Wiimote to OSCulator

Fig 1.0 Connecting the Wiimote to OSCulator

OSCulator

Of these OSCulator provides the most control and customisation allowing you to dedicate MIDI notes and controls to individual keys on the Wiimote and most importantly choice of Port Address for connecting to the MAC.

OSCulator is a free download from http://www.osculator.net/ this is a demo version and although fully functional you do get reminder screens with a delay built in during which time functionality is lost.

Wiimote MAC

The first task is to connect the Wiimote to OSCulator using Bluetooth. Press buttons 1 and 2 on the Wiimote simultaneously, all 4 blue lights on the Wiimote should be flashing, then click Start Discovery on the OSCulator App and the App should identify and link to the Wiimote, note the green tick and MAC device address in the window.

Fig 2.0 Settings for the Wiimote and OSCulator

Fig 2.0 Settings for the Wiimote and OSCulator

The next image Fig 2.0 is of the Wiimote buttons and XY mapping options, for example each of the buttons can be assigned to a MIDI Note and the XY to control Pitch, Volume etc.

Note the box at the top left of the window which shows the Port number, matching this setting to the input port of your Quartz Composer composition links the Wiimote to the input of patch.

As a reminder OSC stands for Open Sound Control

Open Sound Control (OSC) is a protocol for communication among computers, sound synthesizers, and other multimedia devices that is optimized for modern networking technology. Bringing the benefits of modern networking technology to the world of electronic musical instruments, OSC’s advantages include interoperability, accuracy, flexibility, and enhanced organization and documentation.
Introduction to OSC http://opensoundcontrol.org/introduction-osc Site Accessed 02/01/2012

Wiinstrument

Fig 3.0 Wiinstrument Configuration Screen

Fig 3.0 Wiinstrument Configuration Screen

This free application supports the Wiimote and the Nunchuck accelerometers download from here http://screenfashion.org/releases/the_wiinstrument/. Although not as configurable as OSCulator it does provide a useful MIDI connection. It has 2 modes Keyboard and Drumsticks.

The first thing to do is to connect the Wiimote in the usual way by pressing keys 1 and 2 simultaneously when successfully connected you’ll notice the graphical displays for the accelerometers will respond to the movements of the Wiimote.

Fig 4.0 Wiinstrument Drumsticks Screen

Fig 4.0 Wiinstrument Drumsticks Screen

There’s few options but you can change the keyboard scale and select which mode to use using the Home key.

In keyboard mode the Wiimote keys are assigned to MIDI notes which should be compatible with all MIDI applications although I’ve only tested it with GarageBand which seemed to work well.

In Drumsticks you can use the movement of the Wiimote and Nunchuck to simulate the hitting of a virtual Drum, again I’ve only tested this using GarageBand but it works well.

DarwiinRemote

Fig 5.0 DarwiinRemote Configuration Screen

Fig 5.0 DarwiinRemote Configuration Screen

DarwiinRemote also connects to the Wiimote and simulates Mouse movements or Apple Remote functions rather than MIDI. You can download this free program from the usual download site or from the authors site here http://blog.hiroaki.jp/2006/12/000433.html

Fig 6.0 DarwiinRemote Nunchuck Configuration Screen

Fig 6.0 DarwiinRemote Nunchuck Configuration Screen

This is more of a general purpose input interface but will find many uses where Mouse is the input device. For example the control of a PowerPoint presentation or by using a Processing application could control Video or Video Effects just by the movement of the Wiimote.

Personally I found DarwiinRemote perfect for controlling Front Row (Apple’s Media Application) using the Wiimote keys to move through the Menus playing Music and Video content.

Project Blog Entry Links

  1. Performance Video – Conclusion
  2. Performance Video – Wiimote MAC
  3. Performance Video – VJ’ing using Quartz Composer
  4. Performance Video – Kinnect on MAC and PC
  5. Performance Video – MAX MSP Jitter
  6. Performance Video – Modul8
  7. Performance Video – The Human Orchestra

 

Word Count 724

By

Performance Video – VJing using Quartz Composer

VJing using Quartz Composer

The project took a new turn when I came across Synapse for Kinnect.

Synapse is an app for Mac and Windows that allows you to easily use your Kinect to control Ableton Live, Quartz Composer, Max/MSP/Jitter, and any other application that can receive OSC events. It sends joint positions and hit events via OSC, and also sends the depth image into Quartz Composer. In a way, this allows you to use your whole body as an instrument.
http://synapsekinect.tumblr.com/
Synapse for Kinect. Site Accessed Dec 10th 2011

After downloading the source files I quickly had the depth image running on the Mac and from this point I decided to develop the entire project using Quartz Composer.

Quartz Composer is included with Xcode and is a visual programming tool similar in many ways to Max MSP in that you drag patches into a composition and then connect these to create an API (Application Programming Interface).

With my new-found interest in using Quartz Composer I began researching for methods of creating the Video Wall from the video clips we recorded. Having already created a number of Quartz Composer compositions from Tutorials I had a good idea that the patches I would need would include the ‘Billboard’, ‘Keyboard’ and ‘Midi’.

Fig 1.0 Quartz Composer Video and Billboard Patches

Fig 1.0 Quartz Composer Video and Billboard Patches

To start the new composition I added the video loops to a folder and then in turn dragged each of these into the composition. Each video file creating its own patch (Fig 1.0) from the image output of these patches I connected these to the image inputs of the Billboard patch. This I repeated for the 8 video loops which would make up my video wall.

You then need to set the X and Y coordinates for the video loops using the Patch Inspector on the Billboard patches. These I arranged around the centre of the screen so that I could leave the centre space free in order to display the image from the Kinect device, in effect the conductor of the video loops.

Input Patches for Keyboard and Midi

Fig 2.0 Quartz Composer Input Patches

Fig 2.0 Quartz Composer Input Patches

To test the video wall functions as expected I dragged a ‘Keyboard’ patch onto the composition and linked the outputs of this to the enable inputs of each of the Billboard patches. The Keyboard keys are configurable so I changed these from the arrow key defaults to the number keys 1 to 8. (See Fig 2.0)

Now when I run the composition and pressed the number keys each of the video loops played in turn corresponding to the keys set to enable them.

The next step was to test that Midi would also be able to control the video loops and so I dragged a Midi patch onto the composition, which I set for middle C and 5th Octave. The outputs are labeled as for a keyboard and so it was a simple matter to link the outputs of the keys to the enable inputs of the Billboard patches.

However when I’d attached a Midi Keyboard to the USB of my Laptop nothing appeared to work. Going into the Audio Midi setup I made sure it was setup for the M-Audio Oxygen keyboard I was using and I also ran the test program which confirmed the keyboard was talking to the Laptop correctly. Unfortunately there was still no response from the Quartz composer composition so I had to go into the Midi patch and use the patch inspector to re-configure the settings. In  the end the correct settings were:-  Midi Channel 1, Middle C as 5 and Midi Source as ALL and then on the Keyboard itself setting to Octave 1. With these changes it was now possible to control the video loops using the keys on the piano keyboard.

Quartz Composer the complete Video Wall Patches

Fig 3.0 Quartz Composer the complete Video Wall Patches

To complete the design I added an Image patch to act as a background for the video wall it was at this point that I noticed that the layer order needed to be changed so that the background did not overlay the video loops. The completed patch looks complicated but it’s quite easy to understand. (See Fig 3.0 click on images to see bigger versions)

Finally the completed Quartz Composer composition with Video display can be seen in Fig 4.0 along with a short video showing the Midi Keyboard controlling the video loops. Please note these are not the final videos to be used in the project but are just temporary videos to prove the operation.

[youtube.com/watch?v=05ra1f9BsjU]
Quartz Composer Video Wall Composition

Fig 4.0 Quartz Composer Video Wall Composition

Project Blog Entry Links

  1. Performance Video – Conclusion
  2. Performance Video – Wiimote MAC
  3. Performance Video – VJ’ing using Quartz Composer
  4. Performance Video – Kinnect on MAC and PC
  5. Performance Video – MAX MSP Jitter
  6. Performance Video – Modul8
  7. Performance Video – The Human Orchestra

 

Word Count 697

By

Performance Video – Kinect on MAC & PC

Kinect on MAC

Interfacing the Kinect for Interactivity

Microsoft Kinect

Microsoft Kinect

The Performance Video Project has evolved over the weeks but the common requirement for our concept is to have some form of interactivity. This will allow the audience to have some control and to be able to interact with the Video.

Each member of the group is investigating a different method of input device for a MAX MSP Patcher which still needs to be designed to control the Video. I have taken on the task of researching into the newest home use technology, that is a Microsoft Kinect, which I plan to interface to my dual boot Macbook Pro.

MAX MSP for the Kinect on MAC

jit.freenect.grab MAX MSP Patcher

jit.freenect.grab MAX MSP Patcher

First though I started by researching MAX MSP patchers specifically for the Kinect theorising that it would make sense to work backwards, this way I would be able to identify the software requirements in order to get the Patcher to work.

I now had a list of software requirements and applications and libraries to download and install before I could connect the Kinect these are:-

  • Xcode
  • Macports
  • Libfreenect

Xcode is a free download from the Apple App Store but the current version 4.0 only runs on Lion so I had to install the version from the disks that came with my Macbook – you’ll find Xcode as an optional install on the OSX disk.

Macports download and installation instruction can be found here http://guide.macports.org/ – it’s very involved but if you work through them methodically the installation should be fairly straightforward.

I found that the libfreenect software was not required in the end, which is just as well as this does involve more than just downloading and running an installation – it needs compiling!

There’s also an alternative called Homebrew which can be found here http://mxcl.github.com/homebrew/ which is an alternative to Macports packaging systems for installing software applications for the OSX platform. I didn’t use this as I’d already gone down the Macports route but they say this is an easier way of installing what you need.

After I’d completed all this work I got slightly sidetracked after coming across a link to Microsoft’s official Kinect Drivers for installation on a PC. This can be found at http://kinectforwindows.org/ where you can download the 32 or 64 bit versions, make sure you read the system requirements.

I had to install the following additional applications.

  • Visual Studio Express
  • DirectX 9
  • .NET Framework 4.0

With all the software installed I connected the Kinect for the first time and started one of the demo programs that came with the SDK download, unfortunately the system reported that I was missing some .dll files which I manually installed but after some investigation it seemed that the 3rd party firewall I was using was blocking these files from running so by just temporarily disabling this Firewall the errors stopped and the Kinect worked perfectly. Each of the demo programs functioned as expected, so I now had a working and fully interfaced Kinect but only for the PC, time to get the MAC installation working.

Rebooting the Mac to OSX I reconnected the Kinect and ran the MAX MSP Patchers I had installed earlier and surprisingly it worked first time even though I had yet to install libfreenect libraries, which is why I mentioned earlier that you may not need this.

Having continued with my research I found several more examples of MAC Kinect installs one in particular was of interest involving the use of Quartz Composer which is either already installed or comes with the OSX System Disk. Quartz Composer works in a similar way to MAX MSP in allowing the design of applications without writing any code.

Keen to try this alternative I set about loading the required libraries and programs to get the Kinect to work with Quartz Composer.

What Next?

The next stage is to design a MAX MSP Patcher that will link the input from the Kinect to be able to call and run Video loops depending on the position of the person or persons in front of the Kinect.

Project Blog Entry Links

  1. Performance Video – Conclusion
  2. Performance Video – Wiimote MAC
  3. Performance Video – VJ’ing using Quartz Composer
  4. Performance Video – Kinnect on MAC and PC
  5. Performance Video – MAX MSP Jitter
  6. Performance Video – Modul8
  7. Performance Video – The Human Orchestra

 

Word Count 728

 

By

Performance Video – MAX MSP Jitter

Max Msp Jitter

Testing Modul8

Modul8 A - B Group Layers

Modul8 A - B Group Layers

Referring to the previous blog entry, the idea was to test Modul8 to see if it could be used for our project.

We had a very useful lecture Monday on Modul8, which included midi mapping which I had some concerns about originally but all looked promising in going ahead and using this application for our project. Although there was one proviso, a possible negative regarding just how many videos Modul8 could playback at any one time.

I created a new project and started to add small 10 second MP4 movie files to the Media Set until I had filled one set completely, that is 16 video files in all. I then programmed 1 video file to each of the layers in groups A and B. I disabled the A -B Grouping and transition controls so that all 10 layers/videos would display simultaneously in the  preview window.

However, when selecting a new group and adding video files to the next layer group I realised that it seemed that only 1 group could be live at a time. This meant that it would seem to be impossible to have the planned 50 videos available to run live at any time.

Modul8 Media Set

Modul8 Media Set

If the idea was to have just the 10 videos running at any time the project could have worked using Modul8 as this was not the case I decided we would have to look at using alternatives such as MAX MSP and then develop our own midi controlled VJing application.

MAX MSP Jitter – First Look

I’ve looked at MAX MSP Jitter before for a 1st year project and although I decided at the time not to use it, I did put together a few patches just to familiarise myself with how it functions.

A recent lecture from Liam re-introduced MAX MSP Jitter and it’s uses for the non-programmers and although it would take some work I do believe it will be possible to produce a working program.

MAX MSP - cv.jits.faces

MAX MSP - cv.jits.faces

I downloaded the MAX MSP demo version from http://cycling74.com/downloads/ which is free to use for 30 days and at the same time downloaded some of the patches Liam demoed, Computer Vision for Jitter or cv.jit. This can be downloaded from http://jmpelletier.com/cvjit/ and included a number of Video Tracking and Video Manipulation Patches. I installed these as directed and soon had an example patch up and running. The Patch chosen tracked movement either using the Macs iSight Webcam or a video I pre-loaded. The running patch produces coordinates relating to movement seemingly identifying face and hand movements particularly well on both the live view and from the video sequences I’d used.

Although these patches were interesting and offered possibilities for pre-loading the video files for our Human Orchestra concept, I still needed to link this to a midi keyboard.

I then searched through the MAX objects looking for midi and audio related objects and came across ‘kslider’ which displays an on-screen keyboard with note and key velocity information. Using this as a base I looked through the list of midi objects and with Lees assistance we tested a number of these midi objects using a midi keyboard until we’d identified ‘notein’ as the best object to use. This picked up the key presses from the midi keyboard and reproduced this visually on the on-screen keyboard and played a note through the AU Midi interface.

However, this solution wasn’t perfect as the key mapping was slightly out of sequence and the tone was played both on key press and again as the key was released.

MAX MSP kslider

MAX MSP kslider

I deemed this a good start and have booked out a midi keyboard to continue working on this over the next few days at home and hopefully find a method of linking the video and keyboard patches together to produce a working system.

Performance Video – Microsoft Kinect

Microsoft Kinect

Microsoft Kinect

We may however be modifying our concept to include the use of Microsoft’s Kinect tracking system. Following a quick introduction to the Kinect by Liam today we’ve decided that we’d like to abandon the midi keyboard as the main input device for our concept and instead look at using the Kinect instead. This would make our concept a true performance based project with videos interacting to a person or persons movements.

The Performance Video

I’ve also had some ideas about the Performance Video itself which I need to put before the group and seek a consensus on what is the best way forward for this project.

What I now visualise is a looping video of the entire choir moving slightly to and fro and layered on top would be the videos that become active when an input either from the midi device or the Kinect has been received. Think of tiered rows of the choir and then one of the choir moves independently and sings a tone. When a chord is played or more than one movement is detected through the Kinect, more members of the choir move and sing a chord.

Project Blog Entry Links

  1. Performance Video – Conclusion
  2. Performance Video – Wiimote MAC
  3. Performance Video – VJ’ing using Quartz Composer
  4. Performance Video – Kinnect on MAC and PC
  5. Performance Video – MAX MSP Jitter
  6. Performance Video – Modul8
  7. Performance Video – The Human Orchestra

 

Word Count 904

By

Performance Video – Modul8

Modul8

The Plan – using Modul8

The more I think about the Performance Video project the more I think we should be concentrating on the Performance aspect rather than the technology used to produce our concept. With this in mind I began looking more closely at the Technology already available to us. I came to the conclusion that by basing the entire project around some VJing software (Modul8) that can interface with a midi musical instrument and apart from some programming, essentially the technical part of the project is complete.

The idea to this approach is based upon the Ophelia summer project I worked on with Samantha Else and Michael Moore see the Ophelia Blog entries for details and video below.

[youtube.com/watch?v=tP9P8mHqmlk]

Links

This would allow us to concentrate on producing the videos, the sound quality and possibly identifying a suitable location to project our videos, as we all feel this project would work well projected externally on a building.

How will it work?

I’ve sketched out my idea below, essentially all we need is a Macbook Pro with Modul8 installed, a midi keyboard and a VGA projector – simple!

Performance Video - The Plan Modul8

Fig 1.0 Performance Video - The Plan Modul8

The idea is that by using Modul8, which of course will be able to interface to a projector and handle all the video output and hopefully will also be able to interpret the input signals from the midi keyboard, which will in turn be mapped to the individual videos. This is all theory at this stage as I understand from my research that Modul8 does an initial scan of midi inputs at start up and that there is no subsequent midi mapping functionality.

My previous experience of using Modul8 also makes me believe that this simple plan will be anything but simple. I fully expect there to be issues with mapping the keyboard notes to the corresponding videos. I’ve had problems with video files working one day but not the next due to CODEC choices and I suspect that the performance of the overall concept maybe limited by the performance of the technology. By this I mean will the Macbook/Modul8 be able to handle all the multiple videos running simultaneously, especially when a chord is played.

The only solution is to carry out some testing as soon as possible to see if this works, if not then another plan will be needed, one most probably using Max/Msp and Jitter.

 

Research Links

Modul8 is a revolutionary MacOS X application designed for real time video mixing and compositing. It has been designed for VJs and live performers. Modul8. Site Accessed 20/11/2011 http://www.modul8.ch/

 

Project Blog Entry Links

  1. Performance Video – Conclusion
  2. Performance Video – Wiimote MAC
  3. Performance Video – VJ’ing using Quartz Composer
  4. Performance Video – Kinnect on MAC and PC
  5. Performance Video – MAX MSP Jitter
  6. Performance Video – Modul8
  7. Performance Video – The Human Orchestra

Word Count = 501

 

By

Performance Video – The Human Orchestra

Performance Video

Performance Video – The Brief

The musical instrument

For the interface design unit your task is to create a ‘digital’ musical instrument and to use it as part of a performance or to get someone to perform with it. Your instrument can be constructed in numerous ways, it can be software, it can be a physical construction, it could be generative or use environmental factors. The performance, should have an audience and a performance, the place can be a street, the pub, and hilltop or a nightclub.

Performance Video – Idea Development.

The groups idea (from Lee’s Personal Portrait project) for the Performance Video project is to produce a sequence of videos for each of the members of a choir. Each singing to camera a single note. From these video performances we hope to have enough videos/notes to reproduce  up to 3 to 4 Octaves.

Midi Keyboard

Midi Keyboard

To control these videos we will design a custom midi interface to control the sequence of the videos, so that any midi compatible instrument will be able to effectively play the videos, producing a musical score.

Performance Video - Video Wall

Performance Video - Video Wall

To visualise this concept think of a video wall with each individual video featuring a single member of the choir and as you play, for example a midi keyboard – for each note depressed a video runs and you hear that note being sung. Pressing more than one key at a time to make a chord will play the corresponding videos and so you would hear the chord being sung.

We are already considering a number of input devices one of which is a oversized floor positioned piano keyboard an example of which featured in the film ‘Big’

[youtube.com/watch?v=AByIokt3X0E]

Another example of the Giant Piano

[youtube.com/watch?v=1HjG6TYMFfg]

My Responsibilities

For this project my primary responsibility will be to shoot and edit the video footage, almost certainly using my Canon EOS 60D. Due to the technical nature of this project I fully expect to have some involvement in the design and development of the midi interface, which may involve the use of max/msp and jitter or similar technology, which will become apparent after researching appropriate website references and published material.

It may be that the control technology is already in existence in which case we can concentrate on the performance aspect of the project.

As usual I will offer assistance/input to other members of the team as required.

Research Links


What is Max?

Make connections. Make things happen.

Max gives you the parts to create unique sounds, stunning visuals, and engaging interactive media. These parts are called ‘objects’ – visual boxes that contain tiny programs to do something specific. Each object does something different. Some make noises, some make video effects, others just do simple calculations or make decisions.

In Max you add objects to a visual canvas and connect them together with patchcords. You can use as many as you like. By combining objects, you create interactive and unique software without ever writing any code (you can do that too if you really want to). Just connect.
Website Accessed 18/11/2011. http://cycling74.com/whatismax/

MuSET

MuSET is a research group dedicated to the exploration of computer applications to music and sound. The research group was established in 2004 and is located in the School of Music at the University of British Columbia. Website Accessed 18/11/2011 http://debussy.music.ubc.ca/muset/index.html

Project Blog Entry Links

  1. Performance Video – Conclusion
  2. Performance Video – Wiimote MAC
  3. Performance Video – VJ’ing using Quartz Composer
  4. Performance Video – Kinnect on MAC and PC
  5. Performance Video – MAX MSP Jitter
  6. Performance Video – Modul8
  7. Performance Video – The Human Orchestra

Word Count 592

 

%d bloggers like this: