Basic Motion Tracking


A simple motion tracking set up tutorial. I am using Isadora Version 1.3 which can be downloaded here as a beta.

Before you start make sure a USB camera/Firewire camera (or similar) is plugged in BEFORE you open Isadora, its not the end of the world if you plug them in whilst Isadora is open but its good practice, especially with firewire cameras as you can blow the firewire chip!

So; first of all lets get a ‘Video In Watcher’ actor

We wont see any video or anything happen yet (its not quite that simple!) so… we need to tell Isadora which video we want to see and basically tell it to start/go/active.

Go to Input>Live Capture Settings

Short cut is Apple+L

You will then see this screen:

The Live Capture Settings

Please note: This controls input for audio and video; not just video!

So… we must scan for all input devices. Click “Scan For Devices” – nothing will happen visually but Isadora will have just scanned all incoming ports looking for video (and sound) connections.

Isadora can see/read up to 4 video inputs. Now there is a lot of talk about maximum quality, the best capture cards and so on… I can’t answer all these issues here and you will be best logging onto the Isadora forum and asking the community. [Click here to do so]

You can see the four input channels in this section:

(Click on it to see all 4 options)

Make sure the radio button ‘Enable’ is ticked.

Underneath it you will see the Video options. Click on the ‘Device’ section and you will see a list of all the video inputs available to you. In my case I only have my iSight camera. So only this shows up. A lot of firewire cameras will show as sometimes irregular names such as NGV-XXXX/SNY-1334 or similar. This is fine.

And so I click on my iSight input:

Now; click the ‘Start Live Capture’

You should now see something similar…

You can close this window now and return to the main patching window if Isadora, you may see a slight change…

The box has now been activated as we can see the X on in the trigger output… this is because it is receiving the live feed video. Here is a comparison off the Video in watcher actors active and not active:

Now to the creative parts… drop the ‘Eyes Actor’ into the window:

Then link the video out into the video in, like so:

You should now see all these numbers going a bit crazy if there is someone or something moving in front of the camera…

A little feature which is really handy is the monitor within the eyes actor; turn this on by clicking on it and moving it upwards, clicking in the box and typing ‘on’ or click in the little black box and move your mouse upwards; you should then see something similar to this:

This is a low quality preview of what the camera is seeing within a display of what the Eyes Actor is seeing overlayed on top of the image. The same Yellow and Red lines relate to the OBJ CTR H/V and CTR OFFSET H/V. These four abbreviated words stand for:

Object Central Horizontal
Object Central Vertical
and
Central Offset Horizontal
Central Offset Vertical

Yellow Lines = Object Central Horizontal/Vertical
Red Lines = Central Offset Horizontal/Vertical

So the Eyes actor sees objects (normally us; humans) as just blobs or objects. It sees these objects and turns this into numerical data: 0-100. So the numbers you see on the right hand side of the relate to the 0-100. If you stand dead center the values of the horizontal value should be close to 50.

Now a few tips:

– Experiment with the ‘Threshold’:

Turning the values up with allow less video feed into the Eyes Actor which can get rid of background noise and objects which are not needed. Its difficult to get this right and will often need to be changed depending on lighting; brightness of clothing or similar. I normally set it to 11/12 out of habit.

– Turn on ‘Smoothing’:

This smoothes out the values that Eyes creates which stops jittery vales and makes everything move a lot nicer. Again; out of habit I set mine at 6… no real reason, this is subjective.

 

– Try ‘Inverse’ in tricky situations:

Turning this ON will sometimes give you better results as it flips the image into a negative image. I often use this if dancers or actors are wearing dark clothing and am having some trouble.

Now what?

I cant answer this…. shocked?

Well its up to you what you do with this data…

Try linking it up to the position of a movie player, the brightness of a projector, or the wet/dry of an effect. I use motion tracking data in a different way each time and its your artistic merit that will create something new.

Hope this helps and have fun.

Skulpture

3 thoughts on “Basic Motion Tracking

  1. Pingback: Tweets that mention Basic Motion Tracking « VJ Skulpture's Blog -- Topsy.com

  2. i am busy with a project and i wanna make zone’s in real life, if you get in one of these zone’s a music sample starts to play. can you explain me how i do this?

Leave a comment