The problems with motion tracking

Motion tracking; simple in concept, difficult in reality.

I spend a lot of time helping like minded people the ‘art’ of motion tracking. Along my way I have noted a number of problems, questions and thoughts which I will share with you in this post.

Motion tracking is still a new concept. It has been done successfully since the 80’s but as new technology has developed it has opened up new avenues as to how we can track objects – objects normally being us humans.

In no particular order here are some of my comments:

  • Think about where your camera(s) will be. Placing them front on will work on a horizontal X plain very well. But if a person walks behind another person a machine will not know that they are just out of view. Whilst a camera above the subjects will not have this problem.
  • In regards to the above; you can turn on the ‘lifespan’ feature on the Eyes++ actor; as described and seen below.
  • There are many ways to motion track using Isadora. One only needs to search the forum and this blog to see a few methods. If you use my method [Click Here to View] it will still in fact not always be perfect for your desired purpose.
  • If you use the freeze & difference method as I use in my point above, then it is not always good for instillation/theatre use as you may need to take a number of freezes depending on the time of day, change of set or similar.
  • Using a ‘normal’ video camera via FireWire of Composite output can be used for motion tracking, Sony cameras have nightshot which can be handy for darker scenarios but ideally an infrared camera and lighting will eliminate many problems…
  • For example; if you have a camera pointing at a projection screen – it will create not only a video feedback loop, but Eyes/Eyes++ will motion track itself – hard to describe but basically it will pick up its own tracking and go crazy!
  • If you use InfraRed lights and then a filter, the camera will ignore the projection light as the light is not allowed through into the camera because of the filter.
  • You can track up to 16 objects using eyes++, you can double up on eyes++ actors and by splitting/cropping the video but this is less than ideal.
  • If you are using motion tracking in a dance/theatre scenario then consider lighting, props or material on stage as a camera will detect the slightest change.
  • If you are not using InfraRed light then try and black the space you are using as best as possible. Even the slightest change of light will have an effect.
  • You can replicate motion tracking by filming movement to pre-recorded video and linking it to the Eyes/Eyes++ actors as there is no real difference as far as Eyes/Eyes++ actors are concerned.
  • Adding a touch of Gaussian Blur to the incoming video feed can take the ‘edge’ of unwanted video noise and pixilation. You can also knock the camera out of focus.
  • No mater what – motion tracking is tricky, but with practice and forward thinking you will get the hang of it.



3 thoughts on “The problems with motion tracking

  1. Hi Graham,

    I was reading this blog post again, and wondering if you had an idea about how to reduce latency caused by the blur ?

    We played 1minute69’s B0DYSC4PES performance live in Rome last week. You can see it here :

    The delay we encounter (because of this blur effect) is obvious, and even if it sometimes is interesting, we’d like to reduce it to the maximum so that we can choose if/when we want to “play” with this kind of echo…

    We use the blur exactly for the reason you explain in your post : in order to have a decent mask, and not those noisy/pixilated edges, we try to smooth the video capture, but it adds a lot to the video processing task, apparently, and thus slows down the framerate.

    If you have some advice about this, please let us know,
    thanks in advance,

    • Hi Em,

      Blur (gaussian) does use up a lot processing due to its frame blending and rendering. Off the top of my head I can only think of a few work arounds….

      1) If you are using a CCTV or camera that has manual focus then knock the actual camera out of focus to make it blurred – obviously this limits the use of the camera but I have done this before.
      2) Try box blur or another blur (not experimented with this so not sure)
      3) If you have the core version of Isadora try using a Quartz composer actor, this will make the graphics do the processing rather than the coding of Isadora.
      4) Reduce the quality of the incoming input video… again not always ideal but of you are adding effects and/or just using it for tracking and not visual material then it doesn’t matter. I have ran live video input at silly small input sizes just to keep a patch going.

      Hope this helps or triggers some ideas.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s