Isadora + Processing + Kinect


[A]ura Video


A nice 3 minute video from Aura here, it’s a bit late – i’ve been rather busy!

[A]ura – WYSIWYG
What You See Is What You Get

A performance-gallery with open-source-choreography
One day the internet will be as sacred as a giant ancient cathedral. And everybody will sacrifice their details to pay tribute to the digital gods.
In the interactive performance (A)ura, the user is transformed from spectator, to creator: during the weeks before the premiere they can find a unique feature at ChoreoMixer, with which they can design and remix the choreography and contents of (A)ura, by using the browser.
Walter Benjamin says that in modern art the aura of the work of art withers because of mechanical reproduction, and inaccessibility, due to too close a proximity to the observer.
At the same time, a completely different aura of amazement will be created by new technologies. Every user can be a choreographer, an innovative remixer and in that way create his own virtual memorial which perhaps will follow its own unexpected ways. Closeness and distance will join crosswise in a new way.
But what about the situation when the digital medium refuses to cooperate with the user?
What you see is what you get.
choreography/dance: Elke Pichler
music/video: Alexander Nantschev
stagedesign/costumes: Monika Biegler
programming: Stefan Lechner
interactive technical support: Graham Thorne
legal consulting: Clemens Lahner
pr: Stefan Eigenthaler
camera: Philipp Aichinger
Kindly supported by:
MA7, Kosmostheater and Ars Electronica Center”

Don’t let Apple kill Open Ni


http://petitions.moveon.org/sign/apple-let-apple-kill

Written by Mark Coniglio:

Artists, designers, makers, and tinkerers around the world have leveraged the power of 3D cameras like the XBox Kinect and XTion Live to create innumerable interactive installations, performances and other compelling works of art. The OpenNI SDK is a key technology behind many commercial and open-source software programs that have empowered this artistic expression. So, we were surprised and shocked when we saw the notification on the OpenNI site that said the web site would close on April 23rd.

Apple, you bought Prime Sense, and now you want to kill this critically important technology. We’re respectfully asking you: don’t do it.

Many of us in this community remember a time when Apple was our champion. You provided us with a truly innovative operating system that offered unforeseen ways to interactively control and present media. This in turn led to the development of of thousands of applications that empowered tens of thousands of artists to inspire and entertain audiences around the globe. (And, by the way, help you sell millions of computers.)

Mr. Cook, your action to take away OpenNI runs counter to that history. We know your priorities have changed. We know that it’s all about mobile devices these days. We understand that you likely have your own plans for 3D imagining technology.

But, empowering personal expression through technology was one of the core values that made Apple the great company it is. We are asking you to remember that part of your history today, and to support us by changing course. Keep OpenNI open and available to us.

PLEASE SIGN THE PETITION! WE NEED THIS TO STAY!

http://petitions.moveon.org/sign/apple-let-apple-kill

Inspiring Videos.


Every now and again I like to post some video that I’ve seen. Here they are;

A music video/live performance all done using Z-Vector; VJ Julius Tuomisto and his Z Vector software from Delicode. I love this type of live, real time visuals.

Lots of people have been talking about this video. Chances are you have already seen it. But hey, if you haven’t its amazing an well worth a watch! Read more about it here also: http://thecreatorsproject.vice.com/blog/video-exclusive-bot–dollys-the-box-unpacks-a-radically-new-design-concept

Isadora user ‘Lanz’ posted this on the forum. A very fun, vibrant and interesting piece. Lovely digital scenery! [More info here]

“Taking inspiration from fairy tales, In A Deep Dark Wood is a fun and interactive show about a little girl who bravely ventures into a dark and mysterious wood. Encountering tempting trees to climb and beguiling creatures, the tale unfolds as the young audience help to create a magical world using shadow and light to guide the little girl through her bewitching adventure.”

Last but not least check out ‘The zero hour’ by Imitating the Dog.

http://www.imitatingthedog.co.uk/projects/the-zero-hour/

“Taking as its starting point the final moments of the Second World War in Berlin, The Zero Hour follows the stories of three couples living through three very different versions of the same historical events”

If you have any cool videos to share then add them to the comments below.

3D [Embodied] Uses NI-mate for tracking.


“3D [Embodied] is a mixed reality performance involving a virtual world as a platform to explore 3D immersive spatial virtual and physical displays. The performer ́s body interacts with the three dimensional space by the input of computer vision, skeleton tracking, and the output of 3D video mapping projection. Both video and audio rendering are generated in real time. Choreography by Yacov Sharir, sound design by Bruce Pennycook and technical support by Yago de Quay. Austin, 2013”

Z Vector by Delicode.


The guys at Delicode have teased us with a brand new software called Z Vector.

It was code named “The V” for a month or two but now it has a fancy new website and demo video from a band called Phantom and their official music video Scars.

 

Delicode are the driving force behind NI Mate and Kinect Stories. This third product is sure to blow the socks off any artist, VJ or individual who has the taste for live visuals and kinect style motion sensing trickery. With an already heavy hitting feature list I am sure this will be lots of fun!

I was lucky enough to see a demo of this when I met up with CEO of Delicode Julius Tuomisto in Helisnki for the LUX festival. Being able to single someone out in a crowd and then create visuas from them is awsome; podium dancers will be a brilliant addition to this software let me tell you!

I think people are going to love playing around with the editable GLSL shader editor; I seen it very briefly in january and it was a pop up window that looked a bit like processing.

I need to learn some GLSL coding I imagine. I also like the idea of Syphoning (is that even a word?) the results out into another software to add more layers, mix, map and play around with. Wow… can’t wait.

Features include:

  • real-time depth camera based 3D scanning and mixing
  • multiple virtual camera and tracking modes
  • multiple drawing modes (vectors, polygons, etc.)
  • full sound and MIDI clock synchronization
  • animatable GLSL shaders with editor
  • full support for Windows / Mac OS X
  • fully scalable output resolution
  • stereo rendering (side-by-side, anaglyphic)
  • GPU acceleration
  • MIDI/OSC control
  • Syphon support (Mac OS X)

‘Reveal’ Report Part.2 [LUX – Helsinki]


If you haven’t already, please read Part 1 [HERE]

So, after about a year of planning I flew out to join Dan and two of his third year students from Guild Hall School of Drama (London) in Helsinki. The first thing that hit me was the weather; it took your breath away it was that cold. I arrived about 11:30,  after a beer I went straight to bed… but the next day we got up and went straight to the cabin.

THE CABIN

We had a porta-cabin upon which all the computer hardware was kept. (We where lucky as some people had tents!) And this was pretty much my view for the next 6 days…

Lots of coffee cups lying about!

AND IT BEGINS…

The first two days we had to set up, test and check the system. Then LUX was officially open and the public could all the pieces in full glory. Public viewing times where 4:00 PM until 10:00PM

We spent two days fine tuning and making the Isadora patch perfect. The sun went down about 3 so we had a short while to preview the images properly… this was from day one…

Sorry for image quality I was shaking holding my phone!

Day One

We attracted a lot of attention! So much so that people where blocking the pathway and filling up on the stairs in the image above!

People going about their daily business walking or cycling home had to stop and ask people to move out of the way. It was great to see so early on.

It became clear after only a few hours that children understood it straight away and some of the elder people took a little longer to figure out the concept and interaction. We found that a lot of people didn’t want to interact with the piece and just wanted to watch. But the flow of people made it work anyway. As people walked directly up to the cameras they soon realised that they where blocking the cameras and revealing more of the wall. We had a few dogs sniff the cameras and push buggies also looked good interacting with the wall images as they rolled by.

Some interesting observations for me including a guy swinging a white plastic bag around his head for about half an hour, a man and woman dancing like robots and a small child physically touching the wall and projected light thinking that it was somehow physically interactive.

The first night went really quick and some of the patches didn’t look as good as expected. But overall we where happy…. we then went for a beer and didn’t get home until 4 in the morning! The guys at sun effects looked after us and it was all networking (of course!).

TECHNICAL DIAGRAM

This is just a simple technical diagram to show the workflow.

It’s not the best diagram but it gives you an idea of the layout

Kinect for Windows.. Could this be it?!


I have been slightly bewildered as to why the Microsoft Kinect has not really been able to work with windows computers as easily as it has been with Apple Mac; especially in terms of plug-ins. One only has to ‘do a Google’ to see that the majority of software are mac only.

I guess its all down to the drivers and frameworks… (well over my head) but it still seems a bit strange.

The Xbox Kinect. Great for Mac.... its been a long wait for Windows users!

Anyway; many of the Isadora PC/Windows users have been slightly jealous of us apple lot as we’ve been playing around with Kinect sensors for well over a year now via Quartz Composer, Processing, Syphon, etc. But now it seems that there is a Winndows plugin!

The all important link is here:

http://vjfx.com/main/en/mmblog/mmblogintrinsicsrnd/item/64-freeframe-depthcam-plugins

I have not tested it and don’t actually have access to a windows machine right now. But it seems its easy to install and should be fine.

But note: Important: Download and use the 32bit versions of all files, since 64bit versions do not work with the plugins, this also goes for use on Windows 7 64bit!

Enjoy!