[A]ura Video


A nice 3 minute video from Aura here, it’s a bit late – i’ve been rather busy!

[A]ura – WYSIWYG
What You See Is What You Get

A performance-gallery with open-source-choreography
One day the internet will be as sacred as a giant ancient cathedral. And everybody will sacrifice their details to pay tribute to the digital gods.
In the interactive performance (A)ura, the user is transformed from spectator, to creator: during the weeks before the premiere they can find a unique feature at ChoreoMixer, with which they can design and remix the choreography and contents of (A)ura, by using the browser.
Walter Benjamin says that in modern art the aura of the work of art withers because of mechanical reproduction, and inaccessibility, due to too close a proximity to the observer.
At the same time, a completely different aura of amazement will be created by new technologies. Every user can be a choreographer, an innovative remixer and in that way create his own virtual memorial which perhaps will follow its own unexpected ways. Closeness and distance will join crosswise in a new way.
But what about the situation when the digital medium refuses to cooperate with the user?
What you see is what you get.
choreography/dance: Elke Pichler
music/video: Alexander Nantschev
stagedesign/costumes: Monika Biegler
programming: Stefan Lechner
interactive technical support: Graham Thorne
legal consulting: Clemens Lahner
pr: Stefan Eigenthaler
camera: Philipp Aichinger
Kindly supported by:
MA7, Kosmostheater and Ars Electronica Center”

Advertisements

Leap Motion


I like my gadgets. I am admittedly a bit of a gadget freak… but I don’t often actually buy them. Normally because they are expensive and I know deep down I won’t use them as much as my mind believes I will. But the Leap Motion is different.

The Leap Motion is a USB powered hand/finger tracking device. Think of it as an xbox kinect for the hands perhaps? It’s made for Mac and Windows too which is great.

The into video explains the basics very well…

I have bought one (second hand off eBay for £40 I might add) to use at work and at home. I have a few ideas for it and if nothing else will be a handy (pun-intended) gadget on my office desk – it’s already plugged in actually. It’s small sleak and sits flush to my MacBook Pro. It’s a shame it’s not wireless as it takes up a USB plug but I have a powered hub on my desk.

How am I going to use it? To be honest I have no concrete ideas. There are a few Isadora users who have made a few OSC/MIDI related actors to control Isadora scenes which I am going to look at and I am sure I can find a few uses in that area. I also like the idea of using it for generate/live painting in a VJ context. Will I look an idiot waving my hand around at a gig to make stuff appear on the screen? – probably! Do I care?…. probably! But we shall see.

I also want to see if it can be used in a commercial sense. At the museum and art gallery where I work we have interactives for visitors but people, for some reason, LOVE to hack, break, pull, snap, spill drinks, etc, etc on keyboards, mouses, touch screens, etc. Now this can’t be helped all the time of course, but there is a tiny percentage who do it on purpose, I’ve seen them! Maybe a Leap Motion placed under some perspex (military grade?) could replace the keyboard and mouse idea? Problem is the learning curve! How will people know how to move their hands or even to put their hands above the device? Similar to the Kinect sensor it’s not a recognised Human Input Device (HID)

So – I will keep you posted. Minority report… Yeah! Let’s do it!

Do you own one? Let me know how you are using it. I wanna know… Let’s talk!

3D [Embodied] Uses NI-mate for tracking.


“3D [Embodied] is a mixed reality performance involving a virtual world as a platform to explore 3D immersive spatial virtual and physical displays. The performer ́s body interacts with the three dimensional space by the input of computer vision, skeleton tracking, and the output of 3D video mapping projection. Both video and audio rendering are generated in real time. Choreography by Yacov Sharir, sound design by Bruce Pennycook and technical support by Yago de Quay. Austin, 2013”

Simple Colour Tracking in Isadora


Just a quick tutorial to get some of you use to colour tracking.

I have found colour tracking to be quite difficult to implement, this does not mean its impossible – just awkward at times!

Here is a screen shot of a basic colour tracking patch:

A basic Colour tracking patch.

Using the chroma key you can filter the colour that is seen by the Eyes actor. Currently, the image above shows only the yellow colour being allowed through the actor. This means, only objects with the colour yellow will be tracked.

If you want to change the colour to be tracked, you can do this by changing the KEY HUE input, a bigger image of the Chroma actor can be seen below at its default state or ‘red’.

The output of the actor is then fed into the Eyes actor. You can adjust the threshold of the Eyes Actor to filter out any unwanted colour/noise and Eyes will work as normal (same with Eyes++)

A good  idea is to change the key hue and then snapshot the colours using the snapshot features at the top of the Isadora window. Then you can jump between different colours much easier and the values will be stored. Then get creative!

EDIT: A very quick and very basic working example:

Enjoy!

Skulpture