Isadora + Processing + Kinect


[A]ura Video


A nice 3 minute video from Aura here, it’s a bit late – i’ve been rather busy!

[A]ura – WYSIWYG
What You See Is What You Get

A performance-gallery with open-source-choreography
One day the internet will be as sacred as a giant ancient cathedral. And everybody will sacrifice their details to pay tribute to the digital gods.
In the interactive performance (A)ura, the user is transformed from spectator, to creator: during the weeks before the premiere they can find a unique feature at ChoreoMixer, with which they can design and remix the choreography and contents of (A)ura, by using the browser.
Walter Benjamin says that in modern art the aura of the work of art withers because of mechanical reproduction, and inaccessibility, due to too close a proximity to the observer.
At the same time, a completely different aura of amazement will be created by new technologies. Every user can be a choreographer, an innovative remixer and in that way create his own virtual memorial which perhaps will follow its own unexpected ways. Closeness and distance will join crosswise in a new way.
But what about the situation when the digital medium refuses to cooperate with the user?
What you see is what you get.
choreography/dance: Elke Pichler
music/video: Alexander Nantschev
stagedesign/costumes: Monika Biegler
programming: Stefan Lechner
interactive technical support: Graham Thorne
legal consulting: Clemens Lahner
pr: Stefan Eigenthaler
camera: Philipp Aichinger
Kindly supported by:
MA7, Kosmostheater and Ars Electronica Center”

Isadora & The Edge Feedback Effect


Isadora & The Edge Feedback Effect

I was asked how to create a feedback effects using just the edge of a persons outline. This is how i’d go about it.

1) Use either a Kinect mask (via Ni-mate or similar) or a green screen video. I am using a free download green screen video.

2) Threshold then ‘fills’ the person with white.

3) The Edge then ives you an outline.

4) The video mixer and Spinner actors create the feedback loop. Changing the zoom from 90-120 give good effects. You can also spin the feedback – as you can see me doing. The mix amount can vary; play around with this yourself. Start on 50 so you can see 50:50 and then alter it to your desired effect.

5) The contrast adjust brings the white back up again – this will vary depending on your video feed. I find the IN MAX to 10-12 works ok.

6) Add some motion blur. This is processor hungry so you can always skip this.

7) Out to your stage/projector.

8) I also created a second projector and layered it over the top; using a feed direct from the edge. This will sit on top of the other layer sits more obvious where the person is.

9) OPTIONAL: add more effects such as a coloriser. Add other effects too!

Don’t let Apple kill Open Ni


http://petitions.moveon.org/sign/apple-let-apple-kill

Written by Mark Coniglio:

Artists, designers, makers, and tinkerers around the world have leveraged the power of 3D cameras like the XBox Kinect and XTion Live to create innumerable interactive installations, performances and other compelling works of art. The OpenNI SDK is a key technology behind many commercial and open-source software programs that have empowered this artistic expression. So, we were surprised and shocked when we saw the notification on the OpenNI site that said the web site would close on April 23rd.

Apple, you bought Prime Sense, and now you want to kill this critically important technology. We’re respectfully asking you: don’t do it.

Many of us in this community remember a time when Apple was our champion. You provided us with a truly innovative operating system that offered unforeseen ways to interactively control and present media. This in turn led to the development of of thousands of applications that empowered tens of thousands of artists to inspire and entertain audiences around the globe. (And, by the way, help you sell millions of computers.)

Mr. Cook, your action to take away OpenNI runs counter to that history. We know your priorities have changed. We know that it’s all about mobile devices these days. We understand that you likely have your own plans for 3D imagining technology.

But, empowering personal expression through technology was one of the core values that made Apple the great company it is. We are asking you to remember that part of your history today, and to support us by changing course. Keep OpenNI open and available to us.

PLEASE SIGN THE PETITION! WE NEED THIS TO STAY!

http://petitions.moveon.org/sign/apple-let-apple-kill

2d to 3d – Finally!


I have been pondering this idea for a long long time! I have experimented with the Kinect camera to scan 3D objects and make virtual 3D copies but I have always toyed with the idea of making realistic 3D objects using 2D images. This can be done somewhat in photoshop and after effects but this method blew me away!

Source : http://createdigitalmotion.com/2013/09/from-a-single-2d-photo-3d-objects-created-easily/

Future TV?


I have been keeping an eye on a few things lately that I do not normally blog about.

It all started about three weeks ago by randomly finding this video:

As I imagine it did for some of you, it blew me away. What an amazing project.

I then spoke to a friend called Gavin from the Digital Fun Fair who said he had developed something quite similar and we had a good talk about it.

Then I found this on kickstarter…

http://www.kickstarter.com/projects/woodenshark/lightpack-ambient-backlight-for-your-displays

There are lots of demo’s on YouTube but this seems to be the best i’ve found:

Now this is seriously cool stuff. Plug and play functionality, affordable, multi platform, etc.

Then I found this….

http://www.bbc.co.uk/news/magazine-22315685

This idea has been looming in my head for a while now after I had seen this from Sony;

All this stuff is amazing and right up my street. There are obviously huge technological costs for this kinda stuff, never mind logistics – who has space for projector behind the sofa? Not me.

I can imagine playing an xbox game with practically every wall being projected onto. Combine that with a xbox Kinect and the room would spin and rotate based on your head position… scary! A truly immersive, reactive and spacial aware experience could be created with this technology.

I think this stuff will happen in one way or another.

What are your thoughts?

Z Vector by Delicode.


The guys at Delicode have teased us with a brand new software called Z Vector.

It was code named “The V” for a month or two but now it has a fancy new website and demo video from a band called Phantom and their official music video Scars.

 

Delicode are the driving force behind NI Mate and Kinect Stories. This third product is sure to blow the socks off any artist, VJ or individual who has the taste for live visuals and kinect style motion sensing trickery. With an already heavy hitting feature list I am sure this will be lots of fun!

I was lucky enough to see a demo of this when I met up with CEO of Delicode Julius Tuomisto in Helisnki for the LUX festival. Being able to single someone out in a crowd and then create visuas from them is awsome; podium dancers will be a brilliant addition to this software let me tell you!

I think people are going to love playing around with the editable GLSL shader editor; I seen it very briefly in january and it was a pop up window that looked a bit like processing.

I need to learn some GLSL coding I imagine. I also like the idea of Syphoning (is that even a word?) the results out into another software to add more layers, mix, map and play around with. Wow… can’t wait.

Features include:

  • real-time depth camera based 3D scanning and mixing
  • multiple virtual camera and tracking modes
  • multiple drawing modes (vectors, polygons, etc.)
  • full sound and MIDI clock synchronization
  • animatable GLSL shaders with editor
  • full support for Windows / Mac OS X
  • fully scalable output resolution
  • stereo rendering (side-by-side, anaglyphic)
  • GPU acceleration
  • MIDI/OSC control
  • Syphon support (Mac OS X)

Can computers fill the role of choreographers?


Can computers fill the role of choreographers?

 

A really nice article here. Lots of comments and thoughts from Mark Coniglio, Troika Tronix, Troika Ranch, NI Mate from Delicode and more… including a video:

http://www.washingtonpost.com/lifestyle/style/computers-and-dance-an-emerging-duo/2013/03/16/67071308-8e68-11e2-b63f-f53fb9f2fcb4_video.html