Salutations Interactive Installation: Making of

Since the post about the Salutations Interactive Installation in the Museum für Kommunikation in Berlin, many people asked me for videos of the installation so we, the Igalia Interactivity team, came up with a better idea — a making of.

The video shows the an early test application that was created from the one shipped with Skeltrack, then some testing with colleagues at Igalia’s office and finally the installation in Berlin and the final result:


(link to original video in YouTube)

You can get the source code for the application from Igalia’s GitHub.

Skeltrack got an award

Last week, while I was busy in Berlin with an interactive installation, I received some good news: Skeltrack got an award in an innovation contest organized by the KNetworks project.

From its own website, KNetworks “[…] is an open network based in the Atlantic area with main interest in the fields of: e-government, innovation, knowledge transfer, technology, the Internet, collective intelligence, the future and the creation of knowledge.”
Members of KNetworks include several universities and government organizations of the European countries in the Atlantic area.

Being the first Open Source library for skeleton tracking, there are a number of possibilities that Skeltrack makes possible and we developed it in Igalia simply because we wanted to use skeleton tracking and there were no open solutions available. So I am very happy with the recognition, in this case a joint 3rd place.

View of Oxford

Oxford, England


Since there was a ceremony in Oxford for the delivery of the awards, I bought a shirt in Berlin :) and flew to London instead of Coruña. I presented Skeltrack and also mentioned Igalia and the cool things that make us different.
At the ceremony, I also had the chance to meet the other contest winners and members of the organization. It was an interesting dinner where I spoke English, Spanish and Portuguese :)
After that we ended up in the Turf Tavern — the oldest pub of Oxford — where I discovered that I completely dislike ale.

I had never been to Oxford before so I stayed an extra night to visit the city. What a nice city it is! It is kind of similar to Évora, where I studied, in the way that it has many university buildings spread across the city but of course, at a different scale.

On my way back to Heathrow, more awesome stuff: I found out that the old man sitting close to me in the bus was in fact Donald Knuth! I presented myself, chitchatted a bit and, feeling like a little boy who met Spiderman, thanked him for everything.

I would like to thank the organization of the KNetworks contest for the award and congratulate the other contest winners.

The world’s 1st interactive installation to use Open Source skeleton tracking

Edu and I, proud members of the Igalia Interactivity team, spent the last week in Berlin for the culmination of an interesting project: an interactive installation in the Museum für Kommunikation.

The museum commissioned the Berlin’s interaction/design Studio Kaiser Matthies to create an installation so the studio created the concept and teamed up with us to develop the technical part.
The installation’s purpose is to show different forms of communication and the concept is very simple:
When a user is detected in the “action zone”, an actor shows up in a screen and performs a salutation; the user is supposed to do the same salutation and receives a positive feedback if it was performed well or a negative feedback otherwise.
Examples of gestures are the Japanese bow or waving a kiss.

Pictures of the salutations installation in the Museum für Kommunikation, Berlin

Salutations installation in the Museum für Kommunikation, Berlin

The screen in the right side shows a live video of users so they can compare their gestures with the ones expected from another person’s perspective.

For user detection and to know where their skeleton’s joints are, we used Skeltrack. We also used OpenCV on top of it in order to track more complex salutations, such as the US East Coast hand’s sign.
As for the rest of the stack, we used a minimal Debian, Clutter and GStreamer with many mechanisms to make it robust in case of failure and all this running from a USB stick.

This means that the software used in this installation is completely Open Source and more importantly, it is the world’s first interactive installation that uses Open Source skeleton tracking. We are also going to release the very application’s source code once we have time to release it.

We would like to thank Studio Kaiser Matthies for the opportunity of having such an important project in one of the world’s art capitals. Be sure to visit the museum the next time you’re in Berlin and, if you want us to help you do awesome interactive installations using Open Source software, let me know.

Playing Angry Birds with a Kinect

Recently I had to use OpenCV in a project inside the Igalia Interactivity and I took the chance to code a little demo I had in my mind for a while: play Angry Birds with a Kinect and using only Free Software.

Here’s the result:


(direct link to video in Vimeo)

How it’s done

The demo uses Skeltrack (the only Free Software skeleton tracking library) to get user’s hands’ positions. The picked hand’s position will be used to move the mouse pointer. This part is the same that is used in the Skeltrack Desktop Control demo.
Once the hand’s position is known, I calculate an area around its point in the original depth image given by GFreenect and then use OpenCV to get the hand’s contours and their convexity defects. An open hand palm will produce very distinguished convexity defects and by counting them I assume the user’s hand palm is open. After this, all that is missing is to tie a detected closed palm to a mouse press and an open palm to a mouse release.

As the video shows, the demo is not polished yet but it shows one of the many possibilities that allying Skeltrack with other computer vision software gives us.

Skeltrack 0.1.10 is out

That’s right, a new version of the world’s first Free Software skeleton tracking library is out.
In every version we try to make Skeltrack more robust and this one is no exception.

Head&Shoulders

We have changed the way the shoulders are inferred. This heuristic now uses a circumference around the user’s head and an arc with which it searches for the shoulders.
Since we like to keep giving developers the ability to tweak the algorithm’s parameters, we had to change the properties related to the shoulders. We should probably improve the documentation with a visual explanation of how those properties work but meanwhile you can check the properties’ documentation.

Centering Joints

Another issue we had was that the extremas we initially calculate result in e.g. the point at tip of the a finger (for a hand joint) or the top of the head. This was not an issue specifically but it might result in more unstable joints. For example, the Kinect device in particular might give blind spots in very bushy hair which would result in the head joint jittering more than usual.
To fix this, we calculate the average of points around an extrema and assign it with that value. The radius of the sphere surrounding an extrema that is used to calculate this average can be controlled by using the extrema-sphere-radius property. Thus, if this behavior is not desired, this feature can be turned off just by simply assigning a 0 to this property.

Here is a couple of pictures describing this issue:

Picture of Skeltrack's test without averaged extremas

Without the averaged extremas (extrema-sphere-radius set to 0)

Picture of Skeltrack's test with averaged extremas

With the averaged extremas (extrema-sphere-radius set to 300)

Vertical Kinect

Due to a project that Igalia Interactivity has been working on, we had to use the Kinect in a vertical stance. By doing this we discovered a small bug that prevented Skeltrack to be used with a vertical depth image. This is corrected in this 0.1.10 version and while fixing it, we found out that it seems the other skeleton tracking alternatives also do not support the Kinect in a vertical stance; this might mean that if you want to use skeleton tracking with the Kinect vertically, your only choice is either to use Skeltrack or to convince Microsoft or PrimeSense to fix their solutions for you :)

Picture of Skeltrack's test example using a Kinect in a vertical stance

Skeltrack using a Kinect in a vertical stance

Last but not least, the function skeltrack_skeleton_new was returning a GObject instance by mistake. We have corrected that and it now returns a pointer to SkeltrackSkeleton as expected.

Special thanks to Iago, our intern at the Igalia Interactivity team, for coding most of these nifty features.

Be sure to clone Skeltrack at GitHub and read the docs, you are welcome to participate in its development.