Salutations Interactive Installation: Making of

Since the post about the Salutations Interactive Installation in the Museum für Kommunikation in Berlin, many people asked me for videos of the installation so we, the Igalia Interactivity team, came up with a better idea — a making of.

The video shows the an early test application that was created from the one shipped with Skeltrack, then some testing with colleagues at Igalia’s office and finally the installation in Berlin and the final result:


(link to original video in YouTube)

You can get the source code for the application from Igalia’s GitHub.

Skeltrack got an award

Last week, while I was busy in Berlin with an interactive installation, I received some good news: Skeltrack got an award in an innovation contest organized by the KNetworks project.

From its own website, KNetworks “[…] is an open network based in the Atlantic area with main interest in the fields of: e-government, innovation, knowledge transfer, technology, the Internet, collective intelligence, the future and the creation of knowledge.”
Members of KNetworks include several universities and government organizations of the European countries in the Atlantic area.

Being the first Open Source library for skeleton tracking, there are a number of possibilities that Skeltrack makes possible and we developed it in Igalia simply because we wanted to use skeleton tracking and there were no open solutions available. So I am very happy with the recognition, in this case a joint 3rd place.

View of Oxford

Oxford, England


Since there was a ceremony in Oxford for the delivery of the awards, I bought a shirt in Berlin 🙂 and flew to London instead of Coruña. I presented Skeltrack and also mentioned Igalia and the cool things that make us different.
At the ceremony, I also had the chance to meet the other contest winners and members of the organization. It was an interesting dinner where I spoke English, Spanish and Portuguese 🙂
After that we ended up in the Turf Tavern — the oldest pub of Oxford — where I discovered that I completely dislike ale.

I had never been to Oxford before so I stayed an extra night to visit the city. What a nice city it is! It is kind of similar to Évora, where I studied, in the way that it has many university buildings spread across the city but of course, at a different scale.

On my way back to Heathrow, more awesome stuff: I found out that the old man sitting close to me in the bus was in fact Donald Knuth! I presented myself, chitchatted a bit and, feeling like a little boy who met Spiderman, thanked him for everything.

I would like to thank the organization of the KNetworks contest for the award and congratulate the other contest winners.

The world’s 1st interactive installation to use Open Source skeleton tracking

Edu and I, proud members of the Igalia Interactivity team, spent the last week in Berlin for the culmination of an interesting project: an interactive installation in the Museum für Kommunikation.

The museum commissioned the Berlin’s interaction/design Studio Kaiser Matthies to create an installation so the studio created the concept and teamed up with us to develop the technical part.
The installation’s purpose is to show different forms of communication and the concept is very simple:
When a user is detected in the “action zone”, an actor shows up in a screen and performs a salutation; the user is supposed to do the same salutation and receives a positive feedback if it was performed well or a negative feedback otherwise.
Examples of gestures are the Japanese bow or waving a kiss.

Pictures of the salutations installation in the Museum für Kommunikation, Berlin

Salutations installation in the Museum für Kommunikation, Berlin

The screen in the right side shows a live video of users so they can compare their gestures with the ones expected from another person’s perspective.

For user detection and to know where their skeleton’s joints are, we used Skeltrack. We also used OpenCV on top of it in order to track more complex salutations, such as the US East Coast hand’s sign.
As for the rest of the stack, we used a minimal Debian, Clutter and GStreamer with many mechanisms to make it robust in case of failure and all this running from a USB stick.

This means that the software used in this installation is completely Open Source and more importantly, it is the world’s first interactive installation that uses Open Source skeleton tracking. We are also going to release the very application’s source code once we have time to release it.

We would like to thank Studio Kaiser Matthies for the opportunity of having such an important project in one of the world’s art capitals. Be sure to visit the museum the next time you’re in Berlin and, if you want us to help you do awesome interactive installations using Open Source software, let me know.

Skeltrack 0.1.10 is out

That’s right, a new version of the world’s first Free Software skeleton tracking library is out.
In every version we try to make Skeltrack more robust and this one is no exception.

Head&Shoulders

We have changed the way the shoulders are inferred. This heuristic now uses a circumference around the user’s head and an arc with which it searches for the shoulders.
Since we like to keep giving developers the ability to tweak the algorithm’s parameters, we had to change the properties related to the shoulders. We should probably improve the documentation with a visual explanation of how those properties work but meanwhile you can check the properties’ documentation.

Centering Joints

Another issue we had was that the extremas we initially calculate result in e.g. the point at tip of the a finger (for a hand joint) or the top of the head. This was not an issue specifically but it might result in more unstable joints. For example, the Kinect device in particular might give blind spots in very bushy hair which would result in the head joint jittering more than usual.
To fix this, we calculate the average of points around an extrema and assign it with that value. The radius of the sphere surrounding an extrema that is used to calculate this average can be controlled by using the extrema-sphere-radius property. Thus, if this behavior is not desired, this feature can be turned off just by simply assigning a 0 to this property.

Here is a couple of pictures describing this issue:

Picture of Skeltrack's test without averaged extremas

Without the averaged extremas (extrema-sphere-radius set to 0)

Picture of Skeltrack's test with averaged extremas

With the averaged extremas (extrema-sphere-radius set to 300)

Vertical Kinect

Due to a project that Igalia Interactivity has been working on, we had to use the Kinect in a vertical stance. By doing this we discovered a small bug that prevented Skeltrack to be used with a vertical depth image. This is corrected in this 0.1.10 version and while fixing it, we found out that it seems the other skeleton tracking alternatives also do not support the Kinect in a vertical stance; this might mean that if you want to use skeleton tracking with the Kinect vertically, your only choice is either to use Skeltrack or to convince Microsoft or PrimeSense to fix their solutions for you 🙂

Picture of Skeltrack's test example using a Kinect in a vertical stance

Skeltrack using a Kinect in a vertical stance

Last but not least, the function skeltrack_skeleton_new was returning a GObject instance by mistake. We have corrected that and it now returns a pointer to SkeltrackSkeleton as expected.

Special thanks to Iago, our intern at the Igalia Interactivity team, for coding most of these nifty features.

Be sure to clone Skeltrack at GitHub and read the docs, you are welcome to participate in its development.

Skeltrack 0.1.8 released

Skeltrack, the Open Source library for skeleton tracking, keeps being improved here in Igalia and today we are releasing version 0.1.8.
Since July we have had the valuable extra help of Iago López who is doing an internship in Igalia’s Interactivity Team.

What’s new

Several bug fixes (including the introspection), both in the library and the supplied example were fixed.
The threading model was simplified and the skeleton tracking implementation was divided in several files for a better organized source code.

While the above is nice, the coolest thing about this release (and kudos to Iago for this) is that it makes Skeltrack work better with scenes where the user is not completely alone. The issue was that if there was another person or object (think chairs, tables, etc. for a real life example) was in the scene they would confuse the skeleton tracking. After this version, while not being perfect (objects/people cannot be touching the user), the algorithm will try to discard objects that are not the user.
But what about having two people in a scene, which one will it choose? To control this, we have introduced a new function:

skeltrack_skeleton_set_focus_point (SkeltrackSkeleton *skeleton,
gint x,
gint y,
gint z)

This function will tell Skeltrack to focus on the user closer to this point, thus allowing to focus on a user in real time by constantly changing this point to, for example, the user’s head position.
So, even if there is no multi-user support, the current API makes it easy to just run other instances of Skeltrack and try to pick users from other points in the scene.
It should also be easier to use Skeltrack for a typical installation where there is a user controlling something in a public space while other people are passing or standing by.

Contribute

We will keep betting on this great library.
If you wanna help us, read the docs, check out Skeltrack’s GitHub and send us patches or open issues.