Recently I had to use OpenCV in a project inside the Igalia Interactivity and I took the chance to code a little demo I had in my mind for a while: play Angry Birds with a Kinect and using only Free Software.
Here’s the result:
(direct link to video in Vimeo)
How it’s done
The demo uses Skeltrack (the only Free Software skeleton tracking library) to get user’s hands’ positions. The picked hand’s position will be used to move the mouse pointer. This part is the same that is used in the Skeltrack Desktop Control demo.
Once the hand’s position is known, I calculate an area around its point in the original depth image given by GFreenect and then use OpenCV to get the hand’s contours and their convexity defects. An open hand palm will produce very distinguished convexity defects and by counting them I assume the user’s hand palm is open. After this, all that is missing is to tie a detected closed palm to a mouse press and an open palm to a mouse release.
As the video shows, the demo is not polished yet but it shows one of the many possibilities that allying Skeltrack with other computer vision software gives us.
Can you kindly post the step by step instructions to do this?
List of terminal commands may be and how to tie up the closed/open palm to the mouse click /release action.
Hi Brian,
Unfortunately, I do not have time to do a step by step instructions. When I have time I will clean this project’s source code and put it online so everyone can take a look.
However, if you have a doubt about a specific part of it, I might be able to answer you.
Cheers,
Looks promising! We did something similar with Qt last spring: http://bergie.iki.fi/blog/qt-air-cursor/
We’ll look how easily we can replace OpenNI there with your library.