Friday, December 17, 2010

HOWTO: use the kinect as a mouse in linux

In an earlier post I explained how to get PrimeSense's NITE up and running and how to use the samples they provided. Now some people might be thinking "cool, but how can I use this?" I thought using NITE hand tracking to control the cursor would be a good and simple demonstration.

The linux kernel provides a means to create userspace input drivers using a feature called uinput. If you compile your kernel with uinput enabled as a module you can then simply:
modprobe uinput
to load the uinput module. Once the module is loaded you can use the piece of code I've embedded below to convert the coordinates output by the NITE code into actual mouse/cursor movement. In short:

(1) download the code below
(2) save it as ~/kinect/NITE/Nite- (you might want to back up the original)
(3)cd ~/kinect/NITE/Nite- && make
(4)Note: do the following as root or using sudo


(5)Perform a focus gesture to start the hand tracking (check out my video above to see how to do that)

At this point you should be able to do what I do in the video above. You can also extend the code to generate mouse clicks, keystrokes, etc. Have fun.
At some point dropbox ate the public link for the source code I was using before and in the process of restoring from an old backup the formatting of the code below got a bit mangled... in any case here's a gist of what I salvaged:

Friday, December 10, 2010

HOWTO: Kinect + OpenNI/NITE skeleton tracking and gesture recognition in gentoo

Thanks to the folks at PrimeSense libraries are now available for skeleton tracking and gesture recognition.
UPDATE: Check here if you've gotten NITE working and want to try using the kinect as a Minority Report style mouse.
UPDATE:I've added a description of how to track multiple hands under the Sample-PointViewer description.

Here's how I got things working in gentoo:

(1) mkdir ~/kinect && cd ~/kinect
(2) git clone
(3) cd OpenNI/Platform/Linux-x86/Build
(4) make && sudo make install
(5) cd ~/kinect/
(6) git clone
(7) cd Sensor
(8) git checkout kinect
(9) cd Platform/Linux-x86/Build
(10) make && sudo make install
(11) go to this page at openNI to download the latest NITE release for your platform: NITE download page or for the impatient:
UPDATE: download links now point to openNI and should work again
(12)Save the NITE tarball to ~/kinect and untar it
(13) cd ~/kinect/NITE/Nite-
(14)Open Sample-User.xml and replace the existing License line with the line below:
NOTE: this is case sensitive!

< License vendor="PrimeSense" key="0KOIk2JeIBYClPWVnMoRKn5cdY4="/>

(15)Repeat step 14 for Sample-Scene.xml and Sample-Tracking.xml
(16)Open Sample-User.xml and replace the existing MapOutputMode line with the line below.
NOTE: this is case sensitive!

< MapOutputMode xRes="640" yRes="480" FPS="30"/>

(17)Repeat step 16 for Sample-Scene.xml and Sample-Tracking.xml
(18)niLicense PrimeSense 0KOIk2JeIBYClPWVnMoRKn5cdY4=
(19)cd ~/kinect/NITE/Nite-
(20)sudo ./install.bash
(21)make && sudo make install
(22)cd ~/kinect/NITE/Nite-

Now finally you should be sitting in a directory with all the sample binaries that you can play with. Here's what they should look like:

This app will track your hand and show it's relative position on a grid. Run it and wave your hand,one of the squares on the grid should turn yellow to indicate your hand's location as seen below:

you should also get some debug output in your console:

This app demonstrates the skeletal tracking. After starting it up, move around or wave until your body changes to blue (subsequent players will be other colors, e.g. player 2 is green, 3 yellow,etc.). At this point your viewer window should look vaguely like this:

and you should see something like this in your console:
Look for pose
Found pose "Psi" for user 1

Now, hold your arms out to your sides bent 90 degrees at the elbows as shown below until a skeleton is overlayed on the image of your body:

At this point something like this should have appeared in your console:

Calibration started
Calibration done [1] successfully
Writing 217.596 50 50 78.4388 64.6762
Matching for existing calibration
Read 217.596 50 50 78.4388 64.6762

This seems to do some sort of gesture recognition and dynamically adjusts the camera resolution, so it's probably zooming in on an area of interest. When it starts out it asks you to perform a focus gesture. The NITE documentation doesn't seem to define what this would be but simply sticking one hand out in front of you seems to make it happy and you'll see the following output:

This app does handtracking. UPDATE: to allow multiple hands to be tracked you will need to edit /usr/etc/primesense/XnVHandGenerator/Nite.ini by uncommenting the two config parameters it contains. Basically remove the semicolons at the start of each line so that Nite.ini looks like this:


To persistently track different hands in your code you can make use of the XnVHandPointContext.nID in your OnPointUpdate callback.

This example allows you to click one of three boxes, your hand motion is tracked by a slider and depending on the context, up, left, right gestures will be recognized.

Wave to make the border of the window turn green. Then I think you need to send a focus gesture and then if you trace out a circle in the air with your hand the onscreen circle will follow your hand as seen below. In other words if you draw a clockwise circle in the air, the clock hand will also spin clockwise and vice versa. For some reason, this appears to be annoyingly inconsistent.

This seems to just do player detection without skeleton tracking: