Tuesday 22 February 2011

Using Kinect to control Jmol - Part II

Continuing from Part I...

Rather than write Java code that interfaces directly with the Kinect we are going to make our lives easier by taking the information we want from the Kinect and making it available through a port. We can then read the information from the port using software in any language, not just Java.

Here's one someone else made earlier: OSCeleton.

(1) Click on the Downloads button at the OSCeleton Github page (on the right hand side) to get an .exe for Windows.

(2) Unzip it, and run the executable.

(3) Stand back from the Kinect and assume the calibration pose (classic "hands up" posture, with elbows at 90 degrees, see this example) until you see the message "The calibration was finished successfully".

You are now being tracked, and your movements are being broadcast using the OSC protocol to a port on your machine.

Now we're going to use Python to tidy up the broadcast data, smooth it a bit, draw it on screen, and rebroadcast it for Jmol.

(4) Install pyliblo. The website for pyliblo does not include a Windows .exe, so I used the one from touchpy. Note this requires Python 2.5.

(5) Install pygame and pyopengl (both of these can be installed with easy-install.exe).

(6) Clone my git repo. This is a fork of code by Ben O'Steen that adds support for Python 2.5 (see above) and smoothing.

(7) Run osc_hand_viz.py. You will see an image similar to the image at the end of Ben's recent blog post on using the same code to control an ARDrone.

Image: Scott & Elaine van der Chijs

No comments: