EDIT: You can download the sample application sources at http://demos.conquex.com/kinect/CLNUIDeviceTest_20101202.zip


update: ok, some real world scenarios – browsing like a jedi :)

Engadget Kinect browsing with the force from Stoyan Pedev on Vimeo.

EDIT: You can download the sample application sources at http://demos.conquex.com/kinect/CLNUIDeviceTest_20101202.zip

This is from README1st.TXT

///// Microsoft Kinect, http://codelaboratories.com/nui/, http://www.aforgenet.com/, http://inputsimulator.codeplex.com/,
///// http://www.conquex.com
///// Stoyan Pedev
///// Desktop browsing sample application v.0.1

Please, note that this application is an early alpha release, so it is possible that some of the features may be buggy.
I wanted to stick as much as I can with the original sample application from http://codelaboratories.com/nui/ and the
code was not encapsulated intentionally.

Download Microsoft Kinect drivers for Windows from http://codelaboratories.com/nui/ and install.
Navigate to Program Files/Code La…s/Sample and backup the existing sample application directory (i.e. rename it to
CLNUIDeviceTest_bak), then copy the contents of the zip file inside the Sample directory.

Running application:
Navigate to CLNUIDeviceTest\bin\Debug and start the .exe file.
Make yourself comfortable :) , control region is at ~2 feet (65 cm) distance from the Kinect camera.
Click the checkbox to start feed processing. When ready, click the checkbox to relay the events, and then click on the
application you want to scroll (up and down only atm). Please note, that the code that relays the events to the OS is somewhat
aggressive :) and feeds the events directly to the OS.

You are welcome to do any improvements, fixes etc. on this sample.
If you like this sample or have any suggestions you may drop me a line at: pedev@conquex.com

Old post:
well it was about time.. I’ve spent the last 2 weeks on a home project concerning some computer vision algorithms (extracting eye movements -> then moving the mouse) when Kinect hacked driver came out, so I decided to give it a try.
driver: http://codelaboratories.com/nui/
driver installation is easy and pretty straightforward – just click install and the setup will do everything for you – and it comes with sample project (VS 2010 on WPF).
driver sends images on InteropBitmap – http://msdn.microsoft.com/en-us/library/system.windows.interop.interopbitmap.aspx, so a little tweaking and manipulation was needed to access the feed (pretty /unsafe coding).
frankly I got really lazy to rewrite the algorithm for DeepRaw, so I decided just to extract some color from the RGB feed (AForge.NET color filtering ftw), then blob-ed the result image and voila:

Microsoft Kinect with AForge.NET from Stoyan Pedev on Vimeo.

please note, this is just a teaser. I plan to include a lot more (z-index extraction straight from the camera feed, some optimizations for the processed feed etc.)
may be if Karev finds his Logitech Wheel controller there will be some NFS ‘remote’ steering.

by admin