Gesture Controlled Robot

For the UTS open day, we knocked together a Gesture-controlled robot. Check it out!

The robot is controlled by a laptop, which tracks the user’s hands using a Structured Light Scanner. The position of the user’s hands is then used to adjust the speed of the robot’s motors.

The sensor we’re using is called the Xtion Pro Live, which implements the OpenNi motion framework. We’ve coded a small app in Processing which sends serial data to an Arduino via XBee to control the robot.

http://en.wikipedia.org/wiki/Structured-light_3D_scanner

http://www.openni.org/

http://www.processing.org/

http://code.google.com/p/simple-openni/

 

Share Away~
Tweet about this on TwitterShare on RedditShare on FacebookEmail this to someone

Leave a Reply