Our hand-tracking system is designed to track the position and orientation of a user's hands as well as the (limited) pose of the ten fingers. Presently, we support several finger poses such as pinching and pointing.
Our technology works best when the user is seated at a desk and the cameras are looking down at the user's hands. This configuration allows the user to rest his hands on the desk most of the time. A desktop environment is also good for facilitating interaction with virtual objects.
In this version of the hand-tracker, we've focussed on exposing robust tracking of the hand and the pinching pose in particular. Pinching is a good way for users to select or "grab" virtual objects or drive a touchless GUI. It's analogous to "clicking" with a mouse. In fact, our API exposes the "pinching" gesture much like a mouse event.
Our system is not yet a general purpose hand-tracking system. It relies on a database of poses to track a user's hands. Poses outside of the database may be tracked poorly. We are actively working on relaxing this constraint, and encourage you to try out a beta version of a general purpose tracking system as described in the installation notes (part 2).
If you purchased a 3Gear Development Kit, installation is especially easy.
Otherwise, we also support some legacy cameras / Kinect-based system (although some of these cameras are harder to setup or less accurate).
We describe how to calibrate the camera, hands and run applications in our setup guide.
In this document, we describe how to begin using the API portion of the SDK to incorporate hand-tracking in your applications.
If you run into issues along the way, we'd be happy to help you resolve them, but check here first.