Vive Hand Tracking SDK detects hands in the camera frame and provide both classification and position results.
Hand classification results contains left/right detection and gesture detection. These classification are always available for all devices.
Left/Right hand detection
Hands are labelled with either left/right. In most plugins (except C/C++), we provide functions/variables to get left and right hand directly.
Vive Hand Tracking SDK returns at most one left hand and one right hand, as we assume a first person view scenario.
Pre-defined gesture classification
There are 6 pre-defined gestures in Vive Hand Tracking SDK. More gestures may be included in future.
|Point||Only index finger straight and upward|
|Fist||All fingers folded|
|OK||Thumb and index fingers connect into a circle, other fingers straight|
|Like||Only thumb straight and upward|
|Five||All fingers straight|
|Victory||Only index and middle finger straight and upward|
Hand Positions (Modes)¶
Due to OS/hardware restrictions, Vive Hand Tracking SDK provides different position result types (a.k.a modes) for user to choose from. Modes with more detailed information will normally use more computation resources. Some modes are therefore not available on center platforms.
For a speed benchmark of different modes on different platforms, see Detection speed.
Supported on all platforms. A 2D point (x and y) with a fake z is returned for each detected hand. The point is always in the plane that is 25cm in front of camera.
Supported on all dual-camera platforms. A 3D point is returned for each detected hand.
Supported on Windows and WaveVR. Returns 21 key points for each detected hand, which can form a skeleton of the hand. The locations of the key points are illustrated in the image below: