Available Features

Vive Hand Tracking SDK detects hands in the camera frame and provide gesture, position and other results.

Hand Classification

Hand classification results contain left/right detection and gesture detection. These classification are always available for all devices.

Left/Right hand detection

Hands are labelled with either left/right. In most plugins (except C/C++), we provide functions/variables to get left and right hand directly.

Hint

Vive Hand Tracking SDK returns at most one left hand and one right hand, as we assume a first person view scenario.

Pre-defined gesture classification

There are 6 pre-defined gestures in Vive Hand Tracking SDK. More gestures may be included in future.

Gesture Name Description
point Point Only index finger straight and upward
fist Fist All fingers folded
ok OK Thumb and index fingers connect into a circle, other fingers straight
like Like Only thumb straight and upward
five Five All fingers straight
victory Victory Only index and middle finger straight and upward

Hint

Vive Hand Tracking Unity & Unreal SDK supports defining and using custom gesture along with the pre-defined gestures. This is supported in skeleton mode (see below).

Hand Positions (Modes)

Due to OS/hardware restrictions, Vive Hand Tracking SDK provides different position result types (a.k.a modes) for user to choose from. Modes with more detailed information will normally use more computation resources. Some modes are therefore not available on certain platforms.

For a speed benchmark of different modes on different platforms, see Detection speed.

2D point:

Supported on all platforms. A 2D point (x and y) with a fake z is returned for each detected hand. The point is always in the plane that is 25cm in front of camera.

3D point:

Supported on all dual-camera platforms. A 3D point is returned for each detected hand.

Skeleton:

Supported on Windows and WaveVR. Returns 21 key points for each detected hand, which can form a skeleton of the hand. Supports custom gesture in Unity & Unreal plugin. The locations of the key points are illustrated in the image below:

../_images/skeleton.png

Pinch Detection

Pinch is defined by tapping thumb and index tips. Pose of other three fingers does not matter. Hand Tracking SDK returns pinch level for each hand, reprensenting the likeliness of pinching, within range [0, 1]. Larger value means the two finger tips are closer and user is more likely pinching.

We recommend to use a threshold of 0.7 to identify a pinch is happening or not.

Hand Confidence

Each hands comes with a confidence value, within [0, 1]. Lower value indicates hands are currently hard to detect, probably due to near FOV border, or occulusion by other objects. This value can be used as a guidance to the user to avoid places that might cause detection to fail.

Unity and Unreal SDK uses confidence value as alpha when displaying hand results. Developers can also imply other UI/UX hints if confidence is low.