Available Features

Vive Hand Tracking SDK detects hands in the camera frame and provide gesture, position, rotation and other results.

Hand Classification

Hand classification results contain left/right detection and gesture detection. These classification are always available for all devices.

Left/Right hand detection

Hands are labelled with either left/right. In most plugins (except C/C++), we provide functions/variables to get left and right hand directly.


Vive Hand Tracking SDK returns at most one left hand and one right hand, as we assume a first person view scenario.

Pre-defined gesture classification

There are 6 pre-defined gestures in Vive Hand Tracking SDK. More gestures may be included in future.

Gesture Name Description
point Point Only index finger straight and upward
fist Fist All fingers folded
ok OK Thumb and index fingers connect into a circle, other fingers straight
like Like Only thumb straight and upward
five Five All fingers straight
victory Victory Only index and middle finger straight and upward


Vive Hand Tracking Unity & Unreal SDK supports defining and using custom gesture along with the pre-defined gestures. This is supported in skeleton mode (see below).

Hand Positions & Rotations (Modes)

Vive Hand Tracking SDK provides different position/rotation result types (a.k.a modes) for user to choose from. 2D & 3D point modes are deprecated and will be removed in future.

For a speed benchmark of different modes on different platforms, see Detection speed.


Supported on all platforms. Returns 21 key points for each detected hand, which can form a skeleton of the hand. Supports custom gesture in Unity & Unreal plugin. The locations of the key points are illustrated in the image below:

2D point:

(Deprecated) Supported on all platforms. A 2D point (x and y) with a fake z is returned for each detected hand. The point is always in the plane that is 25cm in front of camera.

3D point:

(Deprecated) Supported on all dual-camera platforms. A 3D point is returned for each detected hand.

In all three modes, palm position and rotation is always available.

Pinch Detection

Pinch is defined by tapping thumb and index tips. Pose of other three fingers does not matter. Hand Tracking SDK returns pinch level and ray for each hand, including:

Pinch level:Reprensenting the likeliness of pinching, within range [0, 1]. Larger value means the two finger tips are closer and user is more likely pinching.
Pinch start:The start position of pinch ray. This is always availabel regardless of pinch level.
Direction:The direction vector of pinch ray. This is always availabel regardless of pinch level.


We recommend to use a threshold of 0.7 to identify a pinch is happening or not.

Hand Confidence

Each hands comes with a confidence value, within [0, 1]. Lower value indicates hands are currently hard to detect, probably due to near FOV border, or occulusion by other objects. This value can be used as a guidance to the user to avoid places that might cause detection to fail.

Unity and Unreal SDK uses confidence value as alpha when displaying hand results. Developers can also imply other UI/UX hints if confidence is low.