Basic Usage

Please complete steps in the Setup section before running your application.

The main interface is provided in the singleton class GestureProvider. Please ensure one and only one script is attached in the scene. The detection results are stored in GestureResult class.

Add Script to the Scene

To use Vive Hand Tracking SDK in your Unity project, add GestureProvider script to your VR render camera and set options in Inspector.

  • For most scenes, this normally is the camera object with MainCamera tag.
  • If you are using other VR plugins, this is the head object in the camera prefab.
  • If you are using Unity XR system, this is the game object with TrackedPoseDriver script.
  • Alternatively, this can be a game object that have the same transform as VR render camera.
../_images/provider.png

Note

GestureProvider script must be attached to your VR render camera, or an object that has same position & rotation of your VR render camera. Otherwise your hand may not be able to display in the correct position.

The transform of GestureProvider is used to make sure detected hand position are in the correct global coordinate system. Most games add extra position/rotation offset to your rendering camera, please make sure the same offset is add to the object GestureProvider script is attached to.

See Samples scenes for examples of how to add GestureProvider script.

Start/Stop detection

GestureProvider handles detection start and stop based on its lifecycle. Detection is started when the script is enabled and stopped when the script is disabled/destroyed. The results are cleared when detection stops.

To manually control start/stop of the detection, simply enable/disable the script:

GestureProvider.Current.enabled = true;  // enable the script to start the detection
GestureProvider.Current.enabled = false; // disable the script to stop the detection

Android Camera Permission

On Android/WaveVR platform, camera permission must be granted before starting the detection. Vive Hand Tracking SDK automatically handles camera permission for WaveVR, ARCore and Daydream. But on most Android phones, Unity does not support runtime permission before 2018.3. In such cases, you need to add your own logic (or using other Unity plugins) to grant camera permission. Make sure GestureProvider is disabled at startup, grant camera permission, then enable GestureProvider.

Getting Detection Result

When detection is running, detected left and right hand results are available as GestureProvider.LeftHand and GestureProvider.RightHand.

Each hand contains following members, see Samples section and GestureResult class for details:

  • gesture: pre-defined gesture
  • points: position of the hand joints with different meanings for each mode selected.
  • rotations: rotation of the hand joints with different meanings for each mode selected.
  • position: position of palm center.
  • rotation: rotation of palm center.
  • pinchLevel: contains information of pinch (index and thumb), includig pinch level and ray.
  • confidence: a hint for hand detection difficulty.

To query current detection mode and status, use GestureProvider.Mode and GestureProvider.Status.

Draw Detected Hands As Skeletons

The plugin contains LeftHandRenderer.prefab and RightHandRenderer.prefab for render hands as skeletons (or single points for 2D/3D modes). The prefabs are located in Assets\ViveHandTracking\Prefab folder. These prefabs utilizes HandRenderer.cs script for rendering detected hands. The script allows customization of display colors and hand colliders.

../_images/renderer.png

The displayed hand is shown as below images. Left image have Show Gesture Color off, while right image have it on.

../_images/hand1.png ../_images/hand2.png

Note

By default, HandRenderer script shows confidence as alpha to give user a hint about hand is moving to areas hard to detect. This requires the material to be transparent (and cannot be dynamically batched). If CPU usage is high (especially on Android), you can change the material to opaque to batch dynamically, at the cost of no confidence hint.

Draw Detected Hands As 3D Model

Note

Display hand as 3D model is only supported in skeleton mode.

Two pre-made 3D hand models are included in the plugin, with prefabs located in Assets\ViveHandTracking\Prefab folder:

  • LeftHandModel.prefab and RightHandModel.prefab: Cartoon hand.

    ../_images/cartoon_hand.png
  • LeftRobotHand.prefab and RightRobotHand.prefab: Robot hand.

    ../_images/robot_hand.png

Model hands are available in sample scenes. Make Like gesture with both hands to switch between skeleton and 3d model. These prefabs utilizes ModelRenderer.cs script for applying skeleton transforms on 3D model with skinned meshes. See Customize 3D Hand Model section for using custom models.