Please complete steps in the Setup section before running your application.
Add Script to the Scene¶
To use Vive Hand Tracking SDK in your Unity project, add
GestureProvider script to your VR render camera and set options in Inspector.
- For most scenes, this normally is the camera object with MainCamera tag.
- If you are using other VR plugins, this may be the head object in the camera prefab.
- Alternatively, this can be a game object that always have the same transform with VR render camera.
GestureProvider script must be attached to your VR render camera,
or an object that has same position & rotation of your VR render camera.
Otherwise your hand may not be able to display in the correct position.
The transform of
GestureProvider is used to make sure detected hand position are in the correct global coordinate system.
Most games add extra position/rotation offset to your rendering camera, please make sure the same offset is add to the object
GestureProvider script is attached to.
GestureProvider handles detection start and stop based on its lifecycle.
Detection is started when the script is enabled and stopped when the script is disabled/destroyed.
The results are cleared when detection stops.
To manually control start/stop of the detection, simply enable/disable the script:
GestureProvider.Current.enabled = true; // enable the script to start the detection GestureProvider.Current.enabled = false; // disable the script to stop the detection
Android Camera Permission¶
On Android/WaveVR platform, camera permission must be granted before starting the detection.
Vive Hand Tracking SDK automatically handles camera permission for WaveVR and Daydream.
But on most Android phones, Unity does not support runtime permission before 2018.3.
In such cases, you need to add your own logic (or using other Unity plugins) to grant camera permission.
GestureProvider is disabled at startup, grant camera permission, then enable
Getting Detection Result¶
Draw Detected Hands As Skeletons¶
The plugin contains
RightHandRenderer.prefab for render hands as skeletons (or single points for 2D/3D modes).
The prefabs are located in
These prefabs utilizes
HandRenderer.cs script for rendering detected hands.
The script allows customizatiion of display colors and hand colliders.
The displayed hand is shown as below images. Left image have Show Gesture Color off, while right image have it on.
By default, HandRenderer script shows confidence as alpha to give user a hint about hand is moving to areas hard to detect. This requires the material to be transparent (and cannot be dynamically batched). If CPU usage is high (especially on Android), you can change the material to opaque to batch dynamically, at the cost of no confidence hint.
Draw Detected Hands As 3D Model¶
Display hand as 3D model is only supported in skelton mode.
Two pre-made 3D hand models are included in the plugin, with prefabs located in
RightHandModel.prefab: Cartoon hand.
RightRobotHand.prefab: Robot hand.
Model hands are available in sample scenes. Make Like gesture with both hands to switch between skeleton and 3d model.
These prefabs utilizes
ModelRenderer.cs script for applying skeleton transforms on 3D model with skinned meshes.
See Customize 3D Hand Model section for using custom models.