Advanced Usage

Vive Hand Tracking Unity plugin also provides several utility scripts for some common functions.

Using Gesture As Events

HandStateChecker.cs script is provided to use hand gestures as triggers/events. The script is designed as a state machine with at most 3 states:

../_images/state.png
  • Reset state (state 0): the default state. If reset condition is met, all other states switches to reset state, useful to stop action if one or both hands are missing.
  • Prepare state (state 1): stays at this state only if prepare confition is met.
  • Trigger state (state 2): stays at this state only if trigger confition is met while in state 1. Only allow transit from state 1 by default, unless Allow Skip Prepare checkbox is on.

Tip

To make the prepare and trigger state transit more robust and avoid state jumps, script supports enter state until matches for several frames, and return to reset state after missing for several frames.

Condition of each hand can be multi-selected from gestures, including pre-defined gestures, pinch and custom gestures. Condition is met if hand is currently one of the selected gesture. Condition of both hands must be met to enter the state.

Note

Due to flag bits limitation, HandStateChecker.cs supports at most 10 single hand custom gestures and 5 dual hand custom gestures. For custom gestures that exceeds the limit, see Add Custom Gestures in Skeleton Mode section below for manually checking gesture state.

HandStateChecker.cs is used in most actions in Sample scene. For example, in the right hand laser action, reset state does nothing, prepare state (right hand fist gesture) displays a red light, trigger state (right hand five gesture) displays the laser. Left hand condition is set to no hand or unknown gesture to avoid conflict with other samples.

Add Custom Gestures in Skeleton Mode

Besides pre-defined gestures, user can define custom gestures based on the VR app content. Custom gestures are defined regarding finger close/open states, with optional distance requirement between finger tips.

  • Thumb state is either close or open.
  • Other finger states are close, open or relax (or half-open).
  • Finger states are multi-selectable.
  • Restrictions can be added to make sure distance between two finger tips are close or far.
  • Custom gesture assets can be reused in all scenes.

Please refer to CustomGesture sample scene for example usages.

Add Custom Gesture Definition

Custom gesture definitions are saved as assets in Unity project. They can be created using Assets menu or context menu in Project window:

../_images/custom_create.png

There are two types of custom gestures supported in SDK:

  • Single hand custom gesture: The custom gesture is recognized for each hand separately. This is similar as pre-defined gesture.

    ../_images/custom_rock.png
  • Dual hand custom gesture: The custom gesture requires both hands. Finger tip restrictions can be defined for both hands as well as cross-hand.

    ../_images/custom_vive.png

The above custom gesture assets can be found in Assets/ViveHandTracking/Sample/Gestures folder and are used in CustomGesture scene.

Using Custom Gesture In Scene

To use custom gesture in the scene:

  1. Make sure GestureProvider.cs is setup correctly in the scene, and mode is set to Skeleton.
  2. Add CustomGestureProvider.cs script to the scene.
  3. Register single hand and dual hand custom gestures in CustomGestureProvider.cs inspector.
../_images/custom_provider.png

Note

Custom gestures must be added to CustomGestureProvider.cs, or it cannot be detected in the scene.

For the first 10 single hand custom gestures and 5 dual hand custom gestures, it’s recommended to use HandStateChecker.cs to check gesture status and trigger UI actions.

../_images/custom_state.png

You can also check gesture status manually using SingleHandCustomGestures and DualHandCustomGestures. For single hand custom gesture, use IsLeftMatch and IsRightMatch to check if gesture is triggered for left/right hand. For dual hand custom gesture, use IsMatch to check if gesture is triggered.

Using Hand with Unity Event System

The plugin provides an input module implementation HandInputModule.cs to work with unity event systems. Follow the steps below to setup your scene with hand input module:

  1. Add prefab Assets\ViveHandTracking\Prefab\HandInputModuleDispay.prefab to your scene.

  2. Setup event system in the scene. If you skip this step, this is done by script during Awake.

    1. Add event system to your scene if not exists.
    2. Attach HandInputModule.cs to the event system object.
    3. Set Hand Input property of HandInputModule to the game object instantiated from HandInputModuleDispay.prefab.
  3. Setup every canvas in the scene that need to receive UI events from input module.

    ../_images/canvas.png
    1. Make sure canvas is World Space.
    2. Event camera of the canvas must be None.
    3. Must have a GraphicsRaycaster on the same game object.
  4. Customize hand behavior for UI interaction. Default is to use the first hand visible and click is triggered by pinch.

    ../_images/input_ray_pointer.png
    1. Change Input Type if you want to always use left or right hand for UI interaction.
    2. Change Prepare Condition of Left Hand State and Right Hand State if you want different gestures for click action.

Important

HandInputModule.cs cannot work with other VR based input module, including but not limited to those in SteamVR, GoogleVR and WaveVR plugin. You must disable other input modules to avoid conflicts.

Customize Input Module Graphics

By default, prefab Assets\ViveHandTracking\Prefab\HandInputModuleDispay.prefab uses HandInputRayPointer.cs to draw rays and pointers for UI interaction. You can customize the way to show UI hints by providing your own implmentation that derives from BaseHandInput.cs.

  1. You need to override BaseHandInput.SetHit function in your implementation

    abstract class BaseHandInput : MonoBehaviour {
      // This is called every frame to display graphics.
      // The start point of ray is transform.position, direction of ray is transform.forward
      // Parameter: distance from ray start point to hit point, null if no hit.
      public abstract void SetHit(float? distance);
    
      // implementation details...
    }
    
  2. Set Hand Input property of HandInputModule to the derived class of BaseHandInput.cs.

Switch Hand Display

In sample scenes, there is a utility script to switch hand display among sphere-links and hand models. The script calls SetActive on given game object to show only one kind of hand display each time, while display can be changed using gestures. Make sure to check Is Model for 3d model renderer objects, so these objects will be ignored if current mode is not skeleton.

Note

Make sure there is at least one display object that supports non-skeleton mode.

../_images/switch_display.png

HandDisplaySwitch.cs should be used together with HandStateChecker.cs for hand switch event to trigger. HandDisplaySwitch.OnStateChanged must register to HandStateChecker.OnStateChanged event. Hand display is switched to next object whenever prepare state is entered. Trigger condition is not used and should set condition of both hands to Nothing.

Hint

In sample scenes, we switch hand display for both hand together by pointing to a list of hand game objects. You can switch display for each hand separately by using two HandDisplaySwitch.cs, each pointing to a list of left/right hand game objects. Make sure to add two HandStateChecker.cs as well.