SamplesΒΆ

There are totally five vr sample scenes and two ar sample scenes provided in the plugin Assets\ViveHandTracking\Sample folder, which are tutorials on how to use the SDK. For detailed use of Vive Hand Tracking SDK, please refer to the scripts in the scene.

  1. Sample scene demos how to use hand to interact with game objects in the scene. Supports all platforms and modes.

    1. Switch hand display: make like gesture with both hands. See Switch Hand Display for details.
    2. Remote grab: make fist gesture with both hands to aim at objects, make five gesture with both hands to move selected object.
    3. Laser: use right hand fist to prepare and right hand five to trigger a red laser.
    4. PinchLaser: use left hand to cast a pinch laser, left hand pinch to bump selected cube.
    5. Grab: use left hand ok gesture to pick up 3d objects that you can touch.
    6. Push: use right hand to push 3d objects.
  2. UISample scene is used to demo how to use hands with Unity event systems. Supports all platforms and modes. Use pinch to interact with Unity UI. See Using Hand with Unity Event System for details.

  3. CustomGesture scene is used to demo how to add custom gestures. Only supports skeleton mode. This scene detects custom gesture Rock and Vive. See Add Custom Gestures in Skeleton Mode for details.

  4. JointRotation scene is used to visualize the joint rotation. Only supports skeleton mode. 21 joints are renderered as cube with three axises.

  5. Test_WaveVR scene is used to demo how to add GestureProvider script to existing WaveVR prefab. Only supported on Focus/Focus Plus/Focus 3.

  6. ARCore scene and ARFundation scene are used to demo how Google ARCore and ViveHandTracking work together. Only supported on ARCore supported devices.

    Note

    Use ARCore scene if using Google ARCore SDK, and use ARFoundation scene if using Unity ARFoundation + ARCore.

Sample scenes also show how to skip controller calibration in WaveVR. See DisableController.cs for details.