SamplesΒΆ
There are totally five vr sample scenes and two ar sample scenes provided in the plugin Assets\ViveHandTracking\Sample
folder, which are tutorials on how to use the SDK.
For detailed use of Vive Hand Tracking SDK, please refer to the scripts in the scene.
Sample
scene demos how to use hand to interact with game objects in the scene. Supports all platforms and modes.- Switch hand display: make like gesture with both hands. See Switch Hand Display for details.
- Remote grab: make fist gesture with both hands to aim at objects, make five gesture with both hands to move selected object.
- Laser: use right hand fist to prepare and right hand five to trigger a red laser.
- PinchLaser: use left hand to cast a pinch laser, left hand pinch to bump selected cube.
- Grab: use left hand ok gesture to pick up 3d objects that you can touch.
- Push: use right hand to push 3d objects.
UISample
scene is used to demo how to use hands with Unity event systems. Supports all platforms and modes. Use pinch to interact with Unity UI. See Using Hand with Unity Event System for details.CustomGesture
scene is used to demo how to add custom gestures. Only supports skeleton mode. This scene detects custom gesture Rock and Vive. See Add Custom Gestures in Skeleton Mode for details.JointRotation
scene is used to visualize the joint rotation. Only supports skeleton mode. 21 joints are renderered as cube with three axises.Test_WaveVR
scene is used to demo how to addGestureProvider
script to existing WaveVR prefab. Only supported on Focus/Focus Plus/Focus 3.ARCore
scene andARFundation
scene are used to demo how Google ARCore and ViveHandTracking work together. Only supported on ARCore supported devices.Note
Use ARCore scene if using Google ARCore SDK, and use ARFoundation scene if using Unity ARFoundation + ARCore.
Sample scenes also show how to skip controller calibration in WaveVR. See DisableController.cs
for details.