Body Tracking

VIVE Wave™ XR plugin provides the Body Tracking feature in the Wave XR Plugin and Essence packages (refer to Wave XR Plugin Packages).

Note

  1. VIVE Wave™ Body Tracking supports the VRM format avatar depends on the UniVRM plugin.
  2. If your project does NOT contain the UniVRM plugin, you will have to import the UniVRM VRM-0.109.0_7aff.unitypackage before importing the Body Tracking package.
  3. You must use Unity Editor v2020.3 or newer version required by the VRM-0.109.0_7 plugin.

You can import the Body Tracking package from Project Settings > Wave XR > Essence. The source code and sample will be located at Assets > Wave > Essence > BodyTracking.

../_images/UnityXRBodyTracking01.png

Environment Settings

  1. Go to Project Settings > Player > Other Settings > Configuration. Set the Scripting Backend to “IL2CPP” and Target Architectures to “ARM64”.
../_images/UnityXRBodyTracking02.png
  1. If you used an assembly definition in your project, you would need to add the following references to your .asmdef file.
../_images/UnityXRBodyTracking03.png

Device Role and Tracking Mode

In Body Tracking each input device has a role. By using different Role Mappings you can apply different Tracking Modes on your avatar.

There are two mapping rules.

Rule 1

Refer to the roles used in Body Tracking Rule 1 below.

../_images/UnityXRBodyTracking04.png

Rule 1 - Role Assignment

The devices used of each role are listed below.

Role Device
HEAD HMD, VIVE Focus3 or XR Elite.
WRIST VIVE Wrist Tracker or new coming VIVE self-tracker.
HANDHELD Both controllers.
HAND Both hands.
HIP VIVE self-tracker coming soon.
ANKLE VIVE self-tracker coming soon.

The HMD, VIVE Wrist Tracker, Controller and Hand will be assigned a role automatically.

The new coming VIVE tracker should be configured with a role manually.

If you have the VIVE trackers, you have to wear the trackers as illustration.

../_images/UnityXRBodyTracking08.png

Rule 1 - Tracking Mode

Different tracking modes use different roles. Refer to the Tracking Mode Role Mappings of Body Tracking Rule 1 below.

../_images/UnityXRBodyTracking05.png

Rule 1 - Example

Briefly summarized the role mapping examples, Arm Tracking can use the roles below.

  1. Head + Wrist or
  2. Head + Handheld or
  3. Head + Hand.

And after you review the Rule 1 - Role Assignment, you will find Arm Tracking can use the device mappings below.

  1. Head + VIVE Wrist Tracker or
  2. Head + Both controllers or
  3. Head + Both hands.

Then you can find the Upper Body Tracking obviously uses the Arm Tracking roles (devices) plus role HIP (new coming tracker).

And the Full Body Tracking obviously uses the Upper Body Tracking roles (devices) plus role ANKLE (new coming tracker).

Rule 2

Refer to the roles used in Body Tracking Rule 2 below.

../_images/UnityXRBodyTracking06.png

Rule 2 - Role Assignment

The devices used of each role are listed below.

Role Device
HEAD HMD, VIVE Focus3 or XR Elite.
HANDHELD Both controllers.
HAND Both hands.
HIP VIVE tracker coming soon.
KNEE VIVE tracker coming soon.
ANKLE VIVE tracker coming soon.

The HMD, VIVE Wrist Tracker, Controller and Hand will be assigned a role automatically.

The new coming VIVE tracker should be configured with a role manually.

Rule 2 - Tracking Mode

Refer to the Tracking Mode Role Mappings of Body Tracking Rule 2 below.

../_images/UnityXRBodyTracking07.png

Rule 2 - Example

For example you can use the “Upper Body with Leg” tracking with roles as below.

  1. Head + Both controllers + Hip + Knee + Ankle or
  2. Head + Both hands + Hip + Knee + Ankle.

Usage

After imported the Body Tracking package you can find the source code and sample at Assets > Wave > Essence > BodyTracking.

VIVE Wave™ Body Tracking provides several methods to apply tracking pose on your avatar. We will introduce Humanoid and Human Body Tracking usages here.

Note

Before using Body Tracking you have to set the avatar to T-Pose as below.

../_images/UnityXRBodyTracking19.png

Note

You have to add the Body Manager component to a gameObject of your scene to control the body tracking process.

Note

You have to set the TrackingOriginModeFlags to floor. We assume a human’s height is at least 0.5m. Refer to XR Rig about the Rig setup.

Briefly introduce the simplest usage steps below.

  1. Add the Body Manager component to a gameObject (usually the Rig) in your scene manually.
  2. Import the VRM format avatar by using Humanoid.
  3. Add the Humanoid Tracking component to the avatar gameObject.
  4. Write code to call HumanoidTracking.BeginTracking() in runtime.

Humanoid

The common way to use an avatar in Unity is Humanoid.

../_images/UnityXRBodyTracking09.png

There are two file formats used in Humanoid: VRM0 and VRM1.

The VRM1 assets will be imported together with the Body Tracking package and you can find the assets at

  • Assets > UniGLTF
  • Assets > VRM10
  • Assets > VRMShaders.

If you need to use VRM0 format avatars, import the package UniVRM-0.109.0_7aff.unitypackage.

Follow steps below to apply Humanoid Tracking on your avatar.

  1. Import the VRM avatar (only VRM0 needed)

Select the option Menu > VRM0 > Import from VRM 0.x.

../_images/UnityXRBodyTracking10.png

Select your avatar file and choose the imported location. Your avatar will become a prefab. For example:

../_images/UnityXRBodyTracking11.png
  1. Add the avatar to scene

Drag the avatar.prefab to your scene.

../_images/UnityXRBodyTracking12.png
  1. Use “Humanoid Tracking” component

Add the component “Humanoid Tracking”.

../_images/UnityXRBodyTracking13.png

Configure the settings.

../_images/UnityXRBodyTracking14.png
  • Avatar Offset: Keep empty if you don’t have to move the avatar (like teleport).

When you select the Custom Settings, you can configure the height of avatar instead of using default height (head height - toe height).

../_images/UnityXRBodyTracking16.png

Please refer to the Calibration section about Content Calibration option.

Tracked Device Extrinsics

Thinking about how Body Tracking works?

We use different devices as different roles of avatar, e.g, use a VIVE Wrist Tracker to present an avatar’s wrist.

But there is a “gap”: The VIVE Wrist Tracker pose is NOT the REAL wrist pose, including the translation and orientation.

We have to consider the “offset”, which we called “extrinsic”, between VIVE Wrist Tracker and wrist.

../_images/UnityXRBodyTracking15-1.png ../_images/UnityXRBodyTracking15-2.png

When you expand the Tracked Device Extrinsics menu, you will see the default extrinsics provided by VIVE Wave™ Body Tracking plugin.

You can customize the extrinsics to match your avatar more accurately.

Human Body Tracking

VIVE Wave™ Body Tracking plugin provides another method named Human Body Tracking to apply tracking poses on an avatar.

If your avatar is NOT imported to Humanoid format, you have to configure the avatar joints manually.

Note

The Human Body Tracking is ONLY useful when your avatar is human-like or you may have to customize the offset joint-by-joint in code.

Human Body Tracking provides total 26 avatar joints as below (in OpenGL coordinate).

../_images/UnityXRBodyTracking17.png

We already transformed the joint pose to Unity coordinate in our plugin.

Sample Code

You can find the Human Body Tracking sample code at Assets > Wave > Essence > BodyTracking > {version} > Demo > Scripts > BodyTrackingSample.cs.

../_images/UnityXRBodyTracking18.png

In the sample code, we simply apply the joint poses by following 3 steps.

  1. Changes the scale of avatar gameObject.
ApplyBodyScale(scale);
  1. Updates the position and rotation of avatar root (Hip).
avatarBody.Update(JointType.HIP, ref inputBody.root);
  1. Updates the rotation of other joints in order.

Calibration

Before starting the body tracking you have to calibrate the tracked devices such as HMD, Controller, Hand and VIVE Wrist Tracker.

When the Content Calibration option enabled, HumanoidTracking.BeginTracking() will call BodyManager.SetStandardPose() after 3 seconds to run the calibration process

So when you call the HumanoidTracking.BeginTracking() to start the body tracking in runtime, you have to stand in the Standard Pose illustrated below within 3 seconds.

../_images/UnityXRBodyTracking20.png

The Standard Pose looks like you are standing up straight and push against a wall with your hands extended.

The calibration process completes fast. When you cannot retrieve the IK pose after calibrated, check if all input device poses are available and correct first.

In our demo scene Assets > Wave > Essence > BodyTracking > {version} > Demo > BodyTracking.unity, the HumanoidTracking.BeginTracking() will be called when you click the Begin button.

../_images/UnityXRBodyTracking21.png

The demo process is simple:

  1. Select a tracking mode (yellow button) depends on the roles you used. E.g. Choose Arm if you use HMD and Controllers only.
../_images/UnityXRBodyTracking05.png ../_images/UnityXRBodyTracking07.png
  1. Click the Begin button.
  2. Face the avatar and stand in the Standard Pose within 3 seconds.
  3. Click the Stop button if you want to change the tracking mode or stop tracking.

If the avatar pose is weird, you should repeat steps 4 -> 2 -> 3.

Note

To confirm input devices are available before starting the body tracking is necessary.

Body Tracking API

In the Body Tracking feature package, we provide two components Humanoid Tracking and Body Manager. Here listed the APIs of them.

Humanoid Tracking

  • BeginTracking: To start a body tracking process, including calibration.
  • StopTracking: To stop a body tracking process.

Body Manager

Body Manager is the necessary component controls body tracking processes and provides avatar joints data. By adding the Body Manager component to your scene, the Tracker Manager (see Tracker) is also used automatically.

There are different modes defined for Body Tracking and only 0 (Arm), 1 (Upper Body), 2 (Full Body) and 3 (Upper Body + Leg) are supported currently (see Device Role and Tracking Mode).

public enum BodyTrackingMode : Int32
{
    UNKNOWNMODE = -1,
    ARMIK = 0,
    UPPERBODYIK = 1,
    FULLBODYIK = 2,

    UPPERIKANDLEGFK = 3, // controller or hand + hip tracker + leg fk
    SPINEIK = 4,    // used internal
    LEGIK = 5,    // used internal
    LEGFK = 6,    // used internal
    SPINEIKANDLEGFK = 7, // hip tracker + leg fk
    MAX = 0x7fffffff
}
  • SetStandardPose: To configure the standard pose according to currently available input devices.

  • CreateBodyTracking: There are three polymorphisms of CreateBodyTracking function.

    After CreateBodyTracking you will retrieve an ID represents a body tracking process.

    You may need to provide the avatar information, device extrinsics and specify a BodyTrackingMode while using different polymorphisms of CreateBodyTracking.

  • GetDefaultRotationSpace: This function is available only when using the simplest CreateBodyTracking without avatar information and device extrinsics.

    You can retrieve the rotation spaces of plugin‘s default avatar joints (in T-Pose) after CreateBodyTracking finishes successfully.

    When you set your avatar to T-Pose, you can compare the rotation space between the custom avatar and default avatar. (sample code in BodyTrackingSample.cs)

  • GetBodyTrackingInfo: Retrieves the calibrated avatar’s height and scale.

    When using the simplest CreateBodyTracking, the scale is always 1 and we will use the height to calculate new avatar scale. (sample code in BodyTrackingSample.cs)

    When using CreateBodyTracking with avatar information, we can retrieve the calibrated scale of avatar. (sample code in HumanoidTracking.cs)

  • GetBodyTrackingPoseOnce: Retrieves the avatar joints pose at any time. Note that we usually call this function once in a frame.

  • StartUpdatingBodyTracking, GetBodyTrackingPoses and StopUpdatingBodyTracking: These three functions are paired.

    After calling StartUpdatingBodyTracking with an ID retrieved from CreateBodyTracking, the avatar joints pose will be updated each frame.

    Call GetBodyTrackingPoses with an ID to retrieve the avatar joints pose.

    Call StopUpdatingBodyTracking to stop updating pose.

FAQ

Compile error after imported Body Tracking package

Please use Unity 2020.3 LTS or newer version.

Library not found

VIVE Wave™ Body Tracking library is built in arm64. Confirm your APK is built in IL2CPP + arm64.

Invalid HMD height

Set your tracking origin to Floor.

VIVE Wave™ plugin provides the Wave Rig prefab at Packages > VIVE Wave XR Plugin - Essence > Runtime > Prefabs. Refer to XR Rig about the usage.

Can not move avatar

In HumanoidTracking.cs and other sample codes we apply the joint pose in world space.

To modify the Avatar Offset of Humanoid Tracking can change the avatar’s pose (including the translation and orientation) in runtime.

../_images/UnityXRBodyTracking14.png