Unity SDK 2.6.0 can be downloaded here.
The Unity SDK provides compatibilty and smooth integration of Maestro with new and existing XR scenes.
This release includes:
This Unity SDK is designed to be tracking-agnostic, though we provide prefab rigs for Ultraleap and Meta Quest 1/2 hand tracking. If you want compatibility with some other tracking configuration, you can try to roll your own rig by following these steps [TODO].
The Unity SDK provides the following prefab rigs for ease of integration:
It is also possible to roll your own configuration using the SDK components available.
Before installing the Unity SDK package, you’ll need to add a scoped registry entry for Ultraleap’s Unity SDK so that the Package Manager can resolve the dependency. You can do this by following the steps listed here: https://developer.leapmotion.com/unity#setup-openupm.
You can grab a tarball of our latest Unity SDK Package here [TODO]. You can add this package using the Package Manager:
Once you’ve installed the package, you can navigate to the “Samples” section within the Package Manager listing to import the Sandbox sample scene:
The sandbox scene has a variety of interactions including picking up objects, pushing buttons, and finger painting.
In order to run the scene, first ensure that you have enabled the rig that corresponds to your target platform and XR setup. There are currently two rigs: one for Ultraleap hand tracking support and another for Meta Quest 1/2 hand tracking support.
The easiest way to add Maestro compatibility to a Unity scene is by utilizing the provided prefabs: [MaestroUltraleapRig] and [MaestroOculusRig]. These can be found under
Contact CI Unity SDK/Runtime/Core/Prefabs/ in the
Packages folder after installing from a tarball. Simply drag the appropriate prefab into the hierarchy.
By default, any object that is collidable within the scene will provides haptics when touched. The default haptics can be configured via
MaestroManager can be found on the root of the provided prefabs.
Default Interaction Profile, you can configure the default haptics and configuration for the scene:
Grab Type- Currently only
Arcadeis available; more to come soon!
Interactables Only- Disables these default haptic settings. Only objects that have a
MaestroInteractablewill have haptics when touched.
Amplitude- Controls the strength of force feedback. The value ranges from
0being no pull and
255being full force feedback.
Vibration Effect- The vibration effect to be applied.
Vibration Options- Options corresponding to the selected vibration effect. Typically the strength of the effect.
To customize the haptics for a given collidable gameobject, utilize the
MaestroInteractable script. The fields available will look very similar to those found on
MaestroManager (documented here).
You can also control how an object can be manipulated using the
Type field under Configuration:
Static- Unable to be picked up
One hand grab- Can be picked up using either individual hand
Two hand- Requires both hands to pick up
This script also defines a few events that may be used to easily create complex interactions:
On Grab- Called when after the object is grabbed
On Release- Called when after the object is released
On Touch(FingerCollider) - Called when a finger touches the object
Un Touch(FingerCollider) - Called when a finger stops touching the object
While Touch(FingerCollider) - Called while a finger touches the object
It should be noted that the DLLs used by our SDK are 64-bit only. As such, if the Unity platform settings are set to
Any CPU or
x86 execution will halt.
To address this, you may find all DLLs under
Contact CI Unity SDK/Runtime/Core/_DLLs. Set each DLL’s CPU setting to
x86_64. If you are performing builds, this will also have to be done so the DLLs are included with 64-bit builds, and excluded for 32-bit builds.
You may also have to enable
Load on startup under
Plugin load settings. The Unity editor will have to be restarted for this to take effect.
.apkdoesn’t have hand tracking! Starting the app gives a “Switch to Controllers” prompt.
In the headset itself, be sure you have hand tracking enabled in your settings, under Hands and Controllers.
In Unity, make sure your Oculus project settings don’t have Hand Tracking Support set to
Controllers Only. Setting it to
Hands Only is recommended. These settings can be found on the main
OVRManager component in the scene or under
You need to add the scoped registry for Ultraleap hand tracking support, see Setup above.
You need to import the Oculus Integration for full Quest hand tracking support.
Please contact us at firstname.lastname@example.org with any questions.