WebXR Pass-through on Quest

What is pass-through?

With Pass-through, you can take a break from your virtual reality experience and get a real-time glimpse of your surroundings. The sensors and cameras on your headset give you a sense of what you would see if you could look directly through the front of your headset. Also, pass-through automatically appears when you’re setting up or making changes to your Guardian. But, some apps even use this feature to blend your real and virtual environments. That blending of real and virtual environments makes pass-through interesting and brings VR devices closer to Mixed Reality devices like the Microsoft HoloLens. And, most important, it can be used from WebXR as well.

The Meta Quest 2 can display pass-through in black and white, while the Quest Pro can display it in color. Unfortunately, I don’t own a Quest Pro (yet) so the examples are in black and white.

How to enable pass-through

When creating a WebXR VR session in your JavaScript app, you’d normally call the requestSession function on navigator.xr and pass it the immersive-vr parameter. To get the pass-through in your app on the Quest, change the parameter to immersive-ar. The code might look something like this:

let xrSession = await navigator.xr.requestSession("immersive-ar");

Because you are running this on a website, you’d never know who and what device is trying to enter AR mode. To check if it is supported, you can call isSessionSupported before requesting it.

if (await navigator.xr.isSessionSupported("immersive-ar")) {
  let xrSession = await navigator.xr.requestSession("immersive-ar");
  // do XR stuff
} else {
  // show a message
}

Here’s a nice example of this feature running: https://cabanier.github.io/webxr-samples-1/immersive-ar-session.html

If you’ve ever built a WebXR AR app for Android this code might look familiar. And you are correct! That’s one of the great things about WebXR: If the device supports a feature, it works.

How to create AR WebXR in A-Frame.

If you are creating an A-Frame app, getting the pass-through to work is even simpler. You have to do nothing :) A-frame checks if your device supports the immersive-ar-mode and if it does, it shows the button to enter AR.

For example, you can try running this sample from your Quest to see it work https://aframe.io/examples/showcase/helloworld/

How to enable it in Wonderland Engine

Wonderland Engine also comes with the ability to enable pass-through mode. To enable this you’ll have to change to properties in the project settings. The first one is to enable the AR mode. This can be done in the VR & AR section. In the framework dropdown, under AR, select webxr.

Having this set enables the AR button when running if your device supports it. By default, the scene has a grey background. To see the real world you’ll have to make it transparent. To do this look in the Rendering section and change the A (alpha channel) of the clearColor to 0.

When you run the application on the Quest it looks something like this:

Where to go next?

Viewing 3D models in AR can be fun by itself, but it would be nice to be able to place them in the real world. And maybe even anchor them so they are persisted in a specific location. While this is possible, it is too much for this tutorial. So stay tuned for that ;)

VR Controllers with Unity's XR Interaction Toolkit, Part 2

In the last tutorial I explained how to set up models to use as the hand visualization using Unity’s Interaction Toolkit. Today, I want to add a bit of animation to the hands. While working on this I ran into an issue with the hand from the previous post. It turned out that, although they seemed to have been rigged, the rig didn’t work. I could not animate the fingers to make it so the hand became a fist. With the cyber sales going on in the Unity asset store I finally picked up the vr hand models mega pack which was on my wishlist for quite some time. So, for the rest of the tutorial I’ll be using a hand from this pack (but still any other, correctly, rigged hand will work).

Create the controller states

First, we need to have 2 animation states per hand: an open hand, and a fist. If your model doesn’t come with these you’ll have to tweak and fiddle with the hand a bit to add an animation of a single frame with all the fingers curled into a fist. The pack I used came with a whole bunch of different hand states, so I used those.

Next, we create an animation controller, open it in the Animator window and add the two animations to that. I’ve renamed one to ‘Fist’ and one to ‘Open’. I’ve set the ‘Open’ state to the default and added two transitions back and forth between ‘Fist’ and ‘Open’.

To make the interaction toolkit understand all of this, we need to add two trigger parameters by pressing the little + on the ‘Parameters’ tab of the Animator window.

Now to have the transitions work on the triggers, select a transition and set the condition of the trigger to ‘Open’ on the transition to the ‘Open’ state and select the ‘Grab’ trigger on the transition going to the ‘Fist’ state.

That’s it for the first hand. I’ve done it all again for the second hand. I don’t know if there’s an easier way. Just doing it again seemed to be the least complex.

Also, make sure to add an Animator script to the hand prefabs and add the controllers we created to them. Otherwise, it won’t work, #BTDT.

The last part of the puzzle is the let the controller scripts know that we want this. To do this, select the left hand controller below the ‘XR Origin’ in the scene and find the ‘Animate Model’ section. Enable this and add the two names of the triggers we created earlier to this: ‘Grab’ for the select transition and ‘Open’ for the deselect transition.

When you run the application now you should see the controllers in the place of your hands, and when you grab the controllers you should see the hands close.

It might be all correct now, or things might be off. Like they were in my case. The rotation was off and the speed was weird.

Extra to make to feel better

If the rotation and location are off, you can add an extra gameobject (I’ve named them LH Parent and RH Parent) as a child of the LeftHand Controller and the RightHand Controller. These will become the parent of the hands, meaning you can use these to tweak the transform a bit. You’ll need to go back and forth into VR and back to Unity to get these right for your situation. Make sure to drag the parent to the ‘Model Parent’ property of the XR Controller scripts.

In my case, I had to change the X-rotation to 90 and move the Y-position to 0.07.

Another thing I’ve tweaked is the duration of the transitions between open and fist. This felt way too slow at first. After some fiddling with the values, I’ve settled on a transition duration of 0.05. You can find this setting on the transitions in the animation controllers.

VR Controllers with Unity's XR Interaction Toolkit

A few weeks ago I received my big prize of winning the WebXR category on this year’s JS13K. This triggered me to look into VR in Unity again. And, with an eye on MRTK3 coming hopefully someday in the future, I thought it would be nice to learn more about the XR Interaction Toolkit. This tutorial is mostly documenting for myself, but it may help someone else as well.

I started out with a basic project with OpenXR set up and the XR Interaction Toolkit packages, version 2.2.0, installed from Unity Package Manager.

In the new scene, I added a cube to have something to see when in VR and deleted the original Camera. To get everything to work in VR, I added an ‘XR Origin (VR)’ object. The VR version of this has 2 controls already set up.

To add the hands, you’ll need to model something yourself or maybe download one. For this tutorial, I downloaded one. Easiest would be if it is a rigged model, with bones and such so it can be animated easily later on. I’ve found a nice hand model on SketchFab ( https://skfb.ly/6WWwQ) , dow,loaded that, and added it to Unity. I created an empty game object, called this ‘RightHand’ and added the downloaded model as a child of this.

To make the hand show up it needs to be a prefab. So, drop the ‘RightHand’ game object in your prefab folder to create a prefab and remove it from the scene. Now, it needs to be attached to the controller. Look up ‘RightHand Controller’ in the scene. The has a script on it called ‘XR Controller (Device Based)’ with a property ‘Model Prefab’ in the Model section. Add the earlier created ‘RightHand’ prefab to this property.

Now you should be able to test it again in VR and you should have a hand somewhere that responds to the movement of the right controller. It might be floating somewhere in the distance or be REALLY big or far away.

This is still only the right hand, and the origin and scale are way off. This means a lot of going back and forth between VR and Unity to get it right. A small trick I use to get it right is to peek through the opening next to my nose and look at my hand. My real head holding the right controller needs to match the size and location of the hand in VR. This might take a lot of tweaking to get right.

When the right hand is done, I’ve created a prefab variant of this, called ‘LeftHand’ and in this, I’ve set the X scale to -1 to flip it. After this, I placed this prefab in the ‘Model Prefab’ property of the ‘LeftHand Controller’ game object.

That’s it for this part. The next step will be to add an animation to close the hands when the trigger is pressed.


Credits: “VR Hand” ( https://skfb.ly/6WWwQ ) by Tauffiq Abdllah is licensed under Creative Commons Attribution ( http://creativecommons.org/licenses/by/4.0/) .