Platform support

Happy New Year! Over the last few days I've been testing how this Unity scene runs on Meta Quest 2 vs Valve Index. This is only an initial, exploratory effort to minimize friction developing with support for multiple devices. I'm mostly concerned with text readability because of the whole text karaoke aspect. If you're using the Oculus-specific plugins and such in Unity, then you can use all sorts of nice features, including compositor layers, for better clarity on text.

However, if you move away from OpenXR and switch to developing with the Oculus or SteamVR plugins, everything gets a little more complicated. The whole point of using OpenXR is to avoid some of the pain points found with developing for multiple runtimes. Because of a push to get things ready for a demo at Meaningful Play (a post for another day), the entire focus had been getting this to run standalone on a Meta Quest 2. Now that the conference has passed, I've wanted to spend some time on checking how it runs elsewhere. Not to mention, having the ability to rapidly test iterations without a device deployment is invaluable. So the motivation is there just to aid in debugging.

So with the Unity project configured to use OpenXR, I've been able to run the scene under Play mode in the Unity Editor with both the Valve Index (via SteamVR on Windows, Linux TBD) as well as the Meta Quest 2 (via Oculus Link on Windows). The Unity project's Android configuration is also set to use OpenXR and utilize the OculusXR and Meta Quest Features. Deploying this and running it on a Quest 2 has still worked, which is very encouraging.

Woe de plate-forme*

That about summarizes the good parts of using OpenXR. Now here's some of the bad:

Good things come to those who wait, and evidently compositor layer support is coming to OpenXR at some point. Until then, I would need to switch to using OVR for the provider, which means giving up the use of OpenXR conveniences. The same goes for using anything specific to SteamVR. For instance, the HTC Vive Tracker 3.0 devices work well with the Valve Index base stations – hello low-cost mocap! But these trackers aren't available by default in OpenXR. Valiant efforts have been made to consolidate tracking features from OpenXR and the SteamVR plugin for Unity, but the unofficial solutions given in the thread did not work for me after a night spent trying, which is discouraging. I've come across a couple of other threads of people dealing with similar issues and frustrations at the lack of support. So for the moment, further attempts to supplement lower body IK with additional trackers are on pause.

Speaking of lower body movement, Meta's Movement SDK is another thing we currently miss out on by sticking with OpenXR. We gave this a shot soon after release, but came up short in terms of seeing anything usable from the Body Tracking SDK:

We followed the release notes as best as we could, but no luck. After that, I figured it was worth waiting until there were more examples floating around, which thankfully didn't take too long to appear. Now that these are out, I'll give it another try soon to see how things stand. The real win would be getting reasonable lower body tracking on a Quest 2.

Of course, these things utilize the specific features from the Oculus Utilities plugin, which means supporting them comes at a project management cost. So here's the short list of the things that have been interesting and/but troublesome to support in the same context:

  • Body and hand tracking from the Meta Movement SDK (Quest platforms)
  • Vive trackers (SteamVR and eventually OpenXR)
  • Compositor layers for text readability (Oculus and eventually OpenXR)

What's next?

I'm aspiring to have this game supported across a range of devices. This won't solve the problems that came out of limited Kinect longevity, but it will mitigate it as much as possible and ideally make it easier to support newer, better devices as they emerge. OpenXR goes a long way in making this possible, but the tradeoff involves waiting for certain things. When it comes to delivering the best experience on each specific headset, the advice from Unity is to handle it with separate prefabs and/or scenes that cater to that platform's needs. Which... works, but feels a little bad?

I'm not very far off from following this pattern, but it doesn't solve everything. For instance, if I wanted to use Vive trackers and specify SteamVR/OpenVR as the XR plugin over OpenXR, that means changing a project setting, not a scene setting. To avoid issues when making a build for one platform vs. another, I might end up having the same repository cloned into adjacent folders with descriptive names - one for the Quest build, another for the SteamVR build, etc. This is more applicable when dealing with the choice of build platform, as the "Switch platforms" option in the Unity build system can take a staggering amount of time to complete due to asset reimports and such.

Ultimately, things are looking up for supporting multiple devices. Some of the really cool features may have to wait, but that also helps draw boundaries around core features that deserve attention first before extending further support. As fast as things move, there's always a risk of creating something that works out of madness, only for it to be made obsolete by official packages and specification updates. Good problems to have, but bad for burnout and sanity.

Still, who knows what the next semester will bring? A great deal of effort was made in the fall - general quality-of-life improvements across the board - and the wishlist is still quite long:

  • live multiplayer support
  • downloadable builds for desktop (easy)
  • downloadable apk for Quest (easy to give, harder to install since it requires dev mode)
  • possible pursuit of AppLab support to ease Quest 2 installation (requires a fairly involved review and approval process that may be complicated to satisfy given our features involve deliberately recording voice and movement data)
  • more avatars and play spaces
  • web-based performance viewer
  • possibly even a web-based version? Hello WebXR!
  • internal development for pedagogy and experiment purposes

So, going into 2023, I think we'll have our hands full :)

*Platform woes, but spoken in the same manner as eue de toilette