I wanted to see what it took to set up a proper #MixedReality application that ticks all the boxes (viewing and interacting with reality, hand tracking, spatial sound, etc) on a #Quest3, using the latest version of #MRTK3. Because well, you need to be cross platform these days if you and your apps want to be relevant in the MR space. It took some tinkering, but I got it to work, and I am glad to see I will be able to carry over my MR investments to different platforms

https://localjoost.github.io/CubeBouncer-revisited-setting-up-Mixed-Reality-for-Quest-3-with-MRTK3-and-Unity-6/

CubeBouncer revisited - setting up Mixed Reality for Quest 3 with MRTK3 and Unity 6

Like I said in a previous post, now Microsoft have announced they have stopped producing HoloLens 2, with no successor being announced, it is important to make sure your apps run on other devices too, if you want yourself and your apps to remain relevant in the XR space. You have already seen me doing quite some things with Magic Leap 2, but recently I got myself a Quest 3 as well. And although I did play quite a bit with Quest (2) before, and even published an app for it, I wanted to see how true Mixed Reality worked on Quest 3. Equally important, I wanted to know how that could be set up with MRTK3. To put it more bluntly: I wanted to see how much from my investment in Mixed Reality could be carried over, and how far.

DotNetByExample - The Next Generation
Two weeks ago I released my #MRTK3 based app Walk the World for #HoloLens2, a week later the exact same app for #MagicLeap2. I promised to blog how I so easily and quickly got that done, and here it is: my 'recipe' for getting MRTK3-based HoloLens 2 apps to run on Magic Leap 2. https://localjoost.github.io/Making-an-MRTK3-based-HoloLens-2-app-run-on-Magic-Leap-2/ #openxr #MixedReality
Making an MRTK3 based HoloLens 2 app run on Magic Leap 2

This is a bit of an unusual post, as it does not have a project attached to it. Instead, it’s more like a recipe that I followed to get my app “Walk the World”, originally developed for HoloLens 2, to run on Magic Leap 2. Or actually, how I did it after retracing all my steps, cutting out all the useless detours, pitfalls, and dead ends I encountered trying it the first time.

DotNetByExample - The Next Generation

LocalJoost ((@)localjoost(@)mstdn.social: "Spent a weekend porting the #MRTK3 version of Walk the World to #MagicLeap2. In the end, not so difficult but the documentation leaves some things to be desired If you have a device, download the package at https://www.schaikweb.net/WalktheWorld/ML2/Walk_the_World_MagicLeap_5.1.0.0.apk. Feedback and suggestions welcome" #crossplat

Jun 16, 2024, 04:11 PM

https://mstdn.social/@localjoost/112626939403131691

Spent a weekend porting the #MRTK3 version of Walk the World to #MagicLeap2. In the end, not so difficult but the documentation leaves some things to be desired If you have a device, download the package at https://www.schaikweb.net/WalktheWorld/ML2/Walk_the_World_MagicLeap_5.1.0.0.apk. Feedback and suggestions welcome #crossplat
After I wrote about how you could get the hand *position* while doing an air tap with #MRTK3, two developers asked me simultaneously how you could get the end of the hand *ray*. Noblesse oblige, and I wrote https://localjoost.github.io/Getting-the-hand-ray-end-position-with-MRTK3/
Getting the hand ray end position with MRTK3

Like I wrote before, I sometimes feel like I am Microsoft’s one-man Mixed Reality Q&A department, judging by the number of questions I get. I guess it’s becoming common knowledge that I have the tendency to actually answer a lot of those questions ;).

DotNetByExample - The Next Generation

#MRTK3 has a subsystem implementation for speech commands that is supported by #HoloLens2. #MagicLeap2 supports voice commands as well, if only someone had taken the trouble of making a KeywordRecognitionSubsystem implementation of it, it would have had MRTK3 support for speech commands as well.

Well, guess what - someone just did. Me. So now you can have voice commands on *both* devices using the same API, and you only need to flip a checkbox in the MRTK3 settings.

https://localjoost.github.io/An-MRTK3-KeywordRecognitionSubsystem-for-Magic-Leap-2/

An MRTK3 KeywordRecognitionSubsystem for Magic Leap 2

As I have shown already, MRTK3 is very versatile and very well suited for cross-platform development. The architecture is extensible by the usage of Unity subsystems. At Magic Leap, they have used that to implement a version of the Hands subsystem - that allows hand tracking to work - and you can use that by simply selecting a different implementation of the subsystem in the Android MRTK3 profile:

DotNetByExample - The Next Generation
I get a lot of questions from developers about #MixedReality in general and #MRTK3 in particular. One developer wanted to know how you could find the actual *position* of an air tap. I found out, and of course blogged it: https://localjoost.github.io/Getting-raw-air-taps-and-their-positions-with-MRTK3/
Getting raw air taps and their positions with MRTK3

Over the last few months, I sometimes feel like I am Microsoft’s one-man Mixed Reality Q&A department, as I get lots of questions for tips, samples, and guidance. Apparently, many students and scientists are trying to get to grips with Mixed Reality. I try to answer them all, but sometimes it takes a while, as I attempt to answer them in order of appearance - and I also have a day job as an MR developer. Anyway, this week a developer asked me on GitHub how you could not only get a raw air tap in MRTK3, but also how you could get the tap position.

DotNetByExample - The Next Generation
After successfully updating #HoloATC to #MRTK3 GA, I felt confident I could upgrade the #Augmedit #Lumi #HoloLens app as well. I was successful indeed - but there were some more *interesting* things I ran into. I updated my blog post from September significantly with my new findings. https://localjoost.github.io/Upgrading-to-MRTK3-GA-from-a-pre-release-some-assembly-required/
Upgrading to MRTK3 GA from a pre-release: some assembly required

[Update - October 6, 2023]

DotNetByExample - The Next Generation

I wanted to see if I could run one of my more far-fetched #MRTK3 #HoloLens2 samples on #MagicLeap2. Step one was getting a Spatial Map. It was dead easy on HoloLens 2, but less so on Magic Leap 2. I managed to make it work, but there were more challenges than I had anticipated. I blogged my findings to help others avoid the same pitfalls as me.

https://localjoost.github.io/Using-a-Spatial-Mesh-with-MRTK3-on-Magic-Leap-2/

Using a Spatial Mesh with MRTK3 on Magic Leap 2

After successfully porting the MRTK3 version of HoloATC to Magic Leap 2, I wanted to try some other experiments on it as well. For one of those experiments, I needed to use the Spatial Map. How hard could that be? I just added an ARMeshManager to the XR MRTK Rig like I do for HoloLens, as per my own blog post, and…

DotNetByExample - The Next Generation
With the arrival of #MRTK3 GA, a fellow developer reported my #Quest passthrough sample appeared to be broken after upgrading. I did some investigation, and fixed it. https://localjoost.github.io/Full-underlay-passthrough-transparency-with-MRTK3-GA-on-Quest-2Pro/
Full underlay passthrough transparency with MRTK3 GA on Quest 2/Pro

Hey, didn’t I write about this before? Indeed, I did, but last Monday I got a report from José Rocha that my sample didn’t work anymore after he followed my upgrade tutorial for MRTK3 GA. After trying that myself on the sample, I had to agree he was right: instead of full transparency, I got to see the default Unity Skybox. Ugh.

DotNetByExample - The Next Generation