I got Apple's visionOS Simulator streaming wirelessly to a Meta Quest using ALVR.
Download: https://github.com/zhuowei/VisionOSStereoScreenshots/tree/alvr
Demo of using the visionOS Simulator inside VR:
GitHub - zhuowei/VisionOSStereoScreenshots at alvr

Take 3D stereoscopic screenshots in the visionOS emulator. - GitHub - zhuowei/VisionOSStereoScreenshots at alvr

GitHub
Thanks, @ShinyQuagsire (the author of the original visionOS Simulator -> Quest streaming tool over wired Meta Quest Link) for helping me with my wireless version.
Also thanks to @jjtech and @keithahern for figuring out simulator input, and to ALVR devs for an amazing streaming tool.

@zhuowei @jjtech @ShinyQuagsire @keithahern Curious, have you tried the server in your `alvr` branch with korejan's ALXR client?

CI builds: https://github.com/korejan/ALXR-nightly

Actual source tree: https://github.com/korejan/ALVR

If not, I'll end up trying it sometime next week with my PSVR1 (via SteamVR-iVRy). [↓]

GitHub - korejan/ALXR-nightly

Contribute to korejan/ALXR-nightly development by creating an account on GitHub.

GitHub
@zhuowei @jjtech @ShinyQuagsire @keithahern [↑] The ALXR client works with the SteamVR OpenXR runtime, and is compatible (to my knowledge) with bog standard ALVR servers, so it /should/ work in theory…
@akemin_dayo Thanks! I have not tried it.

I'm using the latest ALVR nightly since it has a few fixes to run the dashboard + server on macOS, which came in handy for this, but I think it's quite a bit ahead of the current ALXR version.

Also, the nightly ALXR is no longer compatible with vanilla ALVR, it seems; I guess only the stable version from last year is?

If ALXR still uses the same Rust->C++ interface as upstream ALVR, then you might be able to just swap in the ALXR server code? I only replace the C++ side and the Rust side is mostly untouched.

I'll take a look at how ALXR's client does passthrough. I think it just takes a mask color, right? I can render a greenscreen (https://notnow.dev/notice/AXTQmrNkveomxqdIrw)

I was planning to stream the actual alpha channel as black-white at the bottom of the video, but visionOS doesn't use alpha blending much (only on the window handles).

Most of it is frosted glass, which I can't do anything about, since only Meta Quest Link gets access to the real camera images.

I also recently picked up a PSVR1 with the intention of adding support, so I can also test on PSVR if you want.
Zhuowei Zhang: “My visionOS stereo screenshot library can now take screenshots with a transparent background: https://github.com/zhuowei/VisionOSStereoScreenshots/tree/transparent-background Next: stream this t...”

Zhuowei Zhang (@[email protected]): “My visionOS stereo screenshot library can now take screenshots with a transparent background: https://github.com/zhuowei/VisionOSStereoScreenshots/tree/transparent-background Next: stream this t...”

@zhuowei Oh huh, it's been a long while since I paid any attention to the ALXR project — wasn't aware it's no longer compatible with vanilla ALVR servers.

As far as passthrough goes… how would that even work with SteamVR actually? As far as I'm aware there are no APIs exposed for [↓]

@zhuowei [↑] camera access, assuming you even have a VR headset that /has/ cameras to begin with. (※ The PSVR1 does not.)

(That and I'm actually one of the weird ones that's interested in seeing the 3D environments Apple built for the visionOS simulator in proper VR ;P) [↓]

@zhuowei [↑] Native PSVR1 support would be interesting to see — I was just planning on running the ALXR client on Windows SteamVR and use iVRy like I always do.

I've actually never used my PSVR1 with any other platform… (Well, I /have/ used it with my PS5, but only like… once.)

@akemin_dayo If you want to give a try:
adapt miniserver.mm to work with the ALXR fork's alvr_server Rust crate
(it looks like the ALVR version that ALXR forked from has a slightly different bindings.h (https://github.com/korejan/ALVR/blob/master/alvr/server/cpp/alvr_server/bindings.h vs https://github.com/alvr-org/ALVR/blob/master/alvr/server/cpp/alvr_server/bindings.h) but shouldn't be too hard to backport).
You may need to re-fork EncodePipelineSW/NalParsing from the older version as well.
It shouldn't be too bad: I don't think any of the Rust needs to change?
Otherwise I can take a look this weekend.
@akemin_dayo I don't actually know if ALXR works with vanilla servers; the download page says only use their nightly servers with their nightly clients though.
The PSVR1 works on my Apple Silicon Mac Mini only in cinematic mode; the mode switch didn't seem to work for me. For VR mode I had to bypass the control box via MacMorpheus' instructions (https://github.com/emoRaivis/MacMorpheus#setup-instructions) which does work on my Mac Mini but is far too much of a hassle. And I don't know how to get the tracking camera working
(I did try porting Monado which has a camera tracker, since ShinyQuagsire is using Monado on Mac for his visionOS streaming, but I didn't get far in trying to get PSVR support enabled on macOS)
GitHub - emoRaivis/MacMorpheus: 3D 180/360 video player for macOS and PSVR

3D 180/360 video player for macOS and PSVR. Contribute to emoRaivis/MacMorpheus development by creating an account on GitHub.

GitHub
@zhuowei Definitely trying this tomorrow. Looks amazing, fantastic work. Never used ALVR though, does this stream it at native Quest resolution and refresh rate? Any compression at all?
@Graphine No, unfortunately. ALVR's normal Windows/Linux servers can be configured to send very high quality video, but my modified version can't use hardware accelerated encode, so it's limited to low resolution.
@Graphine ShinyQuagsire's visionOS->Quest display tool over wired Meta Quest Link has higher resolution, I believe - I'll have to look at how it does it.

@zhuowei Insane

Also I didn't know you can inject stuff into Simulator like that, interesting 🤔

@buzzle Yep! @akemin_dayo and @poomsmart even got the entirety of Cydia Substrate running inside the Simulator: https://github.com/akemin-dayo/simject
GitHub - akemin-dayo/simject: simject is a command-line tool and iOS dynamic library that allows developers to easily test their tweaks on the iOS Simulator.

simject is a command-line tool and iOS dynamic library that allows developers to easily test their tweaks on the iOS Simulator. - GitHub - akemin-dayo/simject: simject is a command-line tool and iO...

GitHub
@zhuowei For your next trick you could stream the visionOS simulator to an iPad, with transparent background. iPad app overlays it over the camera view and sends coordinates from ARKit back to the simulator 👀
@nicolas17 Sure: I can do that - not sure if it'd be as impactful as seeing it from inside a headset, though...

@zhuowei that’s awesome!!! I wondered about that possibility when using the simulator but knowing I didn’t have the skills to achieve it, so thanks for that

Also, finally a good reason to buy a Quest 😅