Clear my calendar: we're doing visionOS simulator shenanigans!
`LC_ALL=C grep -r berration .` shows that there's a `./usr/lib/libDisplayWarpSupport.dylib` used by `./usr/libexec/wakeboardd`; does that have anything to do with the headset params
Stupid questions I need the answers to:
- Has any dataminer found the cursed EyeSight front display code in the simulator yet?
- Does the simulator have the Persona FaceTime avatars?
- Are the "Memories" 3D photos just stereo photos / videos?
"Simulated Senes"
... You had two and a half weeks and you didn't catch this spelling mistake?
All right, if I suspend wakeboardd, the screen stops drawing. So I think that's promising - the binary that mentions warping really does seem to be involved in rendering the output
It looks like wakeboardd does do stuff with IOSurfaces:
* thread #3, name = 'compositor-rt-thread', stop reason = breakpoint 8.1
* frame #0: 0x0000000100ba4b74 wakeboardd`rt_sim_display_submit_surface
frame #1: 0x0000000100b960b0 wakeboardd`rt_sim_hmd_composite + 1640
frame #2: 0x0000000100bb6484 wakeboardd`rt_thread_main + 896
frame #3: 0x0000000106cef4c0 libsystem_pthread.dylib`_pthread_start + 104
rt_sim_display_submit_surface(something, IOSurface*)

... still not sure if this is the right binary though. Ghidra time?
rt_sim_hmd_composite does some Metal rendering of an IOSurface.
1) looks like RealityEnvironment draws the simulator environment:
xcrun simctl spawn booted defaults read
gives
"com.apple.RealityEnvironment" = {
activeSyntheticEnvironment = LivingRoomDay;
};
2) wakeboardd has a bunch of defaults that it reads beginning with hmd.
Did Apple seriously not build themselves a 3d stereo view for the simulator?
rt_sim_hmd_composite has a fast path where - if there's only one layer, and it's the exact same size and format as the screen - it just sends that layer directly to the (simulator's) screen. If there are multiple fully immersive layers (the composer supports up to 5) or the format doesn't match, then the compositor calls out to Metal.
If you change the last "1" in the call to rt_hmd_fill_descriptor_destination / rt_hmd_fill_descriptor_layers to "2", you do get the color and depth textures for the right eye in addition to the left eye (which you get with "1"). The renderer doesn't ever try to access it, though, of course.
Disappointingly, dumping out the second texture shows that the right eye view is completely blank (solid black colour). What else do I need to tweak before apps'd render to the right eye?