What we’ve seen of visionOS is really impressive. But thinking about the bits that are currently missing…

HUDs aren’t shown. Maybe they look weird when they aren’t anchored to your space. Maybe their applications are no good if other people in the space can’t see them.

But I wanna see more AR applications for drivers and pilots and surgeons.

Likewise we never saw any in-person shared experiences.

Whether multiple Vision devices would need to coordinate wirelessly (and know their relative position in meatspace), or whether it’d just be a FaceTime call with someone in the same room, you have to assume Apple’s own industrial designers and mechanicals designers want to collaborate on 3D models.

More on this: proximity-based SharePlay is needed just to solve the case of your boss walking up to your desk and asking for a demo. Or swiveling around and asking a coworker for some help on a problem.

Otherwise you gotta go to different rooms and set up a meeting to present your work and talk through it.

Seems like the kind of feature that’s *so* important for office work that you can’t even imagine Apple putting headsets on employees’ desks until it’s solved.

Back to WFH everybody!

@clarko that reminds me of one other thought i've been having: how do i share a screen (or an item in space) with one person, but keep other stuff private? if i have messages open as well as the xcode window i'm pairing with another dev on, how does the ui assure me that they can see some stuff i can see but not all of it?
@Soroush Big overlap with the existing Mac screencasting API