I was incredibly fortunate to have the opportunity to try Vision Pro this week. The experience was overwhelming in the best possible way. I'm still collecting my thoughts for a longer Design Diary post, but it was so transformative that I'm finding it hard to put the experience into words.

As I work on that, I wanted to ask for any questions folks had with regards to using it and more generally for the platform from a developer/designer perspective. I'll do my best to answer to them.

@_Davidsmith Did you get to ask or experience how notifications work on visionOS?
@_Davidsmith Is it a single user account platform (like iOS) or can multiple people ‘log in’ to the same headset? ie if there was one in a family, could each person use it with their own iCloud account?
@_Davidsmith As amazing as 2D windowed computing looks to be in that “infinite canvas”, those apps aren’t necessarily something that can’t be done well on an existing platform like the Mac or iPad. What kind of applications and experiences could you see being enabled by Vision Pro / visionOS that are truly unique, and could help make spatial computing a true paradigm shift?

@kathrynelrod @_Davidsmith

Peripherals á la William Gibson’s “Peripherals”

#visionpro would allow you to control and experience the viewpoint of a remote “vehicle” in an incredibly real way. The possibilities are almost infinite.

Drive your own vehicle from a far off parking spot to the door. “VisionPro Valet”.

Control a “robot” to safely work in a dangerous situation. Firefighters, police (a police officer not afraid for his life might make better decisions), remote surgery, underwater..

@_Davidsmith Did you get to try any widgets? How about widgets with input? Does it feel like enough space to have lots of them always around?
@_Davidsmith any idea if you could have a window on a table in front of you or is positioning only the other axis?- imagining working on digital paper on your desk
@tchaten @_Davidsmith so many uses for things on the desk, on the floor, widgets above you can glance up at. Hope they will eventually track a pencil/pen for very specialized situations. Unfortunately for me, I want a giganticly wide screen, curved RDP session so I can do my Windows work.
@_Davidsmith Does it have any haptics? I'm not entirely sure how it might work in a headset, but Apples haptic feedback is one of its most under-appreciated and magical elements of their products (a real hardware / software integration only they pull off)
@DrChris PSVR2 has haptics in the headset itself, so it’s at least feasible.
@_Davidsmith Also, did they reveal how unlock will work? faceID (eyeID??) or is it iPhone linked? Wondering how easily you can demo the headset to friends and family.
Optic ID will unlock Apple’s new Vision Pro headset

Apple’s third major biometric authentication system is the iris-scanning Optic ID, which will be used for the Vision Pro mixed reality headset introduced at WWDC. It follows Touch ID and Face ID introduced on the iPhone.

The Verge
Apple Vision Pro has a two user-account limit: Yours and a guest

It sort of supports multiple users—but they'll need to bring their own lenses.

Macworld
@_Davidsmith As someone who appreciates spending time in nature and who has apps that focused on being active, how do you see the device contributing to that part of your work?
@_Davidsmith I’m curious if there are any breadcrumbs around App Intents and what that API might mean for the present/future of the experience on that platform
@matthewcassinelli @_Davidsmith my App Intents lab engineer didn't explicitly say it but hinted for support saying that (paraphrasing) "all currently used frameworks will be available for visionOS"
@_Davidsmith Does it get hot? Does it really have fans? Does the air from the fans blow on your face? Is it lightweight? Can you see it replacing the Studio Display or an external monitor on Mac? Have you tried the virtual keyboard? How does it feel without tactile feedback?
@_Davidsmith Did it feel like you could do productive development work for an extended time? Like running Xcode from a laptop for 2-3 hours? How was text quality compared to a laptop/monitor (at code densities)?
@_Davidsmith Do you have any ideas for developing for it? Will you develop for it?
@_Davidsmith Do you know if it will be possible to manipulate the representation of the “real“ world? Example: sitting in front of a table with the device on, could you modify the color/shape of the table?
@_Davidsmith I know more is to come on a visionOS SDK but I’m interested in your opinion regarding the following: I haven’t done any ARKit or RealityKit features in my app yet. If I want to start with an eye towards my app supporting use in  Vision Pro, what would you suggest to focus on to move straight to volume and space views for the headset.
@_Davidsmith Not sure if you wear contacts, so you might not know, but I’m wondering what the corrective lens situation is for near-sighted folks. I work without glasses and can focus fine at close range and work on my MBP without glasses. I see just fine up to about 1.5-2m (-2.25 diopter). Given the goggles sit close to your eyes, would they require corrective lenses it that situation?

@_Davidsmith you probably won’t have this information yet, but I keep wondering if the Vision Pro will interact with the Apple Watch hardware similar to how in the keynote demo it “syncs” with a Mac when you look at a MacBook Pro for example?

Would be so cool to be wearing the Vision Pro and look at your Apple Watch, and some special informatics or UX happens in the spatial view! Any chance this was mentioned or demoed in the hands-on?

@_Davidsmith As I asked and partially answered in this toot: https://qoto.org/@danb/110505148037732448 , what is #VisonPro much better at than laptops/phones, etc? The answer helps you then figure out which types of applications it's better for, and then for which apps now might be a good time to build/rethink.
Dan Bricklin (@[email protected])

Thinking about Apple #VisionPro vs laptops/iPads/phones/etc: *** What is #VisionPro much better at? *** Some obvious ones: - Lots of pixels in wide field of view & lots of perceived 3rd dimension space => more places/ways to put/organize things - Eye tracking instead of mouse/touch => standing/sitting no desk/lap, quicker, hands-free - Finger(s?) flexing vs finger tip press => richer gesture language - More immersive sound w/transparency => quality, aural cues including location More?

Qoto Mastodon
@_Davidsmith there are contradicting reports on wether the headset fills your FOV completely, or there are dark areas on the peripheral. What was your experience?
@_Davidsmith I’d love to hear of any existing apps today that you feel fit well or don’t fit well in this new world. I’m an app designer and want to start dreaming up what’s possible in the next 6 months!

@_Davidsmith was there any discussion about accessibility?

My particular interests would have to do with drooping/lazy eye.

@_Davidsmith so many good questions here. Lots to still be uncovered I think.
@_Davidsmith Do you think it has “legs”? In other words does it seem like something that might break through into common use or does it seem like another VR/AR toy for enthusiasts?
@_Davidsmith I curious about how it would be to use in “computer mode” while lying down. (I have back issues that make sitting (or standing) in front of a computer somewhat burdensome.)
@_Davidsmith I don’t always follow your design diary, so when you post that note, please post here that you posted it so I can go read it and others can go. Read it as well. I look forward to it. Thank you.

@_Davidsmith What will the stand alone computing experience be like? In other words can it completely replace the need for an iPad or laptop/desktop for someone who does not do heavy computing?

My work issued laptop is, unfortunately, a PC and my non-work computing needs are shrinking. Could it eventually replace both my aging 2016 MacBook Pro —which mostly gets used now for filing taxes, zoom calls, expense tracking, etc.— and my original 12.9” iPad Pro?

@_Davidsmith my 2 biggest questions: considering it has an M2 could it theoretically replace a Mac for development work, rather than just mirror its screen? Or will future improvements allow you to pull windows off the Mac screen into the virtual space? Or emulate multiple screens at least?

what is the deal with “not all Rx supported” for the custom lenses? what are the limitations there?

@_Davidsmith how do you think the platform will evolve? I don’t doubt with the cash reserves Apple has they can go long and wait for the market to grow, either through lower cost HW or increasing demand through increased platform utility, but where do you think they want to take it? I’d love to know!

@_Davidsmith hey, thanks for the opportunity to ask! Since no one I have heard has said anything about it, I'm wondering how walking around in the immersive environments feels like. Are there any guides for furniture and walls? Another question: how do you type on the virtual keyboard? With your fingers or with your eyes?

I'd appreciate it if you have the answers to those questions!

@_Davidsmith How did it feel regarding content density and size of the apps? Can you get a lot more content on that big deal life canvas without feeling overwhelmed or is it more comfortable with a similar amount of content to Mac or iPad apps, just that they have real-life size?
@_Davidsmith realistically, how useful is it with only a 2 hour battery life
@_Davidsmith How excited are you and others as developers about the Vision Pro? Is that excited just because it's a amazing tech or are you seeing a massive dev opportunity?
@_Davidsmith do you think it will be used for work, as a productivity device?
@_Davidsmith in the demos, everything seems to happen to in a specific plane within the field of view. Is the software snapping to those planes a bit like a suggestion or do you have to do the work to place things?
@_Davidsmith like the e photo app do you think it’s possible to have it recognise QR codes or QR like codes