A little louder, a reminder:

"Good UX" mostly means "I've seen this before."

Have you ever seen an adult without your cultural baggage approach a doorknob for the first time? They'll start by pulling it, then pushing it. There's nothing "intuitive" about turning a round doorknob. But you've been trained, so you don't even notice.

"...but Apple, but the iphone", the iPhone was never "easy to use" or "intuitive". They bombarded TV with training videos disguised as ads for 6 months pre-release.

"You just point and click and drag everyone knows that" you spent hundreds of hours training to "just" point and click and drag, but you didn't call it training, you called it solitaire and minesweeper.

You practised.

Today, if we want a better user interface for any computing - and I think we do, and it's possible - we have two choices. Entirely 100% new - clean-break, fresh-start new - tech or to acknowledge and own that we're going to spend some time fighting reflexes honed over decades.

This is not an "it's the user's fault" or "it's never the user's fault" argument. There's not "fault" here, there is responsibility. There's an implicit social contract in computing, in navigating change that matters when we are using the tools we rely on.

Change is an _accessibility problem_ that demands assistive technologies to navigate, and building that tech is the responsibility of the developer.

"We put a 2nd floor on your house", great! "Climb a rope to get there", excuse me what.

The biggest own-goal I've ever experienced in this space was Windows 8.

Extremely Late Hot Take: Metro was Actually Pretty Good.

But the thing they bungled, hard, was making the Win8 changes _accessible_.

If you didn't see the first-run "swipe from off the screen" training montage, all the reflexes you might have from a phone or iPad were all 100% wrong and did nothing.

The first time I sat down in front of one of them it took me an hour to figure out how to log in at all.

@mhoye I click the start menu button to open the start menu

I get a full screen of spinning tiles that hides everything else

@gabrielesvelto @mhoye I quite liked this idea in theory — when the start menu is open, of course I'm not doing anything else, so why should anything else need to be on the screen?

the problem in practice was that the start screen was full of email notifications and photos and news stories and so on, so opening it often felt like entering a room, looking around, and saying "now what did I come in here for?"

that, and, in modern UIs the launcher menu is also the search bar which is also the calculator, and in those cases you actually might want to refer to stuff on the screen. but that was a way of back in the windows 8 era

@andrewt @gabrielesvelto I have a lot of feelings about this, but I have a strong sense that there's room in the world for two buttons, one of which is "I'm looking around" and one of which is "I'm going somewhere specific." Putting them in the same place means they'll always be undermining the operator's' intent.
@andrewt @gabrielesvelto (If you want an _astounding_ trainwreck of an example of this, in the bottom left corner of the Win11 screen, where the start menu used to be, the single most valuable pixel in the entire computer industry, somehow got hijacked by the weather-and-news-widget team in Windows 11, and now the start menu is centered on the bottom of the screen by default An incredible tragedy.)