A little louder, a reminder:

"Good UX" mostly means "I've seen this before."

Have you ever seen an adult without your cultural baggage approach a doorknob for the first time? They'll start by pulling it, then pushing it. There's nothing "intuitive" about turning a round doorknob. But you've been trained, so you don't even notice.

"...but Apple, but the iphone", the iPhone was never "easy to use" or "intuitive". They bombarded TV with training videos disguised as ads for 6 months pre-release.

"You just point and click and drag everyone knows that" you spent hundreds of hours training to "just" point and click and drag, but you didn't call it training, you called it solitaire and minesweeper.

You practised.

Today, if we want a better user interface for any computing - and I think we do, and it's possible - we have two choices. Entirely 100% new - clean-break, fresh-start new - tech or to acknowledge and own that we're going to spend some time fighting reflexes honed over decades.

This is not an "it's the user's fault" or "it's never the user's fault" argument. There's not "fault" here, there is responsibility. There's an implicit social contract in computing, in navigating change that matters when we are using the tools we rely on.

Change is an _accessibility problem_ that demands assistive technologies to navigate, and building that tech is the responsibility of the developer.

"We put a 2nd floor on your house", great! "Climb a rope to get there", excuse me what.

The biggest own-goal I've ever experienced in this space was Windows 8.

Extremely Late Hot Take: Metro was Actually Pretty Good.

But the thing they bungled, hard, was making the Win8 changes _accessible_.

If you didn't see the first-run "swipe from off the screen" training montage, all the reflexes you might have from a phone or iPad were all 100% wrong and did nothing.

The first time I sat down in front of one of them it took me an hour to figure out how to log in at all.

And... not to put too fine a point on this, but I have been computering for a very, very long time. I have Seen Some Shit And I have Fixed More Than My Share Of That Shit. I am Quite A Bit Better Than OK at solving the problems that live near computers.

And it's been a long, long time since I've sat down in front of any computer of any kind and felt that helpless. Like I didn't even know where to start. Decades.

And making people feel helpless is the cardinal sin of UX design.

Anyway, I think there's been a generational shift in developer's perceptions of their obligations to that social contract and that it's made this industry substantially worse. The belief that "everyone knows" how to use the computers that are everywhere now has meant that nobody believes they need to, so nobody bothers, keeping training materials and docs up to date.

So developers think the costs of change are zero, because it's only the operators that bear those costs fully and invisibly.

Like, Windows documentation once upon a time was world-class, it was on-device, available at any moment and excellent. Today "Help" is a hyperlink to trash.

On a modern Mac, you can ask a question and get a few 'maybe' bullet points for your trouble. On System7 - the _stone_ age - the _OS would walk you through each step_ of solving your problem. It would say, click this thing and then draw a little circle around the thing. Apple's reputation comes from this period, when nobody was ever afraid.

@mhoye I think Apple spent a decade hoping that Macs would end up running iOS. (Of course, phones suck, too.)
@billseitz @mhoye I assume they still do, tho I suspect they're too ossified to pull it off now (small mercy).