A little louder, a reminder:

"Good UX" mostly means "I've seen this before."

Have you ever seen an adult without your cultural baggage approach a doorknob for the first time? They'll start by pulling it, then pushing it. There's nothing "intuitive" about turning a round doorknob. But you've been trained, so you don't even notice.

"...but Apple, but the iphone", the iPhone was never "easy to use" or "intuitive". They bombarded TV with training videos disguised as ads for 6 months pre-release.

"You just point and click and drag everyone knows that" you spent hundreds of hours training to "just" point and click and drag, but you didn't call it training, you called it solitaire and minesweeper.

You practised.

Today, if we want a better user interface for any computing - and I think we do, and it's possible - we have two choices. Entirely 100% new - clean-break, fresh-start new - tech or to acknowledge and own that we're going to spend some time fighting reflexes honed over decades.

This is not an "it's the user's fault" or "it's never the user's fault" argument. There's not "fault" here, there is responsibility. There's an implicit social contract in computing, in navigating change that matters when we are using the tools we rely on.

Change is an _accessibility problem_ that demands assistive technologies to navigate, and building that tech is the responsibility of the developer.

"We put a 2nd floor on your house", great! "Climb a rope to get there", excuse me what.

@mhoye

Recognizing the fundamental difference between fault and responsibility is deep wisdom a lot of people don't get. Good thread.

@tbortels I started life as a sysadmin, and one of the truisms I've been repeating since about 1998 is, a lot of things that aren't my fault are still my responsibility, and a lot of things that aren't my responsibility are still my problem.