A little louder, a reminder:

"Good UX" mostly means "I've seen this before."

Have you ever seen an adult without your cultural baggage approach a doorknob for the first time? They'll start by pulling it, then pushing it. There's nothing "intuitive" about turning a round doorknob. But you've been trained, so you don't even notice.

"...but Apple, but the iphone", the iPhone was never "easy to use" or "intuitive". They bombarded TV with training videos disguised as ads for 6 months pre-release.

"You just point and click and drag everyone knows that" you spent hundreds of hours training to "just" point and click and drag, but you didn't call it training, you called it solitaire and minesweeper.

You practised.

Today, if we want a better user interface for any computing - and I think we do, and it's possible - we have two choices. Entirely 100% new - clean-break, fresh-start new - tech or to acknowledge and own that we're going to spend some time fighting reflexes honed over decades.

This is not an "it's the user's fault" or "it's never the user's fault" argument. There's not "fault" here, there is responsibility. There's an implicit social contract in computing, in navigating change that matters when we are using the tools we rely on.

Change is an _accessibility problem_ that demands assistive technologies to navigate, and building that tech is the responsibility of the developer.

"We put a 2nd floor on your house", great! "Climb a rope to get there", excuse me what.

The biggest own-goal I've ever experienced in this space was Windows 8.

Extremely Late Hot Take: Metro was Actually Pretty Good.

But the thing they bungled, hard, was making the Win8 changes _accessible_.

If you didn't see the first-run "swipe from off the screen" training montage, all the reflexes you might have from a phone or iPad were all 100% wrong and did nothing.

The first time I sat down in front of one of them it took me an hour to figure out how to log in at all.

And... not to put too fine a point on this, but I have been computering for a very, very long time. I have Seen Some Shit And I have Fixed More Than My Share Of That Shit. I am Quite A Bit Better Than OK at solving the problems that live near computers.

And it's been a long, long time since I've sat down in front of any computer of any kind and felt that helpless. Like I didn't even know where to start. Decades.

And making people feel helpless is the cardinal sin of UX design.

Anyway, I think there's been a generational shift in developer's perceptions of their obligations to that social contract and that it's made this industry substantially worse. The belief that "everyone knows" how to use the computers that are everywhere now has meant that nobody believes they need to, so nobody bothers, keeping training materials and docs up to date.

So developers think the costs of change are zero, because it's only the operators that bear those costs fully and invisibly.

Like, Windows documentation once upon a time was world-class, it was on-device, available at any moment and excellent. Today "Help" is a hyperlink to trash.

On a modern Mac, you can ask a question and get a few 'maybe' bullet points for your trouble. On System7 - the _stone_ age - the _OS would walk you through each step_ of solving your problem. It would say, click this thing and then draw a little circle around the thing. Apple's reputation comes from this period, when nobody was ever afraid.

Instead we’re moving fast, or at least running in circles, and eroding things that I think are going to matter over the long term, like trust and confidence in our industry, consistency, stability and reliability, pluralist democracy, you know. Basic good-repair-of-society stuff like that.

Just to add one last note to this:

One of the major disconnects I've seen in my career between engineering and UX/UI design is rooted in this misconception.

The Engineers asks: we made a new feature, it does these new things, can I have a button for this new thing?

The design team hears: We want an applied anthropology research exercise. If our product has an international audience, it is fifty or two hundred research exercises.

What the Engineer thought they asked for: a little picture.

You can see how both sides of this conversation find the other side frustrating to deal with.

Engineers believe they've solved a major problem, and now "just" need a tiny label to make that solution discoverable.

Interface designers believe that "just" having some new ideas is the easy part, but making new ideas legible and discoverable is a major problem.

Both are correct. But one of those domains is... let's say far less prone to the practice of humility than the other.

So when you look at interfaces that are obviously rushed, derivative or just straight up knockoffs of some other preexisting thing, this is part of what you're seeing. It might not be the only thing you're seeing, but it's kaleidoscopic; reflections after reflections of a set of mismatched expectations about how to best communicate new ideas to people whose heads are already full of old ideas.
@mhoye have you seen Tantacrul's videos on the design (flaws) of various sheet music engravring software?
@jackeric no! URL me, my friend.
@mhoye ooh! there's a whole series - the finale came out just a few weeks back. this is probably a good starting point. he's a composer and a software developer but you don't need to know much about music to follow. and they're very funny
https://youtu.be/dKx1wnXClcI?si=belius_crashed
Music Software & Bad Interface Design: Avid’s Sibelius

YouTube
@jackeric @mhoye this is an amazing video!
@mhoye We solved this problem in our product organisation by not having a design team.
@mhoye My favourite thread of the day. Great. Thank you! 👏

@mhoye

i think jenn schiffer's 2014 talk about how "your grandparents probably didn't have node" helps put a lot of things into perspective for the farthest-flung version of this problem -- developers assuming their audience is only developers. but in a world where the UX designers at the tech megacorps are constantly pushing quarter-baked overhauls and paradigm changes out into the world... unlike the old world where Windows and Mac were teaching the user some of the fundamentals, the platform is often more of an adversary nowadays. we can't assume that even developers aren't having their 'struggling to use windows 8' moment, let alone people who can't devote so much time to staying familiar with the ever-changing interfaces on their featureless squircle-screen. but as long as participation in society demands these dangfangled machines, it's our responsibility to make learning them worth the investment.

i think a lot about unobtrusively tutorializing the human-computer-interactions we take for granted, from the perspective of game development. it can feel unlikely, perhaps even embarrassing, to consider that your own app could ever be the first thing of its nature that a person has ever seen -- but you can't assume that it's not!

take for example the three-week free-time project my friends and i submitted to a game jam last year, a 2D platformer for the gameboy advance ("getting around it with pheasant birdy"). as soon as i felt it growing in scope from 'tech demo' to 'playable game', designing and implementing a tutorial subsystem immediately became a non-negotiable feature to me, and was one of the first things i worked on once i had the basics of the engine in place. naively, one might think that the demographic of "people who own GBA flashcarts/emulators and download homebrew game jam entries" wouldn't need to be taught "press ➡️ to walk", "press 🅰️ to jump", "press ⬆️ to climb", since those would be obvious fundamentals to anyone who's played any game in the tradition of super mario bros within the past four decades, and surely cutting that scope on such a time-limited project would let us make the rest of it more substantial... and that may be true, but what about the parents/coworkers/etc. of my dev team and me, some of whom never touched anything video-game with their own hands after playing a few rounds of pac-man/tetris/galaxian in the 80s and deciding games weren't their cup of tea, or younger relatives whose experience of gaming are primarily mobile gachas? (we even embedded a javascript-based GBA emulator on the game's page with a special build of the ROM that replaces the button icons with keyboard keys -- many people we'd want to share it with don't need to know how to download and use an emulator.)

we sorely need to instill a patient, empathetic, noone-left-behind mindset in our development cultures. this probably means we have to re-educate a lot of arrogant technocrats who think "patience? teaching? isn't that what the chatbot we added is for?"

Jenn Schiffer - Your Grandparents Probably Didn't Have Node [ Thunder Plains 2014 ]

YouTube
@mhoye I miss balloon help.
@yoasif It was good, yeah.
@mhoye I remember at one point I was the only person in the CS department who could print documents on Windows machine to a printer attached to a Unix machine. When asked about this sorcery, I just said I used Windows "help" and followed its instructions exactly.
@dabeaz @mhoye Far too few things are self-documenting.
@mhoye insert MSDN CD 3/4
@mhoye I think Apple spent a decade hoping that Macs would end up running iOS. (Of course, phones suck, too.)
@billseitz @mhoye I assume they still do, tho I suspect they're too ossified to pull it off now (small mercy).

@mhoye There has also been a shift in incentives.

With the boxed or pay per upgrade model, the incentive is to get users to develop their skills so they want to buy (or ask their employer to buy) the next version

With rolling upgrades and freemium models, the incentive is to make users bad at the software so they click more of the deceptive upsells

Users who learn and do the productivity tips have lower LTV... https://arstechnica.com/gadgets/2025/11/what-i-do-to-clean-up-a-clean-install-of-windows-11-23h2-and-edge/?utm_social-type=owned

How to declutter, quiet down, and take the AI out of Windows 11 25H2

A new major Windows 11 release means a new guide for cleaning up the OS.

Ars Technica
@mhoye I suspect that as the number of computer users 100x, the number of manual readers stayed flat.
C++ Wage Slave (@[email protected])

@[email protected] @[email protected] I write nothing more creative than documentation for the software I produce, but people misunderstand that, too. The experience of people misunderstanding my docs, or simply failing to read them, was one of the things that turned me away from evangelicalism. (I bet you didn't see that coming.) I wasn't using parables to hint at ineffable spiritual truths in the face of religious persecution: I was just explaining how software worked, how to see its current state, and how to configure and maintain systems. People who were paid to read and understand this material would not or could not do so: their repeated questions made it obvious. Within a very few years, instead of reading TFM, people developed folk stories of commands they could type that usually did something that could be mistaken for success. They veered constantly off-course, and I kept having to drag them back. They consulted each other, rather than the docs, and developed their own mythology about how the software worked. They intuitively felt they knew the software better than I did, because my approach to problem-solving was careful and methodical but they knew a golden shortcut. If concrete, human-level explanations, written out literally, landed so badly, there's no chance that people will have remembered the figurative and unfathomable teachings of Jesus seventy years after his death, written them down accurately and fully, and built from them a useful picture of worlds seen and unseen and the will of God. I'm sorry; it just doesn't ring true.

Infosec.Space

@mhoye

This is me in front of any neoliberal means testing form: "How do they expect anyone to fill this in? I have a degree in astrophysics and decades of work in data analysis and I can't fill this in."

(Then I realized that most people could fill it in because they didn't take it seriously and just wrote in anything, which is what I now do.)

@richpuchalsky And here we find ourselves, in the best of all possible worlds.