A little louder, a reminder:

"Good UX" mostly means "I've seen this before."

Have you ever seen an adult without your cultural baggage approach a doorknob for the first time? They'll start by pulling it, then pushing it. There's nothing "intuitive" about turning a round doorknob. But you've been trained, so you don't even notice.

"...but Apple, but the iphone", the iPhone was never "easy to use" or "intuitive". They bombarded TV with training videos disguised as ads for 6 months pre-release.

"You just point and click and drag everyone knows that" you spent hundreds of hours training to "just" point and click and drag, but you didn't call it training, you called it solitaire and minesweeper.

You practised.

Today, if we want a better user interface for any computing - and I think we do, and it's possible - we have two choices. Entirely 100% new - clean-break, fresh-start new - tech or to acknowledge and own that we're going to spend some time fighting reflexes honed over decades.

This is not an "it's the user's fault" or "it's never the user's fault" argument. There's not "fault" here, there is responsibility. There's an implicit social contract in computing, in navigating change that matters when we are using the tools we rely on.

Change is an _accessibility problem_ that demands assistive technologies to navigate, and building that tech is the responsibility of the developer.

"We put a 2nd floor on your house", great! "Climb a rope to get there", excuse me what.

The biggest own-goal I've ever experienced in this space was Windows 8.

Extremely Late Hot Take: Metro was Actually Pretty Good.

But the thing they bungled, hard, was making the Win8 changes _accessible_.

If you didn't see the first-run "swipe from off the screen" training montage, all the reflexes you might have from a phone or iPad were all 100% wrong and did nothing.

The first time I sat down in front of one of them it took me an hour to figure out how to log in at all.

And... not to put too fine a point on this, but I have been computering for a very, very long time. I have Seen Some Shit And I have Fixed More Than My Share Of That Shit. I am Quite A Bit Better Than OK at solving the problems that live near computers.

And it's been a long, long time since I've sat down in front of any computer of any kind and felt that helpless. Like I didn't even know where to start. Decades.

And making people feel helpless is the cardinal sin of UX design.

Anyway, I think there's been a generational shift in developer's perceptions of their obligations to that social contract and that it's made this industry substantially worse. The belief that "everyone knows" how to use the computers that are everywhere now has meant that nobody believes they need to, so nobody bothers, keeping training materials and docs up to date.

So developers think the costs of change are zero, because it's only the operators that bear those costs fully and invisibly.

Like, Windows documentation once upon a time was world-class, it was on-device, available at any moment and excellent. Today "Help" is a hyperlink to trash.

On a modern Mac, you can ask a question and get a few 'maybe' bullet points for your trouble. On System7 - the _stone_ age - the _OS would walk you through each step_ of solving your problem. It would say, click this thing and then draw a little circle around the thing. Apple's reputation comes from this period, when nobody was ever afraid.

Instead we’re moving fast, or at least running in circles, and eroding things that I think are going to matter over the long term, like trust and confidence in our industry, consistency, stability and reliability, pluralist democracy, you know. Basic good-repair-of-society stuff like that.

Just to add one last note to this:

One of the major disconnects I've seen in my career between engineering and UX/UI design is rooted in this misconception.

The Engineers asks: we made a new feature, it does these new things, can I have a button for this new thing?

The design team hears: We want an applied anthropology research exercise. If our product has an international audience, it is fifty or two hundred research exercises.

What the Engineer thought they asked for: a little picture.

You can see how both sides of this conversation find the other side frustrating to deal with.

Engineers believe they've solved a major problem, and now "just" need a tiny label to make that solution discoverable.

Interface designers believe that "just" having some new ideas is the easy part, but making new ideas legible and discoverable is a major problem.

Both are correct. But one of those domains is... let's say far less prone to the practice of humility than the other.

So when you look at interfaces that are obviously rushed, derivative or just straight up knockoffs of some other preexisting thing, this is part of what you're seeing. It might not be the only thing you're seeing, but it's kaleidoscopic; reflections after reflections of a set of mismatched expectations about how to best communicate new ideas to people whose heads are already full of old ideas.
@mhoye have you seen Tantacrul's videos on the design (flaws) of various sheet music engravring software?
@jackeric no! URL me, my friend.
@mhoye ooh! there's a whole series - the finale came out just a few weeks back. this is probably a good starting point. he's a composer and a software developer but you don't need to know much about music to follow. and they're very funny
https://youtu.be/dKx1wnXClcI?si=belius_crashed
Music Software & Bad Interface Design: Avid’s Sibelius

YouTube
@jackeric @mhoye this is an amazing video!
@mhoye We solved this problem in our product organisation by not having a design team.
@mhoye My favourite thread of the day. Great. Thank you! 👏

@mhoye

i think jenn schiffer's 2014 talk about how "your grandparents probably didn't have node" helps put a lot of things into perspective for the farthest-flung version of this problem -- developers assuming their audience is only developers. but in a world where the UX designers at the tech megacorps are constantly pushing quarter-baked overhauls and paradigm changes out into the world... unlike the old world where Windows and Mac were teaching the user some of the fundamentals, the platform is often more of an adversary nowadays. we can't assume that even developers aren't having their 'struggling to use windows 8' moment, let alone people who can't devote so much time to staying familiar with the ever-changing interfaces on their featureless squircle-screen. but as long as participation in society demands these dangfangled machines, it's our responsibility to make learning them worth the investment.

i think a lot about unobtrusively tutorializing the human-computer-interactions we take for granted, from the perspective of game development. it can feel unlikely, perhaps even embarrassing, to consider that your own app could ever be the first thing of its nature that a person has ever seen -- but you can't assume that it's not!

take for example the three-week free-time project my friends and i submitted to a game jam last year, a 2D platformer for the gameboy advance ("getting around it with pheasant birdy"). as soon as i felt it growing in scope from 'tech demo' to 'playable game', designing and implementing a tutorial subsystem immediately became a non-negotiable feature to me, and was one of the first things i worked on once i had the basics of the engine in place. naively, one might think that the demographic of "people who own GBA flashcarts/emulators and download homebrew game jam entries" wouldn't need to be taught "press ➡️ to walk", "press 🅰️ to jump", "press ⬆️ to climb", since those would be obvious fundamentals to anyone who's played any game in the tradition of super mario bros within the past four decades, and surely cutting that scope on such a time-limited project would let us make the rest of it more substantial... and that may be true, but what about the parents/coworkers/etc. of my dev team and me, some of whom never touched anything video-game with their own hands after playing a few rounds of pac-man/tetris/galaxian in the 80s and deciding games weren't their cup of tea, or younger relatives whose experience of gaming are primarily mobile gachas? (we even embedded a javascript-based GBA emulator on the game's page with a special build of the ROM that replaces the button icons with keyboard keys -- many people we'd want to share it with don't need to know how to download and use an emulator.)

we sorely need to instill a patient, empathetic, noone-left-behind mindset in our development cultures. this probably means we have to re-educate a lot of arrogant technocrats who think "patience? teaching? isn't that what the chatbot we added is for?"

Jenn Schiffer - Your Grandparents Probably Didn't Have Node [ Thunder Plains 2014 ]

YouTube
@mhoye I miss balloon help.
@yoasif It was good, yeah.
@mhoye I remember at one point I was the only person in the CS department who could print documents on Windows machine to a printer attached to a Unix machine. When asked about this sorcery, I just said I used Windows "help" and followed its instructions exactly.
@dabeaz @mhoye Far too few things are self-documenting.
@mhoye insert MSDN CD 3/4
@mhoye I think Apple spent a decade hoping that Macs would end up running iOS. (Of course, phones suck, too.)
@billseitz @mhoye I assume they still do, tho I suspect they're too ossified to pull it off now (small mercy).

@mhoye There has also been a shift in incentives.

With the boxed or pay per upgrade model, the incentive is to get users to develop their skills so they want to buy (or ask their employer to buy) the next version

With rolling upgrades and freemium models, the incentive is to make users bad at the software so they click more of the deceptive upsells

Users who learn and do the productivity tips have lower LTV... https://arstechnica.com/gadgets/2025/11/what-i-do-to-clean-up-a-clean-install-of-windows-11-23h2-and-edge/?utm_social-type=owned

How to declutter, quiet down, and take the AI out of Windows 11 25H2

A new major Windows 11 release means a new guide for cleaning up the OS.

Ars Technica
@mhoye I suspect that as the number of computer users 100x, the number of manual readers stayed flat.
C++ Wage Slave (@[email protected])

@[email protected] @[email protected] I write nothing more creative than documentation for the software I produce, but people misunderstand that, too. The experience of people misunderstanding my docs, or simply failing to read them, was one of the things that turned me away from evangelicalism. (I bet you didn't see that coming.) I wasn't using parables to hint at ineffable spiritual truths in the face of religious persecution: I was just explaining how software worked, how to see its current state, and how to configure and maintain systems. People who were paid to read and understand this material would not or could not do so: their repeated questions made it obvious. Within a very few years, instead of reading TFM, people developed folk stories of commands they could type that usually did something that could be mistaken for success. They veered constantly off-course, and I kept having to drag them back. They consulted each other, rather than the docs, and developed their own mythology about how the software worked. They intuitively felt they knew the software better than I did, because my approach to problem-solving was careful and methodical but they knew a golden shortcut. If concrete, human-level explanations, written out literally, landed so badly, there's no chance that people will have remembered the figurative and unfathomable teachings of Jesus seventy years after his death, written them down accurately and fully, and built from them a useful picture of worlds seen and unseen and the will of God. I'm sorry; it just doesn't ring true.

Infosec.Space

@mhoye

This is me in front of any neoliberal means testing form: "How do they expect anyone to fill this in? I have a degree in astrophysics and decades of work in data analysis and I can't fill this in."

(Then I realized that most people could fill it in because they didn't take it seriously and just wrote in anything, which is what I now do.)

@richpuchalsky And here we find ourselves, in the best of all possible worlds.
@mhoye IMHO, Windows Phone 7 and 8 had by far the best phone UX. I had a client with WP7, and had to set up mail and a few other things at random times, and it was just so intuitive, at least to me. My mother also had a Lumia 920, and she still misses that phone.
@jernej__s @mhoye funny. Even with “Windows Phone 7” literally one line up, my brain parsed “WP7” as “Wordperfect 7” and i was wondering why that was at all relevant when talking about windows phone :)
Yes, I also struggle with the stupid sodt keyboard on my phine…

@mhoye I click the start menu button to open the start menu

I get a full screen of spinning tiles that hides everything else

@gabrielesvelto @mhoye I quite liked this idea in theory — when the start menu is open, of course I'm not doing anything else, so why should anything else need to be on the screen?

the problem in practice was that the start screen was full of email notifications and photos and news stories and so on, so opening it often felt like entering a room, looking around, and saying "now what did I come in here for?"

that, and, in modern UIs the launcher menu is also the search bar which is also the calculator, and in those cases you actually might want to refer to stuff on the screen. but that was a way of back in the windows 8 era

@andrewt @gabrielesvelto I have a lot of feelings about this, but I have a strong sense that there's room in the world for two buttons, one of which is "I'm looking around" and one of which is "I'm going somewhere specific." Putting them in the same place means they'll always be undermining the operator's' intent.
@andrewt @gabrielesvelto (If you want an _astounding_ trainwreck of an example of this, in the bottom left corner of the Win11 screen, where the start menu used to be, the single most valuable pixel in the entire computer industry, somehow got hijacked by the weather-and-news-widget team in Windows 11, and now the start menu is centered on the bottom of the screen by default An incredible tragedy.)
@mhoye
My Windows PC (not my work one) is Windows 8.1 (I mostly use Linux now) and, despite the easier start button, the startmenu is so hateful I just put things on the desktop or in a folder in the PATH variable to allow me to use the run dialogue box...

@mhoye
> "We put a 2nd floor on your house", great! "Climb a rope to get there", excuse me what.

oh, I see you've been to my attic

@hyc @mhoye Always wondered how one is supposed to get heavy stuff into/out of that. Does it include some sort of pulley/winch system?
@mhoye "Oh, and we sealed all the doors and windows on the ground floor. You have a second floor now, why would you want to keep using the ground floor?"

@mhoye

Recognizing the fundamental difference between fault and responsibility is deep wisdom a lot of people don't get. Good thread.

@tbortels I started life as a sysadmin, and one of the truisms I've been repeating since about 1998 is, a lot of things that aren't my fault are still my responsibility, and a lot of things that aren't my responsibility are still my problem.
@mhoye wait wait hold on is "Minesweeper as an educational tool for building muscle memory for using a mouse" a known thing already / intentional? Because I've never heard of it before and it seems brilliant
The Strategist

Daily news and analysis for corporate & political leaders

@Osmose @mhoye afaik Minesweeper was just a little game one of the devs made for fun and threw it in

I think Solitaire was the mouse training one though!

@hazelnot @Osmose @mhoye This reminds me of how Miyamoto once talked about how difficult it was to get players to look around with the OOT camera (And specifically, to look up at something) - and then I suddenly remember that Gohma battle in the Deku Tree *refused* to start until you looked up and saw Gohma's eye.
@Osmose
It's why it was included in Windows by default. It's why Windows always had a mouse game (originally reversi, then solitaire, then a whole suite) to get new computer users used to the mouse! They basically dropped them in Windows 10 (and possibly 8?) as they assumed no-one needed them anymore, but with touchscreens now more common than mice, that assumption is looking stupid...
@mhoye
Why computers come with Solitaire and Minesweeper

The games were reportedly designed to teach users skills they would need using these newfangled personal computers.

Business Insider
@Osmose @mhoye Oh, yeah. That was explicitly the reason when it first came out. I think it got forgotten because early adopters were already familiar with the mouse, and ppl who didn't know about using a mouse came later.
@mhoye I still remember the Hypercard stack that came with my first Mac that goes through things like point, click, drag, feed the fish. Big paper manuals full of screenshots were also a thing and they assumed far less familiarity with operational concepts. These days I feel like I could use one of those just to know what capabilities have been hidden behind invisible gestures.
@neal @mhoye and all those resources taught you to use a UI that was shared by all the apps you had - nowadays every app and web site seems to want to invent their own so they can use their favourite colours. And then changes it every few years. Users just seem resigned to being lost 🤷
@mhoye Anyone who thinks "just drag" is intuitive has never tried to help an 85 year old do it. Weak muscles, shaky arms, sticky old mouse, a bit of a tremor... it's actually _very_ hard to describe the motion they need to make and then all of that!
@aredridel @mhoye we've had so much trouble at work trying to make sure our shiny drag-and-drop list editor works for people who can't drag. there are so many things you need to think of and end up building twice if you really want an inclusive product
@andrewt @aredridel If you don't have a lot of cultural bench depth among the people in the room at the earliest stages of some product's life, people who feel safe and empowered to speak up, then an awful lot of your design and iteration process is going to end up wasted.
@mhoye @aredridel we've definitely found that, absolutely
@mhoye I'm dusting off old MS-DOS games for the kids, and it's interesting how many of the games are, at their core, training on how to interface with a computer
@mhoye This is off-topic, but that explains why my late grandfather never liked drag and drop. He was an avid player of FreeCell, not Solitaire! He probably played thousands of hours of FreeCell over the years.

Microsoft FreeCell did not allow dragging cards. You had to click on a card then click on the destination. It was updated to drag and drop in Windows Vista. When my grandfather got a new computer that had Windows 7, he simply couldn't play it. Luckily the FreeCell files from Windows XP still worked on Windows 7 so he could continue playing.

@mhoye In the first decade or so of the Mac, there was even explicit training. Early Macs came with a “guided tour” split across a training program and an *audio cassette*: https://youtube.com/watch?v=iTNDm-LC_Js

Because in 1984, nobody expected you to know how to use a mouse and GUI, but you did probably have some idea of how to use a tape player. And with that, and a good tutorial with practice sections, you could learn.

Guided Tour of Macintosh Plus (#MARCHintosh stream 2024-03-01)

YouTube
@mhoye fighting reflexes, like when Apple inverted the scroll direction?
If you think there are no intuitive UIs, watch little children use an iPad.
@sn Yeah, precisely like that. Apple changed the metaphor from "interacting with the viewpane into the document" to "interacting with the document". I think an improvement, once I got used to it? Going back feels weird now.
@mhoye This is seen in typography as well. A font's readability qualities are strongly tied to how often people see the font every day.