#4: Bring Back Idiomatic Design

I’m part of the desktop software generation. From Windows 95 to Windows 7, I grew using mostly-offline software on computers operated via mouse and keyboard, well before tablets and smartphones. Recently, I’ve been missing one particular part of that era: its consistency in design. I want to tell you about

Loeber on Substack

As the author identifies, the idioms come from the use of system frameworks that steer you towards idiomatic implementations.

The system UI frameworks are tremendously detailed and handle so many corner cases you'd never think of. They allow you to graduate into being a power user over time.

Windows has Win32, and it was easier to use its controls than rolling your own custom ones. (Shame they left the UI side of win32 to rot)

macOS has AppKit, which enforces a ton. You can't change the height of a native button, for example.

iOS has UIKit, similar deal.

The web has nothing. You gotta roll your own, and it'll be half-baked at best. And since building for modern desktop platforms is horrible, the framework-less web is being used there too.

The author may have identified that "the idioms come from the use of system frameworks", but they absolutely got wrong just about everything about why apps are not consistent on the web (e.g. I was baffled by their reasons listed under "this lack of homogeneity is for two reasons" section).

First, what he calls "the desktop era" wasn't so much a desktop era as a Windows era - Windows ran the vast majority of desktops (and furthermore, there were plenty of inconsistencies between Windows and Mac). So, as you point out regarding the Win32 API, developers had essentially one way to do things, or at least the far easiest way to do things. Developers weren't so much "following design idioms" as "doing what is easy to do on Windows".

The web started out as a document sharing system, and it only gradually and organically turned over to an app system. There was simply no single default, "easiest" way to do things (and despite that, I remember when it seemed like the web converged all at once onto Bootstrap, because it became the easiest and most "standard" way to do things).

In other words, I totally agree with you. You can have all the "standard idioms" that you want, but unless you have a single company providing and writing easy to use, default frameworks, you'll always have lots of different ways of doing things.

Yeah the author conveniently ignores the fact that the UX of Mac apps was radically different to that of PC apps, so it’s not that designers/developers were somehow more enlightened back then, it’s just that they were “on rails”

> Developers weren't so much "following design idioms" as "doing what is easy to do on Windows".

Most people only uses one computer. Inconsistency between platforms have no bearing on users. But inconsistency of applications on one platform is a nightmare for training. And accessibility suffers.

I don't disagree, but my point was about the author's incorrect diagnosis of the reason (and solution) for the problem, not that the problem doesn't exist.

As a sibling commenter put it, previously developers had "rails" that were governed by MS and Apple. The very nature of the web means no such rails exist, and saying "hey guys, let's all get back to design idioms!" is not going to fix the problem.

Well, and worse, Windows was itself a hive of inconsistency. The most obvious example of UI consistency failing as an idea was that Microsoft's own teams didn't care about it at all. People my age always have rose tinted glasses about this. Even the screenshot of Word the author chose is telling because Office rolled its own widget toolkit. No other Windows apps had menus that looked like that, with the stripe down the left hand side, or that kind of redundant menu-duplicating sidebar. They made many other apps that ignored or duplicated core UI paradigms too. Visual Studio, Encarta, Windows Media Player... the list went on and on.

The Windows I remember was in some ways actually less consistent than what we have now. It was common for apps to be themeable, to use weirdly shaped windows, to have very different icon themes or button colors, etc. Every app developer wanted to have a strong brand, which meant not using the default UI choices. And Microsoft's UI guidelines weren't strong enough to generate consistency - even basic things like where the settings window could be found weren't consistent. Sometimes it was Edit > Preferences. Sometimes File > Settings. Sometimes zooming was under View, sometimes under Window.

The big problem with the web and the newer web-derived mobile paradigms is the conflation between theme and widget library, under the name "design system". The native desktop era was relatively good at keeping these concepts separated but the web isn't, the result is a morass of very low effort and crappy widgets that often fail at the subtle details MS/Apple got right. And browsers can't help because every other year designers decide that the basic behaviors of e.g. text fields needs to change in ways that wouldn't be supported by the browser's own widgets.

“Brand” and “branding” is arguably the most important thing -not- mentioned in the article. The commercial incentives to differentiate are powerful enough to kick a lot of UX out of the way.

Now that all we do is “experience” a “journey,” it’s more about the user doing what the app wants instead of the other way around

I partially agree with you, but additionally there's a whole set of employees who would be clearly redundant in any given company if that company decided to just use a simple, idiomatic, off the shelf UI system. Or even to implement one but without attempting to reinvent well understood patterns.

One reason so many single-person products are so nice is because that single developer didn't have the time and resources to try to re-think how buttons or drop downs or tabs should work. Instead, they just followed existing patterns.

Meanwhile when you have 3 designers and 5 engineers, with the natural ratio of figma sketch-to-production ready implementation being at least an order of magnitude, the only way to justify the design headcount is to make shit complicated.

But every company I worked at in the past 10 years or so eventually coalesced around a singular "design system" managed by one person or a small core team. But that just goes back to my original point - every company had their own design system, and there is not a single, industry-wide set of "rails".

The bigger issue I see with "got to keep lots of designers employed" problem is the series of pointless, trend-following redesigns you'd see all the time. That said, I've seen many design departments get absolutely slaughtered at a lot of web/SaaS companies in the past 3 years. A lot of the issue designers were working on in the web and mobile for the 25 years prior are now essentially "solved problems", and so, except for the integration of AI (where I've seen nearly every company just add a chat box and that AI star icon), it looks like there is a lot less to do.

Conventions already existed in DOS (CUA) and MacOS. The point is, every operating system had its user interface conventions, and there was a strong move from at least the mid-1980s to roughly the mid-2000s that applications should conform to the respective OS conventions. The cross-platform aspect of the web and then of mobile destroyed that.

The web was designed for interactive documents,not desktop applications. The layout engine was inspired by typesetting (floating, block) and lot of components only make sense for text (<i>, <span>, <strong>,...). There's also no allowance for dynamic data (virtualization of lists) and custom components (canvas and svgs are not great in that regard).

> building for modern desktop platforms is horrible, the framework-less web is being used there too.

I think it's more related to PM wanting to "brand" their product and developers optimizing things for themselves (in the short term), not for their users.

> You can't change the height of a native button, for example.

You can definitely do so, it's just not obvious or straightforward in many contexts.

> The web has nothing. You gotta roll your own, and it'll be half-baked at best. And since building for modern desktop platforms is horrible, the framework-less web is being used there too.

This feels like the root cause to me as well. Or more specifically, the web does have idioms, the problem is that those idioms are still stuck in 1980 and assume the web is a collection of science papers with hyperlinks and the occasional image, data table and submittable form.

This is where the "favourites" list and the ability to select any text on a web pages came from.

Web apps not only have to build an application UI completely from scratch, they also have to do it on top of a document UI that "wants" to do something completely different.

Modern browsers have toned down those idioms and essentially made it "easier to fight them", but didn't remove or improve them.

"The Web" has evolved into a pretty bad UI API. I kind of wish that the web stuck to documents with hyperlinks, and something else emerged as a cross-platform application SDK. Combining them both into HTTP/CSS/JS was a mistake IMO.

That’s not the only reasons. When you are used to how your operating system does things consistently, as a developer you naturally want your application to also behave like you’re used to in that environment.

This eroded on the web, because a web page was a bit of a different “boxed” environment, and completely broke down with the rise of mobile, because the desktop conventions didn’t directly translate to touch and small screens, and (this goes back to your point) the developers of mobile OSs introduced equivalent conventions only half-heartedly.

For example, long-press could have been a consistent idiom for what right-click used to be on desktop, but that wasn’t done initially and later was never consistently promoted, competing with Share menus, ellipsis menus and whatnot.