Seriously, if Apple copy Microsoft with a stunt like this, that'd be my cue to buy a Framework laptop and switch 100% to Linux for work.
(Which would be enormously painful as Scrivener isn't supported on Linux and it's been my work platform for the past 15 years.)
NB: only distributions with X.org ranther than Wayland and sysv init instead of systemd need apply.
@cstross Telling the people who write display servers and protocols how to do their jobs is well above my paygrade.
For better or worse, the ecosystem seems to have mostly moved from X11 to Wayland. And personally, I'm not quite old enough yet to want to expend my energy on going against the grain, there.
You are Wrong. X11 had *nothing* to do with the VGA standard.
X11 predates VGA graphics by three years and wasn't intended to run on PCs or deal with a BIOS at all—it was designed for workstations with a variety of graphics hardware. I remember it on Sun 3/60 kit circa 1989 ...
@khleedril @cstross @lonjil @hko X11 was a network protocol. It came from project Athena, launched in 1983 as a joint project between MIT, DEC and IBM to produce a "campus-wide computing environment".
X11 was designed to pop up GUI windows on a different physical machine than the program was running on, potentially with different OS on different hardware at each end. That was central to the design.
The xfree86 clowns broke a lot of that over the years "optimizing", but that's not X11's fault.
@khleedril @cstross @lonjil @hko Back at Rutgers I loved playing a game called "xbattle" which was implemented as 1 game process opening windows on a bunch of different machines (listed on the command line) so people could play against each other in a shared map.
If you could trivially do that in 1992 on a LAN, one computer with 4 monitors does not require significant new plumbing from X11.
Alas people wrapped the protocol with layers of shared libraries, each with "simplifying assumptions"...
@cstross @khleedril @lonjil @hko 20 years ago I wrote an extensive historical analysis of how Moore's Law's consistent advance was in part market collusion, in order to explain why Itanium failed in the context of SCO's lawsuit against IBM.
I'm still kind of proud of that writeup:
@khleedril @cstross @lonjil @hko With the result that "nobody programs against libX11.so anymore, use gtk or qt to talk to libx" and I'm going "libX11 is _itself_ a wrapper around a documented protocol!""
Posix was a common subset of shared Unix API, an attempt at similifying and stabilizing, "you can rely on this, there may be extensions but they can be ignored".
Nobody even tried to do that for X11...
@cstross @lonjil @hko The most important innovation of Wayland is convincing people post-2010 to give away their time for free to maintain it.
X works fine for now, but bitrot is slowly chipping away at the ability to successfully run it, and it's going to continue that way unless someone volunteers for the thankless, unpaid job of maintaining it.
@StephanSchulz @cstross @hko fuck no!
It's this mentality that "CPU circles are essentially free, storage is practically unlimited, network bandwidth is a given" that makes software that could run on a Super Nintendo bloat to the point that you need a state of the art computer bought in the last two months to run.
@cstross @StephanSchulz @hko I think you mentioned “with a single 8 *MHz* 68020”
Just imagine a Mac IIfx with an 8 GHz CPU. The mind boggles!
@cstross @hko mostly Wayland is a performance improvement for applications running locally on a computer, at the expense of applications which rely on the X windows client/server model. I don't know enough about the engineering issues to speak to its advantages if any, but Wayland has basic user interface problems.
It is a consistent problem of Linux that usability invariably take second place to engineering issues.
@ravenonthill that's actually bullshit and there are practically no contemporary toolkits that use x primitives anymore – and to get a decent desktop experience you have to work around the protocol. this is one of the reasons why x is so painful to develop.
(and i actually did set up and then run regular x protocol over network around ~2005, with thin clients running x servers and connecting to a beef-ish application server. it was bad even then.)
i mean, wayland is by no means perfect, but you clearly have no bloody idea about how x works, and why its developers all moved to work on wayland.
@cstross at this stage systemd is an old, stable building block for most linux systems. if you don't demand to tear out launchd from macosx, why would you care what does precisely the same job on linux?
ditto with wayland, which, by the way, is developed by the same people who worked on x(free|org). regarding xwayland – how frequently do you use quartz's x server on macosx? xwayland it's like that, but much better integrated than xquartz.
@mawhrin @graphite Stop trying to shame me into learning something new that I don't need! I'm about 80% of my way through my life expectancy, and I want to keep my remaining brain cpu cycles for stuff I find interesting and that bring me joy. Being forced to learn new software just because some devs think it's cool and neat to have something to work on does not bring me joy (or help me write my novels).
Only reason I can't stick with old kit is: security holes grow and hardware decays.
@cstross i don't, and i'm not recommending switching (back) to linux if other options work for you.
it's just that these days actively ignoring wayland and systemd requires more tedious work from the end-user, not less, and people considering linux must be aware of that.
systemd is not new: it's old, stable and boring (debian switched to systemd in 2014), and unless you want or need to know about it, you probably won't even touch it. (hence my launchd parallel; they do precisely the same low-level set of jobs).
as for wayland and xwayland, xwayland is the only x server that receives active development these days. if you want to use only x on linux, you either need someone handling all potential issues for yourself (like with steamdeck) or you're setting yourself up to deal with various little problems with bitrotting code base, with dwindling number of people who actually want to engage with them.
(for example, you might encounter a mysterious case of electron apps infuriatingly crashing mid-typing – something that made me ultimately switch to wayland-based desktop last year.)
@cstross JFC.
Last night, I was working on a high-priority incident at work, and my boss produced a script to do some of the work. About 10 minutes into trying to make this thing work:
Me: "This method call doesn't have that parameter at all. Was this generated by AI?"
Him: "Yes"
Me: "Fuck that, then. I'm rewriting every line of this using the documentation."
He still thinks it's a good idea. 
@simon_lucy @darkling @cstross there's a lot of accept and then edit, so I do end up reading it all.
But one thing I like is that when I have a function with a return type declared and if I e.g. just declared an iterable, I can just type "for " and copilot with suggest a loop that at least roughly does what I want to do
@darkling @cstross I have a colleague who also was declaring a bullshit generator the best solution to Life, the Universe and Everything.
At the moment all so-called AI is just glorified lorem ipsum bullshit generator crap. It's worse than the way I used to abuse Dreamweaver in 1999, 2000 when I was just starting out.
@darkling
> He still thinks it's a good idea. :facepalm:
Definitely. That flavour of manager *loathes* the period of cognitive contemplation needed before actually producing something tangible.
He wants that period reduced and, ideally, eliminated, to get directly to the "we're doing something!" reward step.
If he can just press a button to "get things started", that's a win for him. Doesn't matter at all how much worse it makes the result.