It's late enough to be hacker hours, if you're as old as I am. Gonna write down a bunch of rambly thoughts about #xz and #autoconf and capital-F Free Software sustainability and all that jazz. Plan is to edit it into a Proper Blog Post™ tomorrow. Rest of the thread will be unlisted but boosts and responses are encouraged.

Starting with the very specific: I do not think it was an accident that the xz backdoor's exploit chain started with a modified version of a third party .m4 file to be compiled into xz's configure script.

It's possible to write incomprehensible, underhanded code in any programming language. There's competitions for it, even. But when you have a programming language, or perhaps a mashup of two languages, that everyone *expects* not to be able to understand — no matter how careful the author is — well, then you have what we might call an attractive nuisance. And when blobs of code in that language are passed around in copy-and-paste fashion without much review or testing or version control, that makes it an even easier target.

So, in my capacity as one of the last few people still keeping autoconf limping along, I'm thinking pretty hard about what could be done to replace its implementation language, and concurrently what could be done to improve development practice for both autoconf and its extensions (the macro archive, gnulib, etc.)

Side bar, I know a lot of people are saying "time to scrap autotools for good, everyone should just use cmake/meson/GN/Basel/..." I have a couple different responses to that depending on my mood, but the important one right now is: Do you honestly believe that your replacement of choice is *enough* better, readability wise, that folks will actually review patches to build machinery carefully enough to catch this kind of insider attack?

On the subject of implementation language, I have one half-baked idea and one castle in the air.

The half-baked idea is: Suppose ./configure continues to be a shell script, but it ceases to be a *generated* shell script. No more M4. Similarly, the Makefile continues to be a Makefile but it ceases to be generated from Makefile.am. Instead, there is a large library of shell functions and a somewhat smaller library of Make rules that you include and then use.

For ./configure I'm fairly confident it would be possible to do this and remain compatible with POSIX.1-2001 "shell and utilities". (Little known fact: for a long time now, autoconf scripts *do* use shell functions! Internally, wrapped in multiple layers of M4 goo, but still — we haven't insisted on backcompat all the way to System V sh in a long, long time.) For Makefiles I believe it would be necessary to insist on GNU Make.

This would definitely be an improvement on the status quo, but would it be *enough* of one? And would it be less work than migration to something else? (It would be a compatibility break and it would *not* be possible to automate the conversion. Lots of work for everyone no matter what.)

Suppose that's not good enough. Bourne shell is still a shitty programming language, and in particular it is really dang hard to read, especially if you're worried about malicious insiders. Which we are.

Now we have another problem. The #1 selling point for autotools vs all other build orchestrators is "no build dependencies if you're working from tarballs," and the only reason that works is you can count on /bin/sh to exist on anything that purports to be Unix. If we want to stop using /bin/sh, we're going to have to make people install something else first, and that something else needs to be a small and stable Twinkie. Python need not apply (sorry, Meson).

What's small and stable enough? Lua is already too large, and at the same time, too limited.

There's one language that's famous for being tiny, flexible, and pleasantly readable once you wrap your head around it: Forth.

If I had investments to live off, I would be sorely tempted to take the next year or so and write my own Forth that was also a shell language and a build orchestrator, and then have a look at rewriting Autoconf in *that.* This is the castle in the air.

Side bar 2: Let's table the whole "shouldn't everyone build from git nowadays?" discussion. I'm quite sure the xz insider could've found a way to hide the stage 0 exploit in a checked-in file. If you care about ways to make the output of "make dist" verifiable and reproducible, and to facilitate building from VCS checkout for those who want that, we're actually having a productive discussion about that on one of the autotools mailing lists right now.

(Not sure which list — I sort them all into one mailbox — and I have to warn you that several other less helpful conversations are happening under the same subject line.)

Moving to the more general.

I said this over on the autoconf lists earlier today: just as I think it is a mistake to focus on the stage 0 exploit having been concealed by not checking it into the VCS, I also think it is a mistake to focus on the next few stages having been concealed in a binary file. There are binary files that are naturally editable and auditable as themselves (raster images, for instance) and there are text files that nobody wants to look at at all (ever tried to fix a merge conflict in an SVG image?)

A more interesting line to draw, IMO, is between code and tests. I feel quite confident in saying that the files written to $prefix by "make install" should never need to have any sort of dependence on the project's test suite, and that is something that ought to be possible to detect mechanically (the biggest challenge is determining what files of the source repo are exclusively part of the test suite).

tea break

Last bit. Community, sustainability, and trust.

The early free software movement (1983–1994 give or take) was, as I've heard the tales, consciously revolutionary, and, as revolutions often do, it ran on the spare time of relatively young people with time and energy to spare.

I came on the scene in 1997, right about the time it became reasonably possible to run Linux as your only desktop OS if you knew what you were doing — or, to put it another way, right about the time the original goal of the GNU Project had been achieved.

Like many other revolutions, GNU had no answer, and still doesn't, to the question: now what?

This is not the only reason the young, energetic revolutionaries of 1997 are now the exhausted maintainers of an archipelago of individual "projects" that sort of add up to a computing environment that one might fairly describe as "the worst (except for all the others)". But I think it's an important reason.

Side bar 3: In the middle 1990s someone — either Eric Raymond or Guy Steele — wrote as part of the "Portrait of J. Random Hacker" appendix to the Jargon File

> [Among hackers] racial and ethnic prejudice is notably uncommon and tends to be met with freezing contempt.

This was not true even at the time, and twenty years later ESR was cheerfully making common cause with Vox Day and the Sad Puppies.

I'm a white guy (albeit some of my grandparents weren't). I already knew how to program when I got to college. If I'd made different choices in the early 2000s, I could very well now be sitting on enough investment income to take a sabbatical and invent a new shell language.

When we look around and say "where do we find the helping hands we so desperately need?" we must recognize that part of the problem is that hacking was never as inclusive a club as we claimed.

(This sidebar is not *only* a response to the commenters who saw a name like "Jia Tan" and immediately started hating on China as a whole.)

Last thought for tonight: Riding on the time and energy of revolutionaries no longer works. Giving away stuff for free and then asking corporations to pay us *never* worked. Grants from governments and NGOs works only for the stuff you can successfully write grants for, which is almost never "five years' salary for invisible maintenance tasks." What's left?

@zwol

Who needs the maintenance? The devs?

Do we need better platforms for us to collect money for developers or even forks? The culture on Twitch with the subs shows, people are willing to pay, if they are used to.
But then only some may get most of the money. And the hacker scene already had enough problem with its stars.

We don't need more competition. It leads to > an archipelago of individual "projects" <.

Maybe the question is:
How can we federate open source software?

@zwol

Federate like in: "I am able to do this. You are able to do that. This one has that thing. If we bring everything together and work on it, we will all have what we wanted. To work together successfully, we need to agree upon a standard."

No competition. No price. No market.

But we still need people, who know and can do stuff, and people, who have stuff. But without competition.

(This doesn't sound very convincing or conclusive, but I am still sending this post.)

@AdeptVeritatis Thank you for clarifying, that is essentially what I meant by "mutual aid network" ;-)
@AdeptVeritatis Good questions. It seems to me that answers may come from the space of "trade unions" and "mutual aid networks" and " community organization" but I don't think I'm the right person to spearhead that part of the solution
@AdeptVeritatis also I think that a discussion *solely* in terms of "people need to get paid" is going to leave out some important bits.

@zwol
I'm not smart enough to hang here so just gonna add

> toasting in an epic bread

and if you start a patreon to fund your forth project I'm in for ten bucks

@zwol I've been trying to get governments to fund this kind of stuff without requiring people to file grant applications first, but it's tough going... I think if we could lobby for more people who understand FOSS to be appointed as state IT directors, that could help, though. It would require political organizing and lobbying, but it could be worth it.
@zwol Specifically, bird-dogging gubernatorial candidates would be one strategy. If candidates for governor in your state hold publicly-open town halls, go and ask, "What sort of qualities would you look for when nominating a state IT director?" and then grade them based on how FOSS-maintenance-friendly their answers are.

@zwol I'm hoping that a general push towards realizing that less complex code is easier for new people to contribute to helps this, but we've got a long way to go.

along with curriculums and corporate hiring practices making people think they need to know a lot more than they do to get involved with open source...

@NireBryce @zwol

>making people think they need to know a lot more than they do

I had this exact experience the first time I made a contribution: it was so easy, very much against my expectations

@LikesCalendars @NireBryce Several times while I was teaching at CMU, people who'd taken my class (sophomore level "introduction to computer systems") told me that they went in expecting to hate it and find it incomprehensible, but then they really enjoyed the experience and now they were planning to take more systems courses

I don't know if I'll ever teach again, but if I could find a less exhausting way to convey that one piece of enlightenment—that the machine is not magic and you *can* understand it—to the general public...

@zwol What's left is taxes; computers are to a large extent the public communications infrastructure, and ought to be public in the full sense.

(And even if not in the full sense, substantially. Because having it work at all depends on dreary stuff you have to pay people to do, just like clearing clogged culverts.)

@zwol I'm pretty sure that was esr.
@zwol (semi-related: earlier today, I saw a reddit thread asking "why aren't there more women in STEM?" and told my friends I wanted to comment "men in STEM.")
@maco Oof, yeah. I didn't quote it but that same page of the Jargon File has a bunch of unconscious gender essentialism as well (the usual "the wimminz just aren't *interested*" horseshit).
@zwol in CI/CD build systems the test build is separate (at least it should be). It should depend on the actual build, but the only output that's kept from the test build are the test results.
@lyda That's not enough of a firewall; the xz backdoor pulled data out of the testsuite during the normal build.