A few thoughts on Astral / OpenAI, now that the emotions have sat for a bit.

First, let me start by noting that AI is an attack on open source, inherently, by necessity, and at a structural level. That argument is bigger than Astral, but the short version is that you cannot simultaneously expand the public commons and work towards it's enclosure; moreover, if the public commons do not stand for the public good, then it's not really a commons any more.

Second, the unfortunate reality of the software and hardware industries is that funding for the public commons is nearly non-existent. To get anything done, it either has to serve the interests of some company, or has to get done by tricking leadership of one or more companies into believing that their interests are best served by expanding the public commons.

So yeah, lots of folks in the industry have to walk the line between doing good within a system, and that system being extractionist.

The consequence of those two facts strikes me as being that lots of people are doing good work, much of it at evil companies, and that that tension pretty much defines this motherfucking cursed industry. If your job depends on making AI numbers go up, that means your job depends on undermining open source. Sometimes you can malicious compliance that into helping open source as well, and hoping the two balance in an overall helpful direction.

Point being, I'm not criticizing specific individuals here. While I think there are some specific individuals who have made this situation demonstrably worse, on purpose, and for their own personal ends, that's not germane here, and so I'll keep those specifics to myself for now.

Rather, I want to talk about exit strategies.

Because let's face it, we depend on some pretty fucked up shit in software development. Much of the shit that we depend on that isn't currently and actively fucked up is in immediate danger of becoming fucked up, a la OpenAI buying out Astral.

So it's a matter of knowing how, when you adopt a new tool or technology or whatever else, you will eventually stop using it.

Using GitHub was great back in the day. They gave OSS projects a lot of free shit that was hard to get elsewhere. But it's clear in retrospect that we needed more and better exit strategies.

With the Astral buyout, it's a good reminder that uv came with very sensible exit strategies almost built-in: reliance on openly developed and published specs. But that only works because PDM exists, and that in turn only works because PEPs are collaboratively developed, and so forth.

@xgranade Though I thought everyone learned their lesson with sourceforge.net in 2013 when they started to bundle stuff with your installer?

@pinskia There's a difference between learning your lesson and affording it. GitHub meant I didn't need to spend time managing a VCS, paying for hosting, and the like, I could get to the brass tacks of writing my project.

That's not for nothing.

@xgranade @pinskia We could've had a mostly-p2p answer to that though.

Most of the NAT hole punching overlay networks rely on these days was possible even back then (IPv6 adoption is vaguely better now).

And some content-addressed layer like git-torrent and some manner of indexing and we could've avoided this.
@lispi314 @pinskia No disagreement, but we didn't have that at the time that GitHub started getting popular. For me, personally, that subsidy enabled a lot.
@xgranade @pinskia We had Gnutella/Limewire, Bittorrents (the updatable torrent BEPs are pretty old), I2P and Freenet (now Hyphanet).

They were just promptly disregarded.
@lispi314 @pinskia As source control systems, though?
@xgranade @pinskia As transports and/or backing for version-controlled source artifacts, yes.

There's no real reason that wouldn't work.

Literally all that was necessary was to glue together systems that already existed.

(Or figure out some adaptable schema that could be independently implemented over each and over new things, to prevent single points of failure on the scale of entire networks.)

@lispi314 @pinskia Right, but I don't think it's irrational for someone starting a new project circa 2010 to use a subsidized system and getting to work on their project instead of building out that glue.

My point was that what I think *was* irrational was not having an exit strategy, perhaps such as building that better system out in parallel.

@xgranade @pinskia Personally, had there been p2p options when I started and I had known to know about them, I would have used them.

Unfortunately when I got started I had neither sufficient understanding nor skill to implement that myself.

And then by the time I could reasonably consider doing so it was already far too late and no one would even deign to consider alternatives so people fell back on paltry far more brittle options like self-hosting.

That does rejoin the irrational part you see, the unwillingness to merely entertain alternatives. (And the demoralization for others.)

@lispi314 @pinskia Part of where I think I disagree is the separation between what's irrational for an individual under limited resources and what's irrational for a community writ large.

Like I no longer say "just use Linux," but rather "have an exit plan for when Windows or macOS are no longer options." The world would be better with more people using Linux, hands-down, and in many ways it's already more usable than Windows or macOS.

But not everyone is in a good position to challenge that.

@xgranade @pinskia Part of the reason I don't say "just use Linux" is that I'm not sure Linux won't also melt into slop.

There was also implied the assumption that distributions were sufficiently standalone to endure.

Which they aren't. And while the other UNIX-likes may be... some of them are also at slop risk.

@lispi314 @pinskia Well, there's that. systemd, Python, and who the fuck knows what else have all started melting.

I think that's extremely bad, *and also* that Windows and macOS add additional harms on top of that, such that "just use Linux" is, in some limited scenarios, still reasonable advice.

Regardless, though, I absolutely agree that Linux is in real danger of melting into slop, and fast.

@xgranade @pinskia

the separation between what's irrational for an individual under limited resources and what's irrational for a community writ large.

Well, frequently it starts with one.

But yes, there is a difference in how much it applies, and systematization.

That kind of lock-in is hard for some individuals to fight if their livelihoods depend on Word.

Honestly that should be considered anticompetitive collusion (and it was pretty deliberately fostered by Microsoft).

Stuff shouldn't depend exclusively on proprietary formats & interfaces.

@lispi314 @pinskia I mean, not everyone can solve every problem, though? I think of like immigration lawyers I know who work in Word and don't challenge that lock-in because they're busy doing pro bono work to rescue people from ICE.

I think both problems are critical to solve, but with differently shaped fallouts, and on different timescales... it's fair for someone to focus on one kind of injustice and not the other?

@xgranade @pinskia

it's fair for someone to focus on one kind of injustice and not the other?

It is. My statement is also criticism of the legal system. It hasn't been sued into oblivion years ago as the blatantly anticompetitive measure it is because making use of the legal system is (deliberately) priced out of the reach of the majority.