@krismicinski @shriramk @gwozniak @steve @jfdm @csgordon @lindsey @jeremysiek as I've been playing around with these things the last few weeks, doing so as a practitioner who is keenly aware that the a) the world is changing, b) I did not ask it to change, c) it's changing in a way that feels bewildering given my professional experience thus far, I have a few thoughts about the externalities.
First, I think there is a lot of emotion surrounding these things. Beyond the obvious "change can be hard" thing, lots of people have invested significant time and energy over the course of many years of their (innately finite) lives building up skills that are quickly becoming irrelevant. Many people have forged their entire identities and sense of self around being programmers, or software engineers, or whatever the kids are calling it these days. For probably the first time in their lives, they're looking at automation coming for them in a serious way, and it's scary and challenging their sense of self. The fear of, "if I'm not this, what am I?" is real.
I get that. I have deep empathy for that. We _all_ should. Honestly, I'm half there myself.
There's also the sense that it's being driven by a set of out-of-touch members of the billionaire class who are pushing this relentlessly, literally flying above us in their private jets, with no thought to those of us down at the bottom who are going to get crushed under the weight of this juggernaut. We are building the pyramids ever higher for the glory of great pharaoh. What are a few slaves crushed under giant blocks of stone along the way?
Beyond that, if this becomes a de facto necessary part of the production of software, then it seems to me that the means of production of software is going to be controlled by the very few players that can afford to field this technology. And I truly believe the actual cost is much higher than what we're being charged now; they're burning VC money to make it cheap. But happens when that runs out? I keep hearing people talk about "we'll train our own models and run the inference engines locally!" Ok, good luck with that: that's years away. Meanwhile, Google is moving to build small nuclear reactors next to data centers, and that tells me that we're not going to see this outside of the big players any time soon.
Further, energy demands (and water! people always forget the water!) are _increasing_ as these machines get better, not decreasing. Techniques like MoE may lead to them increasing at a slower (non-exponential) rate, but it's still superlinear; eyeballing it, it looks quadratic-ish. Contrast with, say, computer themselves: compare the power draw of ENIAC versus a cell phone; the cell phone is many orders of magnitudes more capable while consuming orders of magnitude _less_ power. Unless we can figure out how to make power consumption sublinear as models increase capability, I don't see how this is at all sustainable in the long term.
Then what?