Back in January I was looking around for some positive "pro-AI" analysis of the ethics of the problem <https://mastodon.social/@glyph/115908558259725802> and it looks like I finally got what I wanted: <https://types.pl/@wilbowma/116247527449271232>

I definitely don't think I'm fully convinced, but there's more than enough here to sit with for a while and consider. It's such a relief that someone is taking the ethical question *seriously* though.

William J. Bowman🇨🇦 (@[email protected])

I think if I spend any more time on this, I'll risk doing more harm than good: new blog post on "AI" and ethics. https://www.williamjbowman.com/blog/2026/03/13/against-vibes-part-2-ought-you-use-a-generative-model/

types.pl

@glyph I think I disagree with almost every word in that post, but it's at least clear enough what I'm disagreeing with, which is refreshing?

I do think it's telling, though, that he describes one of the pillars of opposition to AI as he sees it as being an intellectual property argument and not a labor rights argument — in fairness, he does revisit labor rights later, but I still wouldn't have thought of IP issues in genAI as being moral, per se?

@xgranade @glyph He starts to lose me in the second paragraph: 'if you look into any one of the arguments, the details are (shocking) a little more complicated'. He implies here that the other side doesn't engage with the detail and nuance but he does. This does not seem to set the stage for good-faith engagement. 1/3
@xgranade @glyph The ethical framework he sets out is unsophisticated. He condemns utilitarianism (the claim that this is 'the dominant ethics pretty much everywhere' is... well, a claim) and consequentialism, but his own ethical framework is itself consequentialist. 2/3
@xgranade @glyph Importantly, it entails that there can never be any ethical obligation to act for the good: the only obligation is to avoid doing bad. It is perfectly ethical to passively allow the world to slide into a worse state. Much of the rest of what he writes can be rejected on this basis alone. 3/3