the frustrating thing is that literally anyone who has thought about language and technology for fifteen consecutive seconds could have told you that autocomplete and other writing tools influence beliefs (and *have* been telling you this, over and over, for decades). the other frustrating thing is that slop-pushers *brag about their ability to do this* and right-wing actors are actively exploiting it, but in polite company everyone pretends that's not the case https://mathstodon.xyz/@gregeganSF/116219772468880168
Greg Egan (@[email protected])

“AI-powered writing tools are increasingly integrated into our e-mails and phones. Now a new study finds biased AI suggestions can sway users’ beliefs” “We told people before, and after, to be careful, that the AI is going to be (or was) biased, and nothing helped,” Naaman said. “Their attitudes about the issues still shifted.” https://www.scientificamerican.com/article/ai-autocomplete-doesnt-just-change-how-you-write-it-changes-how-you-think/

Mathstodon

@aparrish I think the "search engine" set the stage for this, the [erroneous] idea that you could search and find an answer [truth: you found ... something]

Now people use "AI" just like a search engine, and accept the results by default.

@aparrish Personal experience. I write a BLOG about local issues, almost nobody else writes about these things. Ask a related question to AI and __you get my answer!__.

And I've been sitting in meetings where someone will "ask AI" and they will proceed to read out my answer. It is accepted as authoritative - it came from AI! That's great for me, I guess, my little cause.. But also, yikes!