I’m working on an AI policy for my org that allows us to opt out of AI note taking and prohibits AI in our comms/storytelling. here is my list of reasons for the policy, but my board is asking me to cite sources. Can you help me with any good references you would cite for any of these? (Or an edit or restatement where I’ve gotten it wrong or inaccurate?)

*if you want to argue about why I shouldn’t have this policy kindly crawl into a hole in the ground and cover yourself with soil

@seachanger I probably do here but would need to do some cross referencing I can’t do at the moment

https://ai-sucks-actually.fyi/

AI Sucks, Actually

that's it, that's the thesis

[…] Even if the accuracy problems were solved, and AI-generated summaries reliably captured all the essential points of a text, it would still be a bad idea to use them. Creating your own summaries is a crucial step in any literature study. When you read and summarize a text, you create the neural connections necessary to memorize and apply the information well in an exam, experiment, or research paper. Generating it with a click is a harmful form of cognitive offloading and will erode these skills. Writing it yourself will reveal the nuances of an academic text and allow you to register those elements that you deem essential to whatever you are working on. 

https://www.tue.nl/en/our-university/library/library-news/24-02-2026-are-ai-generated-summaries-suitable-for-studying-and-research

@darby3

#theaicon #aihype #llm

Are AI-generated summaries suitable for studying and research?

Despite didactic, ethical, and environmental concerns, the use of GenAI is on the rise in academia. For most applications, the jury is still out on whether and how they will benefit education and research in the long term. But it’s already safe to conclude that one popular use case is, in fact, a bad one: AI-generated summaries.

@oatmeal @darby3 Yes, if we don't use our skills, they will degrade.