I swear to god I’m so fucking sick of AI
@danirabbit I am deeply concerned that companies wielding Large Language Models (LLMs) are siphoning away the collective intelligence of humanity. Every conversation we have, every creative spark we discuss, is ultimately fed into these models—word by word, delivered directly to these tech giants. Yet, these corporations never truly commit to not retaining your ideas. They are, in essence, absorbing your creativity. Is this logic sound? Is this our current reality?
@LucasAegis @danirabbit absorbing creativity implies that it is a finite resource which depletes when someone... experiences it? which is obviously wrong :p

as bad as LLMs and tech companies are, humans won't stop doing creative stuff just because someone or something takes in those ideas, people make creative things for unrelated reasons tbh
@froge @danirabbit You missed my point. I’m not saying human creativity is a finite resource that 'depletes.' I’m saying that when you interface with a cloud-based LLM, your specific, private ideas are leaked and harvested without consent or compensation. The tech giants provide zero guarantees of confidentiality. It’s not about creativity running out; it’s about the systemic extraction of individual intellectual property. My spark doesn't die, but the company shouldn't get to own it just because I used their tool.
@LucasAegis @danirabbit oh yeah, that's fair, people really should treat them like shouting your ideas into the public where they will get recorded and reused forever, dealing with a cloud LLM is basically packaging your thoughts into a product for the company

people who want to keep their ideas private just really shouldn't use those systems at all tbh
@froge @danirabbit Technology should be a tool for humans, not a vacuum for our thoughts. The problem isn't the AI itself; it's the lack of oversight. Every major industry—aviation, medicine, construction—operates under strict global safety standards. AI currently has none. We are letting these models grow 'wild' because of a geopolitical arms race. We shouldn't have to choose between 'using AI' and 'keeping our privacy.' The industry needs to mature into a regulated infrastructure.