
Communities across the United States are pushing back against resource-draining data centers being built to fuel artificial intelligence and crypto ventures. In Maine, state legislators recently passed a first-in-the-country statewide moratorium on large data centers. “Maine residents are concerned about the impacts of data centers on both their electric rates and other utility rates, as well as on our wonderful environment,” says Democratic state Representative Melanie Sachs, who sponsored the bill designed to give legislators time to develop regulations around new data center construction. Sachs says developers have been operating in “complete secrecy,” refusing to engage with community stakeholders, while their plans appear to provide “limited economic opportunity with very few local jobs.” The bill goes to Maine Governor Janet Mills’s desk next.

As tech companies scramble to build massive new data centers to power artificial intelligence, marginalized communities are bearing the brunt of the environmental harms. In Memphis, Tennessee, Elon Musk’s xAI operates over two dozen methane gas-burning turbines without legal permits to power its data centers, Colossus 1 and Colossus 2, polluting the nation’s largest majority-Black city with toxic emissions. The NAACP is suing xAI for violating the Clean Air Act. “We are, unfortunately, a cautionary tale about what will and possibly can happen if you don’t have the right rules and guardrails in place,” says KeShaun Pearson, the executive director of Memphis Community Against Pollution. Pearson says pollution from xAI’s energy generation is already “at a level even higher than our Memphis International Airport.” Meanwhile, the company has created far fewer jobs than it initially promised. “This has been terrible for our region, and it’s terrible for our future, because our community is going to continue to suffer. Our children have the highest rate of ER visits for respiratory illnesses and issues in the state of Tennessee, and it’s only going to continue to get worse.”
In a world where AI porn (or other bad things) can be created and posted in minutes...
Where algorithms boost it to thousands (or millions) or people...
Who can then repost and share in a click...
Across a half-dozen major tech platforms...
Who really thinks that 48 hour takedown (after user report) is a solution?
Imagine that safety approach to cars or meds.
#RegulateTech
Sure, but requiring users to monitor the entire internet to report crimes is still a horrifically bad approach to online safety.
The platforms themselves must bear the burden of monitoring (and acting).
#RegulateTech
Nick Robinson perfectly correct on the threat from partisan news.
Look what Fox, Sinclair and Limbaugh did to America. Look what GB News and Murdoch's papers are doing to the UK.
News: Meta warns of "worse" user experience because of enshittification.
..oh wait, no, they're blaming regulators despite decades of making their own products worse to juice profits.
BBC News - Meta warns of 'worse' experience for European users
https://www.bbc.com/news/articles/czd3mey1ej2o
100% agree!
The techbros disinfo machine is democraciy's greatest threat. Europe absolutely must #RegulateTech algorithms to stop radicalisation. It can start by banning Twitter and Truth Social.
And while I agree w your entire thread, may I suggest you review the verbs used when discussing America and Trump. The threat is already here. It is not "increasing" or "becoming". There is no time for delay. We must start acting on the threat now.