Cloudflare announces AI Labyrinth, which uses AI-generated content to confuse and waste the resources of AI Crawlers and bots that ignore “no crawl” directives.
Cloudflare announces AI Labyrinth, which uses AI-generated content to confuse and waste the resources of AI Crawlers and bots that ignore “no crawl” directives.
Thank you for catching that. Even reading through again, I couldn’t find it while skimming. With the mention of X2 and RSS, I assumed that paragraph would just be more technical description outside my knowledge. Instead, what I did hone in on was
“No real human would go four links deep into a maze of AI-generated nonsense.”
Leading me to be pessimistic.
If you’re dumb enough and care little enough about the truth, I’m not really going to try coming at you with rationality and sense. I’m down to do an accelerationism here. fuck it. burn it down.
remember; these companies all run at a loss. if we can hold them off for a while, they’ll stop getting so much investment.
that’s the entire point of laws, though, and it was already being used for that.
giving the laws better law stuff will not improve them. the law is malevolent. you cannot fix it by offering to help.
They aren’t poisoning the data with disinformation.
They’re poisoning it with accurate, but irrelevant information.
For example, if a bot is crawling sites relating to computer programming, or weather, this tool might lure the crawler into pages related to animal facts, or human biology.
while allowing legitimate users and verified crawlers to browse normally.
What is a “verified crawler” though? What I worry about is, is it only big companies like Google that are allowed to have them now?
Any accessibility service will also see the “hidden links”, and while a blind person with a screen reader will notice if they wonder off into generated pages, it will waste their time too.
Also, I don’t know about you, but I absolutely have a use for crawling X, Google maps, Reddit, YouTube, and getting information from there without interacting with the service myself.
yeah. it’s pretty fucked. hopefully it’s temporary.
so do we make everything inaccessible to everyone, or just inaccessible to disabled people? we don’t have a way to include them yet. we should work on it, but we are not the ones who fucked accessibility.
yeah. search engine web crawlers are a public service. they are responsible. but we are in a conflict. we must struggle tooth and nail against capital for every nice thing.
He is not wrong. Unless people start to take steps , the dependency of tech will be used to chain most of us. Granted, these chains will be the kindest and gentlest chains seen in a long time.
Social revolution lives on in decentralized services, like this; the true battles will be later though. This year is a mild warm up. I can’t imagine the challenges that await many
so you’re saying it’s a niche toy you can get if you really want one, but nobody’s pushing that shit on you, and if you never want to talk about one again in your life, you can probably do that?
I would like if large language models were in this position. I don’t think you understand the degree to which our productive capacity and infrastructure are committed to this technology. it’s a lot. basically all the cutting edge computer chips being made are specialist chips for processing large (whatever) models, including the engineering to back it. that means everything else is bumped down a generation or five.
then there’s the amount of electricity being put towards these things-we are in a climate disaster, we do not have green energy, and these things are drawing in the high GW/low TW of energy.
water is also a lot more precarious than a lot of people want to think about. these things are using lots and lots of good drinkable water to cool those specialist chips, then just being throws out, because I assume it’s cheaper than cooling the hot water back down.
It’s difficult to imagine a group of people voluntarily amassing and then using the resources necessary for “AI” absent the desire to cash in on their investment.
I mean Dmitry Pospelov was arguing for AI control in the Soviet Union clear back in the 70s.