OK here's a theory: #ChatGPT's #Atlas browser is not a really browser but fact a way for OpenAI to circumvent scrape blockers. It's more a distributed human-based scraper rather than anything else.
Given how widely loathed AI and how damaging AI scrapers have become #OpenAI's IP ranges ended up in quite a lot of block lists, many servers outright terminate any connection to them. Then there are things like #Anubis or #Iocaine that further frustrate #LLM scraping.
But what if you DIDN'T neeed to bother about all that? What if you could use civilian IP addresses with "organic" traffic patterns, and have humans solve Captchas, provide proof of work for Anubis, or get around Iocaine? All this for free -- you don't even need to pay people for it?
I would be REALLY interested to see what telemetry Atlas sends back. 100% certain it will send back things like URL and rendered HTML output, possibly user interaction patterns ("a normal human on this website moves their mouse first to the 'I am not a bot' captcha then clicks it). They do not have to respect robots.txt because, well, it comes from organic visitors... 
Am I crazy?
EDIT: And what do you know?! I was correct. https://tldr.nettime.org/@remixtures/115419472139725665