There's been a lot of chatter over the past day about whether platforms should slow down the release of large language models. I'm persuaded that we should:
https://www.platformer.news/p/the-ai-industry-really-should-slow
There's been a lot of chatter over the past day about whether platforms should slow down the release of large language models. I'm persuaded that we should:
https://www.platformer.news/p/the-ai-industry-really-should-slow
@caseynewton @DAIR put it well today, too
https://www.dair-institute.org/blog/letter-statement-March2023

On GPT-3: meta-learning, scaling, implications, and deep theory. The scaling hypothesis: neural nets absorb data & compute, generalizing and becoming more Bayesian as problems get harder, manifesting new abilities even at trivial-by-global-standards-scale. The deep learning revolution has begun as foretold.