The AI feedback loop: Researchers warn of ‘model collapse’ as AI trains on AI-generated content
The AI feedback loop: Researchers warn of ‘model collapse’ as AI trains on AI-generated content
As far as I know, that is mainly used where a better, bigger model generates training data for a more efficient smaller model to bring it a bit closer to its level.
Were there any cases of an already state of the art model using this method to improve itself?
Sorta. This “model collapse” thing is basically an urban legend at this point.
The kernel of truth is this: A model learns stuff. When you use that model to generate training data, it will not output all it has learned. The second generation model will not know as much as the first. If you repeat this process a couple times, you are left with nothing. It’s hard to see how this could become a problem in the real world.
Incest is a good analogy, if you know what the problem with inbreeding is: You lose genetic diversity. Still, breeders use this to get to desired traits and so does nature (genetic bottleneck, founder effect).