If you want a rigorous analysis of why statistical #AI models collapse when continuously trained on their own data without external supervision and constraints, read this amazing paper from last year.
If you want to get a visual intuition of how model collapse looks like, look at this video.
When AI stares at its own reflection for too long, and its inference is purely rooted on statistics rather than reasoning, this becomes statistically inevitable.
Keep this in mind whenever you hear someone talking about “AI models learning from their own outputs” without addressing the statistical parrot issue.
AI models collapse when trained on recursively generated data - Nature
Analysis shows that indiscriminately training generative artificial intelligence on real and generated content, usually done by scraping data from the Internet, can lead to a collapse in the ability of the models to generate diverse high-quality out…Nature
This entry was edited (2 days ago)
