Model collapse, also termed "data cannibalism," describes a degradation in AI model performance. This occurs when models are trained repeatedly on synthetic data generated by other AI systems, rather than on novel human-created data. The continuous feedback loop of AI-generated data leads to a decline in accuracy and the production of nonsensical outputs. AI
Summary written by gemini-2.5-flash-lite from 1 source. How we write summaries →
IMPACT Repeated training on AI-generated data can lead to model performance degradation, impacting the reliability and accuracy of future AI systems.
RANK_REASON The cluster describes a phenomenon related to AI training data, but does not announce a new model, research paper, or product release.