Semantic Drift in Recursive Systems Is a Rate Problem
Semantic drift is when models trained recursively on their own outputs, or on heavily endogenous data, gradually lose meaning, coherence, or reliability. This concept appears in the context of synthetic data contamination, self-training loops, and long-term self-improvement. This article proposes understanding of semantic drift as a rate problem. Recursive learning...
Jan 61