AI can learn directly from the passage of time at unlimited scale—no human annotation required.
Time provides free supervision. Humans learn from experience with no labels—we constantly form expectations about the world, notice when we're wrong, and update our models accordingly.
"Future-as-Label" teaches AI to learn the same way. The passage of time provides labels that require no annotation.
This unlocks unlimited training data for AI from streams of data, with zero human bottlenecks.
Here we apply this to historical news streams, then evaluate the trained model on public Metaculus questions.
Result: fine-tuning with "Future-as-Label" improves Brier score by 27% over the base model (Qwen3-32B) and halves calibration error, even outperforming a 7× larger model (Qwen3-235B) of the same generation.
AI can learn directly from the passage of time at unlimited scale—no human annotation required.
Time provides free supervision. Humans learn from experience with no labels—we constantly form expectations about the world, notice when we're wrong, and update our models accordingly.
"Future-as-Label" teaches AI to learn the same way. The passage of time provides labels that require no annotation.
This unlocks unlimited training data for AI from streams of data, with zero human bottlenecks.
Here we apply this to historical news streams, then evaluate the trained model on public Metaculus questions.
Result: fine-tuning with "Future-as-Label" improves Brier score by 27% over the base model (Qwen3-32B) and halves calibration error, even outperforming a 7× larger model (Qwen3-235B) of the same generation.
Full paper: https://arxiv.org/abs/2601.06336