Abstract
We bound the future loss when predicting any (computably) stochastic sequence online. Solomonoff finitely bounded the total deviation of his universal predictor M from the true distribution µ by the algorithmic complexity of µ. Here we assume we are at a time t > 1 and already observed x = x1...xt. We bound the future prediction performance on xt+1xt+2... by a new variant of algorithmic complexity of µ given x, plus the complexity of the randomness deficiency of x. The new complexity is monotone in its condition in the sense that this complexity can only decrease if the condition is prolonged. We also briefly discuss potential generalizations to Bayesian model classes and to classification problems.
Original language | English (US) |
---|---|
State | Published - 2006 |
Event | Kolmogorov Complexity and Applications 2006 - Wadern, Germany Duration: Jan 29 2006 → Feb 3 2006 |
Conference
Conference | Kolmogorov Complexity and Applications 2006 |
---|---|
Country/Territory | Germany |
City | Wadern |
Period | 01/29/06 → 02/3/06 |
Keywords
- future loss
- Kolmogorov complexity
- monotone conditional complexity
- online sequential prediction
- posterior bounds
- randomness deficiency
- Solomonoff prior
- total error
ASJC Scopus subject areas
- Software
- Hardware and Architecture
- Control and Systems Engineering