Algorithmic probability
|
Around 1960, Ray Solomonoff invented the concept of algorithmic probability. Take a universal computer and randomly generate an input program. The program will compute some possibly infinite output.
The algorithmic probability of any given finite output prefix q is the sum of the probabilities of the programs that compute something starting with q. Certain long objects with short programs have high probability.
Algorithmic probability is the main ingredient of Ray Solomonoff's theory of inductive inference, the theory of prediction based on observations. Given a sequence of symbols, which will come next? Solomonoff's theory provides an answer that is optimal in a certain sense, although it is incomputable. Unlike Karl Popper's informal theory, however, Solomonoff's is mathematically sound.
Algorithmic probability is closely related to the concept of Kolmogorov complexity. The Kolmogorov complexity of any computable object is the length of the shortest program that computes it. The invariance theorem shows that it is not really important which computer we use.
Solomonoff's enumerable measure is universal in a certain powerful sense, but it ignores computation time. The book of Ming Li and Paul Vitanyi includes material about time-bounded algorithmic probability measures. The Speed Prior of Juergen Schmidhuber is based on the fastest way of computing objects, not the shortest.