Imagine a bingo draw, where numbered balls are picked randomly. If you want to reproduce this process with a computer, you will find that it is not as easy as you might think. Computing is completely deterministic by nature, and reproducing or simulating naturally random processes is a particularly delicate matter.
The problem is how to use a computer to generate random numbers. In fact, this is impossible! We can program a computer to generate a sequence of numbers, following certain law. Although the output values of this sequence might look random (according to some rules that we will discuss in this section), the existence of a deterministic law behind them is telling us that preciselly, these are not random numbers at all!
We define a sequence of numbers as ``random'' if there are no correlations among the numbers in the sequence. A random sequence can have a distribution, i. e. the probability of a number to appear in the sequence would correspond to some distribution. If the distribution is uniform, all numbers are equally probable to appear. Mathematically, the likehood of a number to occur is described by a distribution function . This means that the probability of finding in the interval is given by . The usual random number generators provided by compilers or libraries generate a uniform distribution between 0 and 1, that means . Ideally this numbers have equal probability, and it is independent of the previous one.
The computer, the sequences are ``pseudo-random'' because knowing a number and the preceeding , we can predict the next one . This is evident in the correlations. Some sophisticated psudo-random number generators do a good job hiding this fact from our eyes, although if you look hard enough, you will eventually figure it out.