Abstract:
In this Lecture, we briefly discussed the role of probability in computing. Probability appears when we do not know some information about a system. The simplest probabilistic system is a uniformly random bit. Using deterministic operations and a number of independent random bits, arbitrary probabilistic operations can be realized. The Shannon entropy $H(p)$ of a given probability distribution $p$ is a measure of this mixed state's randomness. This quantity has an operational meaning: $m$ copies of the distribution $p$ can be prepared by deterministic transformations using $\approx m H(p)$ random bits; and conversely, $n$ copies of the distribution $p$ can be compressed to $\approx m H(p)$ random bits. The problem is efficiently solvable on probabilistic circuits if there exists a uniform family of probabilistic circuits that decide the language with high probability.