Exponentiation makes the difference between the bit size of this
line and the number (< 2^{300}) of particles in the known
universe. The expulsion of exponential time algorithms from computer
theory in the 1960s created a deep gap between deterministic computation
and -- formerly its unremarkable tools -- randomness and nondeterminism.
These two "freedoms" of computation preserved their reputation as some
of the most mysterious phenomena in science and seem to play an ever
more noticeable role in computer theory. We have learned little in the
past decades about the power of either, but a vague pattern is emerging
in their relationships.

A nondeterministic task is to invert an easy computable function. It is widely believed to require, in general, exponential computing time. Great efforts, however, have failed to prove the existence of such one-way (easy to compute, infeasible to invert) functions (owf). Randomness too preserves its mysterious reputation: it seems now, it can be generated deterministically. A simple program can transform a short random "seed" into an unlimited array of random bits. As long as the seed is not disclosed, no computational power that may fit in the whole universe can distinguish this array from truly random ones. This conjecture is proven equivalent to the existence of owf. I will survey a number of papers that have resulted in this theorem.

Although fundamental in many areas of science, randomness is really "native" to computer science. Its computational nature was clarified by Kolmogorov. He and his followers (see survey [Kolmogorov Uspenskii 1987]) built in the 1960s--1970s the first successful theory of random objects, defined roughly as those that cannot be computed from short descriptions. Kolmogorov also suggested in the 1960s that randomness may have an important relationship with nondeterminism; namely, that the task of finding a "non-randomness" witness (i.e. short fast program generating a given string) may be a good candidate to prove that exhaustive search cannot be avoided (in today's terms, P < NP).

The next step came from cryptography of the 1980s. Many applications
require only few properties of random strings. So, they do not care
about the fundamental issues of randomness, as long as some laws (e.g.,
of big numbers) hold. Cryptographers, in contrast, need to deprive the
adversary from taking advantage of * any * regularities in their
random strings, no matter how peculiar, making "perfect" randomness an
important practical concern. [Blum Micali], [Yao] (see some other
details also in [Levin 87]) used the idea of a * hard core * or
* hidden * bit:

Suppose from a length-preserving one-way function **f(x) ** it is
hard to compute not only ** x ** but even its one bit ** b(x) **
(easily computable from ** x **). Assume that even guessing ** b(x)
** with any noticeable correlation is infeasible. If ** f ** is
bijective (i.e. one-to-one), ** f(x) ** and **b(x) ** are both
random and * appear to be * independent to any feasible test,
thus increasing the initial amount ** n=|x| ** of randomness by one
bit. Then, a short random seed ** x ** can be transformed into an
arbitrary long string ** b(f ^{(i)}(x)), i=1,2,...**, which
passes any feasible randomness test. [Blum Micali] established such a
hard-core

[Levin 87] conjectured and
[Goldreich Levin] and [Levin 93] proved that a boolean inner product
**b(x,r)=(x _{1}r_{1}+...+ x_{n} r_{n}
mod 2)** provides every owf

- W. Alexi, B. Chor, O. Goldreich and C.P. Schnorr.
RSA and Rabin functions: Certain parts are as hard as the whole.
*SIAM J. Comput.*17:194-209, 1988. - L. Babai, L. Fortnow, L. Levin, and M. Szegedy.
Checking computations in polylogarithmic time.
*Proc., ACM Symp. on Theory of Computing*pp. 21-31, 1991. - M. Blum, S. Micali. How to generate cryptographically strong
sequences of pseudo-random bits.
*SIAM J. Comput.*13:850-864, 1984. - O. Goldreich, S. Goldwasser, S. Micali. How to construct random
functions.
*J. ACM*33(4):792-807, 1986. - O.Goldreich, L.Levin. A hard-core predicate for all one-way
functions.
*Proc., ACM Symp. on Theory of Computing,*pp. 25-32, 1989. - J. Hastad, R. Impagliazzo, L. Levin, Michael Luby.
Construction of a pseudo-random generator from any one-way function.
*SIAM J. Comput.*28(4):1364-1396, 1999. (Awarded an "Outstanding Paper Prize" of the Society for Industrial and Applied Mathematics in 2003.) - Donald E. Knuth. The Art Of Computer Programming, v.2
*Seminumerical Algorithms,*3-d edition, 1997, Section 3.5.F. This section is new to the 3d edition and so available on the Web in pp. 10, 29-36 of the file of changes http://www-cs-faculty.stanford.edu/~knuth/err2-2e.ps.gz - A.N. Kolmogorov, V.A. Uspenskii. Algorithms and Randomness.
*Theoria Veroyatnostey i ee Primeneniya = Theory of Probability and its Applications*3(32):389-412, 1987 - L. Levin. One-way functions and pseudorandom generators.
*Combinatorica*7(4):357-363, 1987. - L. Levin. Randomness and nondeterminism.
*J. Symb. Logic*58(3):1102-1103, 1993. - U.V. Vazirani. Efficiency Considerations in Using Semi-random
Sources.
*Proc., ACM Symp. on Theory of Computing,*pp. 160-168, 1987. - A.C. Yao. Theory and applications of trapdoor functions.
*Proc., IEEE Symp. on Foundations of Computer Sci.*pp. 80-91, 1982.