\newpage\subsection{Randomized Algorithms and Random Inputs.\label{average}}
{\bf\em Las-Vegas} algorithms, unlike Monte-Carlo, never give wrong
answers. Unlucky coin-flips just make them run longer than expected.
Quick-Sort is a simple example. It is about as fast as deterministic
sorters, but is popular due to its simplicity. It sorts an array
$a[1..n]$ of $n>2$ numbers by choosing in it a random {\em pivot},
splitting the remaining array in two by comparing with the pivot, and
calling itself recursively on each half.
For easy reference, replace the array entries with their positions $1,...,n$ in
the {\em sorted output} (no effect on the algorithm). Denote $t(i)$ the
(random) time $i$ is chosen as a pivot. Then $i$ will ever be compared with $j$
iff either $t(i)$ or $t(j)$ is the smallest among $t(i),...,t(j)$.
This has $2$ out of $|j-i|+1$ chances. So, the expected number of comparisons
is $\sum_{i,j>i} 2/(1+j-i)= 2(-2n+ (n+1)\sum_{k=1}^n 1/k)= 2n(\ln n-O(1))$.
Note, that the expectation of the sum of variables is
the sum of their expectations (not true, say, for product).
The above Monte-Carlo and Las-Vegas algorithms require choosing strings {\em at
random} with uniform distribution. We mentally picture that as flipping a coin.
(Computers use {\em pseudo-random generators} rather than coins in hope, rarely
supported by proofs, that their outputs have all the statistical properties of
truly random coin flips needed for the analysis of the algorithm.)
\paragraph {Random Inputs} to Deterministic Algorithms are analyzed similarly
to algorithms which flip coins themselves and the two should not be confused.
Consider an example: Someone is interested in knowing whether or not certain
graphs contain Hamiltonian Cycles. He offers graphs and pays \$100 if we show
either that the graph {\em has} or that it {\em has not} Hamiltonian Cycles.
Hamiltonian Cycle problem is NP-Complete, so it should be very hard for
{\em some}, but not necessarily for {\em most} graphs. In fact, if our patron
chooses the graphs uniformly, a fast algorithm can earn us the \$100 {\em most
of the time}! Let all graphs have $n$ nodes and, say, $k(n)(1-O(1/n))/\sqrt n$. Thus, the probability that {\em
none} of $n$ nodes is isolated (and we lose our \$100) is $O((1-1/\sqrt n)^n)=
O(e^{-\sqrt n})$ and vanishes fast. Similar calculations can be made whenever
$r = \lim 2k(n)/(n\ln n)<1$. If $r>1$, other fast algorithms can actually find
a Hamiltonian Cycle. See: [Johnson 1984], [Karp 1976], [Gurevich 1985]. See
also [Venkatesan, Levin] for a proof that another problem is NP-complete even
on average. How do this HC algorithm and the above primality test differ?
\begin{itemize} \item The primality algorithm works for {\em all} instances.
It tosses the coin itself and can repeat it for a more reliable answer. The HC
algorithm only works for {\em most} instances (with isolated nodes).
\item In the HC algorithm, we must trust the opponent to follow the presumed
random procedure. If he cheats and produces connected graphs often, our
analysis breaks down. \end{itemize}
\paragraph {Symmetry Breaking.}
Randomness comes into Computer Science in many other ways besides those we
considered. Here is a simple example: avoiding conflicts for shared resources.
Several philosophers dine at a circular table. Before each of them is a plate,
and left of each plate is either a knife or a fork arranged so that each diner
has a knife on his right and a fork on his left or vice versa. The problem is
that neighboring diners must share the utensils: neighbors cannot eat at the
same time. How can the philosophers complete the dinner given that all of them
must act in the same way without any central organizer? Trying to grab the
knives and forks at once may turn them into fighting philosophers. Instead they
could each flip a coin, and sit still if it comes up heads, otherwise try to
grab the utensils. If two diners try to grab the same utensil, neither
succeeds. If they repeat this procedure enough times, most likely each
philosopher will eventually get both a knife and fork without interference.
We have no time to actually analyze this and many other scenaria, where
randomness is crucial. Instead we will take a look into the concept of
Randomness itself.