Next: 5.3 Randomness and Complexity. Up: 5 Randomness in Computing. Previous: 5.1 A Monte-Carlo Primality

## 5.2 Randomized Algorithms and Random Inputs.

Las-Vegas algorithms, unlike Monte-Carlo, never give wrong answers. Unlucky coin-flips just make them run longer than expected. Quick-Sort is a simple example. It is about as fast as deterministic sorters, but is popular due to its simplicity. It sorts an array a[1..n] of n >2 numbers by choosing in it a random pivot, splitting the remaining array in two by comparing with the pivot, and calling itself recursively on each half.

For easy reference, replace the array entries with their positions 1,...,n in the sorted output (no effect on the algorithm). Denote t(i) the (random) time i is chosen as a pivot. Then i will ever be compared with j iff either t(i) or t(j) is the smallest among t(i),...,t(j). This has 2 out of |j-i|+1 chances. So, the expected number of comparisons is Note, that the expectation of the sum of variables is the sum of their expectations (not true, say, for product).

The above Monte-Carlo and Las-Vegas algorithms require choosing strings at random with uniform distribution. We mentally picture that as flipping a coin. (Computers use pseudo-random generators rather than coins in hope, rarely supported by proofs, that their outputs have all the statistical properties of truly random coin flips needed for the analysis of the algorithm.)

#### Random Inputs

to Deterministic Algorithms are analyzed similarly to algorithms which flip coins themselves and the two should not be confused. Consider an example: Someone is interested in knowing whether or not certain graphs contain Hamiltonian Cycles. He offers graphs and pays \$100 if we show either that the graph has or that it has not Hamiltonian Cycles. Hamiltonian Cycle problem is NP-Complete, so it should be very hard for some, but not necessarily for most graphs. In fact, if our patron chooses the graphs uniformly, a fast algorithm can earn us the \$100 most of the time! Let all graphs have n nodes and, say, k(n)< n ln n/4 edges and be equally likely. Then we can use the following (deterministic) algorithm: output ``No Hamiltonian Cycles'' and collect the \$100, if the graph has an isolated node. Otherwise, pass on that graph and the money. Now, how often do we get our \$100. The probability that a given node A of the graph is isolated is . Thus, the probability that none of n nodes is isolated (and we lose our \$100) is and vanishes fast. Similar calculations can be made whenever r = lim 2k(n)/(n ln n)<1. If r>1, other fast algorithms can actually find a Hamiltonian Cycle. See: [Johnson 1984], [Karp 1976], [Gurevich 1985]. See also [Venkatesan, Levin] for a proof that another problem is NP-complete even on average. How do this HC algorithm and the above primality test differ?

• The primality algorithm works for all instances. It tosses the coin itself and can repeat it for a more reliable answer. The HC algorithm only works for most instances (with isolated nodes).
• In the HC algorithm, we must trust the opponent to follow the presumed random procedure. If he cheats and produces connected graphs often, our analysis breaks down.

#### Symmetry Breaking.

Randomness comes into Computer Science in many other ways besides those we considered. Here is a simple example: avoiding conflicts for shared resources.

Several philosophers dine at a circular table. Before each of them is a plate, and left of each plate is either a knife or a fork arranged so that each diner has a knife on his right and a fork on his left or vice versa. The problem is that neighboring diners must share the utensils: neighbors cannot eat at the same time. How can the philosophers complete the dinner given that all of them must act in the same way without any central organizer? Trying to grab the knives and forks at once may turn them into fighting philosophers. Instead they could each flip a coin, and sit still if it comes up heads, otherwise try to grab the utensils. If two diners try to grab the same utensil, neither succeeds. If they repeat this procedure enough times, most likely each philosopher will eventually get both a knife and fork without interference.

We have no time to actually analyze this and many other scenaria, where randomness is crucial. Instead we will take a look into the concept of Randomness itself.

Next: 5.3 Randomness and Complexity. Up: 5 Randomness in Computing. Previous: 5.1 A Monte-Carlo Primality

Leonid Levin
Wed Aug 21 20:35:42 EDT 1996