An example of one of my favorite surprise calculations comes up in the context of quantum computing. The idea is that if you have a process that gets things right 90% of the time and you repeat the process and the success or failure of one instance of the process has no effect on any other instance, the chance of being successful times is . Thus this table:

In just ten iterations, the chance of being correct ten times in a row is a mere 35%. The chance of being correct 100 time in a row is .003%. This phenomena accounts for the following interesting graph. Note that as increases, looks more and more like an “L”. The vertical line is and the dots locate powers of .

One of the more interesting applications of this idea, which I think I read about in literature about random matrices, is about the random distribution of points in space. Suppose we distribute points randomly in the an n-space hypercube with unit sides. The hypercube’s volume would be hypercubic units. Now imagine a cube with sides units inserted inside the unit hypercube. It’s volume would be hypercubic units. If , a small dimension in this day and age of large data sets, of the volume would be near the “surface” of the hypercube. A random point(vector) in the unit hypercube would have norm of nearly 1 (Use a n-space ball instead of a cube). Interesting.