University of Minnesota, Twin Cities     School of Statistics     Charlie Geyer Home Page     John Corbett Home Page

The Poisson Distribution

Updated Feb 14, 2006.

The Poisson Distribution

The Poisson Distribution arises in a number of contexts as the distribution of a random number of points, for example the number of clicks of a Geiger counter in one second, the number of raisins in a box of raisin bran, the number of blades of grass in a randomly chosen square inch of lawn, and so forth.

The formula for the probability of observing k of whatever is being counted when the expected number is m is

p(k) = mk emk !

where e is the base of the natural logarithms and k ! indicates the factorial function (use the ex key on a scientific calculator to calculate em and the x! key to calculate k !).

Theoretically any count between zero and infinity (including zero) is possible, but the probability of large counts falls off very rapidly.

In a lottery, the number of winners cannot have an exact Poisson distribution for two reasons.

The first issue is not a serious problem. The Poisson distribution would be an extremely good approximation if it were not for the other issue. The second is more serious. Many players (about 70%) buy quick picks which are completely random, but other players choose some number they think is lucky and that's not random. If every player choose a quick pick the Poisson distribution would be an almost perfect approximation. Since they don't, it is not quite right. However, we will assume the Poisson distribution is correct to keep things simple.

The reason why the unconditional distribution of the number of winners of the jackpot and the conditional distribution of the number of other winners given you win are the same has to do with the assumption of completely random choices of numbers by all the players, which is required for the correctness of the Poisson distribution. Then whether you you win or not doesn't change the probability of anyone else winning. Everyone has the same 1 in 146.1 million chance of winning, and their ticket choice had nothing to do with yours.

Your Expected Winnings

If you win and there are k other winners, then the jackpot gets split k + 1 ways, and the amount you win is J ⁄ (k + 1), where J is the size of the jackpot.

Your expected winnings are calculated just like any other expectation: multiply the amount you win in each case, which is J ⁄ (k + 1), by the probability of that case, which is mk e&minus mk !, and sum. The sum runs over k from zero to infinity, so it appears to require calculus to sum this infinite series.

Fortunately, there is a trick that allows us to see what the expectation is without doing the infinite sum. The terms in the infinite sum are

ak = J ⁄ (k + 1) × mk emk ! = J mk em / (k + 1) !

Let W denote the sum of the ak as k runs from zero to infinity, which is the expectation we are trying to calculate.

If we multiply each term by m we get

m ak = J mk + 1 em ⁄ (k + 1) ! = J p(k + 1)

where p(k) is the Poisson probability defined above. The probabilities p(k) must sum to one as k goes from zero to infinity by the properties of probability. Because of the k + 1 above, the first term is J p(1). If we were to add an additional term J p(0), the series would sum to J (because the probabilities sum to one). Thus the series sums to

J [1 − p(0)] = m W

(Recall that we multiplied by m so the sum is m W rather than W). Solving for W gives

W = J [1 − p(0)] / m = J (1 − em) ⁄ m