Deriving The Probability Function Of The Poisson Distribution

Let's say the probability of success is \(p = 0.001\), and there are \(n = 1000\) trails. We can apply the binomial distribution here:

$$P(X=k) = {n \choose k} p^k (1-p)^{n-k}$$

where \(k\) is the number of successes. The expected value of \(X\) is \(\mu = np\). Suppose we know what the \(\mu\) is but we don't know anything about \(p\) and \(n\) except that \(p\) is very small and \(n\) is very large, then can we have a model with \(\mu\) as the only parameter? Let's create a new probability function \(P_n\) by expanding our binomial probability function:

$$P_n (X=k) = \left( \frac{n!}{(n-k)!k!} \right) \left( p^k \right) \left( (1-p)^n\right) \left( (1-p)^{-k} \right)$$

Let's get rid of \(p\):

$$P_n (X=k) = \left( \frac{n!}{(n-k)!k!} \right) \left( \frac{\mu}{n} \right)^k \left(1-\frac{\mu}{n}\right)^n \left(1-\frac{\mu}{n}\right)^{-k} $$

If \(n\) is very large, then the last term would be very close to \(1\):

$$P_n (X=k) = \left( \frac{n!}{(n-k)!k!} \right) \left( \frac{\mu}{n} \right)^k \left(1-\frac{\mu}{n}\right)^n (1) $$

In this article, I represent \(e^{x}\) as a limit, and that representation can be used here:

$$P_n (X=k) = \left( \frac{n!}{(n-k)!k!} \right) \left( \frac{\mu}{n} \right)^k (e^{-\mu}) (1) $$

In \(n!\), the terms after \(n-k+1\) can be canceled out:

$$P_n (X=k) = \left( \frac{(n)(n-1)(n-2) * \ldots * (n-k+2)(n-k+1)}{k!} \right) \left( \frac{\mu}{n} \right)^k (e^{-\mu}) (1) $$

For the next step, I will switch \(k!\) and \(n^k\):

$$P_n (X=k) = \left( \frac{(n)(n-1)(n-2) * \ldots * (n-k+2)(n-k+1)}{n^k} \right) \left( \frac{\mu ^k}{k!} \right) (e^{-\mu}) $$

Since there are \(k\) factor in both the numerator and denominator:

$$P_n (X=k) = \left( \left( \frac{n}{n}\right) \left( \frac{n-1}{n} \right) * \ldots * \left(\frac{n-k+1}{n} \right)\right) \left( \frac{\mu ^k}{k!} \right) (e^{-\mu}) $$

If \(n\) approaches infinity, all of these factors will get closer to one. So for very large \(n\):

$$P_n (X=k) = (1) \left( \frac{\mu ^k}{k!} \right) (e^{-\mu}) $$

Therefore, where \(n\) is very large and \(p\) is very small, the probability function is:

$$P_n (X=k) = \left( \frac{\mu ^k}{k!} \right) (e^{-\mu}) $$

If you sum the probabilities form \(k=0\) till \(k=\infty\), does it add up to one?

$$\sum_k \left( \frac{\mu ^k}{k!} \right) (e^{-\mu}) = 1 ?$$

Using the representation of \(e^x\) as an infinite series, we rewrite the above as:

$$(e^{\mu}) (e^{-\mu}) = 1 $$

And we know that this is true. Also, if the variance of a binomial distribution is \(np(1-p)\), then the variance of the Poisson distribution would be:

$$(np)(1-p) \approx (\mu)(1) = \mu $$

Styles

(uses cookies)