# The Binomial Distribution A random variable X has a binomial distribution if

P(X = k) =^n^jpk(1 – p)n-k for k = 0, 1, 2,…, n,

P(X = k) = 0 elsewhere, (4.3)

where 0 < p < 1. This distribution arises, for example, if we randomly draw n balls with replacement from a bowl containing K red balls and N – K white balls, where K /N = p. The random variable X is then the number of red balls in the sample.

We have seen in Chapter 1 that the binomial probabilities are limits of hypergeometric probabilities: If both N and K converge to infinity such that K/N ^ p, then for fixed n and k, (4.1) converges to (4.3). This also suggests that the expectation and variance of the binomial distribution are the limits of the expectation and variance of the hypergeometric distribution, respectively:

E [ X] = np, |
(4.4) |

var(X) = np(1 – p). |
(4.5) |

As we will see in Chapter 6, in general, convergence of distributions does not imply convergence of expectations and variances except if the random variables involved are uniformly bounded. Therefore, in this case the conjecture is true because the distributions involved are bounded: P [0 < X < n] = 1. However, it is not hard to verify (4.4) and (4.5) from the moment-generating function:

n

MB(t) = ^2 exp(t ■ k)

k=0

=e( k) (pe )k (1 – p)

k=0 X /

= (p ■ et + 1 – p)n.

Similarly, the characteristic function is

Pb (t) = (p ■ Є ■t + 1 – p)n.

## Leave a reply