The Poisson Distribution
A random variable X isPoisson(X)-distributediffor k = 0, 1, 2, 3,… and some
X > 0,
P (X = k) = exp(-X)(4.7) k!
Recall that the Poisson probabilities are limits of the binomial probabilities (4.3) for n and p 10 such that np ^ X. It is left as an exercise to show that the expectation, variance, moment-generating function, and characteristic function of the Poisson(X) distribution are
E [ X] = X, var(X) = X, mp(t) = exp[X(et -1)],
(Pp(t) = exp[X(e!4 -1)],
4.1.2. The Negative Binomial Distribution
Consider a sequence of independent repetitions of a random experiment with constant probability p ofsuccess. Let the random variable X be the total number of failures in this sequence before the Mth success, where m > 1. Thus, X + m is equal to the number of trials necessary to produce exactly M successes. The probability P(X = k), k = 0, 1, 2,… is the product of the probability of obtaining exactly m – 1 successes in the first k + m – 1 trials, which is equal
/к + m 1 j pm-1(1 p)k+m-1— (m-1)
m _ 1 J ( p)
and the probability p of a success on the (k + m)th trial. Thus,
P (X = k) = ( k +nm_ _ M pm (1 — p)k, к = 0, 1, 2, 3,….
This distribution is called the negative binomial (m, p) – abbreviated NB (m, p) – distribution.
It is easy to verify from the preceding argument that an NB(m, p)-distributed random variable can be generated as the sum of m independent NB(1, p)- distributed random variables (i. e., if XuX1m are independent NB(1, p) distributed, then X = YTj=1 X1j is NB(m, p) distributed.) The momentgenerating function of the NB(1, p) distribution is
m nb(1, p)(t) = ^2 exp(k ■ t)
= pJ2 ((1 _ p) e‘)k
1 _ (1 _ p) Є*
provided that t < _ ln(1 _ p), hence, the moment-generating function of the NB(m, p) distribution is
1 _ (1 P_ p) et) , t < _In(1 _ p)• (4.12)
Replacing t by i ■ t in (4.12) yields the characteristic function
It is now easy to verify, using the moment generating function that, for an NB(m, p)-distributed random variable X,
E[X] = m(1 _ p)/p, var(X) = m (1 _ p)/p2.