# Distributions and Transformations

This chapter reviews the most important univariate distributions and shows how to derive their expectations, variances, moment-generating functions (if they exist), and characteristic functions. Many distributions arise as transformations of random variables or vectors. Therefore, the problem of how the distribution of Y = g(X) is related to the distribution of X for a Borel-measure function or mapping g(x) is also addressed.

In Chapter 1 I introduced three “natural” discrete distributions, namely, the hypergeometric, binomial, and Poisson distributions. The first two are natural in the sense that they arise from the way the random sample involved is drawn, and the last is natural because it is a limit of the binomial distribution. A fourth “natural” discrete distribution I will discuss is the negative binomial distribution.

4.1.1. The Hypergeometric Distribution

Recall that a random variable Xhas a hypergeometric distribution if

P(X = k) = 0 elsewhere,

where 0 < n < N and0 < K < N are natural numbers. This distribution arises, for example, if we randomly draw n balls without replacement from a bowl containing K red balls and N — K white balls. The random variable X is then the number of red balls in the sample. In almost all applications of this distribution, n < K, and thus I will focus on that case only.

The moment-generating function involved cannot be simplified further than its definition mH(t) = J2m=o exp(t ■ k)P(X = k), and the same applies to the characteristic function. Therefore, we have to derive the expectation directly:

„ n-1 __________ (K – 1)!((N – 1) – (K – 1))!_______

nK k!((K – 1)-k)!((n – 1) – k)!((N – 1) – (K – 1) – (n – 1) + k)!

N ^ (N – 1)!

k=0 (n – 1)!((N – 1) – (n – 1))!

Along similar lines it follows that

hence,

## Leave a reply