Let us first review the definition of the convergence of a sequence of real numbers.

DEFINITION 6.1.1 Asequence of real numbers {aj, n = 1, 2, . . . , is said to converge to a real number a if for any e > 0 there exists an integer N such that for all n > N we have

(6.1.1) |an — a| < e.

We write a„ —» a as n —> °° or lim,,^» an = a (n —> °° will be omitted if it is obvious from the context).

Now we want to generalize Definition 6.1.1 to a sequence of random variables. If a„ were a random variable, we could not have (6.1.1) exacdy, because it would be sometimes true and sometimes false. We could only talk about the probability of (6.1.1) being true. This suggests that we should modify the definition in such a way that the conclusion states that


Read More


We have now studied all the rules of probability for discrete events: the axioms of probability and conditional probability and the definition of independence. The following are examples of calculating probabilities using these rules.

EXAMPLE 2.5.1 Using the axioms of conditional probability, we shall solve the same problem that appears in Example 2.3.4. In the present approach we recognize only two types of cards, aces and nonaces, without paying any attention to the other characteristics—suits or numbers. We shall first compute the probability that three aces turn up in a particular sequence: for example, suppose the first three cards are aces and the last two nonaces. Let A* denote the event that the ith card is an ace and let TV, denote the event that the ith card is a nonace...

Read More


As we have seen so far, a discrete random variable is characterized by specifying the probability with which the random variable takes each single value, but this cannot be done for a continuous random variable. Con­versely, a continuous random variable has a density function but a discrete random variable does not. This dichotomy can sometimes be a source of inconvenience, and in certain situations it is better to treat all the random variables in a unified way. This can be done by using a cumulative distri­bution function (or, more simply, a distribution function), which can be defined for any random variable.

definition 3.5.1 The (cumulative) distribution function of a random variable X, denoted by F(-), is defined by

(3.5.1) F(x) = P(X < x) for every real x.

From the definition and t...

Read More


Given a sequence of random variables {X;}, і = 1, 2, … , define Xn = n lZf=lXi. A law of large numbers (LLN) specifies the conditions under which X„ — EXn converges to 0 in probability. This law is sometimes referred to as a weak law of large numbers to distinguish it from a strong law of large numbers, which concerns the almost sure convergence. We do not use the strong law of convergence, however, and therefore the distinction is unnecessary here.

In many applications the simplest way to show Xn — EXn —» 0 is to show

— — M

Xn — EXn —» 0 and then to apply Theorem 6.1.1 (Chebyshev). In certain situations it will be easier to apply

THEOREM 6.2.1 (Khinchine) Let {X;} be independent and identically distributed (i. i.d.) with XX; = p. Then Xn —> pc.

Note that the conclusio...

Read More



We have already loosely defined the term random variable in Section 1.2 as a random mechanism whose outcomes are real numbers. We have men­tioned discrete and continuous random variables: the discrete random vari­able takes a countable number of real numbers with preassigned prob­abilities, and the continuous random variable takes a continuum of values in the real line according to the rule determined by a density function. Later in this chapter we shall also mention a random variable that is a mixture of these two types. In general, we can simply state

DEFINITION 3.1.1 A random variable is a variable that takes values accord­ing to a certain probability distribution.

When we speak of a “variable,” we think of all the possible values it can take; wh...

Read More