Category INTRODUCTION TO STATISTICS AND ECONOMETRICS

Bayes’ Theorem

Bayes’ theorem follows easily from the rules of probability but is listed separately here because of its special usefulness.

Подпись: Р(АІ I E)

image006

THEOREM 2.4.2 (Bayes) Let events A, A2, . . . , An be mutually exclusive such that P{A U A2 U. . . U An) = 1 and Р(Д) > 0 for each i. Let E be an arbitrary event such that P(E) > 0. Then

Подпись: (2.4.3) Подпись: Р(АІ I E) Подпись: P(E I AdPjAj) P(E)

Proof. From Theorem 2.4.1, we have

Since E П A], E П А4, . . . , E fl An are mutually exclusive and their union is equal to E, we have, from axiom (3) of probability,
(2.4.4) P(E) = X P(E n Aj)-

j= і

Thus the theorem follows from (2.4.3) and (2.4.4) and by noting that P(E П Aj) = P(£ I Aj)P(Aj) by Theorem 2.4.1. □

Read More

Conditional Density

We shall extend the notion of conditional density in Definitions 3.3.2 and

3.3.3 to the case of bivariate random variables. We shall consider first the situation where the conditioning event has a positive probability and second the situation where the conditioning event has zero probability. Under the first situation we shall define both the joint conditional density and the conditional density involving only one of the variables. A gener­alization of Definition 3.3.3 is straightforward: definition 3.4.2 Let (X, У) have the joint density fix, y) and let S be a subset of the plane such that P[(X, Y) Є 5] > 0. Then the conditional density of (X, Y) given (X, Y) Є S, denoted by f(x, у | S), is defined by

(3.4.21) f(x, у I S) =———————— for (x, у) Є S,

P[(X, F)ES]

= 0 otherwise.

We are...

Read More

MULTIVARIATE NORMAL RANDOM VARIABLES

In this section we present results on multivariate normal variables in matrix notation. The student unfamiliar with matrix analysis should read Chapter 11 before this section. The results of this section will not be used directly until Section 9.7 and Chapters 12 and 13.

Let x be an и-dimensional column vector with Ex = x and Ух = X. (Throughout this section, a matrix is denoted by a boldface capital letter and a vector by a boldface lowercase letter.) We write their elements explicidy as follows:

Note that <jt] = Cov(x„ xA, i, j = 1,2,…, n, and, in particular, сти = Vxt,

J о

і = 1, 2, . . . , n. We sometimes write ct* for сгй.

DEFINITION 5.4.1 We say x is multivariate normal with mean (jl and variance-covariance matrix X, denoted 1V(|A, X), if its density is given by

Подпись:

image200
Read More

Statistical Independence

We shall first define the concept of statistical (stochastic) independence for a pair of events. Henceforth it will be referred to simply as “inde­pendence.”

definition 2.4.1 Events A and В are said to be independent if P(A) = P(A I B).

The term “independence” has a clear intuitive meaning. It means that the probability of occurrence of A is not affected by whether or not В has occurred. Because of Theorem 2.4.1, the above equality is equivalent to P(A)P(B) = P(A П B) or to P(B) = P(B I A).

Since the outcome of the second toss of a coin can be reasonably assumed to be independent of the outcome of the first, the above formula enables us to calculate the probability of obtaining heads twice in a row when tossing a fair coin to be У4.

Definition 2.4...

Read More

Independence

Finally, we shall define the notion of independence between two continu­ous random variables.

DEFINITION 3.4.6 Continuous random variables X and Y are said to be independent if f{x, y) = fix)fiy) for all x and y.

This definition can be shown to be equivalent to stating

Fix і ^ X ^ X2, yi ^ Y — у2) = P(xi < X ^ X2)Piyi — Y ^ yfj

for all X], x%, yi, jo such that x < x%, yi ^ y^- Thus stated, its connection to Definition 3.2.3, which defined independence for a pair of discrete random variables, is more apparent.

Definition 3.4.6 implies that in order to check the independence be­tween a pair of continuous random variables, we should obtain the mar­ginal densities and check whether their product equals the joint density. This may be a time-consuming process...

Read More