1. Trang chủ
  2. » Ngoại Ngữ

A Course in Mathematical Statistics phần 4 docx

59 364 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 59
Dung lượng 371,79 KB

Nội dung

144 6 Characteristic Functions, Moment Generating Functions and Related Theorems etdt fy Ty x yx dy itx T − →∞ −∞ ∞ −∞ ∞ () = () − () − ∫∫ φ 2 lim sin . (8) Setting T(y − x) = z, expression (8) becomes etdt fx z T z z dz fx fx itx T − →∞ −∞ ∞ −∞ ∞ () =+ ⎛ ⎝ ⎜ ⎞ ⎠ ⎟ = () = () ∫∫ φ ππ 2 22 lim sin , by taking the limit under the integral sign, and by using continuity of f and the fact that sin z z dz −∞ ∞ ∫ = π . Solving for f(x), we have fx e tdt itx () = () − −∞ ∞ ∫ 1 2 π φ , as asserted. ᭡ Let X be B(n, p). In the next section, it will be seen that φ X (t) = (pe it + q) n . Let us apply (i) to this expression. First of all, we have 1 2 1 2 1 2 1 2 1 2 0 0 T etdt T pe q e dt T n r pe q e dt T n r pq e dt T itx it n itx T T T T it r n r itx r n T T rnr r n ir xt T T −− −− −− = − − = − () − () =+ () = ⎛ ⎝ ⎜ ⎞ ⎠ ⎟ () ⎡ ⎣ ⎢ ⎢ ⎤ ⎦ ⎥ ⎥ = ⎛ ⎝ ⎜ ⎞ ⎠ ⎟ = ∫∫ ∑ ∫ ∑ ∫ φ nn r pq ir x eirxdt T n x pq dt n r pq ee Ti r x rnr ir xt T T r rx n xnx T T rnr ir xT ir xT r r ⎛ ⎝ ⎜ ⎞ ⎠ ⎟ − () − () + ⎛ ⎝ ⎜ ⎞ ⎠ ⎟ = ⎛ ⎝ ⎜ ⎞ ⎠ ⎟ − − () − − () − = ≠ − − − − () −− () = ≠ ∫ ∑ ∫ 1 1 2 2 0 0 xx n xnx rnr xnx r rx n T n x pq T n r pq rxT rxT n x pq ∑ ∑ + ⎛ ⎝ ⎜ ⎞ ⎠ ⎟ = ⎛ ⎝ ⎜ ⎞ ⎠ ⎟ − () − () + ⎛ ⎝ ⎜ ⎞ ⎠ ⎟ − −− = ≠ 1 2 2 0 sin . Taking the limit as T →∞, we get the desired result, namely fx n x pq xnx () = ⎛ ⎝ ⎜ ⎞ ⎠ ⎟ − . EXAMPLE 1 6.5 The Moment Generating Function 145 (One could also use (i′) for calculating f(x), since φ is, clearly, periodic with period 2 π .) For an example of the continuous type, let X be N(0, 1). In the next section, we will see that φ X (t) = e −t 2 /2 . Since |φ(t)| = e −t 2 /2 , we know that |()| φ t −∞ ∞ ∫ dt <∞, so that (ii′) applies. Thus we have f x e t dt e e dt edte edt e e itx itx t t itx t t ix ix ix x () = () = == = −−− −∞ ∞ −∞ ∞ − () + () − () + () + () ⎡ ⎣ ⎢ ⎤ ⎦ ⎥ ()() −∞ ∞ −∞ ∞ − () − ∫∫ ∫∫ 1 2 1 2 1 2 1 2 2 1 2 2 2 2 2 2 2 2 12 2 12 2 12 12 12 π φ π ππ ππ (() + () − () − () −∞ ∞ −∞ ∞ − () − = =⋅= ∫∫ tix x u x x dt e edu e e 2 2 2 2 2 12 12 12 2 2 1 2 2 1 1 2 ππ ππ , as was to be shown. (Uniqueness Theorem) There is a one-to-one correspondence between the characteristic function and the p.d.f. of a random variable. PROOF The p.d.f. of an r.v. determines its ch.f. through the definition of the ch.f. The converse, which is the involved part of the theorem, follows from Theorem 2. ᭡ Exercises 6.2.1 Show that for any r.v. X and every t ∈ ޒ , one has |Ee itX | ≤ E|e itX |(= 1). (Hint: If z = a + ib, a, b ∈ ޒ , recall that zab=+ 22 . Also use Exercise 5.4.7 in Chapter 5 in order to conclude that (EY) 2 ≤ EY 2 for any r.v. Y.) 6.2.2 Write out detailed proofs for parts (iii) and (vii) of Theorem 1 and justify the use of Lemmas C, D. 6.2.3 For any r.v. X with ch.f. φ X , show that φ −X (t) = φ ¯ X (t), t ∈ ޒ , where the bar over φ X denotes conjugate, that is, if z = a + ib, a, b ∈ ޒ , then z¯ = a − ib. 6.2.4 Show that the ch.f. φ X of an r.v. X is real if and only if the p.d.f. f X of X is symmetric about 0 (that is, f X (−x) = f X (x), x ∈ ޒ ). (Hint: If φ X is real, then the conclusion is reached by means of the previous exercise and Theorem 2. If f X is symmetric, show that f −X (x) = f X (−x), x ∈ ޒ .) Exercises 145 THEOREM 3 EXAMPLE 2 146 6 Characteristic Functions, Moment Generating Functions and Related Theorems 6.2.5 Let X be an r.v. with p.d.f. f and ch.f. φ given by: φ (t) = 1 − |t| if |t| ≤ 1 and φ (t) = 0 if |t| > 1. Use the appropriate inversion formula to find f. 6.2.6 Consider the r.v. X with ch.f. φ (t) = e −|t| , t ∈ ޒ , and utilize Theorem 2(ii′) in order to determine the p.d.f. of X. 6.3 The Characteristic Functions of Some Random Variables In this section, the ch.f.’s of some distributions commonly occurring will be derived, both for illustrative purposes and for later use. 6.3.1 Discrete Case 1. Let X be B(n, p). Then φ X (t) = (pe it + q) n . In fact, φ X itx x n x it x nx it n x n x n te n x pq n x pe q pe q () = ⎛ ⎝ ⎜ ⎞ ⎠ ⎟ = ⎛ ⎝ ⎜ ⎞ ⎠ ⎟ () =+ () −− == ∑∑ , 00 Hence d dt t n pe q ipe inp X t it n it t φ () =+ () = = − = 0 1 0 , so that E(X ) = np. Also, d dt t inp d dt pe q e inp n pe q ipe e pe q ie inpn p npn p EX X t it n it t it n it it it n it t 2 2 0 1 0 21 0 22 1 11 11 φ () =+ () ⎡ ⎣ ⎢ ⎤ ⎦ ⎥ =− () + () ++ () ⎡ ⎣ ⎢ ⎤ ⎦ ⎥ =− () + [] =− − () + [] =− () = − = −− = , so that E X np n p X E X EX n p np np n p np p npq 222 2 22 2 22 11 1 () =− () + [] () = () − () =−+−=− () = and σ ; that is, σ 2 (X ) = npq. 2. Let X be P( λ ). Then φ X (t) = e λ e it − λ . In fact, φ λ λ λλ λλλλ X itx x it x e xx e tee x e e x ee e it it () == () == −− − = ∞ = ∞ − ∑∑ !! . 00 6.5 The Moment Generating Function 1476.3 The Characteristic Functions of Some Random Variables 147 Hence d dt teie i X t eit t it φλλ λλ () == = − = 0 0 , so that E(X ) = λ . Also, d dt t d dt ie e ie d dt e ie e e i i ie e i i iEX X t eit t eit t eit it t it it it 2 2 0 0 0 0 22 11 φλ λ λλ λλ λλ λλ λλ λλ λλ λλ () = () = =⋅⋅⋅+ () =⋅+ () =+ () =− + () =− () = −+ = −+ = −+ = − , so that σλλλλ 22 2 2 1XEX EX () = () − () =+ () −=; that is, σ 2 (X ) = λ . 6.3.2 Continuous Case 1. Let X be N( μ , σ 2 ). Then φ X (t) = e it μ− ( σ 2 t 2 /2) , and, in particular, if X is N(0, 1), then φ X (t) = e −t 2 /2 . If X is N( μ , σ 2 ), then (X − μ )/ σ , is N(0, 1). Thus φ φ φσ φσ φ μσ σ μσ μσ μσ μσ XX it XX it X ttet tet − () () − () − − () () = () = () () = () 1 ,.and So it suffices to find the ch.f. of an N(0, 1) r.v. Y, say. Now φ ππ π Y ity y y ity t yit t teedyedy e e dy e () == == − −− () −∞ ∞ −∞ ∞ − −− () − −∞ ∞ ∫∫ ∫ 1 2 1 2 1 2 2 2 2 2 2 2 22 2 2 2 . Hence φ X (t/ σ ) = e it μ / σ e −t 2 /2 and replacing t/ σ by t, we get, finally: φμ σ X tit t () =− ⎛ ⎝ ⎜ ⎞ ⎠ ⎟ exp . 22 2 148 6 Characteristic Functions, Moment Generating Functions and Related Theorems Hence d dt tit t iti EX d dt tit t it it t i X t t X t t φμ σ μσ μ μ φμ σ μσ σ μ σ μσ () =− ⎛ ⎝ ⎜ ⎞ ⎠ ⎟ − () = () = () =− ⎛ ⎝ ⎜ ⎞ ⎠ ⎟ − () −− ⎛ ⎝ ⎜ ⎞ ⎠ ⎟ =−= = = = = 0 22 2 0 2 2 0 22 2 2 2 22 0 22 2 2 22 exp , . exp exp so that ii 22 2 μσ + () . Then E(X 2 ) = μ 2 + σ 2 and σ 2 (X) = μ 2 + σ 2 − μ 2 = σ 2 . 2. Let X be Gamma distributed with parameters α and β . Then φ X (t) = (1 − i β t) − α . In fact, φ αβ αβ α α β α α ββ X itx x xit t exe dx xe dx () = () = () − − ∞ − −− () ∞ ∫∫ 11 1 0 1 1 0 ΓΓ . Setting x(1 − i β t) = y, we get x y it dx dy it y= − = − ∈∞ [ ) 11 0 ββ ,,,. Hence the above expression becomes 11 1 1 1 1 1 1 1 0 1 0 Γ Γ αβ β β β αβ β α α α β α α α β α () − () − =− () () =− () − − − ∞ − − − − ∞ ∫ ∫ it ye dy it it y e dy it y y . Therefore d dt t i it i X t t φ αβ β αβ α () = − () = = + = 0 1 0 1 , so that E(X ) = αβ , and d dt i it i X t t 2 2 0 2 2 2 0 22 1 1 1 φ αα β β αα β α = + = = + () − () =+ () , so that E(X 2 ) = α ( α + 1) β 2 . Thus σ 2 (X) = α ( α + 1) β 2 − α 2 β 2 = αβ 2 . For α = r/2, β = 2, we get the corresponding quantities for χ 2 r , and for α = 1, β = 1/ λ , we get the corresponding quantities for the Negative Exponential distribution. So 6.5 The Moment Generating Function 149 φφ λ λ λ X r X tit t it it () =− () () =− ⎛ ⎝ ⎜ ⎞ ⎠ ⎟ = − − − 12 1 2 1 ,, respectively. 3. Let X be Cauchy distributed with μ = 0 and σ = 1. Then φ X (t) = e −|t| . In fact, φ ππ ππ X itx te x dx tx x dx i tx x dx tx x dx () = + = () + + () + = () + −∞ ∞ −∞ ∞ ∞ −∞ ∞ ∫∫ ∫∫ 11 1 1 1 1 2 1 22 22 0 cos sin cos because sin , tx x dx () + = −∞ ∞ ∫ 1 0 2 since sin(tx) is an odd function, and cos(tx) is an even function. Further, it can be shown by complex variables theory that cos . tx x dx e t () + = − ∞ ∫ 1 2 2 0 π Hence φ X t te () = − . Now d dt t d dt e X t φ () = − does not exist for t = 0. This is consistent with the fact of nonexistence of E(X ), as has been seen in Chapter 5. Exercises 6.3.1 Let X be an r.v. with p.d.f. f given in Exercise 3.2.13 of Chapter 3. Derive its ch.f. φ , and calculate EX, E[X(X − 1)], σ 2 (X), provided they are finite. 6.3.2 Let X be an r.v. with p.d.f. f given in Exercise 3.2.14 of Chapter 3. Derive its ch.f. φ , and calculate EX, E[X(X − 1)], σ 2 (X), provided they are finite. 6.3.3 Let X be an r.v. with p.d.f. f given by f(x) = λ e − λ (x− α ) I ( α ,∞) (x). Find its ch.f. φ , and calculate EX, σ 2 (X), provided they are finite. Exercises 149 150 6 Characteristic Functions, Moment Generating Functions and Related Theorems 6.3.4 Let X be an r.v. distributed as Negative Binomial with parameters r and p. i) Show that its ch.f., φ , is given by φ t p qe r it r () = − () 1 ; ii) By differentiating φ , show that EX = rq/p and σ 2 (X) = rq/p 2 ; iii) Find the quantities mentioned in (i) and (ii) for the Geometric distribution. 6.3.5 Let X be an r.v. distributed as U( α , β ). ii) Show that its ch.f., φ , is given by φ βα β α t ee it it it () = − − () ; ii) By differentiating φ , show that EX = αβ + 2 and σ αβ 2 12 2 X () = () − . 6.3.6 Consider the r.v. X with p.d.f. f given in Exercise 3.3.14(ii) of Chapter 3, and by using the ch.f. of X, calculate EX n , n = 1, 2, . . . , provided they are finite. 6.4 Definitions and Basic Theorems—The Multidimensional Case In this section, versions of Theorems 1, 2 and 3 are presented for the case that the r.v. X is replaced by a k-dimensional r. vector X. Their interpretation, usefulness and usage is analogous to the ones given in the one-dimensional case. To this end, let now X = (X 1 , , X k )′ be a random vector. Then the ch.f. of the r. vector X, or the joint ch.f. of the r.v.’s X 1 , , X k , denoted by φ X or φ X 1 , , X k , is defined as follows: φ XX k it X it X it X j k kk ttEe t 1 11 22 1 ,, ,, ,, ⋅⋅⋅ ⋅⋅⋅ () = [] ∈ + +⋅⋅⋅+ ޒ j = 1, 2, . . . , k. The ch.f. φ X 1 , , X k always exists by an obvious generalization of Lemmas A, A′ and B, B′. The joint ch.f. φ X 1 , , X k satisfies properties analogous to properties (i)–(vii). That is, one has (Some properties of ch.f.’s) i ′′ ′′ ′) φ X 1 , , X k (0, . . . , 0) = 1. ii ′′ ′′ ′) | φ X 1 , , X k (t 1 , , t k )| ≤ 1. iii ′′ ′′ ′) φ X 1 , , X k is uniformly continuous. THEOREM 1 ′ 6.5 The Moment Generating Function 151 iv ′′ ′′ ′) φ X 1 +d 1 , , X k +d k (t 1 , , t k ) = e it 1 d 1 + ···+it k d k φ X 1 , , X k (t 1 , , t k ). v ′′ ′′ ′) φ c 1 X 1 , , c k X k (t 1 , , t k ) = φ X 1 , , X k (c 1 t 1 , , c k t k ). vi ′′ ′′ ′) φ c 1 X 1 + d 1 , , c k X k + d k (t 1 , , t k ) = e it 1 d 1 + ···+ it k d k φ X 1 , , X k (c 1 t 1 , , c k t k ). vii ′′ ′′ ′) If the absolute (n 1 , , n k )-joint moment, as well as all lower order joint moments of X 1 , , X k are finite, then t, , 1 ∂ ∂∂ φ nn t n t n XX k tt n n k n k k k k k j k j k tt tiEXX 1 1 1 1 1 1 1 0 1 +⋅⋅⋅+ =⋅⋅⋅= = ⋅⋅⋅ ⋅⋅⋅ ⋅⋅⋅ () = ∑ ⋅⋅⋅ () = ,, , and, in particular, ∂ ∂ φ n j n XX k tt n j n t tt iEXj k k k 1 1 1 0 1,, ,, , , ,. ⋅⋅⋅ ⋅⋅⋅ () = () = ⋅⋅⋅ =⋅⋅⋅= = 2, viii) If in the φ X 1 , , X k (t 1 , , t k ) we set t j 1 = ···= t j n = 0, then the resulting expression is the joint ch.f. of the r.v.’s X i 1 , , X i m , where the j’s and the i’s are different and m + n = k. Multidimensional versions of Theorem 2 and Theorem 3 also hold true. We give their formulations below. (Inversion formula) Let X = (X 1 , , X k )′ be an r. vector with p.d.f. f and ch.f. φ . Then ii) fxx T e t t dt dt XX k T k it x it x T T T T XX k k k kk k 1 11 1 1 11 1 2 ,, ,, lim ,, ,, , ⋅⋅⋅ ⋅⋅⋅ () = ⎛ ⎝ ⎜ ⎞ ⎠ ⎟ ⋅⋅⋅ × ⋅⋅⋅ ⋅⋅⋅ () ⋅⋅⋅ →∞ − −⋅⋅⋅− −− ∫∫ φ if X is of the discrete type, and ii) fxx e it h ettdt XX k hT k it h j j k T T T T it x it x XX k k j kk k 1 11 1 1 0 1 11 1 2 1 , , , , lim lim ,, ,, ⋅⋅⋅ ⋅⋅⋅ () = ⎛ ⎝ ⎜ ⎞ ⎠ ⎟ ⋅⋅⋅ − ⎛ ⎝ ⎜ ⎞ ⎠ ⎟ × ⋅⋅⋅ ⋅⋅⋅ () ⋅ →→∞ − = −− − −⋅⋅⋅− ∏ ∫∫ π φ ⋅⋅ ⋅ dt k , if X is of the continuous type, with the analog of (ii′) holding if the integral of | φ X 1 , , X k (t 1 , , t k )| is finite. (Uniqueness Theorem) There is a one-to-one correspondence between the ch.f. and the p.d.f. of an r. vector. PROOFS The justification of Theorem 1′ is entirely analogous to that given for Theorem 1, and so is the proof of Theorem 2′. As for Theorem 3′, the fact that the p.d.f. of X determines its ch.f. follows from the definition of the ch.f. That the ch.f. of X determines its p.d.f. follows from Theorem 2′. ᭡ THEOREM 3 ′ THEOREM 2 ′ 6.4 Definitions and Basic Theorems—The Multidimensional Case 151 152 6 Characteristic Functions, Moment Generating Functions and Related Theorems 6.4.1 The Ch.f. of the Multinomial Distribution Let X = (X 1 , , X k )′ be Multinomially distributed; that is, PX x X x n xx pp kk k x k x k 11 1 1 1 = ⋅⋅⋅ = () = ⋅⋅⋅ ⋅⋅⋅,, ! !! . Then φ XX k it k it n k k ttpe pe 1 1 11 ,, ,, . ⋅⋅⋅ ⋅⋅⋅ () = +⋅⋅⋅+ () In fact, φ XX k it x it x k xx x k x k xx it x k it x k kk k k k k k tt e n xx pp n xx pe p e 1 11 1 1 1 1 1 1 1 1 1 1 ,, ,, ! !! ! !! ,, ,, ⋅⋅⋅ ⋅⋅⋅ () = ⋅⋅⋅ × ⋅⋅⋅ = ⋅⋅⋅ () ⋅⋅⋅ () +⋅⋅⋅+ ⋅⋅⋅ ⋅⋅⋅ ∑ ∑ == +⋅⋅⋅+ () pe p e it k it n k 1 1 . Hence ∂ ∂∂ φ k k XX k tt k k it k it nk tt k tt tt nn n k i p p pe pe i nn n k k k k 1 1 0 11 0 1 1 1 1 11 1 ⋅⋅⋅ ⋅⋅⋅ ⋅⋅⋅ () =− () ⋅⋅⋅ − + () ⋅⋅⋅ +⋅⋅⋅ ( + ) =− () ⋅⋅⋅ − =⋅⋅⋅= = − =⋅⋅⋅= = ,, ,, kkpp p k + () ⋅⋅⋅1 12 . Hence EX X nn n k pp p kk112 11⋅⋅⋅ () =− () ⋅⋅⋅ − + () ⋅⋅⋅ . Finally, the ch.f. of a (measurable) function g(X) of the r. vector X = (X 1 , , X k )′ is defined by: φ g itg itg k itg x x kk tEe ef x x e f x x dx dx k X X xx () () () ⋅⋅⋅ () −∞ ∞ −∞ ∞ () = ⎡ ⎣ ⎢ ⎤ ⎦ ⎥ = () = ⋅⋅⋅ () ⋅⋅⋅ ⋅⋅⋅ () ⋅⋅⋅ ⎧ ⎨ ⎪ ⎩ ⎪ ∑ ∫∫ x x ,,, ,, ,, . ,, 1 11 1 ′ Exercise 6.4.1 (Cramér–Wold) Consider the r.v.’s X j , j = 1, , k and for c j ∈ ޒ , j = 1, . . . , k, set 6.5 The Moment Generating Function 153 YcX cjj j k = = ∑ . 1 Then ii) Show that φ Y c (t) = φ X 1 , , X k (c 1 t, , c k t), t ∈ ޒ , and φ X 1 , , X k (c 1 , , c k ) = φ Y c (1); ii) Conclude that the distribution of the X’s determines the distribution of Y c for every c j ∈ ޒ , j = 1, , k. Conversely, the distribution of the X’s is determined by the distribution of Y c for every c j ∈ ޒ , j = 1, , k. 6.5 The Moment Generating Function and Factorial Moment Generating Function of a Random Variable The ch.f. of an r.v. or an r. vector is a function defined on the entire real line and taking values in the complex plane. Those readers who are not well versed in matters related to complex-valued functions may feel uncomfortable in dealing with ch.f.’s. There is a partial remedy to this potential problem, and that is to replace a ch.f. by an entity which is called moment generating function. However, there is a price to be paid for this: namely, a moment generating function may exist (in the sense of being finite) only for t = 0. There are cases where it exists for t’s lying in a proper subset of ޒ (containing 0), and yet other cases, where the moment generating function exists for all real t. All three cases will be illustrated by examples below. First, consider the case of an r.v. X. Then the moment generating function (m.g.f.) M X (or just M when no confusion is possible) of a random variable X, which is also called the Laplace transform of f, is defined by M X (t) = E(e tX ), t ∈ ޒ , if this expectation exists. For t = 0, M X (0) always exists and equals 1. However, it may fail to exist for t ≠ 0. If M X (t) exists, then formally φ X (t) = M X (it) and therefore the m.g.f. satisfies most of the properties analo- gous to properties (i)–(vii) cited above in connection with the ch.f., under suitable conditions. In particular, property (vii) in Theorem 1 yields d dt X t n n n Mt EX () = () =0 , provided Lemma D applies. In fact, d dt Mt d dt Ee E d dt e EX e EX n n X t n n tX t n n tX t ntX t n () = () = ⎛ ⎝ ⎜ ⎞ ⎠ ⎟ = () = () == = = 00 0 0 . This is the property from which the m.g.f. derives its name. [...]... Lemma D applies, so that the interchange of the order of differentiation and expectation is valid Hence dn ηX t dt n () [ ( )] ) ( = E X X −1 ⋅ ⋅ ⋅ X − n+1 t =1 (9) The factorial m.g.f derives its name from the property just established As has already been seen in the first two examples in Section 2 of Chapter 5, factorial moments are especially valuable in calculating the variance of discrete r.v.’s Indeed,... with parameter p, show that the r.v X = X1 + · · · + Xr has the Negative Binomial distribution with parameters r and p 7 .4* 177 Independence of Classes of Events and Related Results 7.1 Stochastic Independence: Criteria of Independence 7.3.3 The life of a certain part in a new automobile is an r.v X whose p.d.f is Negative Exponential with parameter λ = 0.005 day iii) Find the expected life of the part... 168 7 Stochastic Independence with Some Applications which again establishes independence of Xj, j = 1, , k by Theorem 1(ii) ▲ A version of this theorem involving m.g.f.’s can be formulated, if the m.g.f.’s exist REMARK 4 COROLLARY Let X1, X2 have the Bivariate Normal distribution Then X1, X2 are independent if and only if they are uncorrelated PROOF We have seen that (see Bivariate Normal in Section... β’s are the r.v.’s X and Y independent? 7 .4* Independence of Classes of Events and Related Results In this section, we give an alternative definition of independence of r.v.’s, which allows us to present a proof of Lemma 1 An additional result, Lemma 3, is also stated, which provides a parsimonious way of checking independence of r.v.’s 178 7 Stochastic Independence with Some Applications To start... the probability space (S, A, P) and recall that k events A1 , , Ak are said to be independent if for all 2 ≤ m ≤ k and all 1 ≤ i1 < · · · < im ≤ k, it holds that P(Ai ∩ · · · ∩ Ai ) = P(Ai ) · · · P(Ai ) This definition is extended to any subclasses of Cj, j = 1, , k, as follows: 1 DEFINITION 2 m 1 m We say that Cj, j = 1, , k are (stochastically) independent (or independent in the probability... dependent or negatively quadrant dependent, respectively In particular, if X1 and X2 have the Bivariate Normal distribution, it can be seen that they are positively quadrant dependent or negatively quadrant dependent according to whether ρ ≥ 0 or ρ < 0 1 1 6.5.23 2 1 2 1 2 2 Verify the validity of relation (13) 6.5. 24 ii) If the r.v.’s X1 and X2 have the Bivariate Normal distribution with parameters μ1,... sample space, consider a class of events associated with this space, and let P be a probability function defined on the class of events In Chapter 2 (Section 2.3), the concept of independence of events was defined and was heavily used there, as well as in subsequent chapters Independence carries over to r.v.’s also, and is the most basic assumption made in this book Independence of r.v.’s, in essence,... X1 and X2 have a Bivariate Normal distribution if and only if for every c1, c2 ∈ ‫ ,ޒ‬Yc = c1X1 + c2X2 is normally distributed; ii) In either case, show that c1X1 + c2X2 + c3 is also normally distributed for any c3 ∈ ‫.ޒ‬ 1 64 7 Stochastic Independence with Some Applications Chapter 7 Stochastic Independence with Some Applications 7.1 Stochastic Independence: Criteria of Independence Let S be a sample... σ-fields is implied by independence of the classes Cj, j = 1, , k, is an involved result in probability theory and it cannot be discussed here ▲ PROOF We may now proceed with the proof of Lemma 1 In the first place, if X is an r.v and AX = X−1(B), and if g(X) is a measurable function of X and Ag(X) = [g(X)]−1(B), then Ag(X) ⊆ AX In fact, let A ∈ Ag(X) Then there exists B ∈ B such that A = [g(X)]−1 (B)... its factorial m.g.f in order to calculate its kth factorial moment Compare with Exercise 5.2 .4 in Chapter 5 6.5.7 Let X be an r.v distributed as Negative Binomial with parameters r and p i) Show that its m.g.f and factorial m.g.f., M(t) and η(t), respectively, are given by () MX t = pr (1 − qe ) t r () , t < − log q, ηX t = pr (1 − qt ) r t < , 1 ; q ii) By differentiation, show that EX = rq/p and σ . its name from the property just estab- lished. As has already been seen in the first two examples in Section 2 of Chapter 5, factorial moments are especially valuable in calculating the vari- ance. a function defined on the entire real line and taking values in the complex plane. Those readers who are not well versed in matters related to complex-valued functions may feel uncomfortable in dealing. respectively. In particular, if X 1 and X 2 have the Bivariate Normal distribution, it can be seen that they are positively quadrant depend- ent or negatively quadrant dependent according to whether

Ngày đăng: 23/07/2014, 16:21

TỪ KHÓA LIÊN QUAN