1. Trang chủ
  2. » Khoa Học Tự Nhiên

Báo cáo hóa học: " Probability inequalities for END sequence and their applications" docx

12 364 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 12
Dung lượng 314,29 KB

Nội dung

RESEARCH Open Access Probability inequalities for END sequence and their applications Aiting Shen Correspondence: baret@sohu.com School of Mathematical Science, Anhui University, Hefei 230039, PR China Abstract Some probability inequalities for extended negatively dependent (END) sequence are provided. Using the probability inequalities, we present some moment inequalities, especially the Rosenthal-type inequality for END sequence. At last, we study the asymptotic approximation of inverse moment for nonnegative END sequence with finite first moments, which generalizes and improves the corresponding results of Wu et al. [Stat. Probab. Lett. 79, 1366-137 1 (2009)], Wang et al. [Stat. Probab. Lett. 80, 452-461 (2010)], and Sung [J. Inequal. Appl. 2010, Article ID 823767, 13pp. (2010). doi:10.1155/2010/8237 67]. MSC(2000): 60E15; 62G20. Keywords: extended negatively dependent sequence, probability inequality, moment inequality, inverse moment 1 Introduction It is well kno wn that the probability inequality plays an important ro le in various proofs of limit theorems. In particular, it provides a measure of convergence rate for the strong law of large n umbers. The main purpose of the article is to provide so me probability inequalities for extended negatively dependent (END) sequence, which con- tains independent sequence, NA sequence, and NOD sequence as special cases. These probability inequalities for END random variables are mainl y inspired by Fakoor and Azarnoosh [1] and Asadian et al. [2]. Using the pro bability inequalities, we can further study the moment inequalities and asymptotic approximation of inverse moment for END sequence. First, we will recall the definitions of NOD and END sequences. Definition 1.1 (cf. Joag-Dev and Proschan [3]). A finite collection of random variables X 1 , X 2 , ,X n is said to be negatively upper orthant dependent (NUOD) if for all real numbers x 1 , x 2 , , x n , P( X i > x i , i =1,2, , n) ≤ n  i =1 P( X i > x i ) , (1:1) and negatively lower orthant dependent (NLOD) if for all real numbers x 1 , x 2 , , x n , P( X i ≤ x i , i =1,2, , n) ≤ n  i =1 P( X i ≤ x i ) . (1:2) Shen Journal of Inequalities and Applications 2011, 2011:98 http://www.journalofinequalitiesandapplications.com/content/2011/1/98 © 2011 Shen; licensee Springer. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestri cted use, distribution, and reproduction in any medium, provided the original work is properly cited. A fin ite collection of random variables X 1 , X 2 , , X n is said to be negatively orthant dependent (NOD) if they are both NUOD and NLOD. An infinite sequence {X n , n ≥ 1} is said to be NOD if every finite subcollection is NOD. Definition 1.2 (cf. Liu [4]). We cal l random variables {X n , n ≥ 1} END if there exists a constant M >0such that both P( X 1 > x 1 , X 2 > x 2 , , X n > x n ) ≤ M n  i =1 P( X i > x i ) (1:3) and P( X 1 ≤ x 1 , X 2 ≤ x 2 , , X n ≤ x n ) ≤ M n  i =1 P( X i ≤ x i ) (1:4) hold for each n ≥ 1 and all real numbers x 1 , x 2 , , x n . The concept of END sequence was introduced by Liu [4]. Some applications for END s equence have b een found. See, for example, Liu [4] obtained the precise large deviations for dependent random variables with heavy tails. Liu [5] studied the suffi- cient and necessary conditions of moderate deviations for dependent random variables with heavy tails. It is easily seen that independent random v ariables and NOD random variables are END. Joag-Dev and Proschan [3] pointed out that NA random variables are NOD. Thus, NA random variables are END. Since END random variables are much weaker than independent random variables, NA random variables and NOD random variables, studying the limit behavior of END sequence is of interest. Throughout the article, let {X n , n ≥ 1}beasequenceofENDrandomvariables defined on a fixed probability space ( , F , P ) with respective distribution functions F 1 , F 2 , . Denote X + = max{0, X}. c n ~ d n means c n d −1 n → 1 as n ® ∞,andc n = o(d n ) means c n d −1 n → 0 as n ® ∞. Let M and C be positive constants which may be different in various places. Set M t,n = n  i =1 E|X i | t , S n = n  i =1 X i , n ≥ 1 . The following lemma is useful. Lemma 1.1 (cf. Liu [5]). Let random variables X 1 , X 2 , , X n be END. (i) If f 1 , f 2 , , f n are all nondecreasing (or nonincreasing) functions, then random variables f 1 (X 1 ), f 2 (X 2 ), , f n (X n ) are END. (ii) For each n ≥ 1, there exists a constant M >0such that E ⎛ ⎝ n  j=1 X + j ⎞ ⎠ ≤ M n  j=1 EX + j . (1:5) Lemma 1.2. Let {X n , n ≥ 1} be a sequence of END random variables and {t n , n ≥ 1} be a sequence of n onnegative numbers (ornonpositivenumbers),thenforeachn≥ 1, there exists a constant M >0such that E  n  i=1 e t i X i  ≤ M n  i=1 Ee t i X i . (1:6) Shen Journal of Inequalities and Applications 2011, 2011:98 http://www.journalofinequalitiesandapplications.com/content/2011/1/98 Page 2 of 12 As a byproduct, for any t Î ℝ, E  n  i=1 e tX i  ≤ M n  i=1 Ee tX i . (1:7) Proof. The desired result follows from Lemma 1.1 (i) and (ii) immediately. □ The organization of this article is as follows: The probability inequalities for END sequence are provided in S ection 2, the moment inequalities for END sequence are presented in Section 3, and the asymptotic approximatio n of inverse moment for non- negative END sequence is studied in Section 4. 2 Probability inequalities for sums of END sequence In this section, we will give some probability inequalities for END random variables, which can be appli ed to obtain the moment inequalities and strong law of large num- bers. The proofs of the probability inequaliti es for END random variables are mainly inspired by Fakoor and Azarnoosh [1] and Asadian et al. [2]. Let x, y be arbitrary posi- tive numbers. Theorem 2.1. Let 0<t ≤ 1. Then, there exists a positive constant M such that P( S n ≥ x) ≤ n  i =1 P( X i ≥ y)+M exp  x y − x y log  1+ xy t−1 M t,n  . (2:1) If xy t-1 >M t, n , then P( S n ≥ x) ≤ n  i =1 P( X i ≥ y)+M exp  x y − M t,n y t − x y log  xy t−1 M t,n  . (2:2) Proof.Fory >0,denoteY i =min(X i , y), i =1,2, ,n and T n =  n i =1 Y i , n ≥ 1. It is easy to check that {S n ≥ x}⊂{T n = S n }  {T n ≥ x} , which implies that for any positive number h, P( S n ≥ x) ≤ P(T n = S n )+P(T n ≥ x) ≤ n  i =1 P( X i ≥ y)+e −hx Ee hT n . (2:3) Lemma 1.1 (i) implies that Y 1 , Y 2 , ,Y n are still END random variables. It follows from (2.3) and Lemma 1.2 that P( S n ≥ x) ≤ n  i =1 P( X i ≥ y)+Me −hx n  i =1 Ee hY i , (2:4) where M is a positive constant. For 0 <t ≤ 1, the function (e hu - 1)/u t is increasing on u > 0. Thus, Ee hY i =  y −∞ (e hu − 1)dF i (u)+  ∞ y (e hy − 1)dF i (u)+1 ≤  y 0 (e hu − 1)dF i (u)+  ∞ y (e hy − 1)dF i (u)+1 ≤ e hy − 1 y t  y 0 u t dF i (u)+ e hy − 1 y t  ∞ y u t dF i (u)+ 1 ≤ 1+ e hy − 1 y t E|X i | t ≤ exp  e hy − 1 y t E|X i | t  . Shen Journal of Inequalities and Applications 2011, 2011:98 http://www.journalofinequalitiesandapplications.com/content/2011/1/98 Page 3 of 12 Combining the inequality above and (2.4), we can get that P( S n ≥ x) ≤ n  i =1 P( X i ≥ y)+M exp  e hy − 1 y t M t,n − hx  . (2:5) Taking h = 1 y log  1+ xy t−1 M t,n  in the right-hand side of (2.5), we can get (2.1) immedi- ately. If xy t-1 >M t, n , then the right-hand side of (2.5) attains a minimum value when h = 1 y log  1+ xy t−1 M t,n  . Substitute this value of h totheright-handsideof(2.5),wecan get (2.2) immediately. This completes the proof of the theorem. By Theorem 2.1, we can get the following Theorem 2.2 immediately. Theorem 2.2. Let 0<t ≤ 1. Then, there exists a positive constant M such that P( |S n |≥x) ≤ n  i =1 P( |X i |≥y)+2M exp  x y − x y log  1+ xy t−1 M t,n  . (2:6) If xy t-1 >M t, n , then P( |S n |≥x) ≤ n  i =1 P( X i ≥ y)+2M exp  x y − M t,n y t − x y log  xy t−1 M t,n  . (2:7) Theorem 2.3. Assume that EX i =0for each i ≥ 1,thenforanyh,x,y>0,there exists a positive constant M such that P( |S n |≥x) ≤ n  i =1 P( |X i |≥y)+2M exp  e hy − 1 − hy y 2 M 2,n − hx  . (2:8) If we take h = 1 y log  1+ xy M 2,n  , then P( |S n |≥x) ≤ n  i =1 P( |X i |≥y)+2M exp  x y − x y log  1+ xy M 2,n  . (2:9) Proof. We use the same notations as that in Theorem 2.1. It is easy to see that (e hu - 1-hu)/u 2 is nondecreasing on the real line. Therefore, Ee hY i ≤ 1+hEX i +  y −∞ (e hu − 1 − hu)dF i (u)+  ∞ y (e hy − 1 − hy)dF i (u ) =1+  y −∞ e hu − 1 − hu u 2 u 2 dF i (u)+  ∞ y (e hy − 1 − hy)dF i (u) ≤ 1+ e hy − 1 − hy y 2   y −∞ u 2 dF i (u)+  ∞ y y 2 dF i (u)  ≤ 1+ e hy − 1 − hy y 2 EX 2 i ≤ exp  e hy − 1 − hy y 2 EX 2 i  , which implies that P( S n ≥ x) ≤ n  i =1 P( X i ≥ y)+M exp  e hy − 1 − hy y 2 M 2,n − hx  . Replacing X i by -X i , we have P( −S n ≥ x) ≤ n  i =1 P( −X i ≥ y)+M exp  e hy − 1 − hy y 2 M 2,n − hx  . Shen Journal of Inequalities and Applications 2011, 2011:98 http://www.journalofinequalitiesandapplications.com/content/2011/1/98 Page 4 of 12 Therefore, (2.8) follows from statements above immediately, which yields the desired result (2.9). The proof is completed. Theorem 2.4. Assume that EX i =0and |X i | ≤ C for each i ≥ 1, where C is a positive constant. Denote B n =  n i=1 EX 2 i for each n ≥ 1.Then,foranyx>0,thereexistsa positive constant M such that P( S n ≥ x) ≤ M exp  − x 2C arcsin h  Cx 2B n  (2:10) and P( |S n |≥x) ≤ 2M exp  − x 2C arcsin h  Cx 2B n  . (2:11) Proof. It is easily seen that e x − x − 1 ≤ e x + e −x − 2=2 ( cosh x − 1 ) =2 ( cosh |x|−1 ) , x ∈ R and 2 ( cosh x − 1 ) ≤ x sinh x, x ≥ 0 . Thus, for all a > 0 and i = 1, 2, , n, we can get that E(e αX i − 1) = E(e αX i − αX i − 1) ≤ 2E(cosh αX i − 1) =2E(cosh α|X i |−1) ≤ E(α|X i | sinh α|X i | ) = E  α 2 X 2 i sinh α|X i | α|X i |  ≤ αEX 2 i C sinh αC. The last inequality above follows from the fact that the fu nctio n sinh x x is nondecreas- ing on the half-line (0, ∞). Since x = x -1+1≤ e x-1 for all x Î ℝ, we have by Lemma 1.2 that E  n  i=1 e αX i  ≤ M n  i=1 Ee αX i ≤ M n  i=1 exp(Ee αX i − 1) ≤ M exp  αB n sinh αC C  , where C is a positive constant. Therefore, for all a > 0 and x > 0, we have P( S n ≥ x) ≤ e −αx Ee αS n ≤ M exp  α  B n sinh αC C − x  . (2:12) Taking α = 1 C arcsin h  Cx 2B n  in the right-hand side of (2.12), we can see that B n sin h αC C = x 2 and (2.10) follows. Since {-X n , n ≥ 1} is still a sequence of END random variables from Lemma 1.1, we have by (2.10) that P( −S n ≥ x) ≤ M exp  − x 2C arcsin h  Cx 2B n  . (2:13) Hence, (2.11) follows from (2.10) and (2.13) immediately. This completes the proof of the theorem. Theorem 2.5. Assume that EX i =0and |X i | ≤ C for each i ≥ 1, where C is a positive con- stant. If B n =  n i=1 EX 2 i = O(n ) , then n -1 S n ® 0 completely and in consequence n -1 S n ® 0 a.s. Shen Journal of Inequalities and Applications 2011, 2011:98 http://www.journalofinequalitiesandapplications.com/content/2011/1/98 Page 5 of 12 Proof. For any ε > 0, we have by Theorem 2.4 that P( S n ≥ nε) ≤ M exp  − nε 2C arcsin h  Cnε 2B n  ≤ M exp {−nD} , where D is a positive constant. Therefore, ∞  n =1 P( S n ≥ nε) < ∞ , which implies that n -1 S n ® 0 completely and in consequence n -1 S n ® 0 a.s.by Borel-Cantelli Lemma. The proof is completed. Theorem 2.6. Assume that EX 2 n < ∞ and ES n ≤ 0foreachn≥ 1. Denote s n = ES 2 n .If there exists a nondecreasing sequence of positive numbers {c n , n ≥ 1} such that P(S n ≤ c n )= 1, then for any x >0, P( S n ≥ x) ≤ exp  − x 2 2 ( s n + xc n )  1+ 2 3 log  1+ xc n s n  . (2:14) In order to prove Theorem 2.6, the following lemma is useful. Lemma 2.1 (cf. Shao [6]). For any x ≥ 0, log(1 + x) ≥ x 1+x + x 2 2 ( 1+x ) 2  1+ 2 3 log(1 + x)  . Proof of Theorem 2.6. Noting that (e x -1-x)/x 2 is nondecreasing on the real line, for any h > 0 and n ≥ 1, we have Ee hS n =1+hES n + E  e hS n − 1 − hS n (hS n ) 2 (hS n ) 2  ≤ 1+E  e hc n − 1 − hc n (hc n ) 2 (hS n ) 2  =1+  e hc n − 1 − hc n c 2 n  s n ≤ exp  e hc n − 1 − hc n c 2 n  s n  . Hence, P( S n ≥ x) ≤ e −hx Ee hS n ≤ exp  e h c n − 1 − hc n c 2 n  s n − hx  . (2:15) Taking h = 1 c n log  1+ xc n s n  in the right-hand side of (2.15), we can obtain that P( S n ≥ x) ≤ exp  x c n − x c n  1+ s n xc n  log  1+ xc n s n  . (2:16) By Lemma 2.1, we can get that x c n  1+ s n xc n  log  1+ xc n s n  ≥ x c n  1+ s n xc n   xc n s n + xc n + 1 2  xc n s n + xc n  2  1+ 2 3 log  1+ xc n s n   = x c n + x 2 2 ( s n + xc n )  1+ 2 3 log  1+ xc n s n  . Shen Journal of Inequalities and Applications 2011, 2011:98 http://www.journalofinequalitiesandapplications.com/content/2011/1/98 Page 6 of 12 The desired result (2.14) follows from the above inequality and (2.16) immediately. 3 Moment inequalities for END sequence In this section, we will present some moment inequalities, especially the Rosenthal- type inequality for END sequence by means of the probabili ty inequalities that obtained in Section 2. The proofs are also inspired by Asadian et al. [2]. The Rosenthal-type inequality can be applied to prove the asymptotic approximation of inverse moment for nonnegative END random variables in Section 4. Theorem 3.1. Let 0<t ≤ 1 and g(x) be a nonnegative even function and nondecreas- ing on the half-line [0, ∞). Assume that g(0) = 0 and Eg(X i )<∞ for each i ≥ 1, then for every r >0, there exists a positive constant M such that Eg(S n ) ≤ n  i =1 Eg(rX i )+2Me r  ∞ 0  1+ x t r t−1 M t,n  −r dg(x) . (3:1) Proof. Taking y = x r in Theorem 2.2, we have P( |S n |≥x) ≤ n  i =1 P  |X i |≥ x r  +2Me r  1+ x t r t−1 M t,n  −r , which implies that  ∞ 0 P(|S n |≥x)dg(x) ≤ n  i=1  ∞ 0 P(r|X i |≥x)dg(x)+2Me r  ∞ 0  1+ x t r t−1 M t,n  −r dg(x) . Therefore, the desired result (3.1) follows from the inequality above and Lemma 2.4 in Petrov [7] immediately. This completes the proof of the theorem. Corollary 3.1. Let 0<t ≤ 1,p≥ t, and E|X i | p < ∞ for each i ≥ 1. Then, there exists a positive constant C(p, t) depending only on p and t such that E|S n | p ≤ C(p, t)  M p,n + M p/t t,n  . (3:2) Proof. Taking g(x)=|x| p , p ≥ t in Theorem 3.1, we can get that E|S n | p ≤ r p n  i =1 E|X i | p +2pMe r  ∞ 0 x p−1  1+ x t r t−1 M t,n  −r dx . (3:3) It is easy to check that I . =  ∞ 0 x p−1  1+ x t r t−1 M t,n  −r dx =  ∞ 0 x p−1  r t−1 M t,n r t−1 M t,n + x t  r dx =  ∞ 0 x p−1  1 − x t r t−1 M t,n + x t  r dx . If we set y = x t r t−1 M t , n +x t in the last equality above, then we have for r >p/t that I = r p−p/t M p/t t,n t  1 0 y p t −1 (1 − y) r− p t −1 d y = r p−p/t M p/t t,n t B  p t , r − p t  , Shen Journal of Inequalities and Applications 2011, 2011:98 http://www.journalofinequalitiesandapplications.com/content/2011/1/98 Page 7 of 12 where B(α, β)=  1 0 x α−1 (1 − x) β−1 dx, α, β> 0 is the Beta function. Substitute I to (3.3) and choose C(p, t)=max  r p ,2pMe r B  p t , r − p t  r p−p/t t  , we can obtain the desired result (3.2) immediately. The proof is completed. Similar to the proofs of Theorem 3.1 and Corolla ry 3.1, we can get the following Theorem 3.2 and Corollary 3.2 using Theorem 2.3. The details are omitted. Theorem 3.2. Let EX i =0for each i ≥ 1. Assume that the conditions of Theorem 3.1 are satisfied, then for every r >0, there exists a positive constant M such that Eg(S n ) ≤ n  i =1 Eg(rX i )+2Me r  ∞ 0  1+ x 2 rM 2,n  −r dg(x) . (3:4) Corollary 3.2 (Rosenthal-type inequality). Let p ≥ 2,EX i =0, and E|X i | p < ∞ for each i ≥ 1. Then, there exists a positive constant C p depending only on p such that E|S n | p ≤ C p ⎡ ⎣ n  i=1 E|X i | p +  n  i=1 E|X i | 2  p/2 ⎤ ⎦ . (3:5) 4 Asymptotic approximation of inverse moment for nonnegative END random variables Recently, Wu et al. [8] studied the asymptotic approximation of inverse moment for nonnegative independent random variables by me ans of the trunca ted method and Berstein’s inequality, and obtained the following result: Theorem A. Let {Z n , n ≥ 1} be a sequence of independent, n onnegative, and non- degenerated random variables. Suppose that (i) EZ 2 n < ∞ , ∀ n ≥ 1; (ii) EX n ® ∞ as n ® ∞, where X n = n  i =1 Z i  B n , B 2 n = n  i =1 VarZ i ; (iii) there exists a finite positive constant C 1 not depending on n such that sup 1≤i≤n EZ i /B n ≤ C 1 ; (iv) for some h >0, B −2 n n  i =1 EZ 2 i I(Z i >ηB n ) → 0, n →∞ . (4:1) Then, for all real numbers a >0and a >0, E ( a + X n ) −α ∼ ( a + EX n ) −α , n →∞ . (4:2) Shen Journal of Inequalities and Applications 2011, 2011:98 http://www.journalofinequalitiesandapplications.com/content/2011/1/98 Page 8 of 12 Wang et al. [9] pointed out that the condition (iii) in Theorem A can be removed and extended the result for independent random variables to the case of NOD random variables. Shi et al. [10] obtained (4 .2) for B n = 1 and pointed out that the existence of finite second moments is not required. Sung [11] studied the asymptotic approxima- tion of inverse moments for nonneg ative random variables satisfying a Rosenthal-type inequality. For more details about asymptotic approximation of inverse moment, one can refer to Garcia and Palacios [12], Kaluszka and Okolewski [13], and Hu et al. [14], and so on. The main purpose of this section is to show that (4.2) holds under very mild condi- tions. Our results will extend and improve the results of Wu et al. [8], Wang et al. [9], and Sung [11]. Now, we state and prove the results of asymptotic approximation of inverse moments for nonnegative END random variables. Theorem 4.1. Let {Z n , n ≥ 1} be a sequence of nonnegative END random variables and {B n , n ≥ 1} be a sequence of positive constants. Suppose that (i) EZ n < ∞, ∀n ≥ 1; (ii) μ n . = EX n → ∞ as n ® ∞, where X n = B −1 n  n k =1 Z k ; (iii) there exists some b >0such that  n k=1 EZ k I(Z k > bB n )  n k =1 EZ k → 0, n →∞ . (4:3) Then, for all real numbers a >0and a >0, (4.2) holds. Proof. It is easily seen that f(x)=(a + x) -a is a convex function of x on [0, ∞), by Jensen’s inequality, we have E ( a + X n ) −α ≥ ( a + EX n ) −α , (4:4) which implies that lim inf n →∞ (a + EX n ) α E(a + X n ) −α ≥ 1 . (4:5) To prove (4.2), it is enough to show that lim sup n →∞ (a + EX n ) α E(a + X n ) − α ≤ 1 . (4:6) In order to prove (4.6), we need only to show that for all δ Î (0, 1), lim sup n →∞ (a + EX n ) α E(a + X n ) −α ≤ (1 − δ) −α . (4:7) By (iii), we can see that for all δ Î (0, 1), there exists n(δ) > 0 such that n  k =1 EZ k I(Z k > bB n ) ≤ δ 4 n  k =1 EZ k , n ≥ n(δ) . (4:8) Let U n = B −1 n n  k =1  Z k I(Z k ≤ bB n )+bB n I(Z k > bB n )  Shen Journal of Inequalities and Applications 2011, 2011:98 http://www.journalofinequalitiesandapplications.com/content/2011/1/98 Page 9 of 12 and E(a + X n ) −α = E(a + X n ) −α I(U n ≥ μ n − δμ n )+E(a + X n ) −α I(U n <μ n − δμ n ) . = Q 1 + Q 2 . (4:9) For Q 1 , since X n ≥ U n , we have Q 1 ≤ E ( a + X n ) − α I ( X n ≥ μ n − δμ n ) ≤ ( a + μ n − δμ n ) − α . (4:10) By (4.8), we have for n ≥ n(δ) that | μ n − EU n | =      B −1 n n  k=1 EZ k I(Z k > bB n ) − B −1 n n  k=1 bB n EI(Z k > bB n )      ≤ B −1 n n  k=1 EZ k I(Z k > bB n )+B −1 n n  k=1 bB n EI(Z k > bB n ) ≤ B −1 n n  k=1 EZ k I(Z k > bB n )+B −1 n n  k=1 EZ k I(Z k > bB n ) =2B −1 n n  k =1 EZ k I(Z k > bB n ) ≤ δμ n /2. (4:11) For each n ≥ 1, it is easy to see that {Z k I(Z k ≤ bB n )+bB n I(Z k >bB n ), 1 ≤ k ≤ n}are END random variables by Lemma 1.1. Therefore, by (4.11), Markov’s inequality, Corol- lary 3.2, and C r ’s inequality, for any p > 2 and n ≥ n(δ), Q 2 ≤ a −α P ( U n <μ n − δμ n ) = a −α P( EU n − U n >δμ n − (μ n − EU n )) ≤ a −α P( EU n − U n >δμ n /2) ≤ a −α P( |U n − EU n | >δμ n /2) ≤ Cμ −p n E|U n − EU n | p ≤ Cμ −p n  B −2 n n  k=1 EZ 2 k I(Z k ≤ bB n )+B −2 n n  k=1 b 2 B 2 n EI(Z k > bB n )  p/ 2 + Cμ −p n  B −p n n  k=1 EZ p k I(Z k ≤ bB n )+B −p n n  k=1 b p B p n EI(Z k > bB n )  ≤ Cμ −p n  B −1 n n  k=1 EZ k I(Z k ≤ bB n )+B −1 n n  k=1 EZ k I(Z k > bB n )  p/2 + Cμ −p n B −1 n n  k=1 EZ k I(Z k ≤ bB n )+Cμ −p n B −1 n n  k=1 EZ k I(Z k > bB n ) = Cμ −p n  μ p/2 n + μ n  = C  μ −p/2 n + μ 1−p n  . (4:12) Taking p > max {2, 2a, a +1}, we have by (4.9), (4.10), and (4.12) that lim sup n→∞ (a + μ n ) α E(a + X n ) − α ≤ lim sup n→∞ (a + μ n ) α (a + μ n − δμ n ) −α + lim sup n→∞ (a + μ n ) α  Cμ −p/2 n + Cμ 1−p n  = ( 1 − δ ) −α , which implies (4.7). This completes the proof of the theorem. Shen Journal of Inequalities and Applications 2011, 2011:98 http://www.journalofinequalitiesandapplications.com/content/2011/1/98 Page 10 of 12 [...]... Province Universities (2010SQRL016ZD) and Youth Science Research Fund of Anhui University (2009QN011A) Authors’ contributions Some probability inequalities and moment inequalities for extended negatively dependent (END) sequence are provided The asymptotic approximation of inverse moment for nonnegative END sequence with finite first moments is obtained All authors read and approved the final manuscript... Chen, EB: On inverse moments of nonnegative weakly convergent random variables Acta Math Appl Sin 30, 361–367 (2007) (in Chinese) doi:10.1186/1029-242X-2011-98 Cite this article as: Shen: Probability inequalities for END sequence and their applications Journal of Inequalities and Applications 2011 2011:98 Submit your manuscript to a journal and benefit from: 7 Convenient online submission 7 Rigorous peer... October 2011 References 1 Fakoor, V, Azarnoosh, HA: Probability inequalities for sums of negatively dependent random variables Park J Stat 21(3), 257–264 (2005) 2 Asadian, N, Fakoor, V, Bozorgnia, A: Rosenthal’s type inequalities for negatively orthant dependent random variables JIRSS 5(1-2), 69–75 (2006) 3 Joag-Dev, K, Proschan, F: Negative association of random variables with applications Ann Stat 11(1),... inequalities between negatively associated and independent random variables J Theoret Probab 13, 343–356 (2000) doi:10.1023/A:1007849609234 7 Petrov, VV: Limit Theorems of Probability Theory: Sequences of Independent Random Variables Clarendon Press, Oxford (1995) 8 Wu, TJ, Shi, XP, Miao, BQ: Asymptotic approximation of inverse moments of nonnegative random variables Stat Probab Lett 79, 1366–1371 (2009)... deviations for dependent random variables with heavy tails Stat Probab Lett 79, 1290–1298 (2009) doi:10.1016/j.spl.2009.02.001 5 Liu, L: Necessary and sufficient conditions for moderate deviations of dependent random variables with heavy tails Sci China Ser A Math 53(6), 1421–1434 (2010) doi:10.1007/s11425-010-4012-9 6 Shao, QM: A comparison theorem on moment inequalities between negatively associated and. .. Palacios, JL: On inverse moments of nonnegative random variables Stat Probab Lett 53, 235–239 (2001) doi:10.1016/S0167-7152(01)00008-6 Shen Journal of Inequalities and Applications 2011, 2011:98 http://www.journalofinequalitiesandapplications.com/content/2011/1/98 13 Kaluszka, M, Okolewski, A: On Fatou-type lemma for monotone moments of weakly convergent random variables Stat Probab Lett 66, 45–50 (2004)... Ling, NX: Exponential inequalities and inverse moment for NOD sequence Stat Probab Lett 80, 452–461 (2010) doi:10.1016/j.spl.2009.11.023 10 Shi, XP, Wu, YH, Liu, Y: A note on asymptotic approximations of inverse moments of nonnegative random variables Stat Probab Lett 80, 1260–1264 (2010) doi:10.1016/j.spl.2010.04.004 11 Sung, SH: On inverse moments for a class of nonnegative random variables J Inequal...Shen Journal of Inequalities and Applications 2011, 2011:98 http://www.journalofinequalitiesandapplications.com/content/2011/1/98 Page 11 of 12 Remark 4.1 Theorem 4.1 in this article generalizes and improves the corresponding results of Wu et al [8], Wang et al [9], and Sung [11] First, Theorem 4.1 in this article is based on the condition... EZi I(Zi > ηBn ) B−1 = n n i=1 EZi n i=1 EZi I(Zi > ηBn ) → 0, μn n → ∞, since μn ® ∞, i.e (4.3) holds Acknowledgements The author was most grateful to the Editor Andrei Volodin and anonymous referee for the careful reading of the manuscript and valuable suggestions which helped in significantly improving an earlier version of this article The research was supported by the National Natural Science Foundation... condition 2 EZn < ∞, ∀ n ≥ 1 in the above cited references Second, {Bn, n ≥ 1} is an arbitrary sequence of positive constants in Theorem 4.1, while B2 = n VarZi in the above n i=1 cited references If we take Bn ≡ 1, we can get the asymptotic approximation of inverse moments for the partial sums of nonnegative END random variables Third, (4.3) is weaker than (4.1) Actually, by the condition (4.1), we can . so me probability inequalities for extended negatively dependent (END) sequence, which con- tains independent sequence, NA sequence, and NOD sequence as special cases. These probability inequalities. NA random variables are END. Since END random variables are much weaker than independent random variables, NA random variables and NOD random variables, studying the limit behavior of END sequence. deviations for dependent random variables with heavy tails. It is easily seen that independent random v ariables and NOD random variables are END. Joag-Dev and Proschan [3] pointed out that NA random

Ngày đăng: 20/06/2014, 22:20

TỪ KHÓA LIÊN QUAN

TÀI LIỆU CÙNG NGƯỜI DÙNG

TÀI LIỆU LIÊN QUAN