Cramer type large deviations for independent random

Một phần của tài liệu Probability finance and insurance (Trang 68 - 76)

Let Xi, X2, • • • be a sequence of independent random variables with EXi = 0 and 0 < EXf < oo for i > 1. Set

i~\ i = l i = l

It is well-known that the central limit theorem holds if the Lindeberg condi- tion is satisfied. There are mainly two approaches for estimating the error of the normal approximation. One approach is to study the absolute error in the central limit theorem via Berry-Esseen bounds or Edgeworth expan- sions. Another approach is to estimate the relative error of P(Sn > xBn) to 1 — $(x). One of the typical results in this direction is the so-called Cramer large deviation. If X\, X2, • • • are a sequence of i.i.d. random variables with zero means and the finite moment-generating function EetXl < oo for t in a neighborhood of zero, then for x > 0 and x = o^1/2)

P{Sn > xBn) _ f 2

1 - $(z) = exp{*2A(-^)}

1 + 0 ^J

where A(i) is the so-called Cramer's series (see [Petrov (1975), Chapter VIII] for details). In particular, if Ee*lXl1 < oo for some t > 0, then

P(Sn > xBn)

1 - $(x) " *1 {i)

holds uniformly for x G (0, o(n1 / 6)). The moment condition Ee^x^ < oo is necessary. Similar results are also available for independent but not necessarily identically distributed random variables under a finite moment- generating function condition.

Shao (1999) established a (2) type result for self-normalized sums only under a finite third-moment condition. More precisely, he showed that if -B|Xi|2+'5 < oo for 0 < S < 1, then

P(Sn > xVn) . ,

holds uniformly for x S (0, o(n5^2^2+5^)). Recently, several papers have focused on the self-normalized limit theorems for independent but not nec- essarily identically distributed random variables. Bentkus, Bloznelis and Gotze (1996) obtained the following Berry-Esseen bound:

\P{SJVn > x) - (1 - * ( i ) ) |

< A{BZ2YJEXtl{[Xi>Bn} +B-3J2E\Xi\%Xil£Bn}),

t = l i = l

where A is an absolute constant. Assuming only finite third moments, Wang and Jing (1999) derived exponential non-uniform Berry-Esseen bounds.

Chistyakov and Gotze (1999) refined Wang and Jing' results and obtained the following result among others: If X\, X2, • • • are symmetric independent random variables with finite third moments, then

n

P(Sn/Vn > x) = (1 - $ ( ! ) ) ( l + 0(1)(1 + x)3B~3 £ E\Xif) (4)

i = i

for 0 < x < Bn/iY^^ElXi]3)1/3, where 0(1) is bounded by an absolute constant.

Result (4) is useful because it provides not only the relative error but also a Berry-Esseen rate of convergence. Although the assumption of sym- metry allows us to assert that

n

P(Sn/Vn>x) = (l-^(x)){l + 0(l)mm(l,(l+x)3B-3Y/E\Xi\3)) for all x > 0, it not only takes away the main difficulty in proving a self-

normalized limit theorem but also limits its potential applications. Jing, Shao and Wang (2002) recently obtained a Cramer-type large deviation for general independent random variables. In particular, they show that (4) remains valid for non-symmetric independent random variables. Let

( 1 + s Bl

,2 ô

j2Ex?n\Xi\>Bn/(i+x)}

i = i

\3 n

{±±f- J2ElXtfmXtl < Bn/(1 + x)}

1 = 1

for x > 0.

Theorem 7.1. [Jing, Shao and Wang (2002)}. There is an absolute con- stant A (> 1) such that

P(Sn > xVn) _ o(i)An x

1 - $(x) for all x > 0 satisfying

x2 max £ X ? < Bi?

l<i<n l ~ "

and

A„,x < (1 + x)2/A, where |0(1)| < A.

Theorem 7.1 provides a very general framework. The following result is a direct consequence of the above general theorem.

Theorem 7.2. [Jing, Shao and Wang (2002)]. Let {an,n > 1} be a se- quence of positive numbers. Assume that

al<B2J maxEXf (5)

1<1<71

and

n

V e > 0 , B~2Y^EXfl{\Xi\ >eBn/{l + an)}^0 asn->oo. (6)

i=l

Then

\nP(Sn/Vn>x)

l n ( l - * ( * ) ) ^ holds uniformly for x £ (0, an).

When the X^'s have a finite (2 + S)th moment for 0 < 5 < 1, we obtain (4) without assuming any symmetric condition.

Theorem 7.3. [Jing, Shao and Wang (2002)}. Let 0 < S < 1 and set

n

Ln<s = /yE\Xj\2+s, dnt5 = Bn/LnS

1 = 1

Then,

P(Sn/Vn>x) , 1 + ^ 2 + f M

for 0 < x < dn>s, where 0(1) is bounded by an absolute constant. In particular, if dUis —> oo as n —> oo, we have

P(Sn > xVn) 1

1 - $(x) uniformly in 0 < x < o{dnj).

(9)

By the fact that 1 - $(x) < 2e~x2/2/{l + x) for x > 0, it follows from (8) that the following exponential non-uniform Berry-Esseen bound

\P(Sn/Vn > x) - (1 - $(*))| < A{\ + x )1 +* e - *2/2A C / (10) holds for 0 < x < dnj.

Theorem 7.1 has been successfully applied to study the studentized bootstrap and the self-normalized law of the iterated logarithm. The proof of Theorem 7.1 is quite complicated. Here we give a complete proof for the following result, which only requires condition (6).

Theorem 7.4. Let xn be a sequence of real numbers such that xn —* oo and xn = o(Bn). Assume

n

V e > 0, B~2 J2EXfl{\Xi\ > sBJxn} -> 0. (11) i=l

Then we have

lnP(Sn/Vn>xn)~-x2n/2. (12)

As a direct consequence of Theorem 7.4, we have the following self- normalized law of the iterated logarithm for independent random variables:

If Bn —> oo and n

V s > 0,B-2J2EXfl{\Xi\ > eBn/{\og\ogBn)1'2} - 0, then

limsup————" „ M /n = 1 a.s. (13)

n - o / V „ ( 2 1 o g l o g £n) V 2 V J

(13) was proved in 16 under an additional assumption that m a x ^ ^ E X ,2 < (l/A)B2/log\ogBn.

Proof of Theorem 7.4- It suffices to show that for 0 < e < 1/2

P{Sn/Vn > xn) < e x p ( - ( l - e)x2n/2) (14)

and

P(Sn/Vn > xn) > exp(-(l + e)x2n/2) (15) for sufficiently large n. Let 77 = r\e > 0 that will be specified later and

define r = r]2Bn/xn. Set

n n

Xi = XiI{\Xi\ < r}, Sn = Y,xi, ?ằ = E ^ - Observe that

P(Sn/Vn > xn) < P{Sn/Vn > (1 - ri)xn) (16)

n

+ P ( ^ Xi/ { | Xi| > r}/yn > r,xn)

i=l

< P(Sn/Vn > (1 - T ? K ) + P ( ] T I{|X<| > r } > (r,xn)2)

< P(Sn > (1 " 7?)3/2X„B„) + P(V„2 < (1 - 7?)B2) +P(£l{\Xi\>T}>(r1Xnn

t = l

By condition (11), we have

£ P ( | X i | > r) < r - ^ E X f / d ^ l > r } = o(x2n).

i = l i = l

Therefore

i = l

P(£l{\Xi\>T}>(Vxn)2) (17)

(ằPn)

< (3ZUPm>T)y2*

- V (T O„ )2 ;

„2_2

= o(l)" *ằ < exp(-2<) for n sufficiently large.

Note that

EV2 = B2 - f > X2/ { l ^ l > T} = (1 - o(l))B2 > (1 - r,/2)B2n

ằ = i

for sufficiently large n. Hence, by the Hoeffding inequality

P{V2 < (1 - n)Bl) < exp ( - 2 E l L i ^ ^ < T } ) < ô ) {r>Bn/2f

- PV 2r2B2 J

= e x p ( - ^ ) < e x p ( - 2 x 2 ) provided that 77 < 1/16 and that n is sufficiently large.

We now estimate P{Sn > (1 - T])3^2xnBn). Observe that

\ESn\ = \YtEXiI{\Xi\>r}\

i = l n

< r -1 ^ ^ / { i X i l > r } - o ( l ) xnBn.

i = i

It follows from the Bennett-Hoeffding inequality that

P(Sn > (1 - 77)3/2xnJBn) < P ( 5n - ESn > (1 - 277)a;nSn) (19)

- c : : p f ô * ~ fr)*"*")^

-G X Pl 2 ( 1+^ )B2 J

< e x p ( - ( l - e ) a £ / 2 ) provided that (1 - 2Ty)/(l + rj1) > 1 - e.

This proves (14), by (16) - (19).

To prove (15), let 0 < e < 1/2, 1/4 > 77 = % > 0, G = {1 < i < n : x\EX? > rfBl) and H = {1 < i < n : x\EX? < r ?3^ } . First we show that

# G = 0(3*) and J2EX? = o(B2n) (20)

iGG

Note that for i G

V3(Bn/xn)2 < EX? = EXfl{\Xt\ < ^BJxnj + EXflilXil > V2Bn/xn}

< ^(BJxnf+EXfmXil > r,2Bn/xn}.

Hence

EX?I{\Xi\ > r)2Bn/xn} > V4(Bn/xnf

for i e G, and by (11)

rj\Bn/xn)2#G <Y,EX?I{\*i\ > V2Bn/xn} = o(B2n),

which proves the first part of (20). As to the second part of (20), we have

HEXi = 5 > * i J{l*<l < r?Bn/xn} + EX*I{\Xi\ > ij2B„/xn}

i€G i€G n

< £ f a2Bn/ xn)2 + ^EXfl{\Xi\ > rfBn/xn}

i£G i = l

= o{xl){Bn/xn? + o{Bl) = o{Bl) as desired.

Now we show that we only need to focus on i G H. Let

s H =J>, s c =5>, V%=J: X ?> vo=i:xf.

iEH ieG ieH i€G

Noting that

\Sa/Vn\ < ( # G )1 / 2 = o(xn), we have

P(Sn/Vn > xn) = P(SH/Vn >xn- SG/Vn) (21)

> P(SH > (1 + r ? ) ^ ^ )

> P ( 5f f > (1 + i7)xn(Vg + T ? ^ )1/2, Vg < rjBt)

= P(SH > {l+ri)xn{V]j + r,Blf/2)p{V^ < r/B2).

Prom (20) we obtain that

P(V% < r,B2n) > 1 - £(Vg)/foB2) > 1/2 (22) for n sufficiently large.

Let T = ifB-n/x-n, Yt,i £ H be a sequence of independent random variables and Yi have the distribution function of X; conditioned on |Xj| <

T. Put

Note that

y2EY2 = J2EXfI{lXil-T}

ieH ^ ^

EXfI{\Xi\ < r} ^ D 2

< E i ' -T^ ^ / d - ằ ? ) ,

^—' 1 — Tj

£ £ * ;2> 5 > * i2J { | * i l < T }

ieH ieH

= ^EX2-J2EX2I{\Xi\>r}

ieH ieH

= B*-J^EX?-^EX?I{\Xi\>T}

ieG ieH

= {l-o{l))Bl>{l-r))Bl

ieH ieH

and

Y, \EYi\ < 2T-1 J2 EXfl{\Xi\ >T} = o{xnBn).

ieH ieH

We have

P(SH > (1 + ^ ^ ( V l + r,B2n)l/2) (23)

> P(Sff > (1 + ằ7)i„(Vg +7?52)1/2,max|Xi| < r )

= P(max|Xi| < T ) P ( 5H > (1 +rj)xn{V% + r,B2nf'2)

ieH \ /

= P(max|Xô| < r ) p ( 5f f > (1 + r,)xn{V% + vB2n)1/2,V2<(1 + 2r1)B2)

ieH \ /

> P(max |X4| < r ) P ( s „ > (1 + r?)(l + 3ằj)a:nBn) - P(V* > (1 + 2r?)P2) Similar to the proof of (18), we have

Also note that

P(max\Xi\<T)=T[(l-P(\Xi\>T)) (25)

i€H

>H(i-r-2EX?i{\xi\>T})

ieH

> exp ( - 2 J ] r - ^ I f l X i l > r})

= exp(-o(l)a;2).

Finally, by the Kolmogorov inequality

P(SH > (1 + r?)(l + 3V)xnBn) (26)

/ {l+e/2)(l + r,)*{l+to,)*xlB*s

~ ^ 2 ( 1 - „ ) B ằ )

>exp(-(l+e)x2n/2) (27)

for sufficiently large n, provided that nE is chosen very small.

This proves (15) by combining the above inequalities.

Một phần của tài liệu Probability finance and insurance (Trang 68 - 76)

Tải bản đầy đủ (PDF)

(252 trang)