Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống
1
/ 116 trang
THÔNG TIN TÀI LIỆU
Thông tin cơ bản
Định dạng
Số trang
116
Dung lượng
384,48 KB
Nội dung
CENTRAL LIMIT THEOREM OF LINEAR SPECTRAL STATISTICS FOR LARGE DIMENSIONAL RANDOM MATRICES WANG XIAOYING NATIONAL UNIVERSITY OF SINGAPORE 2009 CENTRAL LIMIT THEOREM OF LINEAR SPECTRAL STATISTICS FOR LARGE DIMENSIONAL RANDOM MATRICES WANG XIAOYING (B.Sc. Northeast Normal University, China) A THESIS SUBMITTED FOR THE DEGREE OF DOCTOR OF PHILOSOPHY DEPARTMENT OF STATISTICS AND APPLIED PROBABILITY NATIONAL UNIVERSITY OF SINGAPORE 2009 ii Acknowledgements I would like to express my deep and sincere gratitude to my supervisors, Professor Bai Zhidong and Associate Professor Zhou Wang. Their valuable guidance and continuous support have been crucial to the completion of this thesis. I appreciate all the time and efforts they have spent in helping me to solve the problems I encountered. I have learned many things from them, especially regarding academic research and character building. Special acknowledgement are also due to Assistant Professor Pan Guangming and Mr. Wang Xiping for discussions on various topics of large dimensional random matrices theory. It is a great pleasure to record my thanks to my dear friends Ms. Zhao Wanting, Ms. Zhao Jingyuan, Ms. Zhang Rongli, Ms. Li Hua, Ms. Zhang Xiaoe, Ms. Li Xiang, Mr. Khang Tsung Fei, Mr. Li Mengxin, Mr. Deng Niantao, Mr. Su Yue, Mr. Wang Daqing, and Mr. Loke Chok Kang, who have given me much help not only in my study but also in my daily life. Sincere thanks to all my friends who helped me in one way or another iii for their friendship and encouragement. On a personal note, I thank my parents, husband, sisters and brother for their endless love and continuous support during the entire period of my PhD programme. I also thank my baby for giving me a lot of happy times and a sense of responsibility. Finally, I would like to attribute the completion of this thesis to other members and staff of the department for their help in various ways and providing such a pleasant studying environment. I also wish to express my gratitude to the university and the department for supporting me through an NUS research scholarship. iv Contents Contents Acknowledgements Summary ii vii Introduction 1.1 Large Dimensional Random Matrices . . . . . . . . . . . . . . . . . . 1.2 Spectral Analysis of LDRM . . . . . . . . . . . . . . . . . . . . . . . 1.3 Methodologies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.3.1 Moment Method . . . . . . . . . . . . . . . . . . . . . . . . . 1.3.2 Stieltjes Transform . . . . . . . . . . . . . . . . . . . . . . . . 1.3.3 Orthogonal Polynomial Decomposition . . . . . . . . . . . . . 11 Organization of the Thesis . . . . . . . . . . . . . . . . . . . . . . . . 12 1.4 v Contents Literature Review 14 2.1 Limiting Spectral Distribution (LSD) of LDRM . . . . . . . . . . . . . 14 2.1.1 Wigner Matrix . . . . . . . . . . . . . . . . . . . . . . . . . . 14 2.1.2 Sample Covariance Matrix . . . . . . . . . . . . . . . . . . . . 16 2.1.3 Product of Two Random Matrices . . . . . . . . . . . . . . . . 18 2.2 Limits of Extreme Eigenvalues . . . . . . . . . . . . . . . . . . . . . . 19 2.3 Convergence Rate of ESD . . . . . . . . . . . . . . . . . . . . . . . . 21 2.4 CLT of Linear Spectral Statistics (LSS) . . . . . . . . . . . . . . . . . 22 CLT of LSS for Wigner Matrices 26 3.1 Introduction and Main Result . . . . . . . . . . . . . . . . . . . . . . . 26 3.2 Bernstein Polynomial Approximation . . . . . . . . . . . . . . . . . . 29 3.3 Truncation and Preliminary Formulae . . . . . . . . . . . . . . . . . . 33 3.3.1 Simplification by Truncation . . . . . . . . . . . . . . . . . . 33 3.3.2 Preliminary Formulae . . . . . . . . . . . . . . . . . . . . . . 35 3.4 The Mean Function of LSS . . . . . . . . . . . . . . . . . . . . . . . . 36 3.5 Convergence of ∆ − E∆ . . . . . . . . . . . . . . . . . . . . . . . . . 49 Contents vi CLT of LSS for Sample Covariance Matrices 61 4.1 Introduction and Main Result . . . . . . . . . . . . . . . . . . . . . . . 61 4.2 Bernstein Polynomial Approximations . . . . . . . . . . . . . . . . . . 65 4.3 Simplification by Truncation and Normalization . . . . . . . . . . . . . 68 4.4 Convergence of ∆ − E∆ . . . . . . . . . . . . . . . . . . . . . . . . . 70 4.5 The Mean Function of LSS . . . . . . . . . . . . . . . . . . . . . . . . 89 Conclusion and Further Research 96 5.1 Conclusion and Discussion . . . . . . . . . . . . . . . . . . . . . . . . 96 5.2 Future Research . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98 Bibliography 100 Summary vii Summary With the rapid development of computer science, large dimensional data have become increasingly common in various disciplines. These data resist conventional multivariate analysis that rely on large sample theory, since the number of variables for each observation can be very large and comparable to the sample size. The classical multivariate analysis appears to have intolerable errors in dealing with large dimensional data. Consequently, a new approach, large dimensional random matrices (LDRM) theory, has been proposed to replace the classical large sample theory. Spectral analysis of LDRM plays an important role in large dimensional data analysis. After finding the limiting spectral distribution (LSD) of the empirical spectral distribution (ESD i.e. the empirical distribution of the eigenvalues) of LDRM, one can easily derive the limit of the corresponding linear spectral statistics (LSS). Then, in order to conduct further statistical inference, it is important to find the limiting distribution of LSS of LDRM. A general conjecture about the convergence rate of ESD to LSD puts it at the order of Summary viii O(n−1 ). If this is true, then it seems natural to consider the asymptotic properties of the empirical process Gn (x) = n(Fn (x) − F(x)). Unfortunately, many lines of evidence show that the process Gn (x) cannot converge in any metric space. As an alternative, we turned to finding the limiting distribution of LSS of the process Gn (x). In this thesis, using the Bernstein polynomial approximation and Stieltjes transform method, under suitable moment conditions, we prove the central limit theorem (CLT) of LSS with a generalized regular class C of the kernel functions for large dimensional Wigner matrices and sample covariance matrices. These asymptotic properties of LSS suggest that more efficient statistical inferences, such as hypothesis testing, constructing confidence intervals or regions, etc., on a class of population parameters, are possible. The improved criteria on the constraint conditions of kernel functions in our results should also provide a better understanding of the asymptotic properties of the ESD of the corresponding LDRM. ix List of Notations List of Notations Chapter Wn (xi j )1≤i, j≤n , Wigner matrix Fn (x) F F(x) semicircular law U the open set of the real line containing the interval [−2, 2] Gn ( f ) n sn (z), s(z) the Stieltjes transform of Fn (x) and F(x) D(z) Wn (k) √1 Wn n ∞ −∞ (x) f (x)[Fn − F](dx), f ∈ C (U) √1 Wn n − zIn −1 the submatrix extracted from Wn by removing its k-th row and k-th column γm the contour formed by the boundary of the rectangle with √ vertices (±a ± i/ m) γmh , γmv the union of the two horizontal/vertical parts of γm 90 Chapter 4: CLT of LSS for Sample Covariance Matrices Taking trace, dividing by p and taking expectation, we derive ωn (z) = − − Esn (z) z(Esn (z) + 1) =− − pz(Esn (z) + 1) (4.29) n E(β j (z)d j (z)) j=1 Jn (z), pz(Esn (z) + 1) where −1 d j (z) = r∗j D−1 j (z)r j − trED (z). n On the other hand, by the identity Esn (z) = −(1 − yn )z−1 + yn Esn (z), we have ωn (z) = Esn (z) yn −z − + yn z Esn (z) Esn (z) + Esn (z) Rn (z), yn z where Rn (z) = −z − yn + , Esn (z) Esn (z) + which implies that yn Esn (z) = −z + − Rn (z) Esn (z) + −1 . (4.30) For s0n (z), since s0n (z) = (1 − yn − yn zs0n (z) − z)−1 and s0n (z) = −(1 − yn )z−1 + yn s0n (z), we have s0n (z) = −z + yn sn (z) + −1 . (4.31) 91 Chapter 4: CLT of LSS for Sample Covariance Matrices By (4.30) and (4.31), we get Esn (z) − s0n (z) = Esn (z)s0n (z) yn − Rn (z) = −z + Esn (z) + −1 yn − −z + sn (z) + −1 yn yn − + Rn (z) s0n (z) + Esn (z) + yn Esn (z)s0n (z) = Esn (z) − s0n (z) + Esn (z)s0n (z)Rn (z), (sn (z) + 1)(Esn (z) + 1) which combined with (4.29) leads to n(Esn (z) − = s0n (z)) nEsn (z)s0n (z) yn Esn (z)s0n (z) 1− = nEsn (z)s0n (z)Rn (z) (sn (z) + 1)(Esn (z) + 1) (4.32) s0n (z) yn z ωn (z) = − Jn (z). Esn (z) Esn (z) + Thus, in order to find the limit of Mn (z) = n[Esn (z)− s0n (z)], it suffices to find the limit −1 ¯ of Jn (z). Let d¯ j (z) = r∗j D−1 j (z)r j − n trD (z) and Jn (z) = n j=1 E(β j (z)d¯ j (z)). By (4.11), we have n Jn (z) = J¯n (z) + E β j (z) j=1 n = J¯n (z) + j=1 1 trD−1 (z) − trED−1 (z) n n 1 E (β j (z) − β¯j (z)) trD−1 (z) − trED−1 (z) n n = J¯n (z) − T (z) + T (z), where from (4.22) n T (z) = j=1 n = j=1 n ≤ j=1 1 E β¯2j (z)δ j (z) trD−1 (z) − trED−1 (z) n n −1 −1 E β¯2j (z)δ j (z) tr(D−1 (z) − D−1 j (z)) − trE(D (z) − D j (z)) n K K K = √ 2. √ · nv nv nv 92 Chapter 4: CLT of LSS for Sample Covariance Matrices It follows from Bai and Silverstein (1998) (4.3) that for l ≥ l 1 Kl −1 −1 E trD (z) − trED (z) ≤ √ l . n n ( nv) (4.33) Hence, n T (z) = j=1 1 E β j (z)β¯2j (z)δ2j (z) trD−1 (z) − trED−1 (z) n n K ≤ √ 3. nv From the above estimates on T and T , we claim that Jn (z) = J¯n (z) + ¯n , √ here and in the sequel, ¯n = O(( nv3 )−1 ). Now we only need to consider the limit of J¯n (z). By (4.10), we write n J¯n (z) = j=1 E[(β j (z) − β¯j (z))ε j (z)] + n n =− n E(β¯ 2j (z)ε2j (z)) + j=1 j=1 n −1 E[β j (z)tr(D−1 j (z) − D (z))] j=1 E(β¯ 2j (z)β j (z)ε3j (z)) + n n E(β2j (z)r∗j D−2 j (z)r j ) j=1 J¯n1 (z) + J¯n2 (z) + J¯n3 (z). From Lemma 3.4.1 and Lemma 4.4.4, we find n | J¯n2 (z)| ≤ K E|ε6j (z)| 1/2 j=1 K ≤ √ 3. nv By Lemma 4.4.4, β¯j (z), β j (z) and β¯j (z) can be replaced by −zs(z), thus we get J¯n3 (z) = z2 s2 (z) n n EtrD−2 j (z) + ¯n j=1 z2 s2 (z)ψn (z) + ¯n 93 Chapter 4: CLT of LSS for Sample Covariance Matrices By the identity of quadric form (4.18) and the fact from Lemma 4.4.5 that E[D−1 j (z)]ii can be replaced by s(z) = −z−1 (s(z) + 1)−1 , we have n J¯n1 (z) = −z2 s2 (z) z2 s2 (z) =− n Eε2j (z) + ¯n (4.34) j=1 p n j=1 κ2 [D−1 j (z)]ii E + κ1 trD−2 j (z) + i=1 trD−2 j (z) + ¯n = yn κ2 k2 (z) − z2 s2 (z)(κ1 + 1)ψn (z) + ¯n , where κ1 , κ2 and k(z) were defined in Theorem 4.1.1. Now our goal is to find the limit of ψn (z). Using the expansion of D−1 j (z) in (4.21), we get ψn (z) = = n2 n j=1 k (z) n2 + n p 2 (z) + z s (z + zs(z))2 n2 n n j=1 i,l EtrA2 (z) + ¯n j=1 1 −1 ∗ Etr ri ri∗ − I D−1 i j (z)Dl j (z) rl rl − I n n j n j=1 n p z2 (s(z) + 1)2 + ¯n −1 −1 Note that the cross terms will be if either D−1 i j (z) or Dl j (z) is replaced by Dli j (z), ∗ where Dli j (z) = Di j (z) − rl rl∗ = D−1 l j (z) − ri ri and D−1 i j (z) − D−1 li j (z) =− ∗ −1 D−1 li j (z)rl rl Dli j (z) + rl∗ D−1 li j (z)rl . Therefore, by (4.22), we claim that the sum of cross terms is negligible and bounded 94 Chapter 4: CLT of LSS for Sample Covariance Matrices √ by K/( nv3 ), thus we find n2 = n n 1 −1 ∗ Etr ri ri∗ − I D−1 i j (z)Dl j (z) rl rl − I n n j j=1 i,l n2 = n = n n n j=1 i j 1 ∗ Etr ri ri∗ − I D−2 i j (z) ri ri − I n n + ¯n n n ∗ E (ri∗ D−2 i j (z)ri )(ri ri ) + ¯n j=1 i j n n j=1 i j E trD−2 i j (z)(p + O(1)) + ¯n = yn ψn (z) + ¯n . n From above, we get ψn (z) = yn z (s(z) + 1)2 + yn k2 (z)ψn (z) + ¯n . Joint with (4.34), we have Jn (z) = κ2 yn k2 (z) − κ1 yn k2 (z) + ¯n . − yn k2 (z) Thus, from (4.32), it follows that Mn (z) = = nEsn (z)s0n (z)Rn (z) 1− yn Esn (z)s0n (z) (sn (z)+1)(Esn (z)+1) s0 (z) = − Es n(z)+1 Jn (z) n 1− yn Esn (z)s0n (z) (sn (z)+1)(Esn (z)+1) κ1 yn k3 (z) κ2 yn k3 (z) − + ¯n (1 − yn k2 (z))2 − yn k2 (z) M1 (z) + M2 (z) + ¯n . 95 Chapter 4: CLT of LSS for Sample Covariance Matrices Therefore, we can calculate the mean function in the following two parts. − 2πi = γmh κ1 4πi −→ fm (z) M1 (z)dz = − fm (z) γmh κ1 2π κ1 2πi γmh fm (z) yn k3 (z) dz (1 − yn k2 (z))2 d κ1 ln(1 − yn k2 (z))dz = − dz 4πi γmh fm (z) ln(1 − yn k2 (z))dz b f (x)arg − yk2 (x) dx, a as n → ∞, al → a and br → b. Similarly, − 2πi γmh −→ − κ2 π fm (z) M2 (z)dz = b γmh fm (z) yn k3 (z) dz − yn k2 (z) f (x)Im a κ2 2πi yk (x) dx. − yk2 (x) Hence, summing the two terms, we obtain the mean function of the limiting distribution in (4.3). We complete the whole proof of Theorem 4.1.1. 96 Chapter 5: Applications and Further Research Chapter Conclusion and Further Research 5.1 Conclusion and Discussion The purpose of this thesis is to investigate the asymptotic properties of LSS of some LDRM. Our main results, Theorem 3.1.1 and Theorem 4.1.1 enable us to perform statistical inference, such as hypothesis testing, constructing confidence intervals or regions, etc., on a class of population parameters θ = f (x)dF(x). In both theorems, we prove the CLT of LSS with a more generalized regular class (actually C ) of the test functions but not analytic functions as in Bai and Yao (2005) for Wigner matrices and in Bai and Silverstein (2004) for sample covariance matrices. Pastur and Lytova (2008) studied asymptotic distributions of n f (x)d(Fn (x) − EFn (x)) under the conditions that the fourth cumulant of off-diagonal elements of Wigner matri- Chapter 5: Applications and Further Research 97 ces are zero and the Fourier transform of f (x) has the fifth moment, which is equivalent to the fact that f has the fifth continuous derivative. Their conditions are stronger than ours. In the definition of Gn ( f ), the LSS θˆ = tor of θ = f (x)dFn (x) can be regarded as an estima- f (x)dF(x). We remind the reader that the center θ = instead of Eθˆ = E f (x)dF(x) itself, f (x)dFn (x), has its strong statistical meaning in the application of Theorem 3.1.1 and Theorem 4.1.1. For example, in quantum mechanics, Wigner matrix is a discretized version of a random linear transformation in a Hilbert space, and semicircular law is derived under ideal assumptions. Quantum physicists may want to test the validity of the ideal assumptions. Therefore, they may suitably select one or several f ’s so that θ’s may characterize the semicircular law. Using the limiting distribution of Gn ( f ) = n(θˆ − θ), one may perform statistical test of the ideal hypothesis. However, one ˆ in the above test. cannot apply the limiting distribution of n(θˆ − Eθ) In this thesis, for the first time, we introduced Bernstein polynomial approximation in LDRM study. Since Bernstein polynomials are analytic functions, we use the Cauchy Theorem to show that the convergence of ESD is converted to the convergence of their Stieltjes transforms. Following this procedure, we obtaine the CLT of the LSS. This approximation procedure may provide a novel way for exploring the convergence rate of other general matrices. Chapter 5: Applications and Further Research 5.2 98 Future Research Our present results suggest several potentially fruitful lines of inquiry. Firstly, examining the proofs of Theorem 3.1.1 and Theorem 4.1.1, one may find that the assumption of the fourth continuous derivative of f is related to the convergence rate of Fn − F . If the convergence rate of Fn − F is improved and/or proved under weaker conditions, then the results of Theorem 3.1.1 and Theorem 4.1.1 would hold under the new weakened conditions. We conjecture that our results still hold under the condition that the fourth moment of the entries of random matrices are finite and f has the first continuous derivative. Secondly, the theorems derived in this study are based on the assumption that the entries of random matrices are mutually independent. However, random matrices without independence structures can be found in a number of practical situations. One could extend our work to different practical related structures, such as T 1/2 Yk Yk∗ T 1/2 , the generalized deformation of sample covariance matrices studied by Silverstein (1995), and large dimensional sample covariance matrices without independence structure in columns in Bai and Zhou (2008). Thirdly, finding the limiting spectral distribution and convergence rates of other types of matrices may serve practical purposes. Examples include the following three types of matrices listed as unsolved problems in Bai (1999): the autocovariance matrix in time series analysis, the information matrix in a polynomial regression model, the derivative Chapter 5: Applications and Further Research 99 of a transition matrix in a Markov process. Additional examples include the multivariate F-matrix which is a product of two random matrices first considered by Wachter (1980); Hankel matrices, Toeplitz matrices and symmetric Markov matrices in Bryc, Dembo and Jiang (2006). Finally, the asymptotic behavior of eigenvectors (also known as the eigenmatrix in literature) of LDRM is another research issue in LDRM theory. Compared with extensive research on eigenvalues, there have been very little results on the asymptotic properties of eigenvector of LDRM. The main difficulty seems to be that contrary to the deterministic limit of the eigenvalues, the eigenvectors behave in a completely chaotic manner which is difficult to describe. Silverstein (1979, 1981) first defined a stochastic process to characterize the randomness of eigenvectors. His results supported the conjecture that the orthonormal eigenvectors of large sample covariance matrices are asymptotically Haar-distributed. Different eigenvector statistics arising from the dissipative systems can be found in Pepłowski and Haake (1993). Much work remains to be done to clarify the asymptotic properties of eigenvectors. Bibliography 100 Bibliography [1] Arnold, L. (1967). On the asymptotic distribution of the eigenvalues of random matrices. J. Math. Anal. Appl. 20 262-268. [2] Arnold, L. (1971). On Wiger’s semicircle law for the eigenvalues of random matrices. Z. Wahrsch. Verw. Gebiete. 19 191-198. [3] Bai, Z. D. (1993a). Convergence rate of expected spectral distributions of large random matrices. Part I. Wigner matrices. Ann. Probab. 21 625-648. [4] Bai, Z. D. (1993b). Convergence rate of expected spectral distributions of large random matrices. Part II. Sample covariance matrices. Ann. Probab. 21 649-672. [5] Bai, Z. D. (1999). Methodologies in spectral analysis of large dimensional random matrices, a review. Stat. Sin. 611-677. [6] Bai, Z. D. , Miao. B. Q. and Tsay, J. (1997). A note on the convergence rate of the spectral distributions of large random matrices. Statist. Probab. Lett. 34 95-102. [7] Bai, Z. D. , Miao. B. Q. and Tsay, J. (1999) Remark on the convergence rate of the spectral distributions of Wigner matrices. J. Theo. Probab. 12(2) 301-311. [8] Bai, Z. D., Miao, B. and Tsay, J. (2002). Convergence rates of the spectral distributions of large Wigner matrices. Int. Math. J. 65-90. Bibliography 101 [9] Bai, Z. D. , Miao. B. Q. and Yao. J. F. (2003). Convergence rates of spectral distributions of large sample covariance matrices. SIAM J. Matrix Anal. Appl. 25(1) 105-127. [10] Bai, Z.D. and Saranadasa, H. (1996). Effect of high dimension comparison of significance tests for a high dimensional two sample problem. Statistica Sinica. 311-329 [11] Bai, Z. D. and Silverstein, J. W. (1998). No eigenvalues outside the support of the limiting spectral distribution of large dimensional random matrices. Ann. Probab. 26 316-345. [12] Bai, Z. D. and Silverstein, J. W. (2004). CLT for linear spectral statistics of large-dimensional sample covariance matrices. Ann. Probab. 32 553-605. [13] Bai, Z. D. and Silverstein, J. W. (2006). Spectral analysis of large dimensional random matrices. Mathematics Monograph Series 2, Science Press, Beijing. [14] Bai, Z. D. and Yao, J. (2005). On the convergence of the spectral empirical process of Wigner matrices. Bernoulli 11 1059-1092. [15] Bai, Z. D. and Yin, Y. Q. (1988). Necessary and sufficient conditions for the almost sure convergence of the largest eigenvalue of Wigner matrices. Ann. Probab. 16 1729–1741. [16] Bai, Z. D. and Yin, Y. Q. (1993). Limit of the smallest eigenvalue of large dimensional covariance matrix. Ann. Probab. 21 1275-1294. [17] Bai, Z. D. , Yin, Y. Q. and Krishnaiah, P. R. (1986). On limiting spectral distribution of product of two random matrices when the underlying distribution is isotropic. J. Multiv. Anal. 19 189-200. [18] Bai, Z. D. , Yin, Y. Q. and Krishnaiah, P. R. (1987). On the limiting empirical distribution function of the eigenvalues of a multivariate F matrix. Theory Probab. Appl. 32 490-500. [19] Bai, Z. D. and Zhou, W. (2008). Large sample covariance matrices without independence structures in columns. Stat. Sin. 18 425–442. Bibliography 102 [20] Billingsley, P. (1995). Probability and Measure. 3rd ed. Wiley, New York. [21] Bryc, W. , Dembo, A. and Jiang, T. (2006). Spectral measure of large random Hankel, Markov and Toeplitz matrices. Ann. Probab. 34(1) 1–38. [22] Burkholder, D. L. (1973). Distribution function inequalities for martingales. Ann. Probab. 19-42. [23] Costin, O. and Lebowitz, J. (1995). Gaussian fluctuations in radom matrices. Physical Review Letters 75 69–72. [24] Diaconis, P. and Evans, S. N. (2001). Linear functionals of eigenvalues of random matrices. Trans. Amer. Math. Soc. 353(7) 2615–2633. [25] Donoho, D. (2000). High-dimensional data analysis: the curses and blessings of dimensionality. Lecture in “Math Challenges of the 21st Century” conference, American Math. Society. [26] Dempster, A. P. (1958). A high dimensional two sample significance test. Ann. Math. Statist. 29 1958 995–1010. [27] Geman, S. (1980). A limit theorem for the norm of random matrices. Ann. Probab. 252-261. [28] Anderson, G. W. and Zeitouni, O. (2006). A CLT for a band matrix model. Probab. Theory Relat. Fields. 134 283-338. [29] Grenander, U. (1963). Probabilities on Algebraic Structures. John-Wiley, New York-London. [30] Grenander, U. and Silverstein, J. W. (1977). Spectral analysis of networks with random topologies. SIAM J. Appl. Math. 32(2) 499-519. [31] Huber, P. J. (1973). Robust regression: asymptotics, conjectures and Monte Carlo.Ann. Stat. 799821. [32] Jiang, T. F. (2004). The limiting distributions of eigenvalues of sample correlation matrices. Sankhy¯a 66 35-48. Bibliography 103 [33] Johansson, K. (1998). On fluctuations of eigenvalues of random Hermitian matrices. Duke. Math. J. 91 151–204. [34] Jonsson, D. (1982). Some limit theorems for the eigenvalues of a sample covariance matrix. J. Multivariate Anal. 12 1-38. [35] Khorunzhy, A. M., Khoruzhenko, B. A. and Pastur, L. A. (1996). Asymptotic properties of large random matrices with independent entries. J. Math. Phys. 37 5033-5060. [36] Lo`eve, M. (1977). Probability Theory. I. Fourth Edition, Springer-Verlage, New York. [37] Marˇcenko, V. A. and Pastur, L. A. (1967). Distribution for some sets of random matrices. Math. USSR-Sb.1 457-483. [38] Mehta, M. L. (1990). Random Matrices, 2ed. Academic Press, New York. [39] Pastur, L. A. and Lytova A. (2008). Central Limit Theorem for linear eigenvalue statistics of random matrices with independent entries. arXiv:0809.4698 [40] Pepłowski, P. and Haake, F. (1993). Random-matrix theory and eigenvector statistics of dissipative dynamical systems. J. Phys. A: Math. Gen. 26(1) 3473–3479. [41] Rao, C. R. and Rao, M. B. (2001) Matrix Algebra and Its Applications to Statistics and Econometrics. River Edge, NJ: World Scientific. [42] Silverstein, J. W. (1979). On the randomness of eigenvectors generated from networks with random topologies. SIAM J. Applied Math. 37 235-245. [43] Silverstein, J. W. (1984). Comments on a result of Yin, Bai and Krishnaiah for large dimensional multivariate F matrices. J. Multiv. Anal. 15 408-409. [44] Silverstein, J. W. (1985a). The limiting eigenvalue distribution of a multivariate F matrix. SIAM J. Appl. Math. 16 641-646. Bibliography 104 [45] Silverstein, J. W. (1985b). The smallest eigenvalue of a large dimensional Wishart matrix. Ann. Probab. 13(4) 1364-1368. [46] Silverstein, J. W. (1989). On the weak limit of the largest eigenvalue of a large dimensional sample covariance matrix. J. Multivariate Anal. 30 307-311. [47] Silverstein, J. W. (1995). Strong convergence of the empirical distribution of eigenvalues of large dimensional random matrices. J. Multivariate Anal. 55 331-339. [48] Silverstein, J. W. (1981). Describing the Behavior of Eigenvectors of Random Matrices Using Sequences of Measures on Orthogonal Groups. SIAM J. Math. Anal. 12(2) 274-281. [49] Silverstein, J. W. and Bai, Z. D. (1995). On the empirical distribution of eigenvalues of a class of large dimensional random matrices. J. Multivariate Anal. 54 175-192. [50] Sinai, Y. and Soshnikov, A. (1998). Central limit theorem for traces of large random symmetric matrices with independent elements. Bol. Soc. Brasil. Mat. (N.S.) 29 1–24. [51] Wachter, K. W. (1978). The strong limits of random matrix spectra for sample matrices of independent elements. Ann. Stat. 6(1) 1-18. [52] Wachter, K. W. (1980). The limiting empirical measure of multiple discriminant ratios. Ann. Stat. 8(1) 937–957. [53] Wigner, E. P. (1955). Characteristic rectors of bordered matrices with infinite dimensions. Ann. Math. 62 548–564. [54] Wigner, E. P. (1958). On the distributions of the roots of certain symmetric matrices. Ann. Math. 67 325-327. [55] Yin, Y. Q. (1986). Limiting spectral distribution for a class of random matrices. J. Multivariate Anal. 20 50-68. Bibliography 105 [56] Yin, Y. Q., Bai, Z. D. and Krishnaiah, P. R. (1983). Limiting behavior of the eigenvalues of a multivariate F matrix. J. Multivariate Anal. 13 508-516. [57] Yin, Y. Q., Bai, Z. D. and Krishnaniah, P. R. (1988). On limit of the largest eignevalue of the large dimensional sample covariance matrix. Probab. Theory Related Fields. 78 509-521. [58] Yin, Y. Q. and Krishnaiah. P. R. (1983). A limit theorem for the eigenvalues of product of two random matrices. J. Multivariate Anal. 13 489-507. [...]... important statistics can be expressed as functionals of the spectral distribution of some random matrices For concrete applications, the reader may refer to Bai (1999) Two key problems in spectral analysis of LDRM theory are to investigate the convergence and its rate of the sequence of empirical spectral distributions F An for a given Chapter 1: Introduction 5 sequence of random matrices {An } The limit. .. condition of convergence for Wigner and sample covariance matrices are still open 2.4 CLT of Linear Spectral Statistics (LSS) As mentioned in the introduction section, many important statistics in multivariate analysis can be expressed as functionals of ESD of some random matrices Indeed, a parameter θ of the population can often be expressed as: θ= f (x)dF(x) Let Fn (x) be the ESD of the random matrix... convergence of the largest eigenvalue of Wigner matrix was obtained Jiang (2004) proved that the almost sure limit of the largest eigenvalue of the sample correlation matrix is same as that of the largest eigenvalue of the sample covariance matrix A relatively difficult problem is to find the limit of the smallest eigenvalue of sample covariance matrix Yin, Bai and Krishnaiah (1983) proved that the lower limit of. .. Review 24 CLT for the centralized sum of the r-th power of eigenvalues of a normalized Wishart matrix Similar work for Wigner matrix was obtained in Sinai and Soshnikov (1998) Later, Johansson (1998) proved the CLT of linear spectral statistics of Wigner matrix under density assumption In Bai and Silverstein (2004), the normalization constant αn for large dimensional sample covariance matrices has been... the properties of linear functionals by considering the asymptotic of the empirical process Gn (x) = αn (Fn (x) − F(x)) when viewed as a random element in C space or D space, the metric spaces of functions, equiped with the Skorokhod metric If for some choice of αn , Gn (x) tends to a limiting process G(x), then the limiting distribution of all LSS can be derived Unfortunately, many lines of evidence... that for any 18 Chapter 2: Literature Review constant η > 0, lim √ E|xi j |2 I{|xi j | ≥ η n} = 0, 1 n→∞ η2 np ij Then, with probability one, the ESD of sample covariance matrices F Bn converges to the MP law with ratio index y and scale parameter σ2 2.1.3 Product of Two Random Matrices The study of a product of two random matrices originates from two areas The first is the investigation of the LSD of. .. to establish the strong limits of both the largest and the smallest eigenvalues of the sample covariance matrix simultaneously under the existence of fourth moment of the underlying distribution In fact, the strong limit of the smallest eigenvalue was proven to be a = (1 − √ y)2 21 Chapter 2: Literature Review 2.3 Convergence Rate of ESD The convergence rate of the ESD, is of practical interest, but... in linear regression analysis For LDRM, it is the convention to exploit the properly simple asymptotic setup, p tends to infinity proportional to n, that is, p/n → y ∈ (0, ∞) This new setup leads to two kinds of limiting results: classical limiting theory (for fixed dimension p) and large dimensional limiting theory (for large dimension p and also called LDRM theory) Therefore, it is natural to ask which... Review form of the LSD of multivariate F -matrices was derived in Bai, Yin, and Krishnaiah (1987) and Silverstein (1985a) Under the same structure, Bai, Yin, and Krishnaiah (1986) established the existence of the LSD when the underlying distribution of S is isotropic 2.2 Limits of Extreme Eigenvalues In multivariate analysis, many statistics generated from a random matrix can be written as functions of. .. which is usually non -random, is called the Limiting Spectral Distribution (LSD) of the matrix sequence {An } The following section will review three main methodologies of finding the limits and improving the convergence rates 1.3 Methodologies It is known that the eigenvalues of a matrix are continuous functions of the elements of the matrix elements However, the explicit forms of eigenvalues are too . CENTRAL LIMIT THEOREM OF LINEAR SPECTRAL STATISTICS FOR LARGE DIMENSIONAL RANDOM MATRICES WANG XIAOYING NATIONAL UNIVERSITY OF SINGAPORE 2009 CENTRAL LIMIT THEOREM OF LINEAR SPECTRAL STATISTICS. SPECTRAL STATISTICS FOR LARGE DIMENSIONAL RANDOM MATRICES WANG XIAOYING (B.Sc. Northeast Normal University, China) A THESIS SUBMITTED FOR THE DEGREE OF DOCTOR OF PHILOSOPHY DEPARTMENT OF STATISTICS AND. transform method, under suitable moment conditions, we prove the central limit theorem (CLT) of LSS with a generalized regular class C 4 of the kernel functions for large dimensional Wigner matrices