Skorokhod a prokhorov i basic principles and applications of probability theory ( 2005) (276s)

276 55 0
Skorokhod a prokhorov i basic principles and applications of probability theory ( 2005) (276s)

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

Thông tin tài liệu

Basic Principles and Applications of Probability Theory A.V Skorokhod Basic Principles and Applications of Probability Theory Edited by Yu.V Prokhorov Translated by B D Seckler 123 A.V Skorokhod Department of Statistics and Probability Michigan State University East Lansing, MI 48824, USA Yu.V Prokhorov (Editor) Russian Academy of Science Steklov Mathematical Institute ul Gubkina 117966 Moscow, Russia B D Seckler (Translator) 19 Ramsey Road Great Neck, NY 11023-1611, USA e-mail: bersec@aol.com Original Russian edition published by Viniti, Moscow 1989 Title of the Russian edition: Teoriya Veroyatnostej Published in the series: Itogi Nauki i Tekhniki Sovremennye Problemy Matematiki Fundamental’nye Napravleniya, Tom 43 Library of Congress Control Number: 2004110444 Mathematics Subject Classification (2000): 60Axx, 60Dxx, 60Fxx, 60Gxx, 60Jxx, 62Cxx, 94Axx ISBN 3-540-54686-3 Springer Berlin Heidelberg New York This work is subject to copyright All rights are reserved, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilm or in any other way, and storage in data banks Duplication of this publication or parts thereof is permitted only under the provisions of the German Copyright Law of September 9, 1965, in its current version, and permission for use must always be obtained from Springer Violations are liable for prosecution under the German Copyright Law Springer is a part of Springer Science+Business Media springeronline.com © Springer-Verlag Berlin Heidelberg 2005 Printed in Germany The use of general descriptive names, registered names, trademarks, etc in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use Typeset by Steingraeber Satztechnik GmbH, Heidelberg using a Springer TEX macro package Cover design: Erich Kirchner, Heidelberg Printed on acid-free paper 46/3142LK - Contents Probability Basic Notions Structure Methods II Markov Processes and Probability Applications in Analysis 143 III Applied Probability 191 Author Index 275 Subject Index 277 I Probability Basic Notions Structure Methods Contents Introduction 1.1 The Nature of Randomness 1.1.1 Determinism and Chaos 1.1.2 Unpredictability and Randomness 1.1.3 Sources of Randomness 1.1.4 The Role of Chance 1.2 Formalization of Randomness 1.2.1 Selection from Among Several Possibilities Experiments Events 1.2.2 Relative Frequencies Probability as an Ideal Relative Frequency 1.2.3 The Definition of Probability 1.3 Problems of Probability Theory 1.3.1 Probability and Measure Theory 1.3.2 Independence 1.3.3 Asymptotic Behavior of Stochastic Systems 1.3.4 Stochastic Analysis 12 13 14 15 15 16 17 Probability Space 2.1 Finite Probability Space 2.1.1 Combinatorial Analysis 2.1.2 Conditional Probability 2.1.3 Bernoulli’s Scheme Limit Theorems 2.2 Definition of Probability Space 2.2.1 σ-algebras Probability 2.2.2 Random Variables Expectation 2.2.3 Conditional Expectation 2.2.4 Regular Conditional Distributions 2.2.5 Spaces of Random Variables Convergence 2.3 Random Mappings 2.3.1 Random Elements 19 19 19 21 24 27 27 29 31 34 35 38 38 5 6 9 Contents 2.3.2 Random Functions 2.3.3 Random Elements in Linear Spaces 2.4 Construction of Probability Spaces 2.4.1 Finite-dimensional Space 2.4.2 Function Spaces 2.4.3 Linear Topological Spaces Weak Distributions 2.4.4 The Minlos-Sazonov Theorem 42 44 46 46 47 50 51 Independence 3.1 Independence of σ-Algebras 3.1.1 Independent Algebras 3.1.2 Conditions for the Independence of σ-Algebras 3.1.3 Infinite Sequences of Independent σ-Algebras 3.1.4 Independent Random Variables 3.2 Sequences of Independent Random Variables 3.2.1 Sums of Independent Random Variables 3.2.2 Kolmogorov’s Inequality 3.2.3 Convergence of Series of Independent Random Variables 3.2.4 The Strong Law of Large Numbers 3.3 Random Walks 3.3.1 The Renewal Scheme 3.3.2 Recurrency 3.3.3 Ladder Functionals 3.4 Processes with Independent Increments 3.4.1 Definition 3.4.2 Stochastically Continuous Processes 3.4.3 L´evy’s Formula 3.5 Product Measures 3.5.1 Definition 3.5.2 Absolute Continuity and Singularity of Measures 3.5.3 Kakutani’s Theorem 3.5.4 Absolute Continuity of Gaussian Product Measures 53 53 53 55 56 57 59 59 61 63 65 67 67 71 74 78 78 80 83 86 86 87 88 91 General Theory of Stochastic Processes and Random Functions 4.1 Regular Modifications 4.1.1 Separable Random Functions 4.1.2 Continuous Stochastic Processes 4.1.3 Processes With at Most Jump Discontinuities 4.1.4 Markov Processes 4.2 Measurability 4.2.1 Existence of a Measurable Modification 4.2.2 Mean-Square Integration 4.2.3 Expansion of a Random Function in an Orthogonal Series 93 93 94 96 97 98 100 100 101 103 Contents 4.3 Adapted Processes 4.3.1 Stopping Times 4.3.2 Progressive Measurability 4.3.3 Completely Measurable and Predictable σ-Algebras 4.3.4 Completely Measurable and Predictable Processes 4.4 Martingales 4.4.1 Definition and Simplest Properties 4.4.2 Inequalities Existence of the Limit 4.4.3 Continuous Parameter 4.5 Stochastic Integrals and Integral Representations of Random Functions 4.5.1 Random Measures 4.5.2 Karhunen’s Theorem 4.5.3 Spectral Representation of Some Random Functions 104 105 106 107 108 110 110 111 114 Limit Theorems 5.1 Weak Convergence of Distributions 5.1.1 Weak Convergence of Measures in Metric Spaces 5.1.2 Weak Compactness 5.1.3 Weak Convergence of Measures in Rd 5.2 Ergodic Theorems 5.2.1 Measure-Preserving Transformations 5.2.2 Birkhoff’s Theorem 5.2.3 Metric Transitivity 5.3 Central Limit Theorem and Invariance Principle 5.3.1 Identically Distributed Terms 5.3.2 Lindeberg’s Theorem 5.3.3 Donsker-Prokhorov Theorem 119 119 119 122 123 124 124 126 130 132 132 133 135 115 115 116 117 Historic and Bibliographic Comments 139 References 141 Introduction Probability theory arose originally in connection with games of chance and then for a long time it was used primarily to investigate the credibility of testimony of witnesses in the “ethical” sciences Nevertheless, probability has become a very powerful mathematical tool in understanding those aspects of the world that cannot be described by deterministic laws Probability has succeeded in finding strict determinate relationships where chance seemed to reign and so terming them “laws of chance” combining such contrasting notions in the nomenclature appears to be quite justified This introductory chapter discusses such notions as determinism, chaos and randomness, predictibility and unpredictibility, some initial approaches to formalizing randomness and it surveys certain problems that can be solved by probability theory This will perhaps give one an idea to what extent the theory can answer questions arising in specific random occurrences and the character of the answers provided by the theory 1.1 The Nature of Randomness The phrase “by chance” has no single meaning in ordinary language For instance, it may mean unpremeditated, nonobligatory, unexpected, and so on Its opposite sense is simpler: “not by chance” signifies obliged to or bound to (happen) In philosophy, necessity counteracts randomness Necessity signifies conforming to law – it can be expressed by an exact law The basic laws of mechanics, physics and astronomy can be formulated in terms of precise quantitative relations which must hold with ironclad necessity True, this state of affairs existed in the classical period when science did not delve into the microworld But even before, chance had been encountered in everyday life at practicaily every step Birth and death and even the entire life of a person is a chain of chance occurrences that cannot be computed or foreseen with the aid of determinate laws What then can be studied and how studied and what sort of answers may be obtained in a world of chance? Science can merely treat the Introduction intrinsic in occurrences and so it is important to extract the essential features of a chance occurrence that we shall take into account in what follows 1.1.1 Determinism and Chaos In a deterministic world, randomness must be absent – it is absolutely subject to laws that specify its state uniquely at each moment of time This idea of the world (setting aside philosophical and theological considerations) existed among mathematicians and physicists in the 18th and 19th centuries (Newton, Laplace, etc.) However, such a world was all the same unpredictable because of its complex arrangement In order to determine a future state, it is necessary to know its present state absolutely precisely and that is impossible It is more promising to apply determinism to individual phenomena or aggregates of them There is a determinate relationship between occurrences if one entails the other necessarily The heating of water to 100◦ C under standard atmospheric pressure, let us say, implies that the water will boil Thus, in a determinate situation, there is complete order in a system of phenomena or the objects to which these phenomena pertain People have observed that kind of order in the motion of the planets (and also the Moon and Sun) and this order has made it possible to predict celestial occurrences like lunar and solar eclipses Such order can be observed in the disposition of molecules in a crystal (it is easy to give other examples of complete order) The most precise idea of complete order is expressed by a collection of absolutely indistinguishable objects In contrast to a deterministic world would be a chaotic world in which no relationships are present The ancient Greeks had some notion of such a chaotic world According to their conception, the existing world arose out of a primary chaos Again, if we confine ourselves just to some group of objects, then we may regard this system to be completely chaotic if the things are entirely distinct We are excluding the possibility of comparing the objects and ascertaining relationships among them (including even causal relationships) Both of these cases are similar: the selection of one (or several objects) from the collection yields no information In the first case, we know right away that all of the objects are identical and in the second, the heterogeneity of the objects makes it impossible to draw any conclusions about the remaining ones Observe that this is not the only way in which these two contrasting situations resemble one another As might be expected, according to Hegel’s laws of logic, these totally contrasting situations describe the exact same situation If the objects in a chaotic system are impossible to compare, then one cannot distinguish between them so that instead of complete disorder, we have complete order 1.1.2 Unpredictability and Randomness A large number of phenomena exist that are neither completely determinate nor completely chaotic To describe them, one may use a system of noniden- 4.2 Nonlinear Filtering 267 (a) Discrete time Suppose that pk = P{τ = k} and the distribution of the εk ’s are given The εk ’s will be taken to be independent and identically distributed If F (x) is the distribution function of εk , then it is assumed that F (x − 1) = P{εk + < x} is absolutely continuous with respect to F (x) and that the function dF (x − 1) dF (x) ϕ(x) = is positive dF (x)-almost everywhere Then dF (x + 1)/dF (x) = 1/ϕ(x + 1) or, in other words, F (x) is also absolutely continuous with respect to F (x − 1) If F (x − 1) and F (x) were mutually singular, then we could determine τ expeditiously without delay from the observed values of xk = θk + εk as follows Let A be a Borel set such that P{ε1 ∈ A} = 0, P{ε1 + ∈ A} = Then τ = k if IA (xi ) = when i < k and IA (xk ) = Therefore assuming the equivalence of F (x − 1) and F (x) makes the problem more meaningful It is natural to imagine that there is a loss ank > by having decided τ = k and having stopped the process at that time when in fact τ = n The problem is then to minimize the loss We shall seek a sequential solution to the problem At each moment of time, a decision is made to stop the process or continue it The decision is made on the basis of observations of the process to that time Introduce the variables k zk = , ϕ(x i) i=1 z0 = This is also obviously an observable sequence It turns out that to make a decision it is sufficient to know this sequence since the conditional distribution of τ , given x1 , , xn , can be expressed in terms of z1 , , zn as follows: P{τ = m|x1 , , xn } = P{τ = m|x1 , , xn } = pm zm−1 pi zi−1 + zn pm z n n i=1 pi zi−1 + zn n i=1 ∞ j=n+1 pj ∞ j=n+1 pj , m ≤ n, , m > n (4.2.2) The expressions (4.2.2) are the filtering formulas for the change-point problem Knowing these probabilities, we can evaluate the quantity Wn (x1 , , xn ) = anm P{τ = m|x1 , , xn }, the loss resulting if the stoppage is at time m subject to the observations x1 , , xn There now remains the optimum stopping problem for the sequence {ηn = wn (x1 , , xn ), n ≥ 1}: Find a stopping time ζ that minimizes Eηζ This kind of problem was treated in the theory of controlled processes 268 Filtering (b) Continuous time In the case of continuous time, it is natural to replace θt , εt and xt by integrals involving these processes (instead of a process with independent values, εt is considered to be a process with independent increments) Therefore the observed process (denoted as before by xt ) is taken to be t xt = I{τ n P{θm = j|x0 , , xn } α(i ˆ , x0 , , xm , j)pij (m − n) = αn (x0 , , xn ) i ,i (4.2.12) Thus the requisite conditions for the probabilities are expressed in terms of the functions α ˆ n (i, x0 , , xn , j) which can be computed recursively using (4.2.9), (4.2.8), (4.2.10), (4.2.11) and (4.2.12), with α ˆ (i, y0 , j) = ϕi (y0 )δij Formulas (4.2.8)–(4.2.12) are the filtering equations for a time-discrete Markov chain The advantage of these equations is that all of the conditional probabilities are expressed in terms of the exact same function α ˆ n (i, x0 , , xn , j) (taking (4.2.8) into account) satisfying recursion equation (4.2.9) This fact makes it possible to carry these formulas over to continuous time (b) Continuous time Now suppose that {θt , t ≥ 0} is a homogeneous Markov process with r states {1, 2, , r} and transition probabilities pij (t), i, j ∈ {1, , r} satisfying the relation lim t↓0 pij (t) − δij = aij t Consequently, the probabilities pi (t) = P{θt = i} satisfy the forward Kolmogorov equation d pi (t) = dt pj (t)aji j Let ci , i ∈ {1, , r}, be a real function The observed process is t xt = c(θs )ds + ξt , (4.2.13) where ξt is a Wiener process for which Eξt = and Vξt = bt Our aim is to construct the best estimate for c(θs ) from observations of the process xu on the interval [0, t] or, in other words, to find E(c(θs )|xu , u ≤ t) It is convenient to denote the path of the process θu on [s, t] by θts Similarly, xst is the path of xu on [s, t] Let α(t, θt0 , x0t ) = lim αn (θ0h , , θnh , xh0 , , xhn ) n→∞ n ϕhθh (xhk ), = lim n→∞ k=0 k h = t/n, θkh = θkh , 272 Filtering xhk = xkh − x(k−1)h , (x − hc(θ))2 exp − 2bh 2πbh c(θ)x c2 (θ)h = exp − b b ϕhθ (x) = √ 2πbh −1/2 exp x2 2b Thus α(t, θt0 , x0t ) = exp t b c(θs )dxs − b t c2 (θs )dxs (4.2.14) The first integral on the right is defined for any continuous function since c(θs ) is a step-function Let α ˆ (i, t, yt0 , j) = E(α(t, θt0 , yt0 )|θ0 = i, θt = j), (4.2.15) where yu is an arbitrary continuous random function Taking the limit in (4.2.10)–(4.2.12) after multiplying by c(j) and summing over j, we find that E(c(θt )|x0t ) = E(c(θs )|x0t ) = αt (x0t ) αt (x0t ) c(j)ˆ α(i, t, x0i , j)pi (0)pij (t), (4.2.16) i,j c(j)ˆ α(i, s, x0s , j)ˆ α(j, t − s, xst , k) i,j,k × pi (0)pij (s)pjk (t − s), s < t, (4.2.17) and E(c(θs )|x0t ) = αt (x0t ) c(j)ˆ α(i, t, x0t , k)pi (0)pik (t) i,j,k × pkj (s − t), s > t, (4.2.18) where αt (x0t ) = α ˆ (i, t, x0t , k)pi (0)pik (t) (4.2.19) i,k To determine the functions α ˆ (i, t, x0t , k) in terms of which the requisite conditional probabilities are expressed, it is convenient to introduce the functions βij (t) = α(i, ˆ t, x0t , j)pij (t) (4.2.20) Passage to the limit in (4.2.9) yields a system of stochastic differential equations for βij (t): dβij (t) = βik (t)akj (t)dt + βij (t)cj dx(t), (4.2.21) k which must be solved subject to the initial condition βij (0) = δij Such equations have been studied in the theory of Markov processes Historic and Bibliographic Comments General questions of mathematical statistics are handled by Cram´er (1974), Neyman (1950), Van der Waerden (1969) and Zacks (1971) Jerzy Neyman was one of the eminent statisticians of the twentieth century and his book familiarizes the nonspecialist with the essence of probability and statistical problems both in a simple and profound way The main thrust of Cram´er (1974) is his formulation of statistical problems and presentation of methods for solving them The book contains many interesting and meaningful examples Zacks (1971) treats the concepts of statistics, the related problems and also ways of solving them The book is theoretical and is intended for specialists Also touching on the content of the first chapter is the book by Wald (1947) It contains a presentation of sequential analytic methods of testing statistical hypotheses The books by Bellman (1957), Dynkin (1975), Gikhman (1977), Kalman (1969), Kushner (1967), Krylov (1977) and Shiryaev (1976) consider problems on controlled processes Bellman derives equations for the control cost of controlled Markov random processes Gikhman (1977) gives the general theory of time-discrete and time-continuous controlled processes including both Markov chains and Markov processes Dynkin (1975) presents the general concepts of controlled Markov processes, derives the equations for the control cost and proves the existence of optimum controls Kalman (1969) gives in particular an introduction to the theory of controlled random processes Krylov’s book (1977) is devoted to controlled processes defined by stochastic differential equations of diffusion type He studies the nonlinear partial differential equations which are Bellman’s equations for an optimum control The book is intended for specialists A main theme of Shiryaev’s book (1976) is the optimum stopping of a Markov chain or Markov process In particular, it solves the change-point problem Problems in information theory are considered by Feinstein (1958), McMillan (1953) and Shannon (1948) Feinstein discusses the basic concepts and theorems of information theory McMillan’s article is devoted to the capacity 274 Historic and Bibliographic Comments of a Markov-type communication channel and proves a corresponding version of Shannon’s theorem The books and papers by Bucy (1965), Cram´er (1940), Kolmogorov (1941), Liptser (1974), Wiener (1949) and Yaglom (1952) are devoted to filtering Bucy discusses the theory of nonlinear filtering Liptser (1974) contains much material on martingale theory and stochastic equations Its basic aim is to construct a theory of nonlinear filtering It is intended for specialists Wiener (1949) presents the theory of extrapolation, interpolation, filtering and smoothing of stationary sequences and methods based on the factorization of analytic functions The general theory is illustrated by the solution of engineering problems Kolmogorov (1941) reduces problems involving stationary sequences to problems of analysis in Hilbert spaces This is a fundamental approach to solving them Cram´er’s article (1940) presents some solutions to basic problems involving stationary processes Yaglom’s large review article (1952) contains the basic results with complete proofs including those due to the author on the theory of stationary processes Author Index Arzel`a, C., 136 Bayes, T., 22ff, 209ff Bellman, R., 223, 230, 234, 273 Bernoulli, J., 16, 24, 25, 61, 67 Bernstein, S.N., 23 Birkhoff, G.D., 126 Bochner, S., 45ff, 118 Boltzmann, L., 20 Borel, E., 29ff, 56ff, 82, 85, 93ff, 139, 141, 161, 180, 202ff, 267 Bose, S., 20 Buffon, G., 29 Kakutani, S., 88 Karhunen, K., 116, 118 Khinchin, A Ya., 145, 187, 189 Kolmogorov, A.N., 14ff, 48, 56, 61ff, 82, 87, 94ff, 139, 141ff, 165, 187, 189, 200, 201, 271, 274 L´evy, P., 79, 83, 86 Laplace, P., 6, 26, 40, 59, 68, 176 Lebesgue, H., 29, 37ff, 93, 102, 121, 179 Lindeberg, J.W., 133ff, Lyapunov, A.A., 134 Cantelli, F.P., 56, 65, 66, 82, 97, 180 Cauchy, A., 64, 90, 145, 166, 167, 182, 184 Chapman, D.G., 99, 149ff Chebyshev, P.L., 35, 60, 61, 197 Clapeyron, B., Markov, A.A., 98, 99, 125, 139, 145ff, 165ff, 177, 178, 187, 189, 223ff, 240ff, 269ff Maxwell, J.C., 8, 20 Minlos, R.A., 51 Mises, R von, 14, 67 Moivre, A de, 26, 40 Dirac, P., 20 Dirichlet, P.G., 185 Donsker, M., 135 Doob, J.L., 94, 139ff Newton, I., Neyman, J., 203ff, 273 Nikodym, D., 15, 32, 87 Einstein, A., 20 Fatou, P., 90 Fermi, E., 20 Fourier, J.B.J., 44, 75, 76, 145, 206, 262ff Fubini, G., 88, 100, 102 Gikhman, I.I., 139, 145, 273 Hegel, G.W.F., Itˆ o, K., 145 Pearson, E.S., 203ff Petrovsky, I.G., 145, 187, 189 Plato, 13 Poisson, S., 26, 39, 83, 85 Prokhorov, Yu V., 135ff Radon, J., 15, 32, 87, 121 Riemann, B., 101ff Sazonov, V.V., 51 Shannon, C., 249ff, 273ff Smirnov, N.V., 200, 201 Stirling, J., 26 Jordan, C., 88 Wiener, N., 74, 84, 85, 104, 135, 145, 162, 176ff, 202, 268ff Kac, M., 145, 167, 168, 187, 189 Yaglom, A.M., 262, 274 Subject Index A ∨ B, 54 σ-algebra (field), 14, 27 Borel, 29 cylinder, 47 predictable, 107 tail, 56 ε-entropy, 240 ε-optimum, 218 a ∧ b, a ∨ b, 75 IA , 12, 88 ν µ, 88 µn ⇒ µ, 119 Absolute continuity, 15 Adapted processes, 104ff Adaptedness, 106 Additive control cost, 223 Algebra(s), 24ff, 46, 47, 53ff, 100ff, 140ff, 210 independent, 54 of events, 11, 24 product, 24 Almost surely, 35 Amount of information, 235ff Analytic set, 221, 222 Arithmetic distribution, 69 Atom, 21, 28 Axioms, 21, 28 Kolmogorov, 14 von Mises, 14 Backward (first) equation, 156ff Bayes theorem, 22 Bayesian decision, 209 Bellman’s equation, 223, 230 Bernoulli’s scheme, 24 Bernoulli’s theorem, 16, 25 Binomial distribution, 25, 26, 39 Binomial probabilities, 22 Birkhoff’s theorem, 126 Bochner’s theorem, 45 Borel σ-algebra, 29 Borel-Cantelli lemma, 56 Bose-Einstein statistics, 20 Brownian motion, 8, 84 Buffon’s problem, 29 Cauchy problem, 145 Causes (hypotheses), 22 Central limit theorem, 132, 202 Central moment, 31 Change-point problem, 266ff, 273 Channel capacity, 246ff ergodic, 254 Chaos, Chapman-Kolmogorov equation, 99, 149ff Characteristic function, 44ff, 59, 69, 74, 75, 85–87, 132 general, 86 joint (n-dimensional), 42ff Characteristic functional, 44ff Charge, 150 Chebyshev’s inequality, 35 Chebyshev’s theorem, 60 Coding, 248 Communication channel, 244 determinate (noiseless), 245 noisy, 245 Complete, group of events, 22ff measurable process, 108 set of functions, 45, 57, 88, 122 Completeness, 35, 105 Conditional expectation, 31–33, 129, 136 Conditional probability, 21ff, 32ff, 213ff, 238 regular, 34 Confidence interval, 197 Consistency, 42, 49 Continuous random variable, 29ff Control cost, 216ff additive, 223 Control strategy, 217 278 Subject Index Controlled diffusion process, 233 Controlled stochastic process, 215 discrete, 215 Markov, 223 Convergence almost surely, 35 in probablity, 35 of sequences, 82 weak, 119 with probability 1, 13, 16, 35 Convolution equation, 74 Covariance (function), 42ff, 101ff, 162, 176, 178, 202, 205, 259 Critical region, 203, 204 Cylinder σ-algebra, 47 Cylinder set, 47ff Cylinder set base, 50 n-dimensional, 41 continuous, 155, 240 discrete, 39 exponential, 39, 40 geometric, 39 normal (Gaussian), 40, 41 Poisson, 26, 39, 83 symmetric, 62 uniform, 39, 41 weak, 50, 52 Distribution function(s), 30, 38, 39, 49, 59, 198 finite-dimensional, 42, 43, 48, 124, 150 joint, 41, 42, 58 sample (empirical), 198, 199 Distribution span, 69, 71 Donsker-Prokhorov theorem, 135 DeMoivre-Laplace theorem, 26 Decision function, 208ff minimax, 209 Decoding, 248ff Delay time, 40 Density, 30, 32, 39–41, 44, 88, 155, 162, 176, 204, 205, 212, 240ff, 260ff Denumerable homogeneous Markov process, 161 Determinism, Diffusion coefficients, 161ff, 171, 178, 233, 234 Diffusion operator, 162, 165 Diffusion process, 161ff, 171, 178, 233 Dirichlet problem, 173 Discrete random variable, 30 Disorder problem, see changepoint problem, 266 Distribution, 17, 19, 26ff, 58ff, 64ff, 74ff, 85ff, 93, 99ff, 104, 120ff, 132ff, 147ff, 155, 162, 167, 177, 189, 196ff, 216ff, 223, 226, 240, 245, 251, 267ff Elementary events, 11ff, 19ff, 236 Elliptic equations, 185 Empirical distribution function, 198ff Encounter problem, 29 Entropy, 235ff, 249ff conditional, 238, 241 properties, 237 Equal likelihood, 10, 28 Ergodic channel, 253ff Ergodic theorem, 17 maximal, 127 Estimator, 196ff, 209, 265 unbiased, 196ff Events, 7, 10ff, 19ff, 38, 39, 53ff, 61ff, 71, 80, 96, 105, 113, 147, 149, 195, 225, 236, 238 complete group, 22ff mutually independent, 23 Exit time (first), 154, 170 Expectation, 30ff, 60, 68, 110, 111, 129, 132, 140ff, 166, 175ff, 186, 208, 211, 218, 266 Expected value, see Expectation Fermi-Dirac statistics, 20 Subject Index Filtering, 257, 266ff nonlinear, 274 nonpredictive, 265 Finite-dimensional distributions, 48, 99, 125, 149, 151, 228 consistency, 48 Flow of σ-algebras, 105, 108, 170 Forward (second) equation, 156, 159, 271 Fundamental sequence, 37 Gaussian distribution, 40ff, 87, 202, 205 Gaussian measure, 87, 91 Gaussian random function, 44, 104 Generating function, 60 Geometric distribution, 39 Geometrical probabilities, 28 Goodness-of-fit, 200 Harmonic function, 226 Homogeneous Markov chain, 125, 226 denumerable, 161 temporally, 151 Independence, 15, 16, 23, 53ff, 137, 139, 177 Independent algebras, 24, 53 Independent increments, 78ff, 104, 110, 140, 176ff, 268 stationary, 86, 110 Independent random elements, 57 Indicator function, 12, 28 Indistinguishable processes, 109 Interval estimation, 197 Invariance principle, 132 Jordan decomposition, 88 Jump component, 85 Kac’s formula, 167 Karhunen theorem, 116 Kolmogorov equation(s), 156, 162, 271 Kolmogorov’s inequality, 61 279 Kolmogorov’s theorems, 48, 96 Kolmogorov’s three-series theorem, 65 Kolmogorov’s zero-one law, 56 Kolmogorov-Smirnov test, 200 Ladder functionals, 74 Laplace transform, 59, 68 Law of large numbers, 10, 16, 17, 60, 61, 130 strong 65, 199 Law of normal fluctuations, 17 Law of rare events, 26 L´evy’s decomposition, 79 L´evy’s formula, 83 Lindeberg’s condition, 133 Lindeberg’s theorem, 133 Lyapunov’s theorem, 134 Markov chain, 125 controlled, 223ff filtering, 269 Markov process controlled, 234 definition, 98 filtering, 269 homogeneous (with stationary transition probabilities), 151, 225ff, 242, 271 realization, 225 Markov property, 125 Martingale, 110ff, 150ff, 164ff, 182ff, 274 uniformly integrable, 114 Maximal ergodic theorem, 127 Maxwell distribution, Maxwell-Boltzmann statistics, 20 Mean, 5, 30, 41ff, 69, 83ff, 101ff, 132, 162, 179, 199ff, 248, 258 Measurable mapping, 38, 42, 87, 124 Measurable random function, 100 Measure(s), 15, 24, 38, 45ff, 57, 86ff, 102, 116, 119, 121ff, 280 Subject Index 136, 140, 145ff, 156, 159, 165, 177, 202ff absolute continuity of, 87 continuity set of, 120 derivative of, 88 ergodic, 131 invariant, 125, 140 orthogonal, 88 product, 88, 91, 206 Radon, 121 singular, 87, 88 tight, 121 weakly compact, 122 Measure-preserving transformation, 124 Message(s), 205, 236, 237, 244ff Metric transitivity, 130 Minimax decision function, 209 Minimax risk, 210 Minlos-Sazonov theorem, 51 Modification, 79, 80, 93, 97ff, 109, 110, 135, 153, 161, 205 measurable, 100ff, 109, 110 of martingales, 114, 115 regular, 93 Moment, 31 Moment function, 43 Monotone collection, 27, 109 theorem, 27 Multinomial probabilities, 25 Neyman-Pearson theorem, 204 Noisy (noiseless) channel, 245ff Nonarithmetic distribution, 70 Nonlinear filtering, 274 Nonpredictive filtering, 265 Nonrandomized control, 225 Normal distribution, 44, 132, 133, 162, 200, 205, 206 Null hypothesis, 202 Optimum (ε-)control, 216, 225ff, 234 Optimum stopping, 273 Optimum strategy, 218 Order statistics, 198, 199 Overshoot, 74, 77 Overshoot time, 74 Parabolic equations, 145 Phase space, 38, 147 Pointwise estimation, 196 Poisson distribution, 26, 39 Poisson process, 83 Poisson’s theorem, 26 Positive-definite function, 45 Power, 203 Predictable σ-algebra, 107 Predictable process, 108 Predictable sets, 108 Predictable stopping time, 106ff completely, 106 Prediction, 17, 257ff Probability, 7ff, 30ff, 61ff, 73, 77ff, 93ff, 110ff, 123ff, 145ff, 177ff, 187ff, 193ff, 225, 229, 235ff, 249ff, 266, 269, 273 definition, 13 geometrical, 29 of type I, type II errors, 202 transition, 99, 125, 126, 148ff, 156ff, 177, 187, 225, 229, 269 Probability space(s), 24, 29, 46, 64, 140 family of, 147 Process with independent increments, 78, 80, 83, 90, 104, 110, 268 discrete, 78 Product measures, 86ff, 91, 206 Product of algebras, 24 Progressive measurability, 106 Purely discontinuous process, 156, 159 Radon measure, 121 Radon-Nikodym theorem, 15, 32 Subject Index Random element(s), 34, 38, 49, 57, 120, 121ff distribution of, 44 independent, 53, 57 Random experiment, 10ff, 38 Random field, 42 Random function, 17, 42ff, 93ff, 100ff, 116, 259, 272 Gaussian, 44 measurable, 47 regular, 94 separable, 94 Random mapping, 38 Random measure, 115ff Random polygonal path, 135 Random process, see Stochastic process Random sample, 196, 198 Random variable(s), 29ff, 44, 53, 59ff, 78, 100, 110ff, 132, 139, 140, 198, 202, 207, 241ff, 258, 269 discrete, 30 distribution of, 30 independent, 59, 60ff, 78, 132, 140 Random vector, 40ff, 241 Random walk, 67, 71ff, 227, 228 arithmetic, 71 recurrent, 71 semibounded, 77 Randomized strategy, 217 Ratio theorem, 127 Recurrency, 74 Regular modification, 94 Relative frequency, 12ff, 22, 67, 133, 196ff Renewal function, 68 Renewal scheme, 67 Renewal time, 68 Risk function, 208 Sample data, 195ff, 208 Sample distribution function, 198 Sample mean, 199, 209 281 Sample moment, 199 Sample points, 11 Sample quantile, 199 Sample space, 11 Sample standard deviation, 199 Sample variance, 199 Sampling, 199 Sampling parameters, 199 Self-similar process, 178 Semibounded walk, 77 Semimartingale, 111 Separable random functions, 94 Sequential decision, 211 Shannon’s theorem, 249 Shifting operator, 124, 130 Sigma algebra, 27 cylinder, 48 predictable, 107 tail, 56 Signal detection, 205 Space of elementary events, 11 Spectral density, 260, 265 Spectral function, 259 State space, 147 Stationary process, 130, 244, 254 ergodic, 130 Stationary sequence, 125, 130, 131, 239ff, 259, 264 ergodic, 131 strict-sense, 124 wide-sense, 117, 259 Statistic, 196ff sufficient, 196, 198 Step-controls, 228, 230, 233 Stirling’s formula, 26 Stochastic analysis, 17 Stochastic equivalence, 93 Stochastic integral, 116, 179, 181 Stochastic interval, 107 Stochastic process(es), 42, 43, 53, 67, 104, 139, 140, 145, 193, 215 consistent, 49 continuous, 79 controlled, 193, 215 282 Subject Index Stochastically continuous process, 80, 83, 99 Stopping time, 105ff, 170, 211, 225, 227, 267 predictable, 106ff Strategy, 217, 218, 225, 229, 230 Subharmonic function, 227 Submartingale, 111ff, 227 Sufficient statistic, 196, 198 Superharmonic function, 226 Supermartingale, 111ff Symmetrization, 64, 82 Tail σ-algebra, 56 Tests of hypotheses, 202ff Three-series theorem, 78, 79 Tightness, 121 Total probability formula, 22 Transport coefficient (vector), 162, 165 Transition probability, 99, 125, 126, 148ff, 162, 177, 187, 225, 269 Printing: Mercedes-Druck, Berlin Binding: Stein + Lehmann, Berlin Type I, type II probability errors, 202 Unbiased estimator, 196ff Uniform distribution, 39, 41 Uniform stochastic continuity, 79 Uniform tightness, 121 Uniformly integrable, 37, 91, 114 martingale, 114 Uniprovable tests, 204 Variance, 31, 40, 60, 85, 87, 91, 132, 133, 162, 196, 199, 200, 206, 209 Variational series, 198 Weak compactness, 122, 136 Weak convergence, 119 Weak distribution, 50, 52 Weak-continuity condition, 220, 224 Wiener measure, 178 Wiener process, 176ff ... calculate a probability (a) The classical definition of probability Games of chance and the analysis of testimony of witnesses were originally the basic areas of application of probability theory Games.. .Basic Principles and Applications of Probability Theory A. V Skorokhod Basic Principles and Applications of Probability Theory Edited by Yu.V Prokhorov Translated by B D Seckler 123 A. V Skorokhod. .. Conditional Probability The conditional probability of an event A given event B having positive probability has occurred is the quantity P (A| B) = P (A ∩ B) P (B) (2 .1.1) As a function of A, P (A| B)

Ngày đăng: 07/09/2020, 14:51

Mục lục

  • cover-m

    • cover

    • Page_i

    • Page_iii

    • Page_iv

    • Page_v

    • page_00000001

      • page_00000001

      • page_00000002

      • page_00000003

      • page_00000005

      • page_00000006

      • page_00000007

      • page_00000008

      • page_00000009

      • page_00000010

      • page_00000011

      • page_00000012

      • page_00000013

      • page_00000014

      • page_00000015

      • page_00000016

Tài liệu cùng người dùng

  • Đang cập nhật ...

Tài liệu liên quan