1. Trang chủ
  2. » Ngoại Ngữ

concepts of probability theory second revised edition dover books on mathematics

599 378 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 599
Dung lượng 11,08 MB

Nội dung

Tiếng Anh và mức độ quan trọng đối với cuộc sống của học sinh, sinh viên Việt Nam.Khi nhắc tới tiếng Anh, người ta nghĩ ngay đó là ngôn ngữ toàn cầu: là ngôn ngữ chính thức của hơn 53 quốc gia và vùng lãnh thổ, là ngôn ngữ chính thức của EU và là ngôn ngữ thứ 3 được nhiều người sử dụng nhất chỉ sau tiếng Trung Quốc và Tây Ban Nha (các bạn cần chú ý là Trung quốc có số dân hơn 1 tỷ người). Các sự kiện quốc tế , các tổ chức toàn cầu,… cũng mặc định coi tiếng Anh là ngôn ngữ giao tiếp.

CONCEPTS OF PROBABILITY THEORY PAUL E PFEIFFER Department of Mathematical Sciences Rice University   Second Revised Edition DOVER PUBLICATIONS, INC NEW YORK to Mamma and Daddy   Who contributed to this book by teaching me that hard work can be satisfying if the task is worthwhile Copyright © 1965 by Paul E Pfeiffer Copyright © 1978 by Paul E Pfeiffer All rights reserved     Bibliographical Note This Dover edition, first published in 1978, is an unabridged republication of the work originally published by McGraw-Hill Book Company, New York, in 1965 This Dover edition has been extensively revised and updated by the author Library of Congress Catalog Card Number: 78-53757 International Standard Book Number ISBN 13: 978-0-486-63677-1 ISBN 10: 0-486-63677-1     Manufactured in the United States by Courier Corporation 63677107 www.doverpublications.com Preface My purpose in writing this book is to provide for students of science, engineering, and mathematics a course at the junior or senior level which lays a firm theoretical foundation for the application of probability theory to physical and other real-world problems Mastery of the material presented should provide an excellent background for further study in such rapidly developing areas as statistical decision theory, reliability theory, dynamic programming, statistical game theory, coding and information theory, communication and control in the presence of noise, etc., as well as in classical sampling statistics My teaching experience has shown that students of engineering and science can master the essential features of a precise mathematical model in a way that clarifies their thinking and extends their ability to make significant applications of the theory While the ultimate aim of the development in this book is the application of probability theory to problems of practical import and interest, the central task undertaken is an exposition of the basic concepts of probability theory, including a substantial introduction to the idea of a random process Rather than provide a treatise on applications and techniques, I have attempted to provide a clear development of the fundamental concepts and theoretical perspectives which guide the formulation of problems and the discovery of methods of solution Considerable attention is given to the task of translating real-world problems into the precise concepts of the model, so that the problem is stated unambiguously and may be attacked with all the resources provided by the mathematical theory as well as physical insight The rich theory of probability may be constructed on the essentially simple conceptual framework of that mathematical model generally known as the Kolmogorov model The central features of this model may be grasped with the aid of certain graphical, mechanical, and notational representations which facilitate the formulation and visualization of concepts and relationships Highly sophisticated techniques are seen as the means of performing conceptually simple tasks The whole theory is formulated in a way that makes contact with both the literature of applications and the literature on pure mathematics At many points I have borrowed specifically and explicitly from one or the other; the resulting treatment points the way to extend such use of the vast reservoir of knowledge in this important and rapidly growing field In introducing the basic model, I have appealed to the notion of probability as an idealization of the concept of the relative frequency of occurrence of an event in a large number of repeated trials In most places, the primary interpretation of probability has been in terms of this familiar concept I have also made considerable use of the idea that probability indicates the uncertainty regarding the outcome of a trial before the result is known Various thinkers are currently advocating other approaches to formulating the mathematical model, and hence to interpreting its features These alternative ways of developing and interpreting the model not alter its character or the strategies and techniques for dealing with it They serve to increase confidence in the usefulness and “naturalness” of the model and to point to the desirability of achieving a mastery of the theory based upon it The background assumed in the book is provided in the freshman and sophomore mathematics courses in many universities A knowledge of limits, differentiation, and integration is essential Some acquaintance with the rudiments of set theory is assumed in the text, but an Appendix provides a brief treatment of the necessary material to aid the student who does not have the required background Many students from the high schools offering instruction in the so-called new mathematics will be familiar with most of the material on sets before entering the university Although some applications are made to physical problems, very little technical background is needed to understand these applications The book should be suitable for a course offered in an engineering or science department, or for a course offered by a department of mathematics for students of engineering or science The practicing engineer or scientist whose formal education did not provide a satisfactory course in probability theory should be able to use the book for self-study It has been my personal experience, as well as my observation of others, that success in dealing with abstract systems rests in large part on the ability to find concrete mental images and constructs which serve as aids in visualizing, remembering, and relating the abstract concepts and ideas This being so, success in teaching abstract systems depends in similar measure on making explicit use of the most satisfactory images, diagrams and other conceptual aids in the act of communicating ideas The literature on probability—both works on pure mathematics and on practical applications—contains a number of such aids to clear thinking, but these aids have not always been exploited fully and efficiently I can lay little claim to originality in the sense of novelty of ideas or results Yet I believe the synthesis presented in this book, with its systematic exploitation of several ideas and techniques which ordinarily play only a marginal role in the literature known to me, provides an approach to probability theory which has some definite pedagogical advantages Among the features of this presentation which may deserve mention are:   A full exploitation of the concept of probability as mass; in particular, the idea that a random variable produces a point-bypoint mass transfer from the basic probability space is introduced and utilized in a manner that has proved helpful Exploitation of minterm maps, minterm expansions, binary designators, and other notions and techniques from the theory of switching, or logic, networks as an aid to systematizing the handling of compound events Use of the indicator function for events (sets) to provide analytical expressions for discrete-valued random variables Development of the basic ideas of integration on an abstract space to give unity to the various expressions for mathematical expectation The mass picture is exploited here in a very significant way to make these ideas comprehensible Development of a calculus of mathematical expectations which simplifies many arguments that are otherwise burdened with unnecessary details None of these ideas or approaches is new Some have been utilized to good advantage in the literature What may be claimed is that the systematic exploitation in the manner of this book has provided a treatment of the topic that seems to communicate to my students in a way that no other treatment with which I am familiar has been able to This work is written in the hope that this treatment may be equally helpful to others who are seeking a more adequate grasp of this fascinating and powerful subject   Acknowledgments My indebtedness to the current literature is so great that it surely must be apparent I am certainly aware of that debt Among the many helpful comments and suggestions by students and colleagues, I am particularly grateful for my discussions with Dr A J Welch and Dr Shu Lin, graduate students at the time of writing The critical reviews—from different points of view—by Dr H D Brunk, my former professor, and Dr John G Truxal, as well as those by several unnamed reviewers, have aided and stimulated me to produce a better book than I could have written without such counsel The patient and dedicated work of Mrs Arlene McCourt and Mrs Velma T Goodwin has been an immeasurable aid in producing a manuscript I can only hope that the pride and care they showed in their part of the work is matched in some measure by the quality of the contents   PAUL E PFEIFFER Contents PREFACE Chapter 1.    Introduction 1-1.    Basic Ideas and the Classical Definition 1-2.    Motivation for a More General Theory            Selected References Chapter 2.    A Mathematical Model for Probability 2-1.    In Search of a Model 2-2.    A Model for Events and Their Occurrence 2-3.    A Formal Definition of Probability 2-4.    An Auxiliary Model—Probability as Mass 2-5.    Conditional Probability 2-6.    Independence in Probability Theory 2-7.    Some Techniques for Handling Events 2-8.    Further Results on Independent Events 2-9.    Some Comments on Strategy            Problems            Selected References Chapter 3.    Random Variables and Probability Distributions 3-1.    Random Variables and Events 3-2.    Random Variables and Mass Distributions 3-3.    Discrete Random Variables 3-4.    Probability Distribution Functions 3-5.    Families of Random Variables and Vector-valued Random Variables 3-6.    Joint Distribution Functions 3-7.    Independent Random Variables 3-8.    Functions of Random Variables 3-9.    Distributions for Functions of Random Variables 3-10.  Almost-sure Relationships            Problems            Selected References Chapter 4.    Sums and Integrals 4-1.    Integrals of Riemann and Lebesgue 4-2.    Integral of a Simple Random Variable 4-3.    Some Basic Limit Theorems 4-4.    Integrable Random Variables 4-5.    The Lebesgue-Stieltjes Integral 4-6.    Transformation of Integrals            Selected References Chapter 5.    Mathematical Expectation 5-1.    Definition and Fundamental Formulas 5-2.    Some Properties of Mathematical Expectation 5-3.    The Mean Value of a Random Variable 5-4.    Variance and Standard Deviation 5-5.    Random Samples and Random Variables 5-6.    Probability and Information 5-7.    Moment-generating and Characteristic Functions            Problems            Selected References Chapter 6.    Sequences and Sums of Random Variables 6-1.    Law of Large Numbers (Weak Form) 6-2.    Bounds on Sums of Independent Random Variables ROSENBLATT, MURRAY [1962]: “Random Processes,” Oxford University Press New York [7] [1948]: A Mathematical Theory of Communication, Bell System Tech J., vol 27, pp 379–423, 623–656 [5] SHANNON, C E —— [1957]: Certain Results in Coding Theory for Noisy Channels, Information and Control, vol 1, pp 6–25 [6] [1937]: “Introduction to Mathematical Probability,” McGrawHill Book Company, New York [1] USPENSKY, J V [1960]: “Introduction to Probability and Random Variables,” McGraw-Hill Book Company, New York [3, 5] WADSWORTH, GEORGE A., AND JOSEPH G BRYAN (ed.) [1954]: “Selected Papers on Noise and Stochastic Processes,” Dover Publications, Inc., New York WAX, NELSON [1941]: “The Laplace Transform,” Princeton University Press Princeton, N.J [5] WIDDER, DAVID VERNON [1961]: “Sequential Decoding,” The M.I.T Press Cambridge, Mass., and John Wiley & Sons, Inc., New York [6] WOZENCRAFT, JOHN M., AND BARNEY REIFFEN [1962]: “An Introduction to the Theory of Stationary Random Functions,” Prentice-Hall, Inc., Englewood Cliffs N.J [7] YAGLOM, A M Index Absolutely continuous, 134, 141–142, 213–214 Additive classes of sets, 373 completely additive classes, 374 Additivity property, of integrals, 206, 210 of probability, 5, 29 Algebra of sets, 373 Almost-sure relationships, 183–187 (See also Relationships holding with probability one) Autocorrelation function, 334–339 Average probability, 47–48   Basic set (see Space, basic) Bayes’ rule, 49 Bernoulli trials, 86–93, 281–284 Bernoulli’s theorem, 277 Binary designator, for Boolean functions, 75–76, 79 for minterms, 72 Binary entropy function, 255, 269, 282–283 Binary symmetric channel, 255–256 Binomial coefficient, 88, 366 Binomal distribution, 88, 143, 232, 238, 271, 280, 318 summed, 88 Boolean function, 73–82 binary designator for, 75–76, 79 of independent events, 83 minterm expansion for, 74 Borel-Cantelli lemma, 295 Borel field, 374 Borel functions, 164, 268, 376 Borel sets, 116, 374 Borel’s Theorem, 291   Canonical form for discrete random variables, 124–125 reduced, 124 Cartesian product of sets, 370–371 Central limit theorem, 263, 292–294 Chapman-Kolmogorov equation, 308 Characteristic function, 258–263 for gaussian random variables, 262, 355 Chebyshev inequality, 242, 244 and law of large numbers, 276 Chernoff bound, 280–281 and Bernoulli trials, 281–284 Choice variable, 28, 300, 348, 352 Classes of sets = classes of events, 21–23, 371–374 algebra, 373 Borel field, 374 countable, 22, 28, 371 disjoint, 372 field, 373 independent, 58 intersection of, 23, 367 monotone, 373 sigma field, 27–28, 374, 375 union of, 22, 367 Classical probability, 2, 31, 40 Coding, random, 47 Combinations, 364–365 Communication channel, 50 Complete system of events, 23 Conditional independence, 94 Conditional probability, 41–54 definition of, 42 as probability measure, 43 product rule for, 44 Convergence, almost surely, 285 almost uniformly, 285 in the mean of order p, 285 in probability, 285 with probability one, 285 relationships among types of, 289–290 set of convergence points, 286 Convolution, 177 Correlation coefficient for two random variables, 331 Countable class, 22, 28, 371 Counting, principle of, 364 Counting processes, 302, 319 Poisson, 319–323, 358 Covariance factor for random variables, 331 Covariance function for random process, 334 Cross-correlation function for two random processes, 339   DeMorgan’s rules, 372 Density functions, basic properties, 142 definition of, 142 for functions of random variables, 168–183 (See also Joint density functions) Difference of sets, 367 Discrete random variables, canonical form, 124–125 reduced, 125 expectation of, 228–229 Disjoint class, 372 Disjunctive union, 367 Distribution functions, for random processes, 323–325 first order, 323 second order, 323 for step process, 326–329 Distribution functions, for random variables, 132 for absolutely continuous random variables, 134–135 basic properties of, 139–141 for discrete random variables, 133–134 for functions of single random variable, 168–172 for functions of two random variables, 172–183, 195–196 (See also Joint distribution functions; Probability distributions)   Ehrenfest diffusion model, 317 Element, 15, 28, 367 Ensemble average, 221 (See also Expectation) Entropy, 252 binary entropy function, 250, 269, 282–283 as mathematical expectation, 253–254 properties of, 257–258 source entropy, 256 Equal [P] (see Relationships holding with probability one) Ergodic processes, 352 Ergodic states for Markov chains, 314–316 Events, 14, 16–17 combinations of, 17–19 intersection of, 19 union of, 19 complementary, 19 complete system of, 23 disjoint, 19 impossible, 19 independent classes of, 58, 61–66 joint, 23 mutually exclusive, 19, 30 occurrence of, 15–17, 28 as sets, 16–17, 78 sure, 19 (See also Space, basic) Expectation, mathematical, 221–225 definition of, 221 properties of, 226–228   Field of sets, 373 Fourier transform, 178 Function, 109–112, 111–120 Function, Borel, 162, 268, 376 characteristic, 259, 355 density (see Density functions) domain of, 109 generating, 308 indicator, 72–74 inverse, 113–116 (See also Inverse mapping) measurability of, 116–119, 375–376 moment-generating (see Moment-generating function) orthogonal, 266 positive semi-definite, 340, 355 range of, 110 value of, 110   Gaussian distribution, 136–137, 226, 239–241, 262–263 joint gaussian (or normal), 191–193, 355–356 Gaussian random processes, 356 strongly normal, 356 Generating function, 308 Generating sets for a partition, 71   Iffi = “if and only if,” 5, 16 Image points, 110 Increments of a random process, 319 independent, 319 stationary, 319 Independence, stochastic, 54–66 of boolean functions of events, 83 conditional, 94 definition of, 56 of functions of independent random variables, 167 pairwise, 58 product rule for independent events, 56 of random variables, 156–162 Independent classes, of events, 58, 61–66 of random variables, 157 Index set, 22, 371 Indicator function, 72–74 and discrete random variables, 124–125 mapping produced by, 111 variance of, 237 Inequalities Chebyshev, 242, 244 Jensen, 228 Markov, 228 Schwarz, 226–227 Information and uncertainty, 246–247 conditional, 250 mutual, 248 self-, 249 Integrable random variables, 209 Integrals, Lebesgue, 198–203 Lebesgue-Stieltjes, 212–215 with respect to probability measure, 203–211 properties of, 205–208, 210–211 Riemann, 198–199 transformation of, 215–218 Inverse mapping, 113–115 inverse image, 113 preservation of set operations by, 114–115   Joint density functions, 153 Joint distribution functions, 151–153 Joint gaussian distribution, 191–192, 355–356 Joint mapping produced by random variables, 145 Joint probability distributions, 147–149 discrete random variables, 149–151 uniform over rectangle, 154–155 Joint probability measure, 149   Kolmogorov, A N., 10   Laplace transform, 177, 182, 259 Law of large numbers, Bernoulli’s theorem, 277 Borel’s theorem, 291–292 strong, 290–292 weak, 275–279 Lebesgue dominated-convergence theorem, 290 Lebesgue measure, 141, 200, 301 Limit of sequence of sets, 372–373 Linear predictor, 359 Linearity, of expectations, 226 of integrals, 205–206, 210   Mapping, 109 (See also Function) Marginal probabilities and distributions, 149–151 Markov processes, 302–304 chain dependence, 303 ergodic states, 314–316 homogeneous chains, 304 irreducible chains, 312 period of chain, 314 period of state, 303 recurrence time, 314 state space, 303 stationary chains, 304 stationary distribution, 316 transition diagram, 311–312 transition matrix, 304 transition probabilities, 304, 308 transition time for state, 303 urn model for chain, 305–306 waiting-time distribution, 313–314 Marquand diagram, 76 Mass, probability as, 37–39 transfer of, by random variables, 120 Mass distribution, induced, by random variables, 120–122, 147–149 by function of random variables, 164–175 (See also Probability distributions) Mathematical expectation, 219–225 Matrices, stochastic, 305 transition matrix for Markov chain, 304 Mean-square properties of random processes, 340–342 continuity [mean 2], 340 derivative [mean 2], 341–342 Mean-value function for random process, 330 Mean value for random variable, 230 and center of mass, 231 and central tendency, 234 Measurability condition, 116–119, 376 Measurable function, 376 random variable as, 116, 376 Measure, Lebesgue, 141, 201, 301 Lebesgue-Stieltjes, 212 probability as, 29 totally, finite, 205 Minterm, 71, 79 expansion theorem, 74 Map, 76–82, 368 Model, auxiliary, mathematical, 8, 10–14 Modulo addition, 192 Moment-generating function, 259 for gaussian random variables, 262 and Laplace-Stieltjes transform, 259 properties of, 260–261 for sums of independent random variables, 261 Monotone class, 373   Normal distribution (see Gaussian distribution) Normal processes (see Gaussian random processes)   Object points, 110 Orthogonal functions, 266 Outcomes, as an element, 15, 28, 367 elementary, 14–18, 28   Partition, 23–27, 67–69 associated with simple random variable, 126–127 of event, 25 generated by a class, 70–71 joint, 68 product, 68 Periodic random process, 360 Permutations, 364–365 Poisson distribution, 144, 232, 238 Poisson process, 320–323, 358 mean rate for, 322 Polya, 101 Positive semi-definite functions, 340 quadratic forms, 355 Probability, classical, 2, 31, 40 conditional (see Conditional probability) definition, 5, 28–30 determination of, 36–37 as function of sets, 29 as mass, 37–40 as measure, 10, 29–30, 38–39 Probability density functions (see Density functions) Probability distribution functions (see Distribution functions, for random variables) Probability distributions, 132–144 absolutely continuous, 134, 141–142, 213–214 gaussian (normal), 136–137, 231, 239–241, 262–263, 350–351, 355–356 triangular, 135–136 uniform, 122, 135, 231, 261 discrete, 125–126, 213 binomial, 88, 143, 232, 238, 271, 280, 318 exponential, 144, 178, 223, 232, 272 gamma, 144 geometric, 141, 223 negative binomial, 143 Poisson, 144, 232, 238 and independence, 157–161 singular continuous, 213 part of, 213, 222, 224 (See also Joint probability distributions) Probability mass, 37–39 Probability measure, 29, 39 conditional, 42–43 induced, by function of random variables, 165–166 by random variable, 120–121 joint probability measure induced by pair of random variables, 147–150 (See also Integrals) Probability schemes, 246–247, 251–252 average uncertainty for, 252–253 independent schemes, 252 joint schemes, 252 Probability system, 29 Probability vector, 251 Probability-weighted average, 203, 220 average probability, 47–48 expectation as, 220 Product rule, for conditional probability, 44 for distribution and density functions, 160–161 for independent classes of events, 58–60 for independent events, 56 for independent random variables, 157 for integrals of, 208, 211 for mathematical expectation of, 227   Radon-Nikodym theorem, 142 Random coding, 47 Random digits, 172, 296 Random processes, 299 covariance function for, 334 cross-correlation function for two, 339 distribution functions (see Distribution functions) gaussian, 356 increments of, 319 index set for, 299 mean-square properties of, 340–342 mean-value function for, 330 member of, 300 order of, 330 parameter set for, 299 random function, 299 random sequence, 299 realization of, 300 sample function from, 300 typical function for, 346, 350–353 Random samples, 190 and random variables, 243–245 sample average, 244, 294 Random sampling, 91–93, 190 Random sinusoid, 301, 359 Random variables, 116 absolutely continuous, 141–142, 213, 214 Borel functions of, 164 correlation coefficient for two, 331 discrete, 123 elementary, 123 equality of, 185 functions of, 162–163 distribution functions for, 163–179, 191–192 independence of, 162–163 mappings produced by, 160–161 independent, 152 approximating simple functions for, 157–158 functions of, 167–168 independent classes of, 157 integrable, 209 as limits of sequences of simple random variables, 129–132 and mass distributions, 120–123 mean value for, 230–231 measurability of, 116–120, 202, 375–376 Random variables, and random samples, 243–245 sequences of, variance property for, 276 simple (see Simple random variables) transfer of mass by, 120 uncorrelated, 236, 276–277, 331 Random walk, 188, 306–307 Relationships holding with probability one, 183–187 convergence of sequences, 285 equality, of classes of events, 184 of events, 183, 196 of random variables, 185–186 Relative frequency, 2, Reliability theory, chain model in, 181–182 parallel systems in, 60–61 partially, 91 serial systems in, 32, 60 standby redundancy in, 178–179 Repeated trials, 69–70, 117–118, 229   Sample mean, 244, 294 Sample space (see Space, basic) Sampling, random, 91–93, 190 from infinite population, 243 with replacement, 243 Schwarz inequality for expectations, 227, 230 Sets, 15–17, 367–374 basic, 15, 367 Borel, 116, 374 cartesian product of, 370 classes of (see Classes of sets = classes of events) complement of, 19, 367 coordinate, 370 empty, 19, 367 events as, 16–17, 28 generating, for a partition, 71 index, 22, 371 operations on, 367 rectangle, 148, 370 relationships between, 368–369 sequences of, 372 Sigma field, 27–29, 373–375 determined by random variable, 119 generated by a class, 374 Simple random variables, 123 Simple random variables, functions of, 161–162 reduction to canonical form, 124–126 partition determined by, 126–127 as step functions, 123 Space, basic, 15, 28–29 as organizing concept, 97 sample, 15 Standard deviation, 235 (See also Variance) State probabilities for Markov chain, 305 State space for Markov chain, 303–304 absorbing state, 312 initial state, 303 Stationary distribution for Markov chain, 316 Stationary random processes, 342–345 second-order, 344 wide-sense stationary, 344 Statistical regularity, 5, 12 Stochastic correlation and time correlation, 349–353 Stochastic independence, 56 (See also Independence, stochastic) Stochastic matrix, 305 Stochastic process (see Random processes) Strategy, 30, 96–98 Strong law of large numbers, 290–292 Borel’s theorem, 291 Subadditivity property, 35 Success-failure model, 309 Symmetric difference, 367   Time average = time mean, 347–348 Time correlation, 348–351 autocorrelation, 348 cross-correlation, 348 and stochastic correlation, 349–353 Total probability, 43 Transformation, 109 (See also Function) Transition probabilities for Markov processes, 304 higher order, 308 Trial, 14–15 Bernoulli, 86–91 independent, 63 repeated, 69, 117–118, 229 Truncation, method of, 278 Typical functions for random processes, 346, 350–353   Uncertainty (see Information and uncertainty) Uncorrelated random variables, 236, 276–277, 331   Variance, 235 Variance, properties of, 235–277 and radius of gyration, 230 as second moment about center of mass, 235 Variance property for sequences of random variables, 276 Venn diagram, 21, 67, 77, 368 (See also Minterm maps) 6-3.    Types of Convergence 6-4.    The Strong Law of Large Numbers 6-5.    The Central Limit Theorem            Problems            Selected References Chapter 7.    Random Processes 7-1.    The General Concept of a Random Process 7-2.    Constant Markov Chains 7-3.    Increments of Processes; The Poisson Process 7-4.    Distribution Functions for Random Processes 7-5.    Processes Consisting of Step Functions 7-6.    Expectations; Correlation and Covariance Functions 7-7.    Stationary Random Processes 7-8.    Expectations and Time Averages; Typical Functions 7-9.    Gaussian Random Processes            Problems            Selected References Appendixes Appendix A.   Some Elements of Combinatorial Analysis Appendix B.   Some Topics in Set Theory Appendix C.   Measurability of Functions Appendix D.   Proofs of Some Theorems Appendix E.   Integrals of Complex-valued Random Variables Appendix F.   Summary of Properties and Key Theorems BIBLIOGRAPHY INDEX [...]... corresponding side of one of the dice The second number represents the corresponding side of the second die Thus the number pair (3, 2) indicates the appearance of side 3 of the first die and of side 2 of the second die The sides are usually numbered according to the number of spots thereon We have said there are 36 such pairs For each possibility on the first die there are six possibilities on the second. .. sum of the individual probabilities 4 The probability that an event will not occur is 1 minus the probability of the occurrence of the event 5 If the occurrence of one event implies the occurrence of a second event, the relative frequency of occurrence of the second event must be at least as great as that of the first event Thus the probability of the second event must be at least as great as that of. .. classical monograph [1933] many streams of development This monograph is now available in English translation under the title Foundations of the Theory of Probability [1956] The Kolmogorov model presents mathematical probability as a special case of abstract measure theory Our exposition utilizes some concrete but essentially sound representations to aid in grasping the abstract concepts and relations of this... [1957, chaps 2 through 4] Upon this simple base a magnificent mathematical structure has been erected Introduction of the laws of compound probability and of the concepts of conditional probability, random variables, and mathematical expectation have provided a mathematical system rich in content and powerful in its application As an example of the range of such theory, one should examine a work such... situations in which there is a continuum of possibilities In such situations, some physical variable may be observed: the height of an individual, the value of an electric current in a wire, the amount of water in a tank, etc Each of the continuum of possible values of these variables is to be considered a possible outcome A second limitation inherent in the classical theory is the assumption of equally... be able to move from one part of our system to another with freedom and insight For clarity and emphasis, we may find it helpful to indicate the important transitions in the following manner: A → B: Translation of real-world concepts, relations, and problems into terms of the concepts of the mathematical model B → A: Interpretation of the mathematical concepts and results in terms of real-world phenomena... strategies of problem solution suggested by a grasp of the essential content of the theory Applications emphasize the translation of physical assumptions into statements involving the precise concepts of the mathematical model It is assumed in this chapter that the reader is reasonably familiar with the elements of set theory and the elementary operations with sets Adequate treatments of this material are... present the concepts and their relations with considerable precision, although we do not always attempt to give the most general formulation At many places we borrow mathematical theorems without proof We sometimes note critical questions without making a detailed examination; we merely indicate how they have been resolved Emphasis is on concepts, content of theorems, interpretations, and strategies of problem... considerations, we do not attempt to examine the theory constructed upon the foundation of the classical definition of probability; instead, we turn immediately to the more general model Not only does this general theory include the classical theory as a special case; it is often simpler to develop the more general concepts in spite of certain abstractions—and then examine specific problems from the... history of ideas as well as in the development of the technical aspects of probability theory in its early stages DAVID [1957]: “An Introduction to Probability Theory and Its Applications,” vol 1, 2d ed An introduction and an extensive treatment of probability theory in the case of a finite or countably infinite number of possible outcomes Chapters 2, 3, and 4 provide a rather extensive treatment of the

Ngày đăng: 14/07/2016, 15:24

TỪ KHÓA LIÊN QUAN

TÀI LIỆU CÙNG NGƯỜI DÙNG

TÀI LIỆU LIÊN QUAN

w