1. Trang chủ
  2. » Kinh Doanh - Tiếp Thị

A first course in probability

545 256 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 545
Dung lượng 3,03 MB

Nội dung

A FIRST COURSE IN PROBABILITY This page intentionally left blank A FIRST COURSE IN PROBABILITY Eighth Edition Sheldon Ross University of Southern California Upper Saddle River, New Jersey 07458 Library of Congress Cataloging-in-Publication Data Ross, Sheldon M A first course in probability / Sheldon Ross — 8th ed p cm Includes bibliographical references and index ISBN-13: 978-0-13-603313-4 ISBN-10: 0-13-603313-X Probabilities—Textbooks I Title QA273.R83 2010 519.2—dc22 2008033720 Editor in Chief, Mathematics and Statistics: Deirdre Lynch Senior Project Editor: Rachel S Reeve Assistant Editor: Christina Lepre Editorial Assistant: Dana Jones Project Manager: Robert S Merenoff Associate Managing Editor: Bayani Mendoza de Leon Senior Managing Editor: Linda Mihatov Behrens Senior Operations Supervisor: Diane Peirano Marketing Assistant: Kathleen DeChavez Creative Director: Jayne Conte Art Director/Designer: Bruce Kenselaar AV Project Manager: Thomas Benfatti Compositor: Integra Software Services Pvt Ltd, Pondicherry, India Cover Image Credit: Getty Images, Inc © 2010, 2006, 2002, 1998, 1994, 1988, 1984, 1976 by Pearson Education, Inc., Pearson Prentice Hall Pearson Education, Inc Upper Saddle River, NJ 07458 All rights reserved No part of this book may be reproduced, in any form or by any means, without permission in writing from the publisher Pearson Prentice Hall™ is a trademark of Pearson Education, Inc Printed in the United States of America 10 ISBN-13: 978-0-13-603313-4 ISBN-10: 0-13-603313-X Pearson Education, Ltd., London Pearson Education Australia PTY Limited, Sydney Pearson Education Singapore, Pte Ltd Pearson Education North Asia Ltd, Hong Kong Pearson Education Canada, Ltd., Toronto ´ de Mexico, S.A de C.V Pearson Educacion Pearson Education – Japan, Tokyo Pearson Education Malaysia, Pte Ltd Pearson Education Upper Saddle River, New Jersey For Rebecca This page intentionally left blank Contents Preface xi Combinatorial Analysis 1.1 Introduction 1.2 The Basic Principle of Counting 1.3 Permutations 1.4 Combinations 1.5 Multinomial Coefficients 1.6 The Number of Integer Solutions of Equations Summary Problems Theoretical Exercises Self-Test Problems and Exercises 1 12 15 16 18 20 Axioms of Probability 2.1 Introduction 2.2 Sample Space and Events 2.3 Axioms of Probability 2.4 Some Simple Propositions 2.5 Sample Spaces Having Equally Likely Outcomes 2.6 Probability as a Continuous Set Function 2.7 Probability as a Measure of Belief Summary Problems Theoretical Exercises Self-Test Problems and Exercises 22 22 22 26 29 33 44 48 49 50 54 56 Conditional Probability and Independence 3.1 Introduction 3.2 Conditional Probabilities 3.3 Bayes’s Formula 3.4 Independent Events 3.5 P(·|F) Is a Probability Summary Problems Theoretical Exercises Self-Test Problems and Exercises 58 58 58 65 79 93 101 102 110 114 Random Variables 4.1 Random Variables 4.2 Discrete Random Variables 4.3 Expected Value 4.4 Expectation of a Function of a Random Variable 4.5 Variance 4.6 The Bernoulli and Binomial Random Variables 4.6.1 Properties of Binomial Random Variables 4.6.2 Computing the Binomial Distribution Function 117 117 123 125 128 132 134 139 142 vii viii Contents 4.7 The Poisson Random Variable 4.7.1 Computing the Poisson Distribution Function 4.8 Other Discrete Probability Distributions 4.8.1 The Geometric Random Variable 4.8.2 The Negative Binomial Random Variable 4.8.3 The Hypergeometric Random Variable 4.8.4 The Zeta (or Zipf) Distribution 4.9 Expected Value of Sums of Random Variables 4.10 Properties of the Cumulative Distribution Function Summary Problems Theoretical Exercises Self-Test Problems and Exercises 143 154 155 155 157 160 163 164 168 170 172 179 183 Continuous Random Variables 5.1 Introduction 5.2 Expectation and Variance of Continuous Random Variables 5.3 The Uniform Random Variable 5.4 Normal Random Variables 5.4.1 The Normal Approximation to the Binomial Distribution 5.5 Exponential Random Variables 5.5.1 Hazard Rate Functions 5.6 Other Continuous Distributions 5.6.1 The Gamma Distribution 5.6.2 The Weibull Distribution 5.6.3 The Cauchy Distribution 5.6.4 The Beta Distribution 5.7 The Distribution of a Function of a Random Variable Summary Problems Theoretical Exercises Self-Test Problems and Exercises 186 186 190 194 198 204 208 212 215 215 216 217 218 219 222 224 227 229 Jointly Distributed Random Variables 6.1 Joint Distribution Functions 6.2 Independent Random Variables 6.3 Sums of Independent Random Variables 6.3.1 Identically Distributed Uniform Random Variables 6.3.2 Gamma Random Variables 6.3.3 Normal Random Variables 6.3.4 Poisson and Binomial Random Variables 6.3.5 Geometric Random Variables 6.4 Conditional Distributions: Discrete Case 6.5 Conditional Distributions: Continuous Case 6.6 Order Statistics 6.7 Joint Probability Distribution of Functions of Random Variables 6.8 Exchangeable Random Variables Summary Problems Theoretical Exercises Self-Test Problems and Exercises 232 232 240 252 252 254 256 259 260 263 266 270 274 282 285 287 291 293 Contents Properties of Expectation 7.1 Introduction 7.2 Expectation of Sums of Random Variables 7.2.1 Obtaining Bounds from Expectations via the Probabilistic Method 7.2.2 The Maximum–Minimums Identity 7.3 Moments of the Number of Events that Occur 7.4 Covariance, Variance of Sums, and Correlations 7.5 Conditional Expectation 7.5.1 Definitions 7.5.2 Computing Expectations by Conditioning 7.5.3 Computing Probabilities by Conditioning 7.5.4 Conditional Variance 7.6 Conditional Expectation and Prediction 7.7 Moment Generating Functions 7.7.1 Joint Moment Generating Functions 7.8 Additional Properties of Normal Random Variables 7.8.1 The Multivariate Normal Distribution 7.8.2 The Joint Distribution of the Sample Mean and Sample Variance 7.9 General Definition of Expectation Summary Problems Theoretical Exercises Self-Test Problems and Exercises 297 297 298 311 313 315 322 331 331 333 344 347 349 354 363 365 365 367 369 370 373 380 384 Limit Theorems 8.1 Introduction 8.2 Chebyshev’s Inequality and the Weak Law of Large Numbers 8.3 The Central Limit Theorem 8.4 The Strong Law of Large Numbers 8.5 Other Inequalities 8.6 Bounding the Error Probability When Approximating a Sum of Independent Bernoulli Random Variables by a Poisson Random Variable Summary Problems Theoretical Exercises Self-Test Problems and Exercises Additional Topics in Probability 9.1 The Poisson Process 9.2 Markov Chains 9.3 Surprise, Uncertainty, and Entropy 9.4 Coding Theory and Entropy Summary Problems and Theoretical Exercises Self-Test Problems and Exercises References ix 388 388 388 391 400 403 410 412 412 414 415 417 417 419 425 428 434 435 436 436 516 Solutions to Self-Test Problems and Exercises where Pi (65) is the probability that a Poisson random variable with mean 30i is greater than 65 That is, 65 Pi (65) = − e−30i (30i)j /j! j=0 Because a Poisson random variable with mean 30i has the same distribution as does the sum of 30i independent Poisson random variables with mean 1, it follows from the central limit theorem that its distribution is approximately normal with mean and variance equal to 30i Consequently, with Xi being a Poisson random variable with mean 30i and Z being a standard normal random variable, we can approximate Pi (65) as follows: Pi (65) = P{X > 65} = P{X Ú 65.5} =P 65.5 − 30i X − 30i Ú √ √ 30i 30i L P Z Ú 65.5 − 30i √ 30i Therefore, P2 (65) L P{Z Ú 7100} L 2389 P3 (65) L P{Z Ú −2.583} L 9951 P4 (65) L P{Z Ú −4.975} L leading to the result P{X > 65} L 7447 If we would have mistakenly assumed that X was approximately normal, we would have obtained the approximate answer 8244 (The exact probability is 7440.) 8.13 Take logarithms and then apply the strong law of large numbers to obtain ⎡⎛ ⎞1/n ⎤ n n ⎥ ⎢ log(Xi )→E[log(Xi )] log ⎣⎝ Xi ⎠ ⎦ = n i=1 Therefore, i=1 ⎛ ⎝ n ⎞1/n Xi ⎠ →eE[log(Xi )] i=1 CHAPTER 9.1 From axiom (iii), it follows that the number of events that occur between times and 10 has the same distribution as the number of events that occur by time and thus is a Poisson random variable with mean Hence, we obtain the following solutions for parts (a) and (b): (a) P{N(10) − N(8) = 0} = e−6 (b) E[N(10) − N(8)] = Solutions to Self-Test Problems and Exercises 517 (c) It follows from axioms (ii) and (iii) that, from any point in time onward, the process of events occurring is a Poisson process with rate λ Hence, the expected time of the fifth event after P.M is + E[S5 ] = + 5/3 That is, the expected time of this event is 3:40 P.M 9.2 (a) P{N(1/3) = 2|N(1) = 2} P{N(1/3) = 2, N(1) = 2} = P{N(1) = 2} P{N(1/3) = 2, N(1) − N(1/3) = 0} = P{N(1) = 2} P{N(1/3) = 2}P{N(1) − N(1/3) = 0} (by axiom (ii)) = P{N(1) = 2} P{N(1/3) = 2}P{N(2/3) = 0} (by axiom (iii)) = P{N(1) = 2} e−λ/3 (λ/3)2 /2!e−2λ/3 = e−λ λ2 /2! = 1/9 (b) P{N(1/2) Ú 1|N(1) = 2} = − P{N(1/2) = 0|N(1) = 2} P{N(1/2) = 0, N(1) = 2} =1 − P{N(1) = 2} P{N(1/2) = 0, N(1) − N(1/2) = 2} =1 − P{N(1) = 2} P{N(1/2) = 0}P{N(1) − N(1/2) = 2} =1 − P{N(1) = 2} P{N(1/2) = 0}P{N(1/2) = 2} =1 − P{N(1) = 2} −λ/2 −λ/2 e e (λ/2)2 /2! =1 − e−λ λ2 /2! = − 1/4 = 3/4 9.3 Fix a point on the road and let Xn equal if the nth vehicle to pass is a car and let it equal if it is a truck, n Ú We now suppose that the sequence Xn , n Ú 1, is a Markov chain with transition probabilities P0,0 = 5/6, P0,1 = 1/6, P1,0 = 4/5, P1,1 = 1/5 Then the long-run proportion of times is the solution of π0 = π0 (5/6) + π1 (4/5) π1 = π0 (1/6) + π1 (1/5) π0 + π1 = Solving the preceding equations gives π0 = 24/29 π1 = 5/29 Thus, 2400/29 L 83 percent of the vehicles on the road are cars 518 Solutions to Self-Test Problems and Exercises 9.4 The successive weather classifications constitute a Markov chain If the states are for rainy, for sunny, and for overcast, then the transition probability matrix is as follows: 1/2 1/2 P = 1/3 1/3 1/3 1/3 1/3 1/3 The long-run proportions satisfy π0 = π1 (1/3) + π2 (1/3) π1 = π0 (1/2) + π1 (1/3) + π2 (1/3) π2 = π0 (1/2) + π1 (1/3) + π2 (1/3) = π0 + π1 + π2 The solution of the preceding system of equations is π0 = 1/4, π1 = 3/8, π2 = 3/8 Hence, three-eighths of the days are sunny and one-fourth are rainy 9.5 (a) A direct computation yields H(X)/H(Y) L 1.06 (b) Both random variables take on two of their values with the same probabilities 35 and 05 The difference is that if they not take on either of those values, then X, but not Y, is equally likely to take on any of its three remaining possible values Hence, from Theoretical Exercise 13, we would expect the result of part (a) CHAPTER 10 10.1 (a) 1=C ex dx * C = 1/(e − 1) (b) x F(x) = C ey dy = ex − , e − … x … Hence, if we let X = F −1 (U), then U= eX − e − or X = log(U(e − 1) + 1) Thus, we can simulate the random variable X by generating a random number U and then setting X = log(U(e − 1) + 1) 10.2 Use the acceptance–rejection method with g(x) = 1, < x < Calculus shows that the maximum value of f (x)/g(x) occurs at a value of x, < x < 1, such that 2x − 6x2 + 4x3 = Solutions to Self-Test Problems and Exercises 519 or, equivalently, when 4x2 − 6x + = (4x − 2)(x − 1) = The maximum thus occurs when x = 1/2, and it follows that C = max f (x)/g(x) = 30(1/4 − 2/8 + 1/16) = 15/8 Hence, the algorithm is as follows: Step Generate a random number U1 Step Generate a random number U2 Step If U2 … 16(U12 − 2U13 + U14 ), set X = U1 ; else return to Step 10.3 It is most efficient to check the higher probability values first, as in the following algorithm: Step Step Step Step Step Generate a random number U If U … 35, set X = and stop If U … 65, set X = and stop If U … 85, set X = and stop X = 10.4 2μ − X 10.5 (a) Generate 2n independent exponential random variables with mean 1, Xi , Yi , i = 1, , n, and then use the estimator n eXi Yi /n i=1 (b) We can use XY as a control variate to obtain an estimator of the type n (eXi Yi + cXi Yi )/n i=1 Another possibility would be to use XY + X Y /2 as the control variate and so obtain an estimator of the type n (eXi Yi + c[Xi Yi + Xi2 Yi2 /2 − 1/2])/n i=1 The motivation behind the preceding formula is based on the fact that the first three terms of the MacLaurin series expansion of exy are + xy + (x2 y2 )/2 This page intentionally left blank Index A Absolutely continuous random variables, See Continuous random variables Algorithm, polar, 453 Analytical Theory of Probability (Laplace), 399 Answers to selected problems, 456–457 Antithetic variables, variance reduction, 450–451 Archimedes, 208 Ars Conjectandi (The Art of Conjecturing), 142, 391 Associative laws, 25 Axiom, defined, 27 Axioms of probability, 26–29 B Banach match problem, 158–159 Basic principle of counting, 1–3 proof of, Bayes, Thomas, 74 Bayes’s formula, 65–79, 101 Bell, E T., 208 Bernoulli, Jacques, 142–143 Bernoulli, James, 134, 143, 391 Bernoulli, Nicholas, 391 Bernoulli random variables, 134–139, 403 Bernoulli trials, 112 Bernstein polynomials, 414 Bertrand, Joseph L F., 197 Bertrand’s paradox, 197 Best-prize problem, 344–346 Beta distribution, 218–219 Binary symmetric channel, 433 Binomial coefficients, 7, 15 Binomial distribution, normal approximation to, 204–207 Binomial random variables, 134–139, 259–260 binomial distribution function, computing, 142–143 moments of, 316–317 properties of, 139–141 simulating, 448 variance of, 325–331 Binomial theorem, combinatorial proof of, 8–9 proof by mathematical induction, Bits, 426 Bivariate normal distribution, 268–269 Boole’s inequality, 300–301 ´ 403 Borel, E., Box–Muller approach, 445 Branching process, 383 Buffon’s needle problem, 243–246 C Cantor distribution, 381 Cauchy distribution, 217–218 Center of gravity, 128 Central limit theorem, 198, 412 Channel capacity, 434, 436 Chapman–Kolmogorov equations, 421–423 Chebyshev’s inequality: defined, 389 one-sided, 403–407 and weak law of large numbers, 388–391 Chernoff bounds, 407–409 Chi-squared distribution, 216, 255 Chi-squared random variable, simulating, 446–447 Coding theory: binary symmetric channel, 433 and entropy, 428–434 noiseless coding theorem, 429–431, 433–434 521 522 Index Combinatorial analysis, 1–21 combinations, 5–9 integer solutions of equations, number of, 12–15 multinomial coefficients, 9–12 permutations, 3–5 principle of counting, 1–3 Combinatorial identity, Commutative laws, 25 Complement, 24, 49 Conditional covariance formula, 372, 381 Conditional distributions: continuous case, 266–274 bivariate normal distribution, 268–269 discrete case, 263–266 Conditional expectation, 331–349, 371 computing expectations by conditioning, 333–343 computing probabilities by conditioning, 344–347 best-prize problem, 344–346 conditional variance, 347–349 definitions, 331–333 and prediction, 349–353 Conditional independence, 98 Conditional probability, 58–65 Bayes’s formula, 65–79 independent events, 79–93 Conditional probability density function, 286 Conditional probability mass function, 286 Conditional variance, 347–349 variance of a sum of a random number of random variables, 349 Conditional variance formula, 372 Conditioning: computing expectations by, 333–343 computing probabilities by, 344–347 best-prize problem, 344–346 variance reduction by, 451–452 Continuity correction, 205 Continuous random variables, 186–231 beta distribution, 218–219 Cauchy distribution, 217–218 expectation of, 190–194 gamma distribution, 215–216 simulation of: general techniques for, 440–447 inverse transformation method, 441–442, 453 rejection method, 442–444, 453–454 Weibull distribution, 216–217 Control variates, variance reduction, 452–453 Convolution, 252 Correlation, 371 Correlation coefficient, 322–331 Counting, basic principle of, 1–3 proof of, Coupon-collecting problem, 318–319 singletons in, 321–322 Coupon collecting with unequal probabilities, 314–315 Covariance, 322–323, 371 defined, 322 Cumulative distribution function (distribution function), 123–125 properties, 168–170 D DeMoivre, Abraham, 198, 207–208, 393 DeMoivre–Laplace limit theorem, 204 DeMorgan’s laws, 26 Dependent random variables, 241 Deviations, 324 Discrete distributions: simulation from, 447–449 binomial random variable, 448 Index geometric distribution, 447–448 Poisson random variable, 449 Discrete probability distribution, 358 Discrete random variables, 123–125, 171 Distribution function (distribution function), 123, 170 Distributions: beta, 218-219 binomial, normal approximation to, 204-207 bivariate normal, 268-269 Cantor, 381 Cauchy, 217-218 chi-squared, 216, 255 conditional, 263-274 continuous probability, 359 discreet, 447-449 discrete probability, 358 gamma, 215-216 Gaussian, 207 geometric, 447-448 Laplace, 211 marginal, 233 mulltinomial, 240 multivariate, 372 multivariate normal, 365-367 n-Erlang, 216 negative hypergeometric, 319 normal, 356-357 Weibull, 216-217 zeta (Zipf), 163–164 Distributive laws, 25 Double exponential random variable, 211fn Duration of play, problem of, 89–90 E Edges, 91 Ehrenfest, Paul and Tatyana, 421 Entropy, 426–428, 435 and coding theory, 428–434 Equations, number of integer solutions of, 12–15 Ergodic Markov chain, 423–424 523 Events, 23–26 independent, 79–93, 101 mutually exclusive, 24, 49 odds of, 101 pairwise independent, 147 Exchangeable random variables, 282–285 Expectation, See Continuous random variables conditional, 331–349, 371 and prediction, 349–353 correlations, 322–331 covariance, 322–323 general definition of, 369–370 moment generating functions, 354–365 binomial distribution with parameters n and p, 355 continuous probability distribution, 359 determination of the distribution, 358 discrete probability distribution, 358 exponential distribution with parameter λ, 356 independent binomial random variables, sums of, 360 independent normal random variables, sums of, 360 independent Poisson random variables, sums of, 360 joint, 363–365 normal distribution, 356–357 Poisson distribution with mean λ, 355–356 of the sum of a random number of random variables, 361–363 moments of the number of events that occur, 315–322, 319 binomial random variables, moments of, 316–317 coupon-collecting problem, 318–319, 321–322 524 Index Expectation, See Continuous random variables (Continued) hypergeometric random variables, moments of, 317–318 moments in the match problem, 318 negative hypergeometric random variables, 319–321 probabilistic method, obtaining bounds from expectations via, 311–312 properties of, 297–387 of sums of random variables, 298–315 Boole’s inequality, 300–301 coupon-collecting problems, 303 coupon collecting with unequal probabilities, 314–315 expectation of a binomial random variable, 301 expected number of matches, 303 expected number of runs, 304–305 hypergeometric random variable, mean of, 302 maximum–minimums identity, 313–314 negative binomial random variable, mean of, 301–302 probability of a union of events, 308–310 quick-sort algorithm, analyzing, 306–308 random walk in a plane, 305–306 sample mean, 300 variance of sums, 322–331 Expected value (expectation), 125–128, 171 Exponential random variables, 208–214, 223 hazard rate functions, 212–214 F Failure rate function, 212, 223 Fermat, Pierre de, 85–86, 89 Fermat’s combinatorial identity, 18 Finite population, sampling from, 326–331 First of moment of X, 132 Functional system, G Galton, Francis, 399 Gambler’s ruin problem, 87–88 Gamma distribution, 215–216 Gamma function, 215, 223 Gamma random variables, 254–255 Gauss, Karl Friedrich, 207–208 Gaussian curve, 207 Gaussian distribution, 207 Generalized basic principle of counting, Geometric distribution, 447–448 variance of, 340 Geometric random variable with parameter p, 448 Geometric random variables, 155–157, 260–263 Geometrical probability, 197 Goodwill cost, defined, 176 H Half-life, probabilistic interpretation of (example), 249–251 Hamiltonian path: defined, 311 maximum number of, in a tournament, 311–312 Hazard rate functions, 212–214, 223 Huygens, Christiaan, 86, 89 Hypergeometric random variables, 160–163 moments of, 317–318 I Identically distributed uniform random variables, 252–254 Importance sampling, 455 Index Independent Bernoulli random variables, bounding error probability when approximating a sum of, 410–412 Independent binomial random variables, sums of, 260, 360 Independent events, 79–93, 101 Independent increment assumption, 417 Independent normal random variables, sums of, 360 Independent Poisson random variables, sums of, 259–260, 360 Independent random variables, 240–251, 286 binomial random variables, 259–260 Buffon’s needle problem, 243–246 conditional distributions: continuous case, 266–274 discrete case, 263–266 gamma random variables, 254–255 geometric random variables, 260–263 half-life, probabilistic interpretation of (example), 249–251 identically distributed uniform random variables, 252–254 normal random variables, 256–259 Poisson random variables, 259–260 random subsets, 246–249 sums of, 252–263 Independent uniform random variables, sum of two, 252–253 Inequality: Boole’s, 300–301 Chebyshev’s, 388–391 Jensen’s, 409 Markov’s, 388 525 Initial probabilities, 99 Integer solutions of equations, number of, 12–15 Interarrival times, sequence of, 418 Intersection, 23–24, 49 Inverse transformation method, 441–442, 453 exponential random variable, simulating, 441 gamma random variable, simulating, 441–442 J Jacobian determinant, 280 Jensen’s inequality, 409 Joint cumulative probability distribution function, 232–240, 282–285 Joint density function of order statistics, 270 Joint distribution functions, 232–240 joint cumulative probability distribution function, 232–240, 282–285 joint probability mass function, 233–234 marginal distributions, 233 multinomial distribution, 240 Joint moment generating functions, 363–365 Joint probability density function, 235–239, 285 Joint probability mass function, 233–234 Jointly continuous random variables, 235, 239, 285 Jointly distributed random variables, 232–297 joint distribution functions, 232–240 joint probability density function, 235–239, 285 marginal probability mass functions, 234 526 Index K Kelley strategy, 378 Khinchine, A Y., 391 Kolmogorov, A N., 403 L Laplace distribution, 211 Laplace, Pierre-Simon, 399 Laplace’s rule of succession, 98–99 Law of frequency of errors, 399 Laws of large numbers, 388 Legendre theorem, 229 Length of the longest run (example), 148–154 ˆ L’Hopital’s rule, 392 Liapounoff, A., 393 Limit theorems, 388–417 central limit theorem, 391–399, 412 Chebyshev’s inequality, 388–391 strong law of large numbers, 400–403, 412 weak law of large numbers, 388–391 Lognormal random variable, 258 M Marginal distributions, 233 Marginal probability mass functions, 234 Markov chain, 419–424, 434 Chapman–Kolmogorov equations, 421–422 ergodic, 423–424 matrix, 420 random walk, 422 transition probabilities, 420 Markov’s inequality, 388 Matching problem (example), 41–42, 63 Matrix, 420 Maximum likelihood estimate, 160 Maximum–minimums identity, 313–314 Mean, 132, 171 Measureable events, 29 Measurement signalto-noise ratio, 414 Memoryless, use of term, 223 Men of Mathematics (Bell), 208 Method of maximum likelihood estimation, 180 Midrange of a sequence, 293 Minimax theorem, 175 Moment generating functions, 354–365, 372 binomial distribution with parameters n and p, 355 continuous probability distribution, 359 determination of the distribution, 358 discrete probability distribution, 358 exponential distribution with parameter λ, 356 independent binomial random variables, sums of, 360 independent normal random variables, sums of, 360 independent Poisson random variables, sums of, 360 joint, 363–365 normal distribution, 356–357 Poisson distribution with mean λ, 355–356 of the sum of a random number of random variables, 361–363 Multinomial coefficients, 9–12 defined, 11 Multinomial distribution, 240 Multinomial theorem, 11 Multiplication rule, 62–63, 101 Multivariate distributions, 372 Mutually exclusive events, 24, 49 N n-Erlang distribution, 216 Natural Inheritance (Galton), 399 Negative binomial random variables, 157–160 Negative hypergeometric distribution, 319 Index Negative hypergeometric random variables, 319–321 Newton, Isaac, 208 Noiseless coding theorem, 429–431, 433–434 Normal curve, 207 Normal random variables, 256–259 joint distribution of the sample mean and sample variance, 367–369 multivariate normal distribution, 365–367 simulating, 443–444 polar method, 445–446 Notation/terminology, 6, 10 nth moment of X, 132 Null event, 24 Null set, 49 O Odds, of an event, 101 One-sided Chebyshev’s inequality, 403–407 Order statistics, 270–274, 286 distribution of the range of a random sample, 273–274 joint density function of, 270 Overlook probabilities, 74 P Pairwise independent events, 147 Pareto, V., 164 Pascal, Blaise, 85–86 Pearson, Karl, 207–208 Permutations, 3–5 Personal view of probability, 48 Pierre-Simon, Marquis de Laplace, 399 Points, problem of, 86 Poisson distribution function, computing, 154–155 Poisson paradigm, 148 Poisson process, 417–419 defined, 417, 434 independent increment assumption, 417 527 sequence of interarrival times, 418 stationary increment assumption, 417 waiting time, 419 Poisson random variables, 143–145, 171, 259–260 simulating, 449 ´ Denis, 143 Poisson, Simeon Polar algorithm, 453 Polar method, 445–446 Polya’s urn model, 284 Posterior probability, 99–100 Principle of counting, 1–3 Prior probabilities, 99 Probabilistic method, 93 obtaining bounds from expectations via, 311–312 maximum number of Hamiltonian paths in a tournament, 311–312 Probabilistic Method, The (Alon/Spencer/Erdos), 93fn Probability: axioms of, 26–29 as a continuous set function, 44–48 defining, 26 of an event, 27 geometrical, 197 as a measure of belief, 48–49 multiplication rule, 62–63, 101 personal view of, 48 sample space and events, 22–26 simple propositions, 29–33 subjective view of, 48 Probability density function, 222 defined, 186 Probability mass function, 123, 171, 233 Probability, personal view of, 48 Problem of duration of play, 89–90 Pseudorandom numbers, 439 Q Quick-sort algorithm, analyzing, 306–308 528 Index R Random-number generator, 439 seed, 439 Random numbers, 385, 439, 453 Random permutation, generating, 439–440 Random samples, distribution of the range of, 273–274 Random variables, 117–186, 134–139 Bernoulli, 134–139 binomial, 134–139, 259–260 continuous, 186–231 cumulative distribution function, 123–125 properties of, 168–170 defined, 117, 170 dependent, 241 discrete, 123–125, 171 distribution of a function of, 219–221 exchangeable, 282–285 expectation of a function of, 128–132 expectation of a sum of a random number of, 335 expectation of sums of, 298–315 expected value (expectation), 125–128 sums of, 164–168 exponential, 208 gamma, 254–255 geometric, 155–157, 260–263 hypergeometric, 160–163 Identically distributed uniform, 252–254 independent, 240–251 joint probability distribution of functions of, 274–282 jointly continuous, 239 moment generating functions, 354–365 of the sum of a random number of, 361–363 negative binomial, 157–160 normal, 198–204, 256–259 order statistics, 270–274, 286 Poisson, 143–145, 171, 259–260 uniform, 194–198 variance, 171 variance of a sum of a random number of, 349 Weibull, 217 zeta (Zipf) distribution, 163–164 Random walk, 422 Rate of transmission of information, 436 Rayleigh density function, 229 Recherches sur la probabilil´e des jugements en mati`ere criminelle et en mati`ere civile (Investigations into the Probability of Verdicts in Criminal and Civil Matters), 143 Rejection method, 442–444, 453–454 simulating a chi-squared random variable, 446–447 simulating a normal random variable, 443–444 polar method, 445–446 Riemann, G F B., 164 S Sample mean, 300, 372 joint distribution of, 367–369 joint distribution of sample variance and, 367–369 Sample median, 272 Sample spaces: and events, 22–26, 49 having equally likely outcomes, 33–44 Sample variance, 324, 372 joint distribution of sample mean and, 367–369 Seed, 439 Selected problems, answers to, 456–457 Self-text problems/exercises, 458–516 Sequence of interarrival times, 418 Sequential updating of information, 99–101 Shannon, Claude, 433 Index Simulation, 438–456 of continuous random variables: general techniques for, 440–447 inverse transformation method, 441–442, 453 rejection method, 442–444, 453–454 defined, 438 from discrete distributions, 447–449 pseudorandom numbers, 439 random-number generator, 439 random numbers, 439, 453 random permutation, generating (example), 439–440 seed, 439 variance reduction techniques, 449–453 Simulation, defined, 246 Singletons, in coupon-collecting problem, 321–322 Size of the zeroth generation, 383 St Petersburg paradox, 175 Standard deviation of X, 134, 171 Standard deviations, 414 Stationary increment assumption, 417 Stieltjes integrals, 369–370 Strong law of large numbers, 400–403, 412 Subjective view of probability, 48 Sums of random variables: expectation of, 298–315 binomial random variable, 300–301 Boole’s inequality, 300–301 coupon-collecting problems, 303 coupon collecting with unequal probabilities, 314–315 expectation of a binomial random variable, 301 expected number of matches, 303 529 expected number of runs, 304–305 hypergeometric random variable, mean of, 302 maximum–minimums identity, 313–314 negative binomial random variable, mean of, 301–302 probability of a union of events, 308–310 quick-sort algorithm, analyzing, 306–308 random walk in a plane, 305–306 sample mean, 300 Superset, 24 Surprise concept, 425–426 T Theory of games, 175 Transition probabilities, Markov chains, 420 Trials, 82 U Uncertainty, 426–427 Uniform random variables, 194–198 Union, 23–24, 49 Updated probability, 99–100 Updating information sequentially, 99–101 Utility, 130–131 V Variables, 163–164, See also Random variables antithetic, 450–451 Variance, 171 conditional, 347–349 covariance, 322–323 of geometric distribution, 340 sample, 324, 372 Variance reduction: antithetic variables, use of, 450–451 by conditioning, 451–452 control variates, 452–453 techniques, 449–453 530 Index Venn diagram, 24–25 Vertices, 91 W Waiting time, 419 Weak law of large numbers, 388–391 Weibull distribution, 216–217 Weibull random variables, 217 Wheel of fortune game (chuck-a-luck) (example), 136 Wilcoxon sum-of ranks test, 376 Z Zeroth generation, size of, 383 Zeta (Zipf) distribution, 163–164 Zipf, G K., 164 ... arrangements are possible? What if John and Jim can play all instruments, but Jay and Jack can each play only piano and drums? For years, telephone area codes in the United States and Canada consisted... Discrete random variables are dealt with in Chapter 4, continuous random variables in Chapter 5, and jointly distributed random variables in Chapter The important concepts of the expected value and... are 9! = 362,880 possible batting orders 4 Chapter Combinatorial Analysis EXAMPLE 3b A class in probability theory consists of men and women An examination is given, and the students are ranked

Ngày đăng: 12/06/2017, 15:33

TỪ KHÓA LIÊN QUAN

TÀI LIỆU CÙNG NGƯỜI DÙNG

  • Đang cập nhật ...

TÀI LIỆU LIÊN QUAN