Markov chains Theory, Algorithms and Applications

411 773 0
Markov chains Theory, Algorithms and Applications

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

Thông tin tài liệu

Markov Chains I dedicate this book especially to two exceptional people, my father and my mother Markov Chains Theory, Algorithms and Applications Bruno Sericola Series Editor Nikolaos Limnios First published 2013 in Great Britain and the United States by ISTE Ltd and John Wiley & Sons, Inc Apart from any fair dealing for the purposes of research or private study, or criticism or review, as permitted under the Copyright, Designs and Patents Act 1988, this publication may only be reproduced, stored or transmitted, in any form or by any means, with the prior permission in writing of the publishers, or in the case of reprographic reproduction in accordance with the terms and licenses issued by the CLA Enquiries concerning reproduction outside these terms should be sent to the publishers at the undermentioned address: ISTE Ltd 27-37 St George’s Road London SW19 4EU UK John Wiley & Sons, Inc 111 River Street Hoboken, NJ 07030 USA www.iste.co.uk www.wiley.com © ISTE Ltd 2013 The rights of Bruno Sericola to be identified as the author of this work have been asserted by him in accordance with the Copyright, Designs and Patents Act 1988 Library of Congress Control Number: 2013936313 British Library Cataloguing-in-Publication Data A CIP record for this book is available from the British Library ISBN: 978-1-84821-493-4 Printed and bound in Great Britain by CPI Group (UK) Ltd., Croydon, Surrey CR0 4YY Table of Contents Preface ix Chapter Discrete-Time Markov Chains 1.1 Definitions and properties 1.2 Strong Markov property 1.3 Recurrent and transient states 1.4 State classification 1.5 Visits to a state 1.6 State space decomposition 1.7 Irreducible and recurrent Markov chains 1.8 Aperiodic Markov chains 1.9 Convergence to equilibrium 1.10 Ergodic theorem 1.11 First passage times and number of visits 1.11.1 First passage time to a state 1.11.2 First passage time to a subset of states 1.11.3 Expected number of visits 1.12 Finite Markov chains 1.13 Absorbing Markov chains 1.14 Examples 1.14.1 Two-state chain 1.14.2 Gambler’s ruin 1.14.3 Success runs 1.15 Bibliographical notes 12 14 18 22 30 34 41 53 53 58 64 68 70 76 76 78 82 87 89 2.1 Definitions and properties 2.2 Transition functions and infinitesimal generator 92 93 Chapter Continuous-Time Markov Chains vi Markov Chains – Theory, Algorithms and Applications 2.3 Kolmogorov’s backward equation 2.4 Kolmogorov’s forward equation 2.5 Existence and uniqueness of the solutions 2.6 Recurrent and transient states 2.7 State classification 2.8 Explosion 2.9 Irreducible and recurrent Markov chains 2.10 Convergence to equilibrium 2.11 Ergodic theorem 2.12 First passage times 2.12.1 First passage time to a state 2.12.2 First passage time to a subset of states 2.13 Absorbing Markov chains 2.14 Bibliographical notes 108 114 127 130 137 141 148 162 166 172 172 177 184 190 Chapter Birth-and-Death Processes 191 3.1 Discrete-time birth-and-death processes 3.2 Absorbing discrete-time birth-and-death processes 3.2.1 Passage times and convergence to equilibrium 3.2.2 Expected number of visits 3.3 Periodic discrete-time birth-and-death processes 3.4 Continuous-time pure birth processes 3.5 Continuous-time birth-and-death processes 3.5.1 Explosion 3.5.2 Positive recurrence 3.5.3 First passage time 3.5.4 Explosive chain having an invariant probability 3.5.5 Explosive chain without invariant probability 3.5.6 Positive or null recurrent embedded chain 3.6 Absorbing continuous-time birth-and-death processes 3.6.1 Passage times and convergence to equilibrium 3.6.2 Explosion 3.7 Bibliographical notes 191 200 201 204 208 209 213 215 217 220 225 226 227 228 229 231 233 Chapter Uniformization 235 4.1 Introduction 4.2 Banach spaces and algebra 4.3 Infinite matrices and vectors 4.4 Poisson process 4.4.1 Order statistics 4.4.2 Weighted Poisson distribution computation 4.4.3 Truncation threshold computation 4.5 Uniformizable Markov chains 235 237 243 249 252 255 258 263 Table of Contents 4.6 First passage time to a subset of states 4.7 Finite Markov chains 4.8 Transient regime 4.8.1 State probabilities computation 4.8.2 First passage time distribution computation 4.8.3 Application to birth-and-death processes 4.9 Bibliographical notes vii 273 275 276 276 280 282 286 Chapter Queues 287 5.1 The M/M/1 queue 5.1.1 State probabilities 5.1.2 Busy period distribution 5.2 The M/M/c queue 5.3 The M/M/∞ queue 5.4 Phase-type distributions 5.5 Markovian arrival processes 5.5.1 Definition and transient regime 5.5.2 Joint distribution of the interarrival times 5.5.3 Phase-type renewal processes 5.5.4 Markov modulated Poisson processes 5.6 Batch Markovian arrival process 5.6.1 Definition and transient regime 5.6.2 Joint distribution of the interarrival times 5.7 Block-structured Markov chains 5.7.1 Transient regime of SFL chains 5.7.2 Transient regime of SFR chains 5.8 Applications 5.8.1 The M/PH/1 queue 5.8.2 The PH/M/1 queue 5.8.3 The PH/PH/1 queue 5.8.4 The PH/PH/c queue 5.8.5 The BMAP/PH/1 queue 5.8.6 The BMAP/PH/c queue 5.9 Bibliographical notes 288 290 311 315 318 323 326 326 336 341 342 342 342 349 352 354 363 370 370 372 372 373 376 377 380 Appendix Basic Results 381 Bibliography 387 Index 395 Appendix 383 T HEOREM A1.7.– M ONOTONE CONVERGENCE THEOREM FOR EVENTS – If (An )n≥0 is an increasing sequence of events of F, that is such that An ⊆ An+1 , then we have: ∞ An n=0 = lim {An } n−→∞ If (An )n≥0 is a decreasing sequence of events of F, that is such that An+1 ⊆ An , then we have: ∞ An n=0 = lim {An } n−→∞ T HEOREM A1.8.– M ONOTONE CONVERGENCE THEOREM FOR EXPECTATIONS – If (Xn )n≥0 is an increasing sequence of non-negative random variables, then we have: lim Xn = lim n−→∞ {Xn } n−→∞ L EMMA A1.1.– FATOU ’ S LEMMA FOR SERIES – If (un,k )n,k∈ is a double sequence of non-negative real numbers, then we have: ∞ ∞ lim inf un,k ≤ lim inf k=0 n−→∞ n−→∞ un,k k=0 T HEOREM A1.9.– D OMINATED CONVERGENCE THEOREM (un,k )n,k∈ is a double sequence of real numbers such that: 1) for all k ∈ , lim un,k = vk ∈ n−→∞ FOR SERIES – , ∞ 2) for all n, k ∈ , |un,k | ≤ wk with wk < ∞, k=0 then for all n ≥ 0, the series ∞ lim n−→∞ ∞ un,k = k=0 k vk k=0 un,k is absolutely convergent and we have: If 384 Markov Chains – Theory, Algorithms and Applications T HEOREM A1.10.– D OMINATED CONVERGENCE THEOREM FOR CONTINUITY.– If (un )n∈ is a sequence of continuous functions from I to , where I is an interval of , such that: ∞ for all x ∈ I and for all n ∈ , |un (x)| ≤ with < ∞, n=0 ∞ then the sum F (x) = un (x) is well defined for all x ∈ I and function F is n=0 continuous on I T HEOREM A1.11.– D OMINATED CONVERGENCE THEOREM FOR If (un )n∈ is a sequence of differentiable functions from , where I is an interval of , such that: DIFFERENTIATION – I to ∞ 1) there exists y ∈ I such that |un (y)| < ∞, n=0 ∞ 2) for all x ∈ I and for all n ∈ , |un (x)| ≤ with < ∞, n=0 ∞ then the sum F (x) = un (x) is well defined for all x ∈ I, function F is n=0 ∞ differentiable on I and we have F (x) = un (x) n=0 T HEOREM A1.12.– D OMINATED CONVERGENCE THEOREM FOR EXPECTATIONS – If (Xn )n≥0 is a sequence of real random variables such that: 1) lim Xn = X, n−→∞ 2) for all n ∈ then we have -a.s , |Xn | ≤ Y with {|X|} < ∞ and lim {Y } < ∞, n−→∞ {Xn } = {X} L EMMA A1.2.– C ESÀRO ’ S LEMMA – Let (un )n≥1 be a sequence of real numbers If (un )n≥1 converges to , then the sequence of Cesàro averages also converges to , that is: n−→∞ n n lim uk = k=1 Appendix 385 T HEOREM A1.13.– C ENTRAL LIMIT THEOREM – If (Vn )n≥1 is a sequence of independent and identically distributed square-integrable real random variables, that is {Vi2 } < ∞, with mean m and standard deviation σ with σ > 0, then the sequence (Zn )n≥1 , defined by: n Vi − nm Zn = i=1 √ σ n , converges in distribution, when n tends to infinity, to a random variable with normal distribution N (0, 1), that is, for all x ∈ , we have: lim n−→∞ {Zn ≤ x} = √ 2π x −∞ e−y /2 dy Bibliography [ABA 88] A BATE J., W HITT W., “Transient behavior of the M/M/1 queue via Laplace transforms”, Advances in Applied Probability, vol 20, no 1, pp 145–178, 1988 [ABA 89] A BATE J., W HITT W., “Spectral theory for skip-free Markov chains”, Probability in the Engineering and Informational Sciences, vol 3, no 1, pp 77–88, 1989 [AND 91] A NDERSON W.J., Continuous-Time Markov Chains: An Applications-Oriented Approach, Springer-Verlag, 1991 [ASM 96] A SMUSSEN S., N ERMAN O., O LSSON M., “Fitting phase-type distributions via the EM algorithm”, Scandinavian Journal of Statistics, vol 23, no 4, pp 419–441, 1996 [ASM 03] A SMUSSEN S., Applied Probability and Queues, 2nd ed., Stochastic Modelling and Applied Probability, Springer-Verlag, 2003 [BAC 03] BACCELLI F., B RÉMAUD P., Elements of Queueing Theory, 2nd ed., Stochastic modelling and applied probability, Springer-Verlag, 2003 [BHA 60] B HARUCHA -R EID A.T., Elements of the Theory of Markov Processes and Their Applications, McGraw-Hill, 1960 [BHA 90] B HATTACHARYA R.N., WAYMIRE E.C., Stochastic Processes with Applications, John Wiley & Sons, 1990 [BIN 05] B INI D.A., L ATOUCHE G., M EINI B., Numerical Methods for Structured Markov Chains, Numerical Mathematics and Scientific Computing, Oxford Science Publications, 2005 [BIN 06a] B INI D.A., M EINI B., S TEFFÉ S., et al., “Structured Markov chain solver: algorithms”, 1st International Workshop on Tools for Solving Structured Markov Chains (SMCtools), Pisa, Italy, October 2006 [BIN 06b] B INI D.A., M EINI B., S TEFFÉ S., et al., “Structured Markov chain solver: software tools”, 1st International Workshop on Tools for Solving Structured Markov Chains (SMCtools), Pisa, Italy, October 2006 [BLA 02] B LADT M., M EINI B., N EUTS M.F., et al., “Distribution of reward functions on continuous-time Markov chains”, 4th International Conference on Matrix Analytic Methods in Stochastic Models (MAM4), Adelaide, Australia, July 2002 388 Markov Chains – Theory, Algorithms and Applications [BOB 94] B OBBIO A., T ELEK M., “A benchmark for PH estimation algorithms: results for acyclic-PH”, Communications in Statistics - Stochastic Models, vol 10, no 3, pp 661–677, 1994 [BOB 04] B OBBIO A., H ORVÁTH A., T ELEK M., “The scale factor: a new degree of freedom in phase-type approximation”, Performance Evaluation, vol 56, no 1–4, pp 121–144, 2004 [BOB 05] B OBBIO A., H ORVÁTH A., T ELEK M., “Matching three moments with minimal acyclic phase-type distributions”, Communications in Statistics - Stochastic Models, vol 21, no 2–3, pp 303–326, 2005 [BOL 98] B OLCH G., G REINER S., DE M EER H., et al., Queueing Networks and Markov Chains Modeling and Performance Evaluation with Computer Science Applications, John Wiley & Sons, 1998 [BOW 90] B OWERMAN P.N., N OLTY R.G., S CHEUER E.M., “Calculation of the Poisson cumulative distribution function”, IEEE Transactions on Reliability, vol 39, no 2, pp 158– 161, 1990 [BRÉ 98] B RÉMAUD P., Markov Chains: Gibbs Fields, Monte Carlo Simulation and Queues, Springer-Verlag, 1998 [BRI 95] B RIGHT L., TAYLOR P.G., “Calculating the equilibrium distribution in level dependent quasi-birth-and-death processes”, Communications in Statistics - Stochastic Models, vol 11, no 3, pp 497–525, 1995 [CAR 10] C ARRASCO J.A., NABLI H., S ERICOLA B., et al., “Comment on “Performability analysis: a new algorithm”, IEEE Transactions on Computers, vol 59, no 1, pp 137–138, 2010 [CAS 09] C ASTELLA F., D UJARDIN G., S ERICOLA B., “Moments’ analysis in homogeneous Markov reward models”, Methodology and Computing in Applied Probability, vol 11, no 4, pp 583–601, 2009 [CHU 67] C HUNG K.L., Markov Chains with Stationary Transition Probabilities, SpringerVerlag, 1967 [CIA 90] C IARDO G., M ARIE R., S ERICOLA B., et al., “Performability analysis using semiMarkov reward processes”, IEEE Transactions on Computers, vol 39, no 10, pp 1251– 1264, 1990 [ÇIN 75] Ç INLAR E., Introduction to Stochastic Processes, Prentice-Hall, 1975 [COX 65] C OX D.R., M ILLER H.D., The Theory of Stochastic Processes, Chapman & Hall, 1965 [CSE 94] C SENKI A., Dependability for Systems with a Partitioned State Space, Lecture Notes in Statistics, Springer-Verlag, 1994 [CUM 82] C UMANI A., “On the canonical representation of homogeneous Markov processes modelling failure-time distributions”, Microelectronics and Reliability, vol 22, no 3, pp 583–602, 1982 [DAV 81] DAVID H.A., Order Statistics, John Wiley & Sons, 1981 Bibliography 389 [DES 86] D E S OUZA E S ILVA E.A., G AIL H.R., “Calculating cumulative operational time distributions of repairable computer systems”, IEEE Transactions on Computers, vol 35, no 4, pp 322–332, 1986 [DES 89] D E S OUZA E S ILVA E.A., G AIL H.R., “Calculating availability and performability measures of repairable computer systems using randomization”, Journal of the ACM, vol 36, no 1, pp 171–193, 1989 [DES 98] D E S OUZA E S ILVA E.A., G AIL H.R., “An algorithm to calculate transient distributions of cumulative rate and impulse based reward”, Communications in Statistics – Stochastic Models, vol 14, no 3, pp 509–536, 1998 [DOB 09] D OBRUSHKIN V.A., Methods in Algorithmic Analysis, Chapman & Hall, 2009 [DUP 97] D UPUIS A., G UILLEMIN F., S ERICOLA B., “Asymptotic results for the superposition of a large number of data connections on an ATM link”, Queueing Systems – Theory and Applications, vol 26, no 1–2, pp 121–150, 1997 [FEL 57] F ELLER W., An Introduction to Probability Theory and Its Applications, 2nd ed., vol 1, John Wiley & Sons, 1957 [FEL 66] F ELLER W., An Introduction to Probability Theory and Its Applications, vol 2, John Wiley & Sons, 1966 [FRE 83] F REEDMAN D., Markov Chains, Springer-Verlag, 1983 [FRI 97] F RIGUI I., A LFA A.S., X U X., “Algorithms for computing waiting time distributions under different queue disciplines for the D-BMAP/PH/1 queue”, Naval Research Logistics, vol 44, no 6, pp 559–576, 1997 [GEL 98] G ELENBE E., P UJOLLE G., Introduction to Queueing Networks, 2nd ed., John Wiley & Sons, 1998 [GRA 77a] G RASSMANN W.K., “Transient solutions in Markovian queueing systems”, Computers & Operations Research, vol 4, no 1, pp 47–53, 1977 [GRA 77b] G RASSMANN W.K., “Transient solutions in Markovian queues: an algorithm for finding them and determining their waiting-time distributions”, European Journal of Operational Research, vol 1, no 6, pp 396–402, 1977 [GRA 85] G RASSMANN W.K., TASKAR M.I., H EYMAN D.P., “Regenerative analysis and steady state distributions for Markov chains”, Operations Research, vol 33, no 5, pp 1107–1116, 1985 [GUI 95] G UILLEMIN F., S IMONIAN A., “Transient characteristics of an M/M/∞ system”, Advances in Applied Probability, vol 27, no 3, pp 862–888, 1995 [GUI 97] G UILLEMIN F., RUBINO G., S ERICOLA B., et al., “Transient analysis of statistical mulitplexing of data connections on an ATM link”, in R AMASWAMI V., W IRTH P.E (eds.), 15th International Teletraffic Congress Teletraffic Contributions for the Information Age (ITC’15), Elsevier, Washington, D.C., June 1997 [GUI 12] G UILLEMIN F., S ERICOLA B., “On the fluid queue driven by an ergodic birth and death process”, in Hamilton O RTIZ J (Ed.), Telecommunications Networks – Current Status and Future Trends, pp 379–404, InTech, 2012 390 Markov Chains – Theory, Algorithms and Applications [HAR 08] H ARRIS J.M., H IRST J.L., M OSSINGHOFF M.J., Combinatorics and Graph Theory, Springer-Verlag, 2008 [HE 05] H E Q.-M., Z HANG H., “A note on unicyclic representation of PH-distributions”, Communications in Statistics - Stochastic Models, vol 21, no 2–3, pp 465–483, 2005 [HOE 72] H OEL P.G., P ORT S.C., S TONE C.J., Introduction to Stochastic Processes, Houghton Mifflin Company, 1972 [HOF 01] H OFMANN J., “The BMAP/PH/1 queue with level-dependent arrivals - an overview”, Telecommunication Systems, vol 16, no 3–4, pp 347–359, 2001 [HOR 09] H ORVÁTH G., T ELEK M., “On the canonical representation of phase-type distributions”, Performance Evaluation, vol 66, no 8, pp 396–409, 2009 [JEN 53] J ENSEN A., “Markoff chains as an aid in the study of Markoff processes”, Scandinavian Actuarial Journal, vol 36, no 1, pp 87–91, 1953 [KAL 94] K ALASHNIKOV V.V., Mathematical Methods in Queuing Theory, Kluwer Academic Publisher, 1994 [KAR 57] K ARLIN S., M C G REGOR J.L., “The differential equation of birth and death processes, and the Stieltjes moment problem”, Transactions of the American Mathematical Society, vol 85, no 2, pp 489–546, 1957 [KAR 75] K ARLIN S., TAYLOR H.W., A First Course in Stochastic Processes, Academic Press, 1975 [KAR 81] K ARLIN S., TAYLOR H.W., A Second Course in Stochastic Processes, Academic Press, 1981 [KEM 66] K EMENY J.G., S NELL J.L., K NAPP A.W., Denumerable Markov Chains, D Van Nostrand Company, Inc., 1966 [KEM 76] K EMENY J.G., S NELL J.L., Finite Markov Chains, Springer-Verlag, 1976 [KIJ 97] K IJIMA M., Markov Processes for Stochastic Modeling, Chapman & Hall, 1997 [KLE 75] K LEINROCK L., Queueing Systems Volume I: Theory, John Wiley & Sons, 1975 [KLE 76] K LEINROCK L., Queueing Systems Volume II: Computer Applications, John Wiley & Sons, 1976 [KOB 78] KOBAYASHI H., Modeling and Analysis: An Introduction to System Performance Evaluation Methodology, Addison-Wesley, 1978 [KOB 09] KOBAYASHI H., M ARK B.L., Systems Modeling and Analysis: Foundations of System Performance Evaluation, Pearson International Edition, Pearson Prentice Hall, 2009 [KUL 10] K ULKARNI V.G., Modeling and Analysis of Stochastic Systems, 2nd ed., Chapman & Hall, 2010 [LAT 99] L ATOUCHE G., R AMASWAMI V., Introduction to Matrix Analytic Methods in Stochastic Modeling, ASA-SIAM Series on Statistics and Applied Probability, Elsevier, 1999 Bibliography 391 [LED 94] L EDOUX J., RUBINO G., S ERICOLA B., “Exact aggregation of absorbing Markov processes using the quasi-stationary distribution”, Journal of Applied Probability, vol 31, no 3, pp 626–634, 1994 [LEG 93] L EGUESDRON P., P ELLAUMAIL J., RUBINO G., et al., “Transient analysis of the M/M/1 queue”, Advances in Applied Probability, vol 25, no 3, pp 702–713, 1993 [LEN 02] L E N Y L.-M., S ERICOLA B., “Transient analysis of the BMAP/PH/1 queue”, International Journal of Simulation: Systems, Science & Technology, vol 3, no 3–4, pp 5– 14, 2002 [LUC 90] L UCANTONI D.M., M EIER -H ELLSTERN K.S., N EUTS M.F., “A single server queue with server vacations and a class of non-renewal arrival processes”, Advances in Applied Probability, vol 22, no 3, pp 676–705, 1990 [LUC 93] L UCANTONI D.M., “The BMAP/G/1 queue: a tutorial”, in D ONATIELLO L., N ELSON R (eds), Performance Evaluation of Computer and Communications Systems, Lectures Notes in Computer Science 729, Springer Verlag, pp 330–358, 1993 [LUC 94] L UCANTONI D.M., C HOUDHURY G.L., W HITT W., “The transient BMAP/G/1 queue”, Communications in Statistics - Stochastic Models, vol 10, no 1, pp 145–182, 1994 [LUC 98] L UCANTONI D.M., “Further transient analysis of BMAP/G/1 queue”, Communications in Statistics - Stochastic Models, vol 14, no 1, pp 461–478, 1998 [MAS 05] M ASUYAMA H., TAKINE T., “Algorithmic computation of the time-dependent solution of structured Markov chains and its application to queues”, Communications in Statistics - Stochastic Models, vol 21, no 4, pp 885–912, 2005 [MEY 80] M EYER J.F., “On evaluating the performability of degradable computing systems”, IEEE Transactions on Computers, vol 29, no 8, pp 720–731, 1980 [MEY 82] M EYER J.F., “Closed-form solutions for performability”, IEEE Transactions on Computers, vol 31, no 7, pp 648–657, 1982 [MOC 99] M OCANU S., C OMMAULT C., “Sparse representations of phase-type distributions”, Communications in Statistics - Stochastic Models, vol 15, no 4, pp 759–778, 1999 [MOL 03] M OLER C., VAN L OAN C., “Nineteen dubious ways to compute the exponential of a matrix, twenty-five years later”, SIAM Review, vol 45, no 1, pp 3–49, 2003 [NAB 96] NABLI H., S ERICOLA B., “Performability analysis: a new algorithm”, Transactions on Computers, vol 45, no 4, pp 491–494, 1996 [NEU 78] N EUTS M.F., “Renewal process of phase type”, Quarterly, vol 25, no 3, pp 445–454, 1978 IEEE Naval Research Logistics [NEU 79] N EUTS M.F., “A versatile Markovian point process”, Probability, vol 16, no 3, pp 764–779, 1979 Journal of Applied [NEU 81] N EUTS M.F., Matrix-Geometric Solutions in Stochastic Models An Algorithmic Approach, Johns Hopkins University Press, 1981 [NEU 89] N EUTS M.F., Structured Stochastic Matrices of M/G/1 Type and Their Applications, Marcel Dekker, 1989 392 Markov Chains – Theory, Algorithms and Applications [NOR 97] N ORRIS J.R., Markov Chains, Cambridge University Press, 1997 [OCI 89] O’C INNEIDE C.A., “On non-uniqueness of representations of phase-type distributions”, Communications in Statistics - Stochastic Models, vol 5, no 2, pp 247– 259, 1989 [OCI 99] O’C INNEIDE C.A., “Phase-type distributions: open problems and a few properties”, Communications in Statistics - Stochastic Models, vol 15, no 4, pp 731–757, 1999 [PER 94] P ERROS H.G., Queueing Networks with Blocking, Oxford University Press, 1994 [RAM 86] R AMASWAMI V., L ATOUCHE G., “A general class of Markov processes with explicit matrix-geometric solutions”, OR Spectrum, vol 8, no 4, pp 209–218, 1986 [RAM 88] R AMASWAMI V., “A stable recursion for the steady state vector in Markov chains of M/G/1 type”, Communications in Statistics - Stochastic Models, vol 4, no 1, pp 183– 263, 1988 [RAM 96] R AMASWAMI V., TAYLOR P.G., “Some properties of the rate operators in level dependent quasi-birth-and-death processes with a countable number of phases”, Communications in Statistics - Stochastic Models, vol 12, no 1, pp 143–164, 1996 [REU 53] R EUTER G.E.H., L EDERMANN W., “On the differential equations for the transition probabilities of Markov processes with enumerably many states”, Mathematical Proceedings of the Cambridge Philosophical Society, vol 49, no 2, pp 247–262, 1953 [RIO 62] R IORDAN J., Stochastic Service Systems, John Wiley & Sons, 1962 [RIS 02] R ISKA A., S MIRNI E., “Mamsolver: a matrix analytic methods tool”, 12th International Conference on Modelling Tools and Techniques for Computer and Communication System Performance Evaluation (TOOLS), London, England, April 2002 [ROB 03] ROBERT P., Stochastic Networks and Queues, Stochastic Modelling and Applied Probability, Springer-Verlag, 2003 [ROS 83] ROSS S.M., Stochastic Processes, John Wiley & Sons, 1983 [RUB 89a] RUBINO G., S ERICOLA B., “On weak lumpability in Markov chains”, Journal of Applied Probability, vol 26, no 3, pp 446–457, 1989 [RUB 89b] RUBINO G., S ERICOLA B., “Sojourn times in Markov processes”, Journal of Applied Probability, vol 26, no 4, pp 744–756, 1989 [RUB 91] RUBINO G., S ERICOLA B., “A finite characterization of weak lumpable Markov processes Part I: the discrete time case”, Stochastic Processes and Their Applications, vol 38, no 2, pp 195–204, 1991 [RUB 92] RUBINO G., S ERICOLA B., “Interval availability analysis using operational periods”, Performance Evaluation, vol 14, no 3, pp 257–272, 1992 [RUB 93a] RUBINO G., S ERICOLA B., “A finite characterization of weak lumpable Markov processes Part II: the continuous time case”, Stochastic Processes and Their Applications, vol 45, no 1, pp 115–125, 1993 [RUB 93b] RUBINO G., S ERICOLA B., “Interval availability distribution computation”, 23rd IEEE International Symposium on Fault Tolerant Computing (FTCS’23), Toulouse, France, June 1993 Bibliography 393 [RUB 95] RUBINO G., S ERICOLA B., “Interval availability analysis using denumerable Markov Processes Application to multiprocessor subject to breakdowns and repair”, IEEE Transactions on Computers Special Issue on Fault-Tolerant Computing, vol 44, no 2, pp 286–291, 1995 [RUD 91] RUDIN W., Functional Analysis, McGraw-Hill, 1991 [SAI 08] S AINT R AYMOND J., Topologie calcul différentiel et variable complexe, Calvage & Mounet, 2008 [SEN 81] S ENETA E., Non-Negative Matrices and Markov Chains, Springer-Verlag, 1981 [SER 90] S ERICOLA B., “Closed-form solution for the distribution of the total time spent in a subset of states of a homogeneous Markov process during a finite observation period”, Journal of Applied Probability, vol 27, no 3, pp 713–719, 1990 [SER 98] S ERICOLA B., “Transient analysis of stochastic fluid models”, Performance Evaluation, vol 32, no 4, pp 345–263, 1998 [SER 99] S ERICOLA B., “Availability analysis of repairable computer systems and stationarity detection”, IEEE Transactions on Computers, vol 48, no 11, pp 1166–1172, 1999 [SER 00] S ERICOLA B., “Occupation times in Markov processes”, Communications in Statistics - Stochastic Models, vol 16, no 5, pp 479–510, 2000 [SHE 85] S HESKIN T.J., “A Markov chain partitioning algorithm for computing steady state probabilities”, Operations Research, vol 33, no 1, pp 228–235, 1985 [SHI 89] S HIRYAEV A.N., Probability, 2nd ed., Springer-Verlag, 1989 [STE 94] S TEWART W.J., Introduction to the Numerical Solution of Markov Chains, Princeton University Press, 1994 [STE 09] S TEWART W.J., Probability, Markov Chains, Queues and Simulation, Princeton University Press, 2009 [SYS 92] S YSKI R., Passage Times for Markov Chains, IOS Press, 1992 [TAK 00] TAKINE T., “A new recursion for the queue length distribution in the stationary BMAP/G/1 queue”, Communications in Statistics - Stochastic Models, vol 16, no 2, pp 335–341, 2000 [TEI 55] T EICHER H., “An inequality on Poisson probabilities”, The Annals of Mathematical Statistics, vol 26, no 1, pp 147–149, 1955 [THÜ 06] T HÜMMLER A., B UCHHOLZ P., T ELEK M., “A novel approach for phase-type fitting with the EM algorithm”, IEEE Transactions on Dependable and Secure Computing, vol 3, no 3, pp 245–258, 2006 [TIJ 03] T IJMS H.C., A First Course in Stochastic Models, John Wiley & Sons, 2003 [TRI 02] T RIVEDI K.S., Probability and Statistics with Reliability, Queuing and Computer Science Applications, John Wiley & Sons, 2002 [WIL 91] W ILLIAMS D., Probability with Martingales, Cambridge University Press, 1991 Index A absorbing Markov chain, 71, 184, 228 absorbing state, 10, 98 absorption, 74 absorption probability, 188 absorption time, 75, 187 accessible state, 12, 137 aperiodic Markov chain, 30 aperiodic state, 30 aperiodicity, 30 arrival rate instantaneous, 332 stationary, 332 arrivals, 287 B ballot numbers, 302 Banach algebra, 239 Banach space, 238 batch Markovian arrival process, 342 batch phase-type renewal process, 352 birth-and-death process continuous-time, 213 absorbing, 228 discrete-time, 191 absorbing, 201 periodic, 208 uniformizable, 282 block lower Hessenberg matrix, 354 block upper Hessenberg matrix, 353 block-structured Markov chain, 352 SFL, 353 SFR, 353 busy period, 289, 313, 318, 322 C capacity, 288 Catalan’s numbers, 296 Cesàro’s lemma, 385 Chapman–Kolmogorov equations, 4, 95 closed set, 20 computational algorithm occupation period duration distribution SFL chains, 362 SFR chains, 369 state probabilities SFL chains, 356, 357 SFR chains, 366 first passage time distribution birth-and-death process, 281 state probabilities birth-and-death process, 279 Poisson distribution, 258 truncation threshold of the Poisson distribution, 262 concatenation of Markov chains, 35 convergence to equilibrium, 34, 162 coupling, 34, 38 D discipline, 288 FIFO, 287 396 Markov Chains – Theory, Algorithms and Applications LIFO, 287 PS, 287 distribution geometric, 236 Poisson, 255 uniform, 252 E embedded Markov chain, 107, 227 ergodic theorem, 44, 166 Erlang distribution, 141 explosion, 91, 141, 215, 231 explosive Markov chain, 141, 225 exponential distribution, 95 F Fatou’s lemma, 383 finite Markov chain, 69 first passage time to a state, 53, 172 to a subset of states, 58, 177 Fubini’s theorem, 381 fundamental matrix, 65 K Kendall’s notation, 288 Kolmogorov backward differential equation, 110 backward integral equation, 108 forward differential equation, 126 forward integral equation, 114 M Markov modulated Poisson process, 342 Markov chain embedded, 237 equivalent, 236 finite, 275 uniformizable, 263 uniformized, 268 Markov chain of the G/M/1 type, 354 Markov chain of the M/G/1 type, 354 Markovian arrival process, 326 minimal process, 92, 106, 141 N gcd, 30 generating function, 291 hitting time(s), 8, 72, 130, 177, 187 homogeneous Markov chain, 1, 92 normed vector space, 238 null recurrent birth-and-death process, 192 null recurrent Markov chain, 26, 139 null recurrent set, 52 null recurrent state, 26, 51, 139 number of arrivals on [0, t], 332 I O ideal of , 32 principal, 32 infinite matrix, 3, 157 infinite vector, 3, 157, 243 infinitesimal generator, 108 initial distribution, 2, 93 interarrival times joint distribution, 336, 349 interarrival time distribution, 287 invariant measure, 22, 150 invariant probability, 26, 154 irreducible Markov chain, 13, 137 irreducible set, 20 occupation period, 289, 311 order statistics, 253 G, H P passage time to a state, 192 to a subset of state, 274 period of a state, 30 periodicity, 31, 52 phase of the arrival process, 343 phase process, 326 phase-type renewal process, 341 Index distribution, 323 irreducible representation, 325 Poisson distribution, 259, 320, 321 Poisson process, 249 positive recurrent birth-and-death process, 192 positive recurrent Markov chain, 26, 139 positive recurrent set, 52 positive recurrent state, 26, 51, 139 potential kernel, 291 probability of absorption, 74 pure birth process, 209 Q quasi-birth-and-death process, 354, 371 queue, 287 BMAP/G/1, 354 BMAP/PH/1, 377 BMAP/PH/c, 378 M/M/1, 288 M/M/∞, 319 M/M/c, 316 M/G/∞, 322 M/PH/1, 371 M/PH/∞, 322 M/PH/c, 375 MAP/PH/1, 377 MAP/PH/c, 380 PH/M/1, 372 PH/M/c, 376 PH/PH/1, 372 PH/PH/c, 373 queue capacity, 288 R recurrence, 52, 139 recurrent birth-and-death process, 192 recurrent Markov chain, 10, 131 recurrent set, 20 recurrent state, 10, 13, 16, 18, 131 Reuter’s criterion, 147, 215, 231 S server, 287 service discipline, 288 service times distribution, 287 services, 287 skeleton, 148 sojourn time, 104 state probabilities, 276, 290 stationary distribution, 38, 166 stationary probability, 38, 166 stationary regime, 38, 166 Stirling formula, 261 stochastic matrix, 2, 100 stopping time, 6, 92 strong law of large numbers, 43 strong Markov property, 7, 93 T theorem central limit, 385 dominated convergence, 384 monotone convergence, 382 transience, 52, 139 transient birth-and-death process, 192 transient Markov chain, 10, 131 transient regime, 38, 166, 276 SFL chains, 355 SFR chains, 364 transient set, 20 transient state, 10, 13, 16, 131 transition function, 93 transition probability matrix, 2, 107 transition rate matrix, 108 truncation threshold Poisson distribution, 258 U, V uniformization rate, 268 visits to a state, 14, 204 visits to states of a set, 68 397 [...]... ∈ and, for all i, j ∈ S, we have: {Xn+k = j | Xk = i} = {Xn = j | X0 = i} All the following Markov chains are considered homogeneous The term Markov chain in this chapter will thus designate a homogeneous discrete-time Markov chain 2 Markov Chains – Theory, Algorithms and Applications We consider, for all i, j ∈ S, Pi,j = {Xn = j | Xn−1 = i} and we define the transition probability matrix P of the Markov. .. chapter finite Markov chains as well as absorbing Markov chains Finally, three examples are proposed and covered thoroughly Chapter 2 discusses continuous-time Markov chains We detail precisely the way the backward and forward Kolmogorov equations are obtained and examine the existence and uniqueness of their solutions We also treat in this chapter the phenomenon of explosion This occurs when the Markov chain... communication networks and applied probability x Markov Chains – Theory, Algorithms and Applications It is structured as follows Chapter 1 deals with discrete-time Markov chains We describe not only their stationary behavior with the study of convergence to equilibrium and ergodic theorem but also their transient behavior together with the study of the first passage times to a state and a subset of states... discrete-time and continuous-time Markov chains on a countable state space This study is both theoretical and practical including applications to birth -and- death processes and queuing theory It is addressed to all researchers and engineers in need of stochastic models to evaluate and predict the behavior of systems they develop It is particularly appropriate for academics, students, engineers and researchers... definition of τ (j) for the second equality and the fact that Xτ (j) = j when τ (j) < ∞ for the fourth equality The sixth equality uses the Markov property since τ (j) is a stopping time and the seventh equality uses the homogeneity of the Markov chain X Taking i = j, we obtain, for ≥ 1, {Nj > | X0 = j} = fj,j {Nj > − 1 | X0 = j}, 16 Markov Chains – Theory, Algorithms and Applications therefore, for all ≥ 0,... that j −→ i Therefore, we have i ←→ j and i recurrent Theorem 1.10 then states that j is recurrent We finally obtain j = i, j −→ i and j recurrent The same approach by interchanging the roles of i and j, gives us fi,j = 1 20 Markov Chains – Theory, Algorithms and Applications D EFINITION 1.9.– A non-empty subset C of states of S is said to be closed if for all i ∈ C and, for all j ∈ / C, we have Pi,j... 1.1.– If state j is transient then, for all i ∈ S, ∞ (P n )i,j < ∞, n=1 and, therefore, lim (P n )i,j = 0 and n−→∞ lim n−→∞ {Xn = j} = 0 P ROOF.– Let us resume equation [1.2], that is: n n (P )i,j = k=1 (k) fi,j (P n−k )j,j 12 Markov Chains – Theory, Algorithms and Applications Here again, summing over n, using Fubini’s theorem and because (P 0 )j,j = 1, we obtain: ∞ ∞ (P n )i,j = n=1 n (k) n=1 k=1... transient P ROOF.– If i ←→ j then there exist integers ≥ 0 and m ≥ 0 such that (P )i,j > 0 and (P m )j,i > 0 For all n ≥ 0, we have: (P +n+m (P )i,k (P n+m )k,i ≥ (P )i,j (P n+m )j,i )i,i = k∈S 14 Markov Chains – Theory, Algorithms and Applications and (P n+m )j,i = (P n )j,k (P m )k,i ≥ (P n )j,j (P m )j,i , k∈S and thus: (P +m+n )i,i ≥ (P )i,j (P n )j,j (P m )j,i Summing over n, we obtain: ∞ ∞ (P n... obtained using theorem 1.4 because A ∩ {T = n} ∈ Fn 8 Markov Chains – Theory, Algorithms and Applications 1.3 Recurrent and transient states Let us recall that the random variable τ (j) that counts the number of transitions necessary to reach state j is defined by: τ (j) = inf{n ≥ 1 | Xn = j}, where τ (j) = ∞ if this set is empty For all i, j ∈ S and, for all n ≥ 1, we define: (n) fi,j = {τ (j) = n |... transient if and only 18 Markov Chains – Theory, Algorithms and Applications if Nj < ∞, j -a.s Likewise, in the statement of theorem 1.16, we could write τ (j) < ∞, -a.s., in place of {τ (j) < ∞} = 1 1.6 State space decomposition L EMMA 1.1.– For all i, j ∈ S such that i = j, we have: i −→ j ⇐⇒ fi,j > 0 P ROOF.– Let i and j be two states of S such that i = j If i −→ j then, by definition and since (P ... Markov Chains I dedicate this book especially to two exceptional people, my father and my mother Markov Chains Theory, Algorithms and Applications Bruno Sericola... communication networks and applied probability x Markov Chains – Theory, Algorithms and Applications It is structured as follows Chapter deals with discrete-time Markov chains We describe not... } is a Markov chain on the state space S then, for all n ≥ and for all i ∈ S, conditional on {Xn = i}, the process {Xn+p , p ∈ } Markov Chains – Theory, Algorithms and Applications is a Markov

Ngày đăng: 17/12/2016, 16:51

Từ khóa liên quan

Mục lục

  • Cover

  • Title Page

  • Contents

  • Preface

  • Chapter 1. Discrete-Time Markov Chains

    • 1.1. Definitions and properties

    • 1.2. Strong Markov property

    • 1.3. Recurrent and transient states

    • 1.4. State classification

    • 1.5. Visits to a state

    • 1.6. State space decomposition

    • 1.7. Irreducible and recurrent Markov chains

    • 1.8. Aperiodic Markov chains

    • 1.9. Convergence to equilibrium

    • 1.10. Ergodic theorem

    • 1.11. First passage times and number of visits

      • 1.11.1. First passage time to a state

      • 1.11.2. First passage time to a subset of states

      • 1.11.3. Expected number of visits

      • 1.12. Finite Markov chains

      • 1.13. Absorbing Markov chains

      • 1.14. Examples

        • 1.14.1. Two-state chain

Tài liệu cùng người dùng

  • Đang cập nhật ...

Tài liệu liên quan