Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống
1
/ 21 trang
THÔNG TIN TÀI LIỆU
Thông tin cơ bản
Định dạng
Số trang
21
Dung lượng
145,98 KB
Nội dung
8 BOUNDS ON THE BUFFER OCCUPANCY PROBABILITY WITH SELF-SIMILAR INPUT TRAFFIC N. L IKHANOV Institute for Problems of Information Transmission, Russian Academy of Science, Moscow, Russia 8.1 INTRODUCTION High-quality traf®c measurements indicate that actual traf®c behavior over high- speed networks shows self-similar features. These include an analysis of hundreds of millions of observed packets on several Ethernet LANs [7, 8], and an analysis of a few million observed frame data by variable bit rate (VBR) video services [1]. In these studies, packet traf®c appears to be statistically self-similar [2, 11]. Self-similar traf®c is characterized by ``burstiness'' across an extremely wide range of time scales [7]. This behavior of aggregate Ethernet traf®c is very different from conventional traf®c models (e.g., Poisson, batch Poisson, Markov modulated Poisson process [4]). A lot of studies have been made for the design, control, and performance of high- speed and cell-relay networks, using traditional traf®c models. It is likely that many of those results need major revision when self-similar traf®c models are considered [18]. Self-similarity manifests itself in a variety of different ways: a spectral density that diverges at the origin, a nonsummable autocorrelation function (indicating long- range dependence), an index of dispersion of counts (IDCs) that increases mono- tonically with the sample time T, and so on [7]. A key parameter characterizing self- similar processes is the so-called Hurst parameter, H, which is designed to capture the degree of self-similarity. Self-Similar Network Traf®c and Performance Evaluation, Edited by Kihong Park and Walter Willinger ISBN 0-471-31974-0 Copyright # 2000 by John Wiley & Sons, Inc. 193 Self-Similar Network Traf®c and Performance Evaluation, Edited by Kihong Park and Walter Willinger Copyright # 2000 by John Wiley & Sons, Inc. Print ISBN 0-471-31974-0 Electronic ISBN 0-471-20644-X Self-similar process models can be derived in different ways. One way is to construct the self-similar process as a sum of independent sources with a special form of the autocorrelation function. If we put the peak rate of each source going to zero as the number of sources goes to in®nity, models like those of Mandelbrot [11] and Taqqu and Le  vy [15] will be obtained. Queueing analysis for these kinds of processes is given in Chapters 4 and 5 in this volume. Another approach is to consider on=off sources with constant peak rate, while the number of sources goes to in®nity. In this way, we obtain self-similar processes with sessions arrived as Poisson r.v.'s [9]. Originally this process was proposed by Cox [2] and queueing analysis was done recently by many authors [3, 6, 9, 10, 13, 17]. The main results of these papers are presented in this volume. In Chapter 9 we can ®nd a complete overview of this topic. Chapters 7 and 10, present some particular results for the above model as well as results for the model with ®nite number of on=off sources. Models close to on=off processes arrived as Poisson r.v.'s are considered in Chapter 11. In Chapter 6 a queueing system with instantaneous arrivals is given. In this chapter we will ®nd the class of all self-similar processes with independent sessions arrived as Poission r.v.'s. For the particular case of the Cox model, we will ®nd asymptotic bounds for buffer over¯ow probability. Compare with Chapter 9, Section 10.4 of Chapter 10, and Section 7.4 of Chapter 7, where asymptotic bounds are presented for a wide class of processes beyond the self-similar one we will focus on the self-similar case (Pareto distribution of the active period). For this case we present some new bounds, which are more accurate compared to the best-known current bounds [3, 10]. This chapter is organized in the following way. First, we give the de®nition of second-order self-similar traf®c and some well-known, but useful, relations between variance, autocorrelation, and spectral density functions. This is followed by a construction of a class of second-order self-similar processes. Finally, asymptotic queueing behavior for a particular form of the processes from our class is analyzed. 8.2 SECOND-ORDER SELF-SIMILAR PROCESSES We consider a discrete-time stationary stochastic process X .; X À1 ; X 0 ; X 1 ; . with a constant mean m EX i and ®nite variance s 2 Var X i R EfX i À m 2 g.For convenience, we also consider the process x .; x À1 ; x 0 ; x 1 ; ., which is equal to x i R X i À m, so the process x is the same as process X , but with zero mean. Ex i 0; Var x i Ex 2 i s 2 . The autocorrelation function of processes x and X is rkEfX i À mX ik À mg=s 2 Efx i x ik g=s 2 ; k 0; Æ1; Æ2; .: Since the process X is a stationary one, rk, Var X i , and EX i do not depend on i. 194 BOUNDS ON THE BUFFER OCCUPANCY PROBABILITY Denote X m .; X m À1 ; X m 0 ; X m 1 ; .; x m .; x m À1 ; x m 0 ; x m 1 ; .; m 1; 2; 3; .; where X m i R 1 m X im1 ÁÁÁX imm ; x m i R x im1 ÁÁÁx imm : We have EX m i m; Var X m i 1 m 2 Var x m i and autocorrelation function r m k of the processes X m i ; x m i : r m EfX m i À mX m ik À mg=Var X m i Efx m i x m ik g=Var x m i ; k 0; Æ1; Æ2; .: First we recall some well-known relations between the autocorrelation function and the variance of the sum [2, 16] (see also Chapter 2). We have m 2 Var X m 0 Var x m 0 Ex 1 x 2 ÁÁÁx m x 1 x 2 ÁÁÁx m s 2 P m i1 P m j1 ri À js 2 m 2 P mÀ1 i1 m À iri ; 8:1 where we use rÀkrk; k 0; 1; 2; From the above equations we have Var x m1 0 À Var x m 0 s 2 2s 2 P m i1 ri: Next, Var x m1 0 À Var x m 0 ÀVar x m 0 À Var x mÀ1 0 2s 2 rm: If we de®ne Var x 0 i 0, then rm 1 2s 2 Var x m1 0 À 2 Var x m 0 Var x mÀ1 0 : 8:2 8.2 SECOND-ORDER SELF-SIMILAR PROCESSES 195 The same equation for r k m is r k m 1 2 Var x k 0 Var x km1 0 À 2 Var x km 0 Var x kmÀ1 0 : 8:3 One more important statistical characteristic of a stationary process is its spectral density function f o. For processes with a continuous spectrum we have f o R s 2 2p P I kÀI rke Àiko ; rk 1 s 2 p Àp f oe iko do: 8:4 Let us express Var x m 0 via the spectral density function f o. Substituting Eq. (8.4) into (8.1) we have Var x m 0 P m k1 P m j1 p Àp f oe iokÀj do p Àp f o P m k1 P m j1 e iokÀj do p Àp 1 À e iom 1 À e io 2 f o do p Àp 1 À cos om 1 À cos o f o do: 8:5 Now we can give the de®nition of a second-order self-similar process. De®nition 8.2.1. A stationary process X .; X À1 ; X 0 ; X 1 ; . with ®nite mean m EX i < I and variance s 2 Var X i < I is called exactly second-order self- similar with parameter 0 < b < 1, if rk 1 2 k 1 2Àb À 2k 2Àb k À 1 2Àb 8:6 for all k 1; 2; Parameter b and the Hurst parameter H are related as H 1 À b=2, 1 2 < H < 1. Let us discuss the given de®nition. Substituting Eq. (8.6) into (8.1), it is easy to see that for a self-similar process Var x m 0 s 2 m 2Àb ; m 1; 2; .: 8:7 From Eq. (8.2) we can easily conclude that if Var x m 0 s 2 m 2Àb , then the autocorrelation function rk satis®es Eq. (8.6). This means that Eq. (8.7) is equivalent to Eq. (8.6) and can be used in the above de®nition instead of Eq. (8.6). Substituting Eq. (8.7) into (8.3) we get r k mrm; m Æ1; Æ2; .; k 1; 2; .: 8:8 196 BOUNDS ON THE BUFFER OCCUPANCY PROBABILITY Equation (8.8) can clarify the sense of the de®nition of the self-similar process: the original process X and its aggregated process X m have the same correlation structure. The spectral density function of the self-similar process was found by Sinai [14]: f o 1 2 p Àp 1 À coson P I kÀI 1 jo=2p kj 2Àb do 8:9 It is easy to see from Eq. (8.5) that, if process X has spectral density function (8.9), its variance will be equal to (8.7). As o 3 0, from Eq. (8.9) we obtain f o$const Á o 1Àb . We use f x$gx in the sense that lim x3I f x=gx 1. Now we conclude that an exactly second-order self-similar process can be de®ned via its autocorrelation function (8.6), variance of the sum (8.7), or spectral density function (8.9), and all these de®nitions are equivalent. Meanwhile, for the de®nition of an asymptotically second-order self-similar process, it is important to de®ne the kind of characteristic to be used. We will use the following de®nition. De®nition 8.2.2. A stationary process X .; X À1 ; X 0 ; X 1 ; . with ®nite mean m EX i < I and variance s 2 Var X i < I is called asymptotically second-order self-similar with parameter 0 < b < 1, if lim m3I r m k 1 2 k 1 2Àb À 2k 2Àb k À 1 2Àb 8:10 for all k 1; 2; The sense of this de®nition is that, for suf®ciently large m, the processes X m will have the same autocorrelation function equal to (8.6). As we can see from Eqs. (8.1) and (8.5), Var X m 0 is a double integral of the autocorrelation function or an integral of the spectral density function. These relations can clarify the behavior of rk, Var X m 0 , and f o obtained from the real traf®c measurements. Usually self-similarity of the data traf®c is established by analyzing the traf®c variance Var X m 0 [5, 12], but the behavior of the variance Var X m 0 can be close to a self-similar one, while the behavior of the autocorrelation can be quite different from the theoretical one since the autocorrelation function is a second derivative of the variance Var X m 0 . It is also important that, as can be seen from Eq. (8.5), even small harmonics at low frequency can have a dramatic in¯uence on the behavior of Var X m 0 in a suf®cient region of m values. 8.3 MODEL OF SELF-SIMILAR TRAFFIC Consider a Poission process on the time axis with intensity l: Let y .; y À1 ; y 0 ; y 1 ; .; 8.3 MODEL OF SELF-SIMILAR TRAFFIC 197 where y t is the number of Poisson points in the interval t; t 1; t 0; Æ1; Æ2; The random variables y t are independent and identically distributed with Prfy t kgl k =k!e Àl . Suppose that y t is the number of new active sources arriving to the system at moment t. For each source we assign a random variable t t;i Ðthe length of the active period, t t;i Pf1; 2; .g; t 0; Æ1; Æ2; .; i 1; 2; .Ðand a random process c t;i nÐthe rate of cell generation during the active period at the moment n from the beginning of the period, c t;i nPR ; n 0; 1; 2; The random couples t t;i ; c t;i n are independent and identically distributed and also independent of the process y. We de®ne our process Y .; Y À1 ; Y 0 ; Y 1 ; . as Y t R P t kÀI P y k j1 c k;j t À kI ft k;j > t À kg; 8:11 where IA is an indicator function of the event, Ift k;j > t À kg 1; if t k;j > t À k; 0; otherwise: Here and later we assume that the sum P k ij equals zero (independent of its argument) if k < j. The random variable Y t is the total cell generation rate from all active sources at time t.IfEY t < I, the process Y is stationary and ergodic. Process Y can also be obtained as a sum of the processes of M independent sources, where each source can be in only the active or passive state. In the passive state, a source does not generate cells, while in the active state it generates cells with rate c t;i n. Suppose that the length of the passive period u is an arbitrary random variable with mean u, and the length of active periods is t t;i . If we now consider M 3I, so that M= u Et t;i 3l and Prfu < M g30, then the obtained process will be statistically the same as the process Y . Let us calculate the mean, variance, and autocorrelation function of the process Y . From Eq. (8.11), considering that the variance of the sum of independent random variables is equal to the sum of variance, we have s 2 Var Y t P t kÀI Var P y k j1 c k;j t À kI ft k;j > t À kg "# : Replacing k with i t À k 1 and taking into account that t t;i ; c t;i n are identically distributed, we can write Var Y t P I i1 Var P y 0 j1 c 0;j i À 1Ift 0;j > i À 1g "# : 8:12 Denote k i;j R c 0;j i À 1Ift 0;j > i À 1g; i 1; 2; .; j 1; 2; .: 198 BOUNDS ON THE BUFFER OCCUPANCY PROBABILITY We have S R E P y 0 j1 k i;j "# lEk i;1 : Let us consider Var P y 0 j1 c 0;j i À 1Ift 0;j > i À 1g "# Var P y 0 j1 k i;j "# E P y 0 j1 k i;j À S "# 2 P I m0 Prfy 0 mgE P y 0 j1 k i;j À S ! 2 jy 0 m 2 4 3 5 P I m0 Prfy 0 mgE P m j1 k i;j À S ! 2 P I m0 Prfy 0 mg E P m j1 k i;j ! 2 À2SmEk i;1 S 2 2 4 3 5 P I m0 Prfy 0 mgm Var k i;1 m 2 Ek i;1 2 À 2SmEk i;1 S 2 l Var k i;1 lEk i;1 2 lEk 2 i;1 ; 8:13 where Prfy 0 mg is a Poisson distribution. From Eqs. (8.12) and (8.13) we have s 2 Var Y t l P I i1 Ek 2 i;1 l P I i1 Ec 2 0;1 i À 1Ift 0;1 > i À 1g: Now we will calculate the autocorrelation function rk. For any given t and k de®ne the following random variables: Z 1;i R P y i j1 c i;j t À 1Ift i;j > t k À ig; Z 2;i R P y i j1 c i;j t k À iI ft i;j > t k À ig; z R P t iÀI P y i j1 c i;j t À iIft k À i ! t i;j > t À ig; Z 1 R P t iÀI Z 1;i ; Z 2 R P t iÀI Z 2;i ; w R P tk it1 Z 2;i : 8:14 8.3 MODEL OF SELF-SIMILAR TRAFFIC 199 We have Y t z Z 1 ; Y tk w Z 2 : 8:15 Since w; z are independent random variables and also independent from Z 1 ; Z 2 , rk 1 s 2 EY t À EY t Y tk À EY tk 1 s 2 EZ 1 À EZ 1 Z 2 À EZ 2 : Using independence of Z 1;j ; Z 2;i for i T j we have s 2 rkEZ 1 À EZ 1 Z 2 À EZ 2 E P t iÀI Z 1;i À EZ 1;i P t iÀI Z 2;i À EZ 2;i P t iÀI EZ 1;i À EZ 1;i Z 2;i À EZ 2;i : 8:16 Consider the term EZ 1;i À EZ 1;i Z 2;i À EZ 2;i . For any given i de®ne random variables W 1;j R c i;j t À iIft i;j > t k À ig; W 2;j R c i;j t k À iI ft i;j > t k À ig: We have EZ 1;i À EZ 1;i Z 2;i À EZ 2;i P I m0 Prfy 0 mgEZ 1;i À EZ 1;i Z 2;i À EZ 2;i jy 0 m P I m0 Prfy 0 mgE P m j1 W 1;j À EZ 1;i ! P m j1 W 2;j À EZ 2;i !"# P I m0 Prfy 0 mg E P m j1 W 1;j ! P m j1 W 2;j !"# À EZ 1;i EZ 2;i "# : 8:17 Using the fact that couples W 1;j ; W 2;j are independent and identically distributed, we obtain E P m j1 W 1;j ! P m j1 W 2;j !"# m À 1mEW 1;1 EW 2;1 mEW 1;1 W 2;1 ; EZ 1;i lEW 1;1 ; EZ 2;i lEW 2;1 : 8:18 200 BOUNDS ON THE BUFFER OCCUPANCY PROBABILITY Substituting Eq. (8.18) into (8.17) and taking into account that y 0 has a Poisson distribution, we get EZ 1;i À EZ 1;i Z 2;i À EZ 2;i P I m0 Prfy 0 mgm À 1m À l 2 EW 1;1 EW 2;1 mEW 1;1 W 2;1 lEk 1;1 k 2;1 lEc i;1 t À ic i;1 t k À iI ft i;1 > t k À ig: Finally, using Eq. (8.16), we have s 2 rkl P t iÀI Ec 0;1 t À ic 0;1 t k À iI ft 0;1 > t k À ig l P I i0 Ec 0;1 ic 0;1 i kI ft 0;1 > i kg: 8:19 The mean value of Y t will be equal to EY t l P I i0 Ec 0;1 iIft 0;1 > ig: 8:20 A self-similar process Y can be obtained from the following theorem. Theorem 8.3.1. Process Y .; Y À1 ; Y 0 ; Y 1 ; . de®ned by Eq. (8.11) with ®nite mean m EY t < I and variance s 2 Var Y t < I is exactly second-order self-similar with parameter 0 < b < 1, if P I i0 Ec 0;1 ic 0;1 i kIft 0;1 > i kg s 2 2l k 1 2Àb À 2k 2Àb k À 1 2Àb 8:21 for all k 1; 2; Proof. This theorem directly follows from the de®nition of a self-similar process (8.6) and the expression for the autocorrelation function (8.19). j Corollary 8.3.2. If a random process c k;j i is a constant one, c k;j ic k;j for all i 0; 1; 2; ., then a process Y .; Y À1 ; Y 0 ; Y 1 ; . de®ned by Eq. (8.11), with ®nite mean m EY t < I and variance s 2 Var Y t < I, will be exactly second- order self-similar with parameter 0 < b < 1,if Prft 0;1 > kgEc 2 0;1 jt 0;1 > kÀ s 2 2l D 3 k À 1 2Àb R À s 2 2l k 2 2Àb À 3k 1 2Àb 3k 2Àb Àk À 1 2Àb 8:22 8.3 MODEL OF SELF-SIMILAR TRAFFIC 201 for all k 0; 1; 2; ., where, for convenience, we will use that À1 2Àb R 1, 0 2Àb R 0. Proof. Since c 0;1 i does not depend on i, we can write Ec 0;1 ic 0;1 i kI ft 0;1 > i kg Prft 0;1 > kgEc 2 0;1 jt 0;1 > k: Substituting this into Eq. (8.21) and subtracting Eqs. (8.21) for k and k 1, we obtain Eq. (8.22). j Equation (8.22) can also be written in the following form: Prft 0;1 kgEc 2 0;1 jt 0;1 k s 2 2l D 4 k À 2 2Àb R s 2 2l k 2 2Àb À 4k 1 2Àb 6k 2Àb À 4k À 1 2Àb k À 2 2Àb ; 8:23 k 1; 2; Corollary 8.3.3. If a random process c k;j i is independent of t k;j and stationary, then a process Y .; Y À1 ; Y 0 ; Y 1 ; . de®ned by Eq. (8.11), with ®nite mean m EY t < I and variance s 2 Var Y t < I, will be exactly second-order self- similar with parameter 0 < b < 1,if s 2 c r c km 2 c Prft 0;1 > kgÀ s 2 2l D 3 k À 1 2Àb 8:24 for all k 0; 1; 2; .,wheres 2 c r c k R Ec 0;1 iÀm c c 0;1 i kÀm c , m c Ec 0;1 i. Proof. Since c 0;1 i is a stationary process and does not depend on t 0;1 ,wehave Ec 0;1 ic 0;1 i kIft 0;1 > i kg Ec 0;1 ic 0;1 i k Prft 0;1 > i k s 2 c r c km 2 c Prft 0;1 > i kg: Substituting this into Eq. (8.21) and subtracting Eq. (8.21) for k and k 1, we obtain Eq. (8.24). j Equation (8.24) can also be rewritten in the following form: s 2 c r c km 2 c Prft 0;1 kg s 2 2l D 4 k À 2 2Àb ; 8:25 k 1; 2; 202 BOUNDS ON THE BUFFER OCCUPANCY PROBABILITY [...]... > zg for large values of z, while process Yt 1 will give an average load to the queueing system First, we derive an upper bound for the probability Prfnt > zg De®ne R Sk kÀ1 P i0 YtÀi ; kÀ1 P R j Sk i0 j YtÀi ; j 1; 2: 8:34 We have Prfnt > zg Pr maxfSk À kCg > z : k!1 For any 0 < d1 < C À m, 0 < d2 < 1, the following inequality is true: Pr maxfSk À kCg > z Pr k!1 1 maxfSk . the probability Prfn t > zg. De®ne S k R P kÀ1 i0 Y tÀi ; S j k R P kÀ1 i0 Y j tÀi ; j 1; 2: 8:34 We have Prfn t > zgPr max k!1 fS