1. Trang chủ
  2. » Công Nghệ Thông Tin

Networking Theory and Fundamentals - Lectures 9 & 10 pps

32 356 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 32
Dung lượng 454,05 KB

Nội dung

1 TCOM 501: Networking Theory & Fundamentals Lectures 9 & 10 M/G/1 Queue Prof. Yannis A. Korilis 10-2 Topics  M/G/1 Queue  Pollaczek-Khinchin (P-K) Formula  Embedded Markov Chain Observed at Departure Epochs  Pollaczek-Khinchin Transform Equation  Queues with Vacations  Priority Queueing 10-3 M/G/1 Queue  Arrival Process: Poisson with rate λ  Single server, infinite waiting room  Service times:  Independent identically distributed following a general distribution  Independent of the arrival process  Main Results: Determine the average time a customer spends in the queue waiting service (Pollaczek-Khinchin Formula) Calculation of stationary distribution for special cases only 10-4 M/G/1 Queue – Notation  i W : waiting time of customer i  i X : service time of customer i  i Q : number of customers waiting in queue (excluding the one in service) upon arrival of customer i  i R : residual service time of customer i = time until the customer found in service b y customer i completes service  i A : number of arrivals during the service time i X of customer i Service Times  X 1 , X 2 , …, independent identically distributed RVs  Independent of the inter-arrival times  Follow a general distribution characterized by its pdf () X f x , or cdf () X F x  Common mean []1/EX = µ  Common second moment 2 []EX 10-5 M/G/1 Queue State Representation:  {(): 0}Nt t≥ is not a Markov process – time spent at each state is not exponential  R(t) = the time until the customer that is in service at time t completes service  {( ( ), ( )) : 0}Nt Rt t≥ is a continuous time Markov process, but the state space is not a countable set  Finding the stationary distribution can be a rather challenging task Goals:  Calculate average number of customers and average time-delay without first calculating the stationary distribution  Pollaczek-Khinchin (P-K) Formula: 2 [] [] 2(1 [ ]) EX EW E X λ = −λ  To find the stationary distribution, we use the embedded Markov chain, defined by observing ()Nt at departure epochs only – transformation methods 10-6 A Result from Probability Theory Proposition: Sum of a Random Number of Random Variables  N: random variable taking values 0,1,2,…, with mean []EN  X 1 , X 2 , …, X N : iid random variables with common mean []EX Then: 1 [][][] N EX X EX EN++ = ⋅ Proof: Given that N=n the expected value of the sum is 111 |[][] Nnn jjj jjj EXNnEX EXnEX ===  == = =  ∑∑∑ Then: 11 11 1 |{}[]{} [] { } [][] NN jj jj nn n EX EXNnPNnnEXPNn EX nPN n EXEN ∞∞ == == ∞ =   = =× == ⋅ =   === ∑∑∑ ∑ ∑ 10-7 Pollaczek-Khinchin Formula  Assume FCFS discipline  Waiting time for customer i is: 1 12 i i i ii i i iQ i j jiQ WRX X X R X − −− − =− =+ + ++ =+ ∑   Take the expectation on both sides and let i →∞ , assuming the limits exist: 1 [] [] [] [][] [] [] [][] i i ii j i i jiQ EW ER E X ER EX EQ EW E R E X EQ − =−  = +=+⇒  =+ ∑  Averages E[Q], E[R] in the above equation are those seen by an arriving customer.  Poisson arrivals and Lack of Anticipation: averages seen by an arriving customer are equal averages seen by an outside observer – PASTA property  Little’s theorem for the waiting area only: [] [ ] E QEW = λ  [] [] [] [] [] [] [] 1 E R EW E R E X EW R EW EW=+λ⋅ =+ρ ⇒ = −ρ  [] /EXρ=λ =λ µ : utilization factor = proportion of time server is busy 0 []1 E Xp ρ =λ = − Calculate the average residual time: [ ] lim [ ] i i E RER →∞ = 10-8 Average Residual Time 2 X 1 X 1 X () D t X t ()Rt  Graphical calculation of the long-term average of the residual time  Time-average residual time over [0,t]: 1 0 () t tRsds − ∫  Consider time t, such that R(t)=0. Let D(t) be the number of departures in [0,t] and assume that R(0)=0. From the figure we see that: () 2 2 () 1 0 1 () 2 1 0 111() () 22 () 11() lim ( ) lim lim 2() Dt Dt t i ii i Dt t i i ttt X X Dt Rsds tt tDt X Dt Rsds ttDt = = = →∞ →∞ →∞ = =⋅ ⋅ ⇒ =⋅ ⋅ ∑ ∑ ∫ ∑ ∫  Ergodicity: long-term time averages = steady-state averages (with probability 1) 0 1 [ ] lim [ ] lim ( ) t i it ER ER Rsds t →∞ →∞ == ∫ 10-9 Average Residual Time (cont.)  () 2 1 0 11() lim ( ) lim lim 2() Dt t i i ttt X Dt Rsds ttDt = →∞ →∞ →∞ =⋅ ⋅ ∑ ∫  lim ( ) / t Dt t →∞ : long-term average departure rate. Should be equal to the long-term average arrival rate. Long-term averages = steady-state averages (with probability 1): () lim t Dt t →∞ = λ  Law of large numbers: () 22 2 11 lim lim [ ] () Dt n ii ii tn XX E X Dt n == →∞ →∞ == ∑ ∑ Average residual time: 2 1 [] [ ] 2 E REX=λ P-K Formula: 2 [] [ ] [] 12(1) E REX EW λ == −ρ − ρ 10-10 P-K Formula P-K Formula: 2 [] [ ] [] 12(1) E REX EW λ == − ρ−ρ  Average time a customer spends in the system 2 1[] [] [ ] [ ] 2(1 ) E X ET E X EW λ =+=+ µ −ρ  Average number of customers waiting for service: 22 [] [] [ ] 2(1 ) E X EQ EW λ =λ = − ρ  Average number of customers in the system (waiting or in service): 22 [] [] [] 2(1 ) E X EN ET λ =λ =ρ+ − ρ Averages E[W], E[T], E[Q], E[N] depend on the first two moments of the service time [...]... function M X (t ) exists and is finite in some neighborhood of t=0, it determines the distribution (pdf or pmf) of X uniquely Theorem 2: For any positive integer n: 1 dn M X (t ) = E[ X n etX ] n dt 2 dn M X (0) = E[ X n ] n dt Theorem 3: If X and Y are independent random variables: M X +Y (t ) = M X (t ) M Y (t ) 1 0-1 9 Z-Transforms of Discrete Random Variables For a discrete random variable, the moment... to set z = et and define the z-transform (or characteristic function): GX ( z ) = E[ z X ] = ∑ z j P{ X = x j } x j Let X be a discrete random variable taking values 0, 1, 2,…, and let pn = P{ X = n} The ztransform is well-defined for | z |< 1 : GX ( z ) = p0 + zp1 + z 2 p2 + z 3 p3 + ∞ = ∑ pn z n n=0 Z-transform uniquely determines the distribution of X If X and Y are independent random variables:... h ir pet 1¡(1¡p)et t e¸(e ¡1) 1 0-2 2 P-K Transform Equation We have established: L j = L j −1 − 1{L j −1 > 0} + Aj = ( L j −1 − 1) + + Aj Let πn = lim P{L j = n} be the stationary distribution and GL ( z ) = ∑ n = 0 πn z n its z-transform ∞ j →∞ Noting that ( L j −1 − 1) + and Aj are independent, we have: L E[ z j ] = E[ z ( L j −1 −1) + A ]E [ z j ] At steady-state, L j and L j −1 are statistically... factorial moments can be calculated similarly 1 0-2 0 Continuous Random Variables Distribution (parameters) Prob Density Fun fX (x) Moment Gen Fun MX (t) Mean E[X] Variance Var(X) Uniform over (a; b) 1 b¡a etb ¡eta t(b¡a) a+b 2 (b¡a)2 12 ¸ ¸¡t 1 ¸ 1 ¸ ¹ ¾2 Exponential ¸ Normal (¹; ¾ 2 ) a . 1 TCOM 501: Networking Theory & Fundamentals Lectures 9 & 10 M/G/1 Queue Prof. Yannis A. Korilis 1 0-2 Topics  M/G/1 Queue  Pollaczek-Khinchin (P-K) Formula  Embedded Markov. ] n n X n d M EX dt = Theorem 3: If X and Y are independent random variables: () () () XY X Y M tMtMt + = 1 0-1 9 Z-Transforms of Discrete Random Variables  For a discrete random variable, the moment. 1 0-6 A Result from Probability Theory Proposition: Sum of a Random Number of Random Variables  N: random variable taking values 0,1,2,…, with mean []EN  X 1 , X 2 , …, X N : iid random

Ngày đăng: 22/07/2014, 18:22

TỪ KHÓA LIÊN QUAN