1 / 36

Queueing Systems I : Chap. 2 Some Important Random Processes

Queueing Systems I : Chap. 2 Some Important Random Processes. S.Y. Yang. What is a “ Queue ”?. Any system where jobs/customers/users arrive looking for service and depart once service is provided. Notation. C n n-th customer to enter the system N(t)

duaa
Télécharger la présentation

Queueing Systems I : Chap. 2 Some Important Random Processes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Queueing Systems I : Chap. 2 Some Important Random Processes S.Y. Yang

  2. What is a “Queue”? • Any system where jobs/customers/users arrive looking for service and depart once service is provided.

  3. Notation • Cn • n-th customer to enter the system • N(t) • number of customers in the system at time t • U(t) • The unfinished work in the system at time t • The remaining time required to empty the system of all customers present at time t • If U(t) > 0, system is said to be busy and only when U(t) = 0, system is said to be idle.

  4. continued

  5. Time-diagram notation for queues

  6. Arrivals and departures (t)number of arrivals in(0,t) (t) number of departures in(0,t) N(t) the number in the system at time t = (t) - (t) (t)  the total time all customers have spent in the system during (0,t) t average arrival rate in (0,t) = (t)/t Tt average system time per customer in (0,t) = (t)/(t) Nt average number of customers during (0,t) = (t)/t

  7. Little’s Theorem • Nt = (t)/t = ((t)/(t))((t)/t) = Tt t • Let t ->  N = T The average number of customers in a queueing system is equal to the average arrival rate of customers to that system, times the average time spent in that system.

  8.  (utilization factor) •   (average arrival rate of customers) x (average service time) = x • In the case of multiple servers •  = E[fraction of busy servers]

  9. Discrete-time Markov Chains • Definition : The sequence of random variables X1, X2, … forms a discrete-time Markov chain if • The right side of above equation is referred to as the (one-step) transition probability.

  10. Homogeneous Markov chain • If it turns out that the transition probabilities are independent of n, then we have what is referred to as a homogeneous Markov chain.

  11. Irreducible Markov Chain • We say that a Markov chain is irreducible if every state can be reached from every other state; that is, for each pair of state( Ei and Ej ) there exists an integer m0(which may depend upon i and j) such that

  12. More definitions about Markov chain • Closed • Absorbing state • Reducible • Recurrent/Transient • Periodic/aperiodic • Mean recurrence time • Recurrent null/Recurrent nonnull • P[system in state j at the n-th step]

  13. Irreducible MC: Theorems • Theorem 1 : The states of an irreducible Markov chain are either all transient or all recurrent nonnull or all recurrent null. If periodic, then all states have the same period . • Theorem 2 : In an irreducible and aperiodic homogeneous Markov chain the limiting probabilities always exist and are independent of the initial state pribability distribution.

  14. Moreover, either • A. all states are transient or all states are recurrent null in which cases for all j and there exists no stationary distribution, or • B. all states are recurrent nonnull and then for all j, in which case the set {j} is a stationary probability distribution and

  15. Ergodicity • A state is said to be ergodic if it is aperiodic, recurrent, and nonnull. • Moreover, a Markov chain is said to be ergodic if the probability distribution {j(n)} as a function of n always converges to a limiting stationary distribution {j}, which is independent of the initial state distribution.

  16. Transition probability matrix • Transition probability matrix • Probability vector

  17. Transient behavior of the system • The probability vector at time n

  18. General solution using z-Trans. • Somebody can add summary here. • It’s beyond my understanding.

  19. Memoryless Property of MC • P[system remains in Ei for exactly m additional steps given that it has just entered Ei] = (1-pii)piim • Geometric Distribution!

  20. General(Nonhomogeneous) MC • Multistep transition probabilities • One-step transition probability matrix • Multistep transition probability matrix

  21. Chapman-Kolmogorov equation • Rewritten in matrix form as • Let q=n-1, • Let q=m+1, • Time dependent Probabilities

  22. Continus-time MC. • Definition • Memoryless Property

  23. Continuous-time MC memoryless property leads exponential distribution The pdf for the time the process spends in state Ei is exponentially distributed with the parameter

  24. Continuous-time C-K equation • Def. of Time dependent transition probability • Continuous-time MC C-K equation • Represented by Matrix form

  25. continued • If we define • Q(t) is called transition rate matrix.

  26. continued • Forward & backward C-K equation

  27. State probabilities

  28. Homogeneous Continuous-time MC

  29. Birth-Death Process • A Markov process in which transitions from state Ek are permitted only to neighboring states Ek+1, Ek, and Ek -1. • Birth : a transition from Ek to Ek+1. • Death : a transition from Ek+1 to Ek. • Birth rate : k • Death rate : k

  30. continued • Birth and death rates are independent of time and depend only on Ek. • A continuous time homogeneous MC

  31. What we wish to solve for? • The probability that the population size is k at some time t;

  32. continued

  33. Pure birth Process (Poisson Process) • k = 0 for all k.

  34. Summary • Notations • Arrivals and Departures • Little’s Theorem • Discrete-time Markov Chain • Continuous-time Markov Chain

  35. Reference • Queueing Systems Volume I : Theory, Leonard Kleinrock, JW&S, 1975, http://www.lk.cs.ucla.edu/ • Lecture notes from http://vega.icu.ac.kr/~bnec/ written by Professor J.K. Choi. • Lecture notes from http://home.iitk.ac.in/~skb/ee679/ee679.html written by Professor S.K. Bose.

More Related