1 / 37

TCOM 501: Networking Theory & Fundamentals

TCOM 501: Networking Theory & Fundamentals. Lecture 2 January 22, 2003 Prof. Yannis A. Korilis. Topics. Delay in Packet Networks Introduction to Queueing Theory Review of Probability Theory The Poisson Process Little’s Theorem Proof and Intuitive Explanation Applications.

issac
Télécharger la présentation

TCOM 501: Networking Theory & Fundamentals

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. TCOM 501: Networking Theory & Fundamentals Lecture 2 January 22, 2003 Prof. Yannis A. Korilis

  2. Topics • Delay in Packet Networks • Introduction to Queueing Theory • Review of Probability Theory • The Poisson Process • Little’s Theorem • Proof and Intuitive Explanation • Applications

  3. Sources of Network Delay • Processing Delay • Assume processing power is not a constraint • Queueing Delay • Time buffered waiting for transmission • Transmission Delay • Propagation Delay • Time spend on the link – transmission of electrical signal • Independent of traffic carried by the link • Focus: Queueing & Transmission Delay

  4. Buffer Server(s) Departures Arrivals Queued In Service Basic Queueing Model • A queue models any service station with: • One or multiple servers • A waiting area or buffer • Customers arrive to receive service • A customer that upon arrival does not find a free server is waits in the buffer

  5. Characteristics of a Queue • Number of servers m: one, multiple, infinite • Buffer size b • Service discipline (scheduling): FCFS, LCFS, Processor Sharing (PS), etc • Arrival process • Service statistics m b

  6. Arrival Process • : interarrival time between customers n and n+1 • is a random variable • is a stochastic process • Interarrival times are identically distributed and have a common mean • l is called the arrival rate

  7. Service-Time Process • : service time of customer n at the server • is a stochastic process • Service times are identically distributed with common mean • m is called the service rate • For packets, are the service times really random?

  8. Queue Descriptors • Generic descriptor: A/S/m/k • A denotes the arrival process • For Poisson arrivals we use M (for Markovian) • B denotes the service-time distribution • M: exponential distribution • D: deterministic service times • G: general distribution • m is the number of servers • k is the max number of customers allowed in the system – either in the buffer or in service • k is omitted when the buffer size is infinite

  9. Queue Descriptors: Examples • M/M/1: Poisson arrivals, exponentially distributed service times, one server, infinite buffer • M/M/m: same as previous with m servers • M/M/m/m: Poisson arrivals, exponentially distributed service times, m server, no buffering • M/G/1: Poisson arrivals, identically distributed service times follows a general distribution, one server, infinite buffer • */D/∞ : A constant delay system

  10. Probability Fundamentals • Exponential Distribution • Memoryless Property • Poisson Distribution • Poisson Process • Definition and Properties • Interarrival Time Distribution • Modeling Arrival and Service Statistics

  11. The Exponential Distribution • A continuous RV X follows the exponential distribution with parameter m, if its probability density function is: • Probability distribution function:

  12. Exponential Distribution (cont.) • Mean and Variance: • Proof:

  13. Memoryless Property • Past history has no influence on the future • Proof: • Exponential: the only continuous distribution with the memoryless property

  14. Poisson Distribution • A discrete RV Xfollows the Poisson distribution with parameter l if its probability mass function is: • Wide applicability in modeling the number of random events that occur during a given time interval – The Poisson Process: • Customers that arrive at a post office during a day • Wrong phone calls received during a week • Students that go to the instructor’s office during office hours • … and packets that arrive at a network switch

  15. Poisson Distribution (cont.) • Mean and Variance • Proof:

  16. Sum of Poisson Random Variables • Xi , i =1,2,…,n, are independent RVs • Xi follows Poisson distribution with parameter li • Partial sum defined as: • Sn follows Poisson distribution with parameter l

  17. Sum of Poisson Random Variables (cont.)

  18. Sampling a Poisson Variable • X follows Poisson distribution with parameter l • Each of the X arrivals is of type i with probability pi, i =1,2,…,n, independently of other arrivals;p1+ p2 +…+ pn = 1 • Xi denotes the number of type i arrivals • X1, X2,…Xn are independent • Xi follows Poisson distribution with parameter li= lpi

  19. Sampling a Poisson Variable (cont.)

  20. Binomial distribution with parameters (n, p) As n→∞ and p→0, with np=l moderate, binomial distribution converges to Poisson with parameter l Proof: Poisson Approximation to Binomial

  21. Poisson Process with Rate l • {A(t): t≥0} counting process • A(t) is the number of events (arrivals) that have occurred from time 0 – when A(0)=0 – to time t • A(t)-A(s) number of arrivals in interval (s, t] • Number of arrivals in disjoint intervals independent • Number of arrivals in any interval (t, t+t] of length t • Depends only on its length t • Follows Poisson distribution with parameter lt • Average number of arrivals lt; l is the arrival rate

  22. Interarrival-Time Statistics • Interarrival times for a Poisson process are independent and follow exponential distribution with parameter l • tn: time of nth arrival; tn=tn+1-tn: nth interarrival time Proof: • Probability distribution function • Independence follows from independence of number of arrivals in disjoint intervals

  23. Small Interval Probabilities • Interval (t+ d, t] of length d Proof:

  24. Merging & Splitting Poisson Processes • A1,…, Ak independent Poisson processes with rates l1,…, lk • Merged in a single processA= A1+…+ Ak • A is Poisson process with ratel= l1+…+ lk l1 lp p l l1+ l2 1-p l(1-p) l2 • A: Poisson processes with rate l • Split into processes A1 and A2 independently, with probabilities p and 1-p respectively • A1 is Poisson with rate l1= lpA2 is Poisson with rate l2= l(1-p)

  25. Modeling Arrival Statistics • Poisson process widely used to model packet arrivals in numerous networking problems • Justification: provides a good model for aggregate traffic of a large number of “independent” users • n traffic streams, with independent identically distributed (iid) interarrival times with PDF F(s) – not necessarily exponential • Arrival rate of each stream l/n • As n→∞, combined stream can be approximated by Poisson under mild conditions on F(s) – e.g., F(0)=0, F’(0)>0 • Most important reason for Poisson assumption:Analytic tractability of queueing models

  26. Little’s Theorem • l: customer arrival rate • N: average number of customers in system • T: average delay per customer in system • Little’s Theorem: System in steady-state N l T

  27. (t) N(t) b(t) t Counting Processes of a Queue • N(t) : number of customers in system at time t • (t) : number of customer arrivals till time t • b(t) : number of customer departures till time t • Ti: time spent in system by the ith customer

  28. Time average over interval [0,t] Steady state time averages Little’s theorem N=λT Applies to any queueing system provided that: Limits T, λ, and d exist, and λ= d We give a simple graphical proof under a set of more restrictive assumptions Time Averages

  29. Assumption: N(t)=0, infinitely often. For any such t If limits Nt→N, Tt→T,λt→λexist, Little’s formula follows We will relax the last assumption FCFS system, N(0)=0 (t) and b(t): staircase graphs N(t) = (t)- b(t) Shaded area between graphs (t) N(t) i Ti b(t) T2 T1 Proof of Little’s Theorem for FCFS t

  30. (t) N(t) i Ti b(t) T2 T1 Proof of Little’s for FCFS (cont.) • In general – even if the queue is not empty infinitely often: • Result follows assuming the limits Tt →T, λt→λ, and dt→d exist, and λ=d

  31. Probabilistic Form of Little’s Theorem • Have considered a single sample function for a stochastic process • Now will focus on the probabilities of the various sample functions of a stochastic process • Probability of n customers in system at time t • Expected number of customers in system at t

  32. Probabilistic Form of Little (cont.) • pn(t), E[N(t)]depend on t and initial distribution at t=0 • We will consider systems that converge to steady-state • there exist pn independent of initial distribution • Expected number of customers in steady-state [stochastic aver.] • For an ergodic process, the time average of a sample function is equal to the steady-state expectation, with probability 1.

  33. Probabilistic Form of Little (cont.) • In principle, we can find the probability distribution of the delay Tifor customer i, and from that the expected value E[Ti], which converges to steady-state • For an ergodic system • Probabilistic Form of Little’s Formula: • Arrival rate define as

  34. Time vs. Stochastic Averages • “Time averages = Stochastic averages,” for all systems of interest in this course • It holds if a single sample function of the stochastic process contains all possible realizations of the process at t→∞ • Can be justified on the basis of general properties of Markov chains

  35. Moment Generating Function

  36. Discrete Random Variables

  37. Continuous Random Variables

More Related