1 / 9

Internet Traffic Modeling

Internet Traffic Modeling. Authors:. Date: 2009-11-17. Why Model Internet Traffic?. Why model traffic? The traffic model is the key for determining the performance of the system. The more accurate is the traffic model the better is the system quantified in terms of its performance.

rooney-burt
Télécharger la présentation

Internet Traffic Modeling

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Internet Traffic Modeling Authors: Date: 2009-11-17 Broadcom

  2. Why Model Internet Traffic? • Why model traffic? • The traffic model is the key for determining the performance of the system. The more accurate is the traffic model the better is the system quantified in terms of its performance. • Traffic model in the evaluation methodology document should focus on capturing the accents of the application which posts special demand on the system performance. • In the traffic model case, the long rang dependency (LRD) is the key characteristic that needs to be captured, because high burstiness resulting from LRD posts high demand on both transport and buffering capability in the system. • System Impact of Traffic Modeling • Network performance degrades gradually with increasing LRD (self-similarity). • The more self-similar the traffic, the slower the queue length decays. • Aggregating streams of self-similar traffic typically intensifies the self-similarity ("burstiness") rather than smoothing it. • The bursty behaviour exacerbates the clustering phenomena and degrades network performance. • QoS depends on coping with traffic peaks - video delay bound may be exceeded. Broadcom

  3. The Packet Count From Measurements by Bellcore • In 1989, Leland and Wilson begin taking high resolution traffic traces at Bellcore • Ethernet traffic from a large research lab • 100 m sec time stamps • Packet length, status, 60 bytes of data • Mostly IP traffic (a little NFS) • Four data sets over three year period • Over 100 million packets in traces • Traces considered representative of normal use • A Poisson process • When observed on a fine time scale traffic will appear bursty • When aggregated on a coarse time scale traffic will flatten (smooth) to white noise • A Self-Similar (fractal) process • When aggregated over wide range of time scales traffic will maintain its bursty characteristic Broadcom

  4. Self-similarity: Definition and Manifestations • Self-similarity manifests itself in several equivalent fashions: • Slowly decaying variance • Long range dependence • Non-degenerate autocorrelations • Hurst effect • Self-similar processes are the simplest way to model processes with long-range dependence – correlations that persist (do not degenerate) across large time scales • The autocorrelation function r(k) of a process (statistical measure of the relationship, if any, between a random variable and itself, at different time lags) with long-range dependence is not summable: • Sr(k) = inf. • r(k) @ k-b as k g inf. for 0 < b < 1 • Autocorrelation function follows a power law • Slower decay than exponential process • Power spectrum is hyperbolic rising to inf. at freq. 0 • If Sr(k) < inf. then there is a short-range dependence • Hurst Parameter • Related to the autocorrelation lag b by H=1-b/2 Broadcom

  5. Analysis of Different Traffic • Ethernet Traffic [LT 94]: • Analysis of traffic logs from perspective of packets/time unit found H to be between 0.8 and 0.95. • Aggregations over many orders of magnitude • Effects seem to increase over time • Initial looks at external traffic pointed to similar behavior • TCP Traffic [PF 95]: • Dominated by diurnal traffic cycle • A simple statistical test was developed to assess accuracy of Poisson assumption • Exponential distribution of interarrivals • Independence of interarrivals • TELNET and FTP connection interarrivals are well modeled by a Poisson process • Evaluation over several hour and minutes periods • WWW Traffic [CB 97]: • Crovella and Bestavros [CB 97] analyze WWW logs collected at clients over a 1.5 month period • H was found to be between 0.7 and 0.8 Broadcom

  6. Generating Self Similar Traffic to Model Internet Traffic in TGad (1) • Traditional traffic models: finite variance ON/OFF source models • Superposition of such sources behaves like white noise, with only short range correlations • Lengths of ON and OFF periods are iid positive random variables, Uk • Suppose that U has a hyperbolic tail distribution, • Property (1) is the infinite variance syndrome or the Noah Effect. •   2 implies E(U2) =  •  > 1 ensures that E(U) < , and that S0 is not infinite Broadcom

  7. Generating Self Similar Traffic to Model Internet Traffic in TGad (2) • Consider a set M traffic sources which are typical ON/OFF sources • Let the value of M be 20 • The distribution of ON and OFF times are heavy tailed (a1, a2) • Ex: Hyperexponential or Pareto or Weibull distribution • The aggregation of these processes leads to a self-similar process • H = (3 - min (a1, a2))/2 • Choose the value of according to the desired H as shown above Broadcom

  8. Graphical Tests for Self-Similarity • How to measure self similarity? • Variance-time plots • Relies on slowly decaying variance of self-similar series • The variance of X(m) is plotted versus m on log-log plot • Slope (-b) greater than –1 is indicative of SS • R/S plots • Relies on rescaled range (R/S) statistic growing like a power law with H as a function of number of points n plotted. • The plot of R/S versus n on log-log has slope which estimates H Broadcom

  9. References • [LT 94] W. Leland, M. Taqqu, W. Willinger, D. Wilson, On the Self-Similar Nature of Ethernet Traffic, IEEE/ACM TON, 1994. • Baker Award winner • [PF 95] V. Paxson, S. Floyd, Wide-Area Traffic: The Failure of Poisson Modeling, IEEE/ACM TON, 1995. • [CB 97] M. Crovella, A. Bestavros, Self-Similarity in World Wide Web Traffic: Evidence and Possible Causes, IEEE/ACM TON, 1997. • A Nonstationary Poisson view of Internet Traffic; TKaragiannis, M.Molle, M.Falautsos, A.Broido; Infocom in 2004 • http://www.itl.nist.gov/div898/handbook/eda/section3/eda366j.htm • http://www.itl.nist.gov/div898/handbook/eda/section3/eda35c.htm • Wide-Area Traffic: The Failure of Poisson Modeling; Vern Paxson and Sally Floyd; University of California, Berkeley • Mathematical Modeling of the internet; F.Kelly, Statistical Laboratory, Univ of Cambridge. • Internet Traffic modeling: Markovian Approach to self similarity traffic and prediction of Loss Probability for Finite Queues; S.Kasahara; IEICE Trans Communications, 2001 Broadcom

More Related