1 / 23

Continuous random variables

Continuous random variables. Continuous random variable Let X be such a random variable Takes on values in the real space (-infinity; +infinity) (lower bound; upper bound) Instead of using P(X= i ) Use the probability density function f X (t) Or f X (t) dt.

lis
Télécharger la présentation

Continuous random variables

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Continuous random variables • Continuous random variable • Let X be such a random variable • Takes on values in the real space • (-infinity; +infinity) • (lower bound; upper bound) • Instead of using P(X=i) • Use the probability density function • fX (t) • Or fX (t)dt

  2. Cumulative function of continuous r.v. • The relationship between the • Cumulative distribution of continuous r.v. and fX • => • Properties for CDF

  3. Distribution function: properties • Properties for pdf

  4. Uniform random variable • X is a uniform random variable • Mean: • Variance:

  5. Exponential distribution • Exponential distribution • is the foundation of most of the stochastic processes • Makes the Markov processes ticks • is used to describe the duration of sthg • CPU service • Telephone call duration • Or anything you want to model as a service time

  6. Exponential random variable • A continuous r.v. X • Whose density function is given for • is said to be an exponential r.v. with parameter λ • Mean: and variance:

  7. Exponentially distributed with 1/ λ T Link between Poisson and Exponential • If the arrival process is Poisson • # arrivals per time unit follows the Poisson distribution • With parameter λ • => inter-arrival time is exponentially distributed • With mean = 1/ λ = average inter-arrival time 0 Time

  8. Proof • Number of arrivals in a t-second interval • Follows the Poisson distribution with parameter • Let denote random time of first arrival • => T is exponentially distributed

  9. Memoryless Property Proof:

  10. Example • Suppose that amount of time you spend in bank • is exponential with mean 10 min • What is the probability you spend more than 5 min in bank? • What is the probability you spend more than 15 min • Given that you are still in bank after 10 min?

  11. Hyper-exponential distributions • H2 Hn • Advantage • Allows a more sophisticated representation • Of a service time • While preserving the exponential distribution • And have a good chance of analyzing the problem λ1 p1 λ1 p λ2 p2 . . 1-p λ2 λn pn

  12. Further properties of the exponential distribution • If are independent exponential r.v. • With mean , then the pdf of is: • Gamma distribution with parameters n and • If and are independent exponential r.v. • With mean and =>

  13. Further properties of the exponential distribution (ct’d) • X1 , X2 , …, Xn independent r.v. • Xi follows an exponential distribution with • Parameter λi => fXi (t) = λieλit • Define • X = min{X1, X2, …, Xn} is also exponentially distributed • Proof • fX(t) = ?

  14. Joint distribution functions • Discrete case • One variable (pmf) • P(X=i) • Joint distribution • P(X1=i1, X2=i2, …, Xn=in) • Continuous case • One variable (pdf) • fX(t) • Joint distribution • fX1, X2,…Xn, (t1,t2,.., tn)

  15. Independent random variables • The random variables X1 , X2 • Are said to be independent if, for all a, b • Example • Green die: X1 • Red die: X2 • X3 = X1 + X2 • X3 and X1 are dependent or independent?

  16. Marginal distribution • Joint distribution • Discrete case • P(X1=i, X2=j), for all i, j in S1xS2 • => • Continuous case • fX1,X2(t1,t2), for all t1,t2 • =>

  17. Expectation of a r.v.: the continuous case • X is a continuous r.v. • Having a probability density function f(x) • The expected value of X is defined by • Define g(X) a function of r.v. X

  18. Expectation of a r.v.: the continuous case (cont’d) • X1, X2, …, Xn: dependent or independent • Example:

  19. Variance, auto-correlation, & covariance • Variance • Continuous case • If are independent r.v. => • If X and Y are correlated r.v.: • Autocorrelation: • Covariance

  20. Conditional probability and conditional expectation: d.r.v. • X and Y are discrete r.v. • Conditional probability mass function • Of X given that Y=y • Conditional expectation of X given that Y=y

  21. Conditional probability and expectation: continuous r.v. • If X and Y have a joint pdf fX,Y(x,y) • Then, the conditional probability density function • Of X given that Y=y • The conditional expectation • Of X given that Y=y

  22. Computing expectations by conditioning • Denote • E[X|Y]: function of the r.v. Y • Whose value at Y=y is E[X|Y=y] • E[X|Y]: is itself a random variable • Property of conditional expectation • if Y is a discrete r.v. • if Y is continuous with density fY(y) => (1) (2) (3)

  23. Proof of equation when X and Y are discrete

More Related