1 / 36

Discrete Random Variables and Probability Distributions

Discrete Random Variables and Probability Distributions. Random Variables. Random Variable (RV): A numeric outcome that results from an experiment For each element of an experiment’s sample space, the random variable can take on exactly one value

ezequiela
Télécharger la présentation

Discrete Random Variables and Probability Distributions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Discrete Random Variables and Probability Distributions

  2. Random Variables • Random Variable (RV): A numeric outcome that results from an experiment • For each element of an experiment’s sample space, the random variable can take on exactly one value • Discrete Random Variable: An RV that can take on only a finite or countably infinite set of outcomes • Continuous Random Variable: An RV that can take on any value along a continuum (but may be reported “discretely” • Random Variables are denoted by upper case letters (Y) • Individual outcomes for RV are denoted by lower case letters (y)

  3. Probability Distributions • Probability Distribution: Table, Graph, or Formula that describes values a random variable can take on, and its corresponding probability (discrete RV) or density (continuous RV) • Discrete Probability Distribution: Assigns probabilities (masses) to the individual outcomes • Continuous Probability Distribution: Assigns density at individual points, probability of ranges can be obtained by integrating density function • Discrete Probabilities denoted by: p(y) = P(Y=y) • Continuous Densities denoted by: f(y) • Cumulative Distribution Function: F(y) = P(Y≤y)

  4. Discrete Probability Distributions

  5. Example – Rolling 2 Dice (Red/Green) Y = Sum of the up faces of the two die. Table gives value of y for all elements in S

  6. Rolling 2 Dice – Probability Mass Function & CDF

  7. Rolling 2 Dice – Probability Mass Function

  8. Rolling 2 Dice – Cumulative Distribution Function

  9. Expected Values of Discrete RV’s • Mean (aka Expected Value) – Long-Run average value an RV (or function of RV) will take on • Variance – Average squared deviation between a realization of an RV (or function of RV) and its mean • Standard Deviation – Positive Square Root of Variance (in same units as the data) • Notation: • Mean: E(Y) = m • Variance: V(Y) = s2 • Standard Deviation: s

  10. Expected Values of Discrete RV’s

  11. Expected Values of Linear Functions of Discrete RV’s

  12. Example – Rolling 2 Dice

  13. Tchebysheff’s Theorem/Empirical Rule • Tchebysheff: Suppose Y is any random variable with mean m and standard deviation s. Then: P(m-ks ≤ Y ≤ m+ks) ≥ 1-(1/k2) for k ≥ 1 • k=1: P(m-1s ≤ Y ≤ m+1s) ≥ 1-(1/12) = 0 (trivial result) • k=2: P(m-2s ≤ Y ≤ m+2s) ≥ 1-(1/22) = ¾ • k=3: P(m-3s ≤ Y ≤ m+3s) ≥ 1-(1/32) = 8/9 • Note that this is a very conservative bound, but that it works for any distribution • Empirical Rule (Mound Shaped Distributions) • k=1: P(m-1s ≤ Y ≤ m+1s)  0.68 • k=2: P(m-2s ≤ Y ≤ m+2s)  0.95 • k=3: P(m-3s ≤ Y ≤ m+3s)  1

  14. Proof of Tchebysheff’s Theorem

  15. Moment Generating Functions (I)

  16. Moment Generating Functions (II) M(t) is called the moment-generating function for Y, and cam be used to derive any non-central moments of the random variable (assuming it exists in a neighborhood around t=0). Also, useful in determining the distributions of functions of rndom variables

  17. Probability Generating Functions P(t) is the probability generating function for Y

  18. Discrete Uniform Distribution • Suppose Y can take on any integer value between a and b inclusive, each equally likely (e.g. rolling a dice, where a=1 and b=6). Then Y follows the discrete uniform distribution.

  19. Bernoulli Distribution • An experiment consists of one trial. It can result in one of 2 outcomes: Success or Failure (or a characteristic being Present or Absent). • Probability of Success is p (0<p<1) • Y = 1 if Success (Characteristic Present), 0 if not

  20. Binomial Experiment • Experiment consists of a series of n identical trials • Each trial can end in one of 2 outcomes: Success or Failure • Trials are independent (outcome of one has no bearing on outcomes of others) • Probability of Success, p, is constant for all trials • Random Variable Y, is the number of Successes in the n trials is said to follow Binomial Distribution with parameters n and p • Y can take on the values y=0,1,…,n • Notation: Y~Bin(n,p)

  21. Binomial Distribution

  22. Binomial Distribution – Expected Value

  23. Binomial Distribution – Variance and S.D.

  24. Binomial Distribution – MGF & PGF

  25. Geometric Distribution • Used to model the number of Bernoulli trials needed until the first Success occurs (P(S)=p) • First Success on Trial 1  S,y = 1  p(1)=p • First Success on Trial 2  FS,y = 2  p(2)=(1-p)p • First Success on Trial k  F…FS,y = k  p(k)=(1-p)k-1p

  26. Geometric Distribution - Expectations

  27. Geometric Distribution – MGF & PGF

  28. Negative Binomial Distribution • Used to model the number of trials needed until the rth Success (extension of Geometric distribution) • Based on there being r-1 Successes in first y-1 trials, followed by a Success

  29. Poisson Distribution • Distribution often used to model the number of incidences of some characteristic in time or space: • Arrivals of customers in a queue • Numbers of flaws in a roll of fabric • Number of typos per page of text. • Distribution obtained as follows: • Break down the “area” into many small “pieces” (n pieces) • Each “piece” can have only 0 or 1 occurrences (p=P(1)) • Let l=np≡ Average number of occurrences over “area” • Y ≡ # occurrences in “area” is sum of 0s & 1s over “pieces” • Y ~ Bin(n,p) with p = l/n • Take limit of Binomial Distribution as n  with p = l/n

  30. Poisson Distribution - Derivation

  31. Poisson Distribution - Expectations

  32. Poisson Distribution – MGF & PGF

  33. Hypergeometric Distribution • Finite population generalization of Binomial Distribution • Population: • N Elements • k Successes (elements with characteristic if interest) • Sample: • n Elements • Y = # of Successes in sample (y = 0,1,,,,,min(n,k)

More Related