1 / 71

Continuous Distributions

Continuous Distributions. The Uniform distribution from a to b. The Normal distribution (mean m , standard deviation s ). The Exponential distribution. Weibull distribution with parameters a and b . The Weibull density, f ( x ). ( a = 0.9, b = 2). ( a = 0.7, b = 2).

urvi
Télécharger la présentation

Continuous Distributions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Continuous Distributions The Uniform distribution from a to b

  2. The Normal distribution (mean m, standard deviation s)

  3. The Exponential distribution

  4. Weibull distribution with parameters a andb.

  5. The Weibull density, f(x) (a= 0.9, b= 2) (a= 0.7, b= 2) (a= 0.5, b= 2)

  6. The Gamma distribution Let the continuous random variable X have density function: Then X is said to have a Gamma distribution with parameters aand l.

  7. Expectation

  8. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected value of X, E(X) is defined to be: and if X is continuouswith probability density function f(x)

  9. Interpretation of E(X) • The expected value of X, E(X), is the centre of gravity of the probability distribution of X. • The expected value of X, E(X), is the long-run average value of X. (shown later –Law of Large Numbers) E(X)

  10. Example: The Uniform distribution Suppose X has a uniform distribution from a to b. Then: The expected value of X is:

  11. Example: The Normal distribution Suppose X has a Normal distribution with parameters mand s. Then: The expected value of X is: Make the substitution:

  12. Hence Now

  13. Example: The Gamma distribution Suppose X has a Gamma distribution with parameters aand l. Then: Note: This is a very useful formula when working with the Gamma distribution.

  14. The expected value of X is: This is now equal to 1.

  15. Thus if X has a Gamma (a ,l) distribution then the expected value of X is: Special Cases: (a ,l) distribution then the expected value of X is: • Exponential (l) distribution:a = 1, l arbitrary • Chi-square (n) distribution:a = n/2, l = ½.

  16. The Gamma distribution

  17. The Exponential distribution

  18. The Chi-square (c2) distribution

  19. Expectation of functions of Random Variables

  20. Definition Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected value of g(X), E[g(X)] is defined to be: and if X is continuouswith probability density function f(x)

  21. Example: The Uniform distribution Suppose X has a uniform distribution from 0to b. Then: Find the expected value of A = X2. If X is the length of a side of a square (chosen at random form 0 to b) then A is the area of the square = 1/3 the maximum area of the square

  22. Example: The Geometric distribution Suppose X (discrete) has a geometric distribution with parameter p. Then: Find the expected value of XA and the expected value of X2.

  23. Recall: The sum of a geometric Series Differentiating both sides with respect to r we get: Differentiating both sides with respect to r we get:

  24. Thus This formula could also be developed by noting:

  25. This formula can be used to calculate:

  26. To compute the expected value of X2. we need to find a formula for Note Differentiating with respect to r we get

  27. Differentiating again with respect to r we get Thus

  28. implies Thus

  29. Thus

  30. Moments of Random Variables

  31. Definition Let X be a random variable (discrete or continuous), then the kthmoment of X is defined to be: The first moment of X , m = m1 = E(X) is the center of gravity of the distribution of X. The higher moments give different information regarding the distribution of X.

  32. Definition Let X be a random variable (discrete or continuous), then the kthcentralmoment of X is defined to be: wherem = m1 = E(X) = the first moment of X .

  33. The central moments describe how the probability distribution is distributed about the centre of gravity, m. = 2ndcentral moment. depends on the spread of the probability distribution of X about m. is called the variance of X. and is denoted by the symbolvar(X).

  34. is called the standard deviation of X and is denoted by the symbols. The third central moment contains information about the skewness of a distribution.

  35. The third central moment contains information about the skewness of a distribution. Measure of skewness

  36. Positively skewed distribution

  37. Negatively skewed distribution

  38. Symmetric distribution

  39. The fourth central moment Also contains information about the shape of a distribution. The property of shape that is measured by the fourth central moment is called kurtosis The measure ofkurtosis

  40. Mesokurtic distribution

  41. Platykurtic distribution

  42. leptokurtic distribution

  43. Example: The uniform distribution from 0 to 1 Finding the moments

  44. Finding the central moments:

  45. Thus The standard deviation The measure of skewness The measure of kurtosis

  46. Rules for expectation

  47. Rules: Proof The proof for discrete random variables is similar.

  48. Proof The proof for discrete random variables is similar.

  49. Proof The proof for discrete random variables is similar.

  50. Proof

More Related