1 / 33

Continuous Random Variables Chapter 5

Continuous Random Variables Chapter 5. Nutan S. Mishra Department of Mathematics and Statistics University of South Alabama. Continuous Random Variable. When random variable X takes values on an interval For example GPA of students X  [0, 4] High day temperature in Mobile X  ( 20, ∞)

octavio
Télécharger la présentation

Continuous Random Variables Chapter 5

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Continuous Random VariablesChapter 5 Nutan S. Mishra Department of Mathematics and Statistics University of South Alabama

  2. Continuous Random Variable When random variable X takes values on an interval For example GPA of students X  [0, 4] High day temperature in Mobile X  ( 20,∞) Recall in case of discrete variables a simple event was described as (X = k) and then we can compute P(X = k) which is called probability mass function In case of continuous variable we make a change in the definition of an event.

  3. Continuous Random Variable Let X  [0,4], then there are infinite number of values which x may take. If we assign probability to each value then P(X=k)  0 for a continuous variable In this case we define an event as (x-x≤ X ≤ x+x ) where x is a very tiny increment in x. And thus we assign the probability to this event P(x-x≤ X ≤ x+x ) = f(x) dx f(x) is called probability density function (pdf)

  4. Properties of pdf

  5. (cumulative) Distribution Function The cumulative distribution function of a continuous random variable is Where f(x) is the probability density function of x.

  6. Relation between f(x) and F(x)

  7. Mean and Variance

  8. Exercise 5.2 To find the value of k Thus f(x) = 4x3 for 0<x<1 P(1/4<x<3/4) = = P(x>2/3) = =

  9. Exercise 5.7 Exercise 5.13

  10. Probability and density curves • P (a<Y<b): P(100<Y<150)=0.42 Useful link: http://people.hofstra.edu/faculty/Stefan_Waner/RealWorld/cprob/cprob2.html

  11. Normal Distribution X = normal random variate with parameters µ and σ if its probability density function is given by µ and σ are called parameters of the normal distribution http://www.willamette.edu/~mjaneba/help/normalcurve.html

  12. Standard Normal Distribution The distribution of a normal random variable with mean 0 and variance 1 is called a standard normal distribution.

  13. Standard Normal Distribution • The letter Z is traditionally used to represent a standard normal random variable. • z is used to represent a particular value of Z. • The standard normal distribution has been tabularized.

  14. Standard Normal Distribution Given a standard normal distribution, find the area under the curve (a) to the left of z = -1.85 (b) to the left of z = 2.01 (c) to the right of z = –0.99 (d) to right of z = 1.50 (e) between z = -1.66 and z = 0.58

  15. Standard Normal Distribution Given a standard normal distribution, find the value of k such that (a) P(Z < k) = .1271 (b) P(Z < k) = .9495 (c) P(Z > k) = .8186 (d) P(Z > k) = .0073 (e) P( 0.90 < Z < k) = .1806 (f) P( k < Z < 1.02) = .1464

  16. Normal Distribution • Any normal random variable, X, can be converted to a standard normal random variable: z = (x – μx)/sx Useful link:(pictures of normal curves borrowed from: http://www.stat.sc.edu/~lynch/509Spring03/25

  17. Normal Distribution Given a random Variable X having a normal distribution with μx = 10 and sx = 2, find the probability that X < 8. z x 4 6 8 10 12 14 16

  18. Relationship between the Normal and Binomial Distributions • The normal distribution is often a good approximation to a discrete distribution when the discrete distribution takes on a symmetric bell shape. • Some distributions converge to the normal as their parameters approach certain limits. • Theorem 6.2: If X is a binomial random variable with mean μ = np and variance s2 = npq, then the limiting form of the distribution of Z = (X – np)/(npq).5 as n  , is the standard normal distribution, n(z;0,1).

  19. Exercise 5.19

  20. Uniform distribution The uniform distribution with parameters α and β has the density function

  21. Exponential Distribution: Basic Facts • Density • CDF • Mean • Variance

  22. Key Property: Memorylessness • Reliability: Amount of time a component has been in service has no effect on the amount of time until it fails • Inter-event times: Amount of time since the last event contains no information about the amount of time until the next event • Service times: Amount of remaining service time is independent of the amount of service time elapsed so far

  23. Exponential Distribution The exponential distribution is a very commonly used distribution in reliability engineering. Due to its simplicity, it has been widely employed even in cases to which it does not apply. The exponential distribution is used to describe units that have a constant failure rate.The single-parameter exponential pdf is given by: where: · λ      = constant failure rate, in failures per unit of measurement, e.g. failures per hour, per cycle, etc. ·    λ   = .1/m ·        m = mean time between failures, or to a failure. ·        T = operating time, life or age, in hours, cycles, miles, actuations, etc. This distribution requires the estimation of only one parameter, , for its application.

  24. Joint probabilities For discrete joint probability density function (joint pdf) of a k-dimensional discrete random variable X = (X1, X2, …,Xk) is defined to be f(x1,x2,…,xk) = P(X1 = x1, X2 = x2 , …,Xk = xk) for all possible values x = (x1,x2,…,xk) in X. Let (X, Y) have the joint probability function specified in the following table

  25. Joint distribution Consider

  26. Joint probability distribution Joint Probability Distribution Function f(x,y) > 0 Marginal pdf of x & y here is an example x =1, 2, 3 y = 1, 2

  27. Marginal pdf of x & y Consider the following example x = 1,2,3 y = 1,2

  28. Independent Random Variables If

  29. Properties of expectations for a discrete pdf, f(x), The expected value of the function u(x), E[u(X)] = Mean =  = E[X] = Variance = Var(X) = 2 = x2=E[(X-)2] = E[X2] - 2 For a continuous pdf, f(x) E(X) = Mean of X = E[(X-)2] = E(X2) -[E(X)]2 = Variance of X =

  30. Properties of expectations E(aX+b) = aE(X) + b Var (aX+b) = a2var(X)

  31. Mean and variance of Z Is called standardized variable E(Z) = 0 and var(Z) = 1

  32. Linear combination of two independent variables Let x1 and x2 be two Independent random variables then their linear combination y = ax1+bx2 is a random variable. E(y) = aE(x1)+bE(x2) Var(y) = a2var(x1)+b2var(x2)

  33. Mean and variance of the sample mean x1, x2,…xn are independent identically distributed random variables (i.e. a sample coming from a population) with common mean µ and common variance σ2 The sample mean is a linear combination of these i.i.d. variables and hence itself is a random variable

More Related