1 / 25

Expectation and Properties of Random Variables

This handout provides a review of expectations, variances, moment generating functions, and properties of random variables. It includes examples and explanations for understanding key concepts in calculus.

rdicken
Télécharger la présentation

Expectation and Properties of Random Variables

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Handout Ch 4 實習

  2. 微積分複習第二波(1) • Example

  3. 微積分複習第二波(2) • 變數變換 • Example 這是什麼鬼

  4. 微積分複習第二波(3) • Ch 4積分補充

  5. 歸去來析 ( 乾脆去死,台語)

  6. Expectation of a Random Variable • Discrete distribution • Continuous distribution • E(X) is called expected value, mean or expectation of X. • E(X) can be regarded as being the center of gravity of that distribution. • E(X) exists if and only if • E(X) exists if and only if Whenever X is a bounded random variable, then E(X) must exist.

  7. The Expectation of a Function • Let , then • Let , then • Suppose X has p.d.f as follows: • Let it can be shown that

  8. Example 1 (4.1.3) • In a class of 50 students, the number of students ni of each age i is shown in the following table: • If a student is to be selected at random from the class, what is the expected value of his age

  9. Solution • E[X]=18*0.4+19*0.44+20*0.08+21*0.06+ 25*0.02=18.92

  10. Properties of Expectations • If there exists a constant such that • If are n random variables such that each exists, then • For all constants • Usually Only linear functions g satisfy • If are nindependent random variable such that each exists, then

  11. Example 2 (4.2.7) • Suppose that on each play of a certain game a gambler is equally likely to win or to lose. Suppose that when he wins, his fortune is doubled; and when he loses, his fortune is cut in half. If he begins playing with a given fortune c, what is the expected value of his fortune after n independent plays of the game?

  12. Solution

  13. Properties of the Variance • Var(X ) = 0 if and only if there exists a constant c such that Pr(X = c) = 1. • For constant a and b, . Proof :

  14. Properties of the Variance • If X1 , …, Xnare independent random variables, then • If X1,…, Xn are independent random variables, then

  15. Example 3 (4.3.4) • Suppose that X is a random variable for which E(X)=μ and Var(X)=σ2. • Show that

  16. Solution

  17. Moment Generating Functions • Consider a given random variable X and for each real number t, we shall let . The function is called the moment generating function (m.g.f.) of X. • Suppose that the m.g.f. of X exists for all values of t in some open interval around t = 0. Then, • More generally,

  18. Properties of Moment Generating Functions • Let X has m.g.f. ; let Y = aX+b has m.g.f. . Then for every value of t such that exists, Proof: • Suppose that X1,…, Xn are n independent random variables; and for i = 1,…, n, let denote the m.g.f. of Xi. Let , and let the m.g.f. of Y be denoted by . Then for every value of t such that exists, we have Proof:

  19. The m.g.f. for the Binomial Distribution • Suppose that a random variable X has a binomial distribution with parameters n and p. We can represent X as the sum of n independent random variables X1,…, Xn. • Determine the m.g.f. of

  20. Uniqueness of Moment Generating Functions • If the m.g.f. of two random variables X1 and X2 are identical for all values of t in an open interval around t = 0, then the probability distributions of X1 and X2 must be identical. • The additive property of the binomial distribution Suppose X1 and X2 are independent random variables. They have binomial distributions with parameters n1 and p and n2 and p. Let the m.g.f. of X1 + X2 be denoted by . The distribution of X1 + X2 must be binomial distribution with parameters n1 + n2 and p.

  21. Example 4 (4.4.8) • Suppose that X is a random variable for which the m.g.f. is as follows: • Find the mean and the variance of X

  22. Solution

  23. Properties of Variance and Covariance • If X and Y are random variables such that and , then • Correlation only measures linear relationship. • Two random variables can be dependent, but uncorrelated. • Example: Suppose that X can take only three values –1, 0, and 1, and that each of these three values has the same probability. Let Y=X 2. So X and Y are dependent. E(XY)=E(X 3)=E(X)=0, so Cov(X,Y) = E(XY) – E(X)E(Y)=0 (uncorrelated).

  24. Example 5 (4.6.11) • Suppose that two random variables X and Y cannot possibly have the following properties: E(X)=3, E(Y)=2, E(X2)=10. E(Y2)=29, and E(XY)=0

  25. Solution

More Related