1 / 39

Section 2.5

Section 2.5. Important definition in the text : The definition of the moment generating function (m.g.f. ) Definition 2.5-1. If S is the space for a random variable X with p.m.f. f ( x ), then M ( t ) = and M (0) = M / ( t ) = and M / (0) =

johnsonk
Télécharger la présentation

Section 2.5

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Section 2.5 Important definition in the text: The definition of the moment generating function (m.g.f.) Definition 2.5-1 If S is the space for a random variable X with p.m.f. f(x), then M(t) = and M(0) = M /(t) = and M/(0) = M//(t) = and M//(0) = M///(t) = and M///(0) = and in general, M(n)(t) = and M(n)(0) =  xS  xS etxf(x) f(x) = 1  xS  xS xetxf(x) xf(x) = E(X)  xS  xS x2etxf(x) x2f(x) = E(X2)  xS  xS x3etxf(x) x3f(x) = E(X3)  xS  xS xnetxf(x) xnf(x) = E(Xn)

  2. 1. (a) (b) (c) An envelope contains two blue sheets of paper and three green sheets of paper. Two sheets are randomly selected without replacement, and the random variable X is defined to be the number of blue sheets. Name the type of distribution X has. Find the p.m.f. for X. Find the m.g.f. for X. X has a hypergeometric distribution with N = , N1 = , n = . 5 2 2 3 — 10 if x = 0 if x = 1 if x = 2 3 — 5 f(x) = 1 — 10 3 + 6et + e2t ————— for – < t <  10 M(t) =

  3. 2. (a) (b) (c) An envelope contains two blue sheets of paper and three green sheets of paper. Two sheets are randomly selected with replacement, and the random variable X is defined to be the number of blue sheets. Name the type of distribution X has. Find the p.m.f. for X. Find the m.g.f. for X. X has a b( , ) distribution. 2 0.4 9 — 25 if x = 0 if x = 1 if x = 2 f(x) = 12 — 25 4 — 25 9 + 12et + 4e2t —————– for – < t <  25 M(t) =

  4. 3. Suppose the random variable X has m.g.f. M(t) = (e–9t + e4t)3/8. Find P(X < 0). (e–9t + e4t)3 M(t) = ———— = 8 e–27t + 3e–18te4t + 3e–9te8t + e12t ———————————— = 8 1 3 3 1 — e–27t + —e–14t + —e–t + —e12t = 8 8 8 8 e–27t + 3e–14t + 3e–t + e12t —————————— = 8 etxf(x) x 1 — 8 if x = –27 if x = –14 if x = –1 if x = 12 3 — 8 The p.m.f. for X must be f(x) = 3 — 8 1 — 8 P(X < 0) = 1/8 + 3/8 + 3/8 = 7/8

  5. 4. A random variable X has a b(n,p) distribution. Find the m.g.f. for X, and use the m.g.f. to find E(X) and Var(X). n etx x = 0 n  x = 0 n x n x M(t) = E(etX) = px(1 –p)n–x = (pet)x(1 –p)n–x = (pet + 1 – p)n = (1 – p + pet)n for – < t <  Since M(t) = (1 – p + pet)n , M(t) = , and M(t) = , then n(1 – p + pet)n–1pet n(1 – p + pet)n–1pet + n(n– 1)(1 – p + pet)n–2p2e2t M(0) = np M(0) = np +n(n– 1)p2 E(X) =  = np Var(X) = 2 = E(X2) – [E(X)]2 = np + n(n – 1)p2– (np)2 = np(1 –p)

  6. Return to Section 2.4 A Bernoulli experiment is one which must result in one of two mutually exclusive and exhaustive outcomes often labeled in general as “success” and “failure”. The probability of “success” is often denoted as p, and the probability of “failure” is often denoted as q, that is, q = 1 − p. If X is a random variable defined to be one (1) when a success occurs and zero (0) when a failure occurs, then X is said to have a Bernoulli distribution with success probability p. The p.m.f. of X is f(x) = E(X) = Var(X) = The m.g.f. of X is px(1 –p)1–x if x = 0 , 1 . p p(1 –p) M(t) = (1 – p + pet) for – < t < 

  7. A sequence of Bernoulli trials is a sequence of independent Bernoulli experiments where the probability of the outcome labeled “success”, often denoted as p, remains the same on each trial; the probability of “failure” is often denoted as q, that is, q = 1 –p. Suppose X1 , X2 , ..., Xn make up a sequence of Bernoulii trials. If X = X1 + X2 + ... + Xn , then X is a random variable equal to the number of successes in the sequence of Bernoulli trials, and X is said to have a binomial distribution with success probability p, denoted b(n, p). The p.m.f. of X is f(x) = E(X) = Var(X) = The m.g.f. of X is n! ———— x! (n–x)! px(1 –p)n–x if x = 0, 1, … , n np np(1 –p) M(t) = (1 – p + pet)n for – < t <  Return to Class Exercise #3 in Section 2.4

  8. (c) Consider the random variable Q = “the number of clear marbles when 3 marbles are selected at random with replacement” with p.m.f. f(q). Find f(q), E(Q), and Var(Q). q 3 –q 3 q 2 — 7 5 — 7 f(q) = E(Q) = Var(Q) = if q = 0, 1, 2, 3 Q = (3)(2/7) = 6/7 Q2 = (3)(2/7)(5/7) = 30/49

  9. (d) Consider the random variable V = “the number of clear marbles when 7 marbles are selected at random with replacement” with p.m.f. g(v). Find g(v), E(V), and Var(V). v 7 –v 7 v 2 — 7 5 — 7 g(v) = E(V) = Var(V) = if v = 0, 1, 2, …, 7 V = (7)(2/7) = 2 V2 = (7)(2/7)(5/7) = 10/7

  10. (e) Consider the random variable W = “the number of colored marbles when 7 marbles are selected at random with replacement” with p.m.f. h(w). Find h(w), E(W), and Var(W). (Note that V + W = 7.) w 7 –w 7 w 5 — 7 2 — 7 h(w) = E(W) = Var(W) = if w = 0, 1, 2, …, 7 W = (7)(5/7) = 5 W2 = (7)(5/7)(2/7) = 10/7 Return to Section 2.5

  11. 5. (a) (b) A random variable X has p.m.f. f(x) = 2(1/3)x if x = 1, 2, 3, … . Verify that f(x) is a p.m.f. Find the m.g.f. for X.   2(1/3)x = x = 1  2 (1/3)x = x = 1  2/3 (1/3)x = x = 0 2 — 3 1 ——— = 1 – 1/3 1  etx(2)(1/3)x = x = 1 M(t) = E(etX) =  2 (et/3)x = x = 1  2(et/3) (et/3)x–1 = x = 1 2(et/3) [ 1 + (et/3) + (et/3)2 + (et/3)3 + (et/3)4 +… ] = 2et —— for t < ln(3) 3 – et 1 2(et/3) ————= 1 – (et/3)

  12. (c) Use the moment generating function to find E(X) and Var(X). 2 Since M(t) = ——— , M(t) = , and 3e–t– 1 M(t) = , then 6e–t(3e–t– 1)–2 – 6e–t(3e–t– 1)–2 + 36e–2t(3e–t– 1)–3 M(0) = 6/4 = 3/2 M(0) = – 6/4 + 36/8 = 3 E(X) =  = 3/2 Var(X) = 2 = E(X2) – [E(X)]2 = 3 – (3/2)2 = 3/4

  13. 6. (a) Among all the boxes of cereal produced at a certain factory, 8% are underweight. Boxes are randomly selected and weighed, and the following random variables are defined: X = Y = V = W = S = number of boxes weighed to get the first underweight box number of boxes weighed to get the third underweight box number of acceptable boxes weighed to get the first underweight box number of acceptable boxes weighed to get the third underweight box number of boxes weighed before the third underweight box Find each of the following: P(X = 1) = P(X = 2) = (0.92)(0.08) 0.08 P(X = 3) = (0.92)2(0.08) P(X = 4) = (0.92)3(0.08)

  14. (b) Find each of the following: 3 1 P(Y = 3) = (0.08)3 P(Y = 4) = (0.92)(0.08)3 4 2 P(Y = 5) = (0.92)2(0.08)3 P(Y = 6) = 5 3 (0.92)3(0.08)3 (c) Find the p.m.f. of X. f1(x) = (0.92)x–1(0.08) if x = 1, 2, 3, … (d) Find the p.m.f. of Y. y– 1 y– 3 f2(y) = (0.92)y–3(0.08)3 if y = 3, 4, 5, … y– 1 2 same as

  15. Suppose independent Bernoulli trials are performed until the rth success is observed. If the random variable X is defined to be the number of trials to observe the rth success, then the random variable X is said to have a negative binomial distribution; in the special case where r = 1, we say that X has a geometric distribution. The p.m.f. of X is f(x) = E(X) = Var(X) = The m.g.f. of X is x– 1 r– 1 pr(1 –p)x–r if x = r , r + 1 , r + 2 , … x– 1 x– r same as

  16. 7. Do Text Exercise 2.5-19. M/(t) —— M(t) M(t)M//(t) – [M/(t)]2 ———————— [M(t)]2 R(t) = ln[M(t)] R/(t) = R//(t) = M/(0) —— = M(0) E(X) —— = 1 R/(0) = E(X) =  M(0)M//(0) – [M/(0)]2 ————————— = [M(0)]2 E(X2) – [E(X)]2 —————— = 12 Var(X) = 2 R//(0) =

  17. Look at the handout “Some Well-Known Series”: 1 ——— = 1 + v + v2 + v3 + v4 + v5 + v6 + … for  1 < v < 1 1 – v Take the anti-derivative of both sides to get 1 1 1 1 1  ln(1 – v) = v + —v2 + —v3 + —v4 + —v5 + —v6 + … for  1 < v < 1 2 3 4 5 6 Let w = 1 – v (1 – w)2 (1 – w)3 (1 – w)4 (1 – w)5 (1 – w)6  ln(w) = 1 – w + ——— + ——— + ——— + ——— + ——— + … 2 3 4 5 6 for 0 < w < 2 (w – 1)2 (w – 1)3 (w – 1)4 (w – 1)5 (w – 1)6 ln(w) = w – 1 – ——— + ——— – ——— + ——— – ——— + … 2 3 4 5 6 for 0 < w < 2

  18. 1 ——— = 1 + w + w2 + w3 + w4 + w5 + w6 + … for  1 < w < 1 1 – w Take the derivative of both sides to get 1 ——— = 1 + 2w + 3w2 + 4w3 + 5w4 + 6w5 + 7w6 + … for  1 < w < 1 (1 – w)2 Take the derivative of both sides to get 2 1 3 1 4 1 5 1 6 1 7 1 2 ——— = 2 + (3)(2)w + (4)(3)w2 + (5)(4)w3 + (6)(5)w4 + (7)(6)w5 + … (1 – w)3 for  1 < w < 1 Divide both sides by 2 to get

  19. 1 (4)(3) (5)(4) (6)(5) (7)(6) ——— = 1 + 3w + ——w2 + ——w3 + ——w4 + ——w5 + … (1 – w)3 2 2 2 2 for  1 < w < 1 We can recognize the general pattern to be 3 2 4 2 5 2 6 2 7 2   wx–r x = r 1 ——— = (1 – w)r x– 1 r– 1 for |w| < 1

  20. 8. A random variable X has a negative binomial distribution. Find the m.g.f. for X, and use the m.g.f. to find E(X) and Var(X).  etx x = r   x = r x– 1 r– 1 x– 1 r– 1 pr(1 –p)x–r = (pet)r [(1 –p)et]x–r = M(t) = E(etX) =   x = r (pet)r —————— for t < – ln(1 –p) [1 – (1 – p)et]r x– 1 r– 1 [(1 –p)et]x–r = (pet)r R(t) = ln[M(t)] = rln(pet) – rln[1 – (1 – p)et] = rln(p) + rt– rln[1 – (1 – p)et] r(1 – p)et —————— [1 – (1 – p)et]2 r(1 – p)et ————— = 1 – (1 – p)et r ————— 1 – (1 – p)et R//(t) = R/(t) = r+ r — p r(1 – p) ——— p2 E(X) =  = R/(0) = Var(X) = 2 = R//(0) =

  21. Suppose independent Bernoulli trials are performed until the rth success is observed. If the random variable X is defined to be the number of trials to observe the rth success, then the random variable X is said to have a negative binomial distribution; in the special case where r = 1, we say that X has a geometric distribution. The p.m.f. of X is f(x) = E(X) = Var(X) = The m.g.f. of X is x– 1 r– 1 pr(1 –p)x–r if x = r , r + 1 , r + 2 , … x– 1 x– r same as r — p r(1 – p) ——— p2 (pet)r M(t) = —————— for t < – ln(1 – p) [1 – (1 – p)et]r Return to Class Exercise #6

  22. (e) Find each of the following: 1 —— = 0.08 25 — = 12.5 2 E(X) = X = (1)(0.92) ———– = (0.08)2 575 —– = 143.75 4 Var(X) = X2 = 3 —— = 0.08 75 — = 37.5 2 E(Y) = Y = (3)(0.92) ———– = (0.08)2 1725 —— = 431.25 4 Var(Y) = Y2 = 23 — = 11.5 2 E(V) = V = E(X– 1) = 575 —– = 143.75 4 Var(V) = V2 = Var(X– 1) = Var(X) =

  23. 69 — = 34.5 2 E(W) = W = E(Y– 3) = 1725 —— = 431.25 4 Var(W) = W2 = Var(Y– 3) = Var(Y) = 73 — = 36.5 2 E(S) = S = E(Y– 1) = 1725 —— = 431.25 4 Var(S) = S2 = Var(Y– 1) = Var(Y) =

  24. (f) Find each of the following: the p.m.f. of V f3(v) = (0.92)v(0.08) if v = 0, 1, 2, … the p.m.f. of W f4(w) = w+ 2 w (0.92)w(0.08)3 if w = 0, 1, 2, … w+ 2 2 same as the p.m.f. of S s s– 2 f5(s) = (0.92)s–2(0.08)3 if s = 2, 3, 4, … s 2 same as

  25. (g) Find the expected number of boxes that must be weighed in order to obtain the third underweight box. E(Y) = Y = 75 / 2 = 37.5 (h) Find the probability that exactly 5 boxes must be weighed in order to find the first underweight box (i.e., the fifth box weighed is the first underweight box). P(X = 5) = (0.92)4(0.08) = 0.0573 (i) Find the probability that at least 5 boxes must be weighed in order to find the first underweight box. P(X 5) = P(first 4 boxes weighed are acceptable) = (0.92)4 = 0.7164

  26. (j) Find the probability that at most 5 boxes must be weighed in order to find the first underweight box. P(X 5) = 1 – P(X 6) = 1 – P(first 5 boxes weighed are acceptable) = 1 – (0.92)5 = 0.3409 (k) Find the probability that exactly 8 boxes must be weighed in order to find the third underweight box (i.e. exactly 5 acceptable boxes are weighed before the third underweight box). 7 2 P(Y = 8) = (0.92)5(0.08)3 = 0.0071

  27. (l) Find the probability that at least 6 boxes must be weighed in order to find the second underweight box (i.e. at least 4 acceptable boxes are weighed before the second underweight box). P(first 5 boxes are all acceptable or contain exactly 1 underweight box) = P(first 5 boxes are all acceptable) + P(first 5 boxes contain exactly 1 underweight box) = 5 1 (0.92)5 + (0.92)4(0.08) = 0.9456

  28. (m) Find the probability that at most 6 boxes must be weighed in order to find the second underweight box (at most 4 acceptable boxes are weighed before the second underweight box). 1 – P(at least 7 boxes must be weighed to find the 2nd underweight box) = 6 1 1 – [ (0.92)6 + (0.92)5(0.08) ] = 0.0773

  29. 9. (a) (b) Find E(X) and Var(X), if the moment generating function of X is 10 0.64et ———— for t < – ln(0.36) 1 – 0.36et M(t) = We recognize this as the moment generating function for a negative binomial distribution with p = and r = . E(X) = Var(X) = 0.64 10 10/0.64 = 125 / 8 (10) (0.36) / (0.64)2 = 1125 / 128 M(t) = (4e–t– 3)–1 for t < ln(4/3) After some algebra, we recognize this as the moment generating function for a geometric distribution with p = . E(X) = Var(X) = 1/4 4 12

  30. (c) M(t) = (4e–5t– 3)–1 for t < (1/5)ln(4/3) We do not recognize this moment generating function. M/(t) = M//(t) = M/(0) = M//(0) = E(X) = Var(X) = 20e–5t(4e–5t– 3)–2 – 100e–5t(4e–5t– 3)–2 + 800e–10t(4e–5t– 3)–3 20 700 20 700 – (20)2 = 300

  31. (d) M(t) = 4/(4 – et) – 1/3 for t < ln(4) We do not recognize this moment generating function. M(t) = 4(4 – et)–1– 1/3 M/(t) = M//(t) = M/(0) = M//(0) = E(X) = Var(X) = 4et(4 – et)–2 4et(4 – et)–2 + 8e2t(4 – et)–3 4/9 4/9 + 8/27 = 20/27 4/9 20/27– (4/9)2 = 44/81 (e) M(t) = (0.2 + 0.8et)27 for – < t <  We recognize this as the moment generating function for a binomial distribution with p = and n = . E(X) = Var(X) = 0.8 27 (27)(0.8) = 21.6 (27)(0.8)(0.2) = 4.32

  32. 10. (a) Two people, Ms. A and Mr. B, take turns flipping a coin which has probability p of displaying heads, and Ms. A gets to flip first. Suppose the person who gets a head first wins the game. Find the probability that Ms. A wins with a fair coin (i.e., p = 0.5). P(Ms. A wins) = P(H) + P(TTH) + P(TTTTH) + … = (1/2) + (1/2)3 + (1/2)5 + (1/2)7 + … = (1/2){1 + (1/2)2 + [(1/2)2]2 + [(1/2)2]3 + … } = 1 1 — ——— = 2 1 – 1/4 2 — 3

  33. (b) For what value(s) of p will the game in part (a) to be fair (i.e., both Ms. A and Mr. B have an equal chance of winning)? P(Ms. A wins) = P(H) + P(TTH) + P(TTTTH) + … = p + (1 – p)2p + (1 – p)4p + (1 – p)6p + … = p ———— 1 – (1 – p)2 p{1 + (1 – p)2 + [(1 – p)2]2 + [(1 – p)2]3 + … } = p For what value(s) of p is ———— = 1/2 ? 1 – (1 – p)2 There are no values of p which make this equation true, and therefore it is not possible for the game to be fair.

  34. (c) Suppose Ms. A must get a head to win, and Mr. B must get a tail to win. For what value(s) of p will this game be fair? P(Ms. A wins) = P(H) + P(THH) + P(THTHH) + … = p + (1 – p)p2 + (1 – p)2p3 + (1 – p)3p4 + … = p ————— 1 – (1 – p)p p{1 + (1 – p)p + [(1 – p)p]2 + [(1 – p)p]3 + … } = p For what value(s) of p is ————— = 1/2 ? 1 – (1 – p)p 3 – 5 p = ———  0.382 2

  35. 11. (a) A factory produces pieces of candy in equal proportions of eight different colors: red, orange, purple, pink, blue, green, brown, and white. If random pieces of candy are purchased, find the expected number of pieces of candy to obtain at least one of each color. Define the following random variables: X1 = number of purchases to obtain any color X2 = number of purchases to obtain a color different from the 1st color obtained X3 = number of purchases to obtain a color different from the 1st and 2nd colors obtained . . . X8 = number of purchases to obtain a color different from the 1st, 2nd, 3rd, …, and 7th colors obtained The expected number of purchases to obtain one candy of each color is E(X1 + X2 + X3 + X4 + X5 + X6 + X7 + X8) = E(X1) + E(X2) + … + E(X8) .

  36. X1 has a geometric distribution with p = 1, that is, X1 = 1. X2 has a geometric distribution with p = 7/8. X3 has a geometric distribution with p = 6/8 = 3/4. X4 has a geometric distribution with p = 5/8. X5 has a geometric distribution with p = 4/8 = 1/2. X6 has a geometric distribution with p = 3/8. X7 has a geometric distribution with p = 2/8 = 1/4. X8 has a geometric distribution with p = 1/8. For k = 1, 2, …, 8, Xk has a geometric distribution with p = , and E(Xk) = (9 – k) / 8 8 / (9 – k) . E(X1) + E(X2) + … + E(X8) = 8/8 + 8/7 + 8/6 + … + 8/2 + 8/1  21.743

  37. (b) Sandy has three pieces of candy in her pocket: a red, an orange, and a purple. She is carrying a lot of material and only has one hand free. With her free hand, she reaches into her pocket to select one piece of candy at random. If it is the purple candy, she holds onto it. If it is not purple, she holds the selected piece in her palm while selecting one of the other two, after which releases the piece of candy in her palm. She repeats this process until she obtains the purple piece. Find the expected number of selections to obtain the purple piece. X = number of selections to obtain the purple candy Space of X = {1, 2, 3, …} P(X = 1) = 1/3 P(X = 2) = (2/3)(1/2) P(X = 3) = (2/3)(1/2)2 P(X = 4) = (2/3)(1/2)3 1/3 if x = 1 (2/3)(1/2)x–1 if x = 2, 3, 4, … The p.m.f. of X is f(x) =

  38. M(t) = E(etX) =  (1/3) et + etx(2/3)(1/2)x–1 = x = 2  (1/3) et + (e2t/3) et(x–2)(1/2)x–2 = x = 2 (1/3) et + (e2t/3) [ 1 + (et/2) + (et/2)2 + (et/2)3 + (et/2)4 + … ] = 2e2t et/3 + ——— 6 – 3et 1 (1/3) et + (e2t/3) ————= 1 – (et/2) if t < ln(2) et/3 + 4e2t(6 – 3et)–1 + 2e2t(–1)(6 – 3et)–2(–3et) M(t) = E(X) =M(0) = 1/3 + 4/3 + 6/9 = 7/3

  39. 12. The random variable X has p.m.f. 3/4 if x = 0 f(x) = (1/5)x if x = 1, 2, 3, … Verify that f(x) is a p.m.f., and find E(X). The sum of the probabilities is 2 3 3 1 1 1 — + — + — + — + … = 4 5 5 5 3 1 1 — + — ——— = 4 5 1 – 1/5 1 so f(x) is a p.m.f. 2 3 3 1 1 1 (0)— + (1) — + (2) — + (3) — + … = 4 5 5 5 E(X) = 5 — 16 1 1 — ———— = 5 (1 – 1/5)2 Note: The mean could also be found by first finding the m.g.f.

More Related