1 / 27

07. Midterm Practice

Condition Independence 郭俊利 2009/04/13. 07. Midterm Practice. 2.1 ~ 4.1. Concept.  Homework  p(n) > 1 Discrete  f ( x ) > 1 Continuous  F( x ) > 1. Basic Probability. Set P(A ∪ B ∪ C) = P(A) + P(A C ∩ B) + P(A C ∩ B C ∩C) Condition P(A∩B∩C) = P(A) P(B|A) P(C|A∩B)

edythe
Télécharger la présentation

07. Midterm Practice

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Condition Independence 郭俊利2009/04/13 07. Midterm Practice

  2. 2.1 ~ 4.1 Concept Homework  p(n) > 1 • Discrete  f (x) > 1 • Continuous  F(x) > 1

  3. Basic Probability • Set • P(A∪B∪C) = P(A) + P(AC∩B) + P(AC∩BC∩C) • Condition • P(A∩B∩C) = P(A) P(B|A) P(C|A∩B) • Independence • P(A∩B) = P(A) P(B) • Problem 1.31 – error bit

  4. Expectation • E[a] = a • E[aX + b] = aE[X] + b • E[g(X)] = Σx g(x) px(x) • E[X] = Σi E[Xi] = np(p is uniform!) • E[X] = Σi P(Ai) E[X | Ai] • E[X] = E[E[X|Y]] • var(aX + b) = a2 var(X)

  5. Graph f(x) • Mean • Center (not find f(x)) • Variance • E[X2] – E[X]2 (need f(x) or formula) 2/3 1/3 x 2 1

  6. Other E.V. • Independence • E[XY] = E[X] E[Y] • var(X + Y) = var(X) + var(Y) • Sum • E[T] = E[X] E[N] • var(T) = var(X) E[N] + E2[X] var(N) f(x, y) = f(x) f(y) T = X1 + … + XN

  7. Conditional Sum of Independence 1st • Let X1, X2 and X3 be independent and identical binomial random variables, that is, P(Xi = k) = Cnk pk (1 − p)n−k, 0 ≤ k ≤ n. • Compute the P(Z = X1 + X2 + X3) • Compute E[Z], var(Z) • E[] = np ; var() = np(1-p)

  8. Joint • fX,Y(x,y) = fY(y) fX|Y(x|y) • X, Y are independent,X and Y are in [0, 2]. fXY(x, y) = xy / 4, find E[f(x, y)] Double integration or… ∵ f(x, y) = f(x) f(y) ∴ f(x) = x / 2 f(y) = y / 2 or f(x) = xf(y) = y / 4

  9. Bernoulli pX(k) = p, 1-p Binomial pX(k) = Cnk pk (1 – p)n – k Geometric pX(k) = (1 – p)k-1 p E[X] = p var(X) = p(1-p) E[X] = np var(X) = np(1-p) E[X] = 1/p var(X) = (1-p)/p2 Important Random Variable

  10. Geometric random number • Xiao-Quan plays a game rock-paper-scissors with another. He plays until he loses. • Find the expectation of the number of rounds. • If Xiao-Quan has won 3 times and drawn 2 times, how many rounds will he expect to play? • Problem 2.22, 2.23 • Non-memoryless

  11. P(X = i ∩ X + Y = n) P(X + Y = n) P(X = i) P(Y = n – i) P(X + Y = n) P(X = i ∩ Y = n – i) P(X + Y = n) = = Conditional Sum of Independence 2nd • Suppose that X and Y are independent and identical geometrical random variables with parameter p, that is, P(X = k) = P(Y = k) = qk−1p, k ≥ 1. • Compute P(X = i | X + Y = n), i = 1, 2, ..., n − 1. • Compute E(X | X + Y = n), var(X | X + Y = n). P(X = i | X + Y = n) = = 1 / n-1

  12. Exponential random number • f(x) = λe–λx • P(x ≧ a) =∫a∞ λe–λxdx = –e–λx | a∞ = e–λa • F(x) = 1 – e–λx (Geometric F(n) = 1 – (1–p)n) • E[X] = 1 / λ • var(X) = 1 / λ2(E[X2] = 2 / λ2)

  13. The spent time of work is modeled as an exponential random variable. The average time that Xiao-Ming completes the task is 10 hours. What is the probability that Xiao-Ming has done this task early (in advance)? f(x) = ½ λe–λx, x ≥ 0 ½ λe+λx, x < 0 F(x) = ? Exponential Examples

  14. Condition • P(A|B) = P(A∩B) / P(B) • pX|A(x) = P({X = x}∩A) / P(A) • fX|A(x) = f(x) / P(A) • Roll a fair die shown k points, what is the probability p(k) given some roll is even number? • Xiao-Wang arrivals is a uniform random variable from 7:10 to 7:30. The bus comes at 7:15 and 7:30. What is the waiting time f(x) of Xiao-Wang? • Show exponential random number is memoryless.

  15. a – μ σ a – μ σ Normal random number • Standard normal distribution • N(–a) = P(Y ≦ –a) = P(Y ≧ a) = 1 – P(Y ≦ a) N(–a) = 1 – N(a) • CDF • P(X ≦ a) = P(Y ≦ ) = N( ) • The grades of the exam is suitable for a normal random variable. The average of grades = 60 and the standard deviation = 20. What is the probability that Xiao-Kuo’s grade will be higher over 70?

  16. F() and f() • Derived • Linear • Y = aX + b • General • y = g(x), x = h(y)  y = g(h(y)) fY(y) = fX( h(y) ) |h’(y)|

  17. Derived Distribution • Find the PDF of Z= g(X, Y) = Y/X • FZ(z) = 0 ≤z ≤1 • FZ(z) =z > 1 • fZ(z) =

  18. λ |-λ| Linear Mapping • X is an exponential random variable, Y = –λX + 2, Find the PDF of Y f(x) = λe–λx f(y) = e–λ(y-2 / –λ) = e y-2

  19. Travel Problem • Xiao-Hua is driving from Boston to New York 180 miles. His speed is uniformly distributed between 30 and 60 mph. What is the distribution of the duration of the trip? fV(v) = 1 / 30 30 ≤ v≤60 T(v) = 180 / v fT(t) = fV(v) T(t)’ = 6 / t2

  20. Convolution • W = X + Y • PW(w) = Σ PX(x) PY(w – x) • W = |X| + 2Y p(x) = 1/3, if x = –1, 0, 1 p(y) = 1/2, y = 0 1/3, y = 1 1/6, y = 2

  21. Maximum • W = max {X, Y} (X, Y = [0, 1] uniformly) • W = max {X, Y} (X, Y = 0.1, 0.2, …, 1.0 uniformly) pW(w) = FW(w) – FW(w – 0.1) = w2 – (w–0.1)2 P(X≦w) = P(Y≦w) = w FW(w) = P(X≦w) P(Y≦w) = w2 fW(w) = 2w

More Related