1 / 40

C HAPTER 5 Pairs of Random Variables

C HAPTER 5 Pairs of Random Variables. Prof. Sang-Jo Yoo sjyoo@inha.ac.kr http://multinet.inha.ac.kr. What we are going to study?. Extend the concepts to two random variables

barbie
Télécharger la présentation

C HAPTER 5 Pairs of Random Variables

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CHAPTER 5 Pairs of Random Variables Prof. Sang-Jo Yoo sjyoo@inha.ac.kr http://multinet.inha.ac.kr

  2. What we are going to study? • Extend the concepts to two random variables • Joint pmf, cdf, and pdf to calculate the probabilities of events that involve the joint behavior two random variables. • Expected value to define joint moments • Correlation when they are not independent • Conditional probabilities involving a pair of random variables.

  3. Multiple Random Variables • Vector Random Variable • X is a function that assigns a vector of real numbers to each outcome in S (sample space of the random experiment). • Example • The random experiment of selecting one student from a class, define the following functions : H ( ) = height of student in inches W ( ) = weight of student in pounds A ( ) = age of student in years. • (H ( ), W ( ), A ( )) is vector random variable • A function that assigns a pair of real numbers to each out come in sample space S. • Example 5.1 • Example 5.2

  4. Two Random Variables • The event involving a pair of RV (X,Y) are specified by conditions that we are interested in • A={X+Y 10}, B={min(X,Y) 5}, C={X2+Y2 100} • To determine the probability that the pair X=(X,Y) in some region B in the plane, find the equivalent event for B in the underlying sample space S:

  5. Events and Probabilities • For -dimensional random variable X= , we are particularly interested in events that have the product form in in in where is a one-dimensional event that involves only • Probability of product-form events in in in in in • Probability of non-product form events B is approximated by the union of disjoint product-form events

  6. Example: Product-Form Events

  7. Pairs of Discrete Random Variables(1) Vector random variable assumes values from some countable set SX,Y = {(xj, yk), j = 1, 2, • • •, k = 1, 2,• • •}. The joint probability mass function of specifies the probability of the product-form event {X = xj} ∩ {Y= yk}, j = 1, 2, • • • ,k = 1, 2, • • •. This can be interpreted as the long-term relative frequency of the joint event {X = xj} ∩ {Y= yk} in a sequence of repetitions of the random experiment. The probability of any event B is the sum of the pmf over the outcomes in B Note 5-1 (Example 5.5) How to show pmf graphically: Figure 5.5

  8. Different Pairs of Random Variables

  9. Pairs of Discrete Random Variables(2) • Marginal probability mass functions • The joint pmf of X=(X,Y) provides the information about the joint behavior of X and Y. • We are also interested in the probabilities of events involving each of the RV in isolation. Similarly,

  10. Pairs of Discrete Random Variables(3) • Example • A urn contains 3 red, 4 white and 5 blue balls. Now, 3 balls are drawn. Let X and Y be the number of red and white balls chosen, respectively, find the joint probability mass function of X and Y,

  11. Pairs of Discrete Random Variables(4) • Example 5.9 • The number of bytes N in a message has a geometric distribution with parameter p and range SN = {0, 1, 2, • • •}. Messages are broken into packets of maximum length M bytes. Let Q be the number of full packet in a message, and R be the number of bytes left over. • Find the joint pmf and the marginal pmf’s of Q and R. • Solution • Q is the quotient of division of N by M, and R is the remainig bytes in the above division. Q takes on values in {0, 1, • • •}; that is, all non – negative integers. • R takes on values in {0, 1, • • • , M – 1}. • Interestingly, the joint pmf is relatively easier to compute

  12. Pairs of Discrete Random Variables(4) • Marginal pmf of Q is given by • Marginal pmf of R is found to be

  13. Joint cdf of X and Y (1) • Defined as the probability of the product-form event • Joint cumulative distribution function of X and Y • Properties (i) , this is because is a subset of . (ii) This is because and are impossible events. (iii) Marginal cumulative distribution functions

  14. Joint cdf of X and Y (2) (iv) Joint cdf is continuous from the ‘north’ and the ‘east’ • This is a generalization of the right continuity property of the one-dimensional cdf (v) The probability of the rectangle is given by • Example 5.12 • Then • X and Y are exponentially distributed with respective parameter α and β

  15. Joint cdf of X and Y (3) The cdf can be used to find the probability of events that can be expressed as the union and intersection of semi-infinite rectangles. In particular,

  16. Joint PDF of Two Continuous RVs • Backgrounds • Joint cdf allows us to compute the probability of events that correspond to “rectangular” shapes in the plane. • To compute the probability of events corresponding to regions other than rectangles, any shape can be approximated by rectangles Bj,k. • The probability of the event can be approximated by the sum of the probabilities of infinitesimal rectangles, and if the cdf is sufficiently smooth, the probability of each rectangle can be expressed in terms of a density function.

  17. Joint pdf of two jointly continuous RVs • Joint cdf: • The pdf can be obtained from the cdf by differentiation: When the random variables X and Y are jointly continuous, the probability of an event involving (X,Y) can be expressed as an integral of a joint probability density function. For every event B, which is a subset of the plane, we have

  18. Joint pdf of two jointly continuous RVs • The marginal pdf’s and are obtained by differentiating the marginal cdf’s. • Note 5-2 (Example 5.15, Example 5.16) Since

  19. Jointly Gaussian random variables X and Y are Gaussian random variables with zero mean and unit variance. The last integral is recognized as the Gaussian pdf with mean ρx and variance 1- ρ2, so the value of integral is one Hence, fx(x) is the one-dim Gaussian pdf with zero mean and unit variance.

  20. Independence of two random variables • Two events are independent if the knowledge that one has occurred gives no clue to the likelihood that the other will occur. • Let X and Y be discrete random variables. Let A1 be the event that X=x, A2 be the event that Y=y. If X and Y are independent, then A1 and A2 are independent. • For continuous random variables: • Example • X1= number of students attending the lecture on given day • X2= number of tests within that week • X3= number of students having a cold • X4= number of students having hair cut • Which pair of random variables are independent?

  21. Independence Definition • Let X and Y be random variables with joint density fXY and marginal densities fX and fY, respectively. • X and Y are independent if and only if • Remark • By integrating the above equation, we have • So that • X and Y are independent if and only if their joint cdf is equal to the product of its marginal cdf’s. • Let X and Y be independent random variables, then the random variables defined by g(X ) and h(Y ) are also independent.

  22. Example • Consider the jointly distributed Gaussian random variables with the joint pdf: • The product of the marginals equals the joint pdf if and only of ρ=0. Hence, X and Y are independent if and only if ρ=0. • What is the interpretation of ρ? It is related to a concept called correlation (to be discussed later).

  23. Joint Moments and Expected Values of a Function of Two Random Variables • Remember: • Expected value of X: indicates the center of mass of the distribution of X • Variance: expected value of (X-m)2, provides a measure of the spread of the distribution. • We are interested in: • How X and Y vary together? • Whether the variation of X and Y are correlated? • If X increases does Y tend to increase or to decrease? • The joint moments of X and Y, which are defined as expected values of functions of X and Y, provide this information.

  24. Expected value of functions of RVs • Let Z=g(X,Y ), expected value of Z is given by • Sum of random variables (Example 5.24) • Z=X+Y • The random variables do not have to be independent in order that the above formula holds.

  25. Product of functions of independent random variables: Example 5.25 • Suppose X and Y are independent random variables, and g(X,Y ) is separable where g(X,Y )=g1(X )g2(Y ), then • In general, if are independent random variables, then

  26. Joint Moments, Correlation, and Covariance • The jkth joint moment of X and Yis defined by • Correlation of X and Y is defined by E[XY], specially in Electrical Engineering • When E [XY ]=0, then X and Y are orthogonal. • The jkth central moment of X and Yis defined as When j =2, k =0, it gives VAR(X); and when j=0, k=2, it gives VAR(Y). When j=k=1, it gives COV(X,Y)=E[X-E[X])(Y-E[Y])].

  27. If XY=0 then X and Y are said to be uncorrelated. If X and Y are independent, then COV(X,Y)=0 so uncorrelated. • Covariance 1. COV[X,Y]=E[XY] if either of the random variables has mean zero. 2. When X and Y are independent, then E [XY]=E [X]E[Y] so that • Correlation coefficient of X and Y Where σX and σy are standard deviations of X and Y, respectively.

  28. Correlation Definitions • E[XY]: maximize when X goes large Y also goes large. • E[{X-E(X)}{Y-E(Y)}]: covariance, consider mean values of X and Y • Compute correlation at the equal conditions: conceptually zero mean • If a positive (negative) value of (X-E(X)) tends to be accompanied by a positive (negative) values of (Y-E(Y)); then COV will be positive. • If they tend to have opposite signs, then COV(X,Y) will be negative. • If they sometimes have the same sign and sometimes have opposite sign, then COV(X,Y) will be close to zero. • Correlation coefficient XY: • Multiplying either X or Y by a large number will increase the covariance, so need to normalize the covariance. • Properties of ρXY and so -1 ≤ρXY ≤ 1.

  29. Example 5.28 • Suppose X and Y have the joint pdf: • The marginal pdf’s are found to be • Correlation coefficient,

  30. Conditional Probability and Conditional Expectation • Many random variables in practical interest • Are not independent • Output Y of a communication channel must depend on the input X • Consecutive samples of a waveform that varies slowly are likely to be close in value. • We are interested in • Computing the probability of event concerning the random variable Y given that we know X=x. • The expected value of Y given X=x.

  31. Conditional Probability • Recall the formula: Case 1: X is a Discrete Random Variable • For X and Y discrete random variables, • If X and Y are independent • How to calculate joint pmf from conditional marginal pmf’s • How to calculate Example 5.29 Example 5.30

  32. Suppose X is a discrete RV and Y is a continuous RV • Conditional cdf of Y given X=xk • Conditional pdf of Y given X=xk • If X and Y are independent

  33. Case 2: X is a Continuous Random Variable • If X is a continuous random variable, then Suppose X and Y are jointly continuous random variables with a joint pdf that is continuous and non-zero over some region, then the conditional cdf of Y given X=x is defined by • Taking h0, • The conditional pdf of Y given X=x is given by • If X and Y are independent, then so and .

  34. Example 5.32 • Suppose the joint pdf of X and Y is given by find and . • Solution

  35. Example • X = input voltage, Y = output = input + noise voltage. • Here, X=1 or -1 with equal probabilities, the noise is uniformly distributed from -2 to 2 with equal probabilities. • Whe X=1, Y becomes uniformly distributed in [-1,3] so • The conditional cdf of Y: • For example,

  36. Conditional Expectation (1) The conditional expectation of Y given X=x is given by When X and Y are both discrete random variables On the other hand, E[Y|x] can be viewed as a function of x: Correspondingly, this gives rise to the random variable:

  37. Conditional Expectation(2) What is Note that Suppose X and Y are jointly continuous random variables Generalization [in the above proof, change y to h(y)]; and in particular,

  38. Example 5.37 Binary Communication System • The input X to a communication channel assumes the values +1 or -1 with probability 1/3 and 2/3. The output Y of the channel is given by Y=X+N, where N is zero-mean, unit variance Gaussian RV. • Find the mean of the output Y. • Solution • Since Y is Gaussian RV with mean +1 when X=+1, and -1 when X=-1,the conditional expected value of Y given X are • Since ,

  39. Multiple Random Variables • Joint cdf of • Joint pmf of • Marginal pmf's • Joint cdf of • Marginal pdf's

More Related