1 / 64

CHAPTER 3

CHAPTER 3. CHAPTER 3. Overview. ● Random Variables and Distributions ● Continuous Distributions ● The Distribution Function ● Bivariate Distributions ● Marginal Distributions ● Conditional Distributions ● Functions of a Random Variable. So We are Skipping:.

adina
Télécharger la présentation

CHAPTER 3

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CHAPTER 3

  2. CHAPTER 3 Overview ● Random Variables and Distributions ● Continuous Distributions ● The Distribution Function ● Bivariate Distributions ● Marginal Distributions ● Conditional Distributions ● Functions of a Random Variable

  3. So We are Skipping: Sec3.7 Multivariate Distributions Sec3.9 Functions of Two or More Random Variables

  4. Section 3.1Random Variables and Discrete Distributions

  5. Definition • A Random Variable X is a real-valued function defined on the sample space S of an experiment. X: S → R • That is, every outcome in S is associated with a real number.

  6. Examples Toss a coin twice, the sample space is S={HH,HT,TH,TT}. We can define a random variable on this sample space in many ways, for example:1) let X be the number of heads obtained.2) let Y be the number of heads minus the number of tails.

  7. Examples 3) Toss a coin repeatedly until we see a head. The number of tosses needed until we observe a head for the first time is a random variable.

  8. Examples 4) Consider the “experiment” of throwing a dart at a rectangular wall.Let X be the x-coordinate of the location hit by the dart. 5) Suppose that a school bus arrives between 7:30 and 7:45 am.We can regard the exact arrival time of the bus as a random variable.

  9. Definitions • A random Variable is a called discrete if it can only assume a finite or countable number of events. • A random Variable is a called continuous if its values fill in an entire interval (possibly of infinite length)

  10. Distribution of a Discrete random Variable • The distribution of a discrete random variable is a formula or table that lists all the possible values that the random variable can take, together with the corresponding probabilities.

  11. Requirements A probability distribution must satisfy: 1) P(X=i) ≥ 0 for every possible i 2) ∑x P(X=x) equals one.

  12. Example 1: The Uniform Distribution On the Integers • Consider a set of integers, say {1,2,3,…,N}. • The uniform distribution on this set assigns the same weight/probability to each outcome, namely 1/N. • A random variable that has this distribution is referred to as a discrete uniform r.v.

  13. Example 2 Roll two balanced dice. Let X be the sum of dots on the faces up. • The possible values for X are 2,3,4,···,12. • The probabilities P(X=i) for i= 2,3,4,···,12 are: 1/36,2/36,3/36,4/36,5/36,6/36,5/36,···,1/36

  14. Example 3: The Binomial Distribution Definitions: • an experiment that results in one of two possible values is called a Bernoulli experiment. We will refer to this experiment as a trial. • We will refer to the two possible outcomes of a single Bernoulli trial as S (for Success) or F (for Failure) • The probabilities of S and F are p and q. • A Bernoulli process is a process of repeating a Bernoulli trial many times independently.

  15. Definition • Consider a Bernoulli process with a fixed number of trials n • The number of successes X in these n trials is a random variable, which we call binomial • We write X ~b(n,p). • Formula:P(X=x)=Cn,xpxqn-x Where x=0,1,2…,n

  16. Geometric Distribution • Consider a Bernoulli Process that goes on and on until a success is encountered for the first time. • The number of trials X, including the last success, we needed to make is called a geometric random variable. • Formula: P(X=x)= pqx-1,x=1,2,3,…

  17. Negative Binomial Distribution • Consider a Bernoulli process again. • We will “stop” the process as soon as we encounter a success for the kth time. • The number of trials needed is a random variable, which we call a negative binomial. • Formula:P(X=x)=Cx-1,k-1pkqx-k; x = k,k+1,….

  18. Hypergeometric Distribution • A box contains N items, of which D are defective. We select k items randomly, where n ≤ min( D, N - D). • The number of defective items X we randomly picked among the k items selected is called a hypergeometric random variable. • Possible values of X: 0,1,…,n • Probability Function: P(x) = (D choose x)(N-D choose n-x)/(N choose n)

  19. Example • Consider a well-shuffled regular deck of cards. Draw the cards successively and without replacement. Let X be the number of cards drawn by the time when the first ace appears. • Is X geometric? • Repeat the problem so that X is the number of cards when the fourth ace appears. • Is X negative binomial?

  20. Section 3.2 Continuous Distributions

  21. The Probability Density Function Recall: for a continuous random variable X the probability P(X=x)=0 for any specific value x. So in the continuous case we calculate the probability that X “falls” in some set or interval. The right object that characterizes probabilities in the continuous case is the density f(x) of the random variable (probability density function or p.d.f. for short). f(x) ≥ 0 ∫[-∞, ∞] f(x)dx =1

  22. The Uniform Distribution on an Interval [a,b] • This distribution assigns the same probability to intervals with the same width. • It is a generalization of the discrete uniform probability function which assigns the same probability to all possible values. • The p.d.f. is given by 1/(b-a) for x in [a,b] and zero elsewhere. • Interpretation: the “mass” is uniformly / evenly distributed in the interval [a,b].

  23. Example A random variable X has the f(x) = cx2 for 1≤ x ≤ 2 and zero otherwise. • Find the value of the constant c and sketch the p.d.f. • Find the value of P(X > 2). The constant c is called the normalizing constant, and if it exists it is unique.

  24. Section 3.3 The Distribution Function

  25. Definition • The Distribution function of a random variable is defined for every real number as:F(x) = P(X ≤ x) F(x) is called the cumulative distribution function. It is defined (as we will see) for all types of random variables, e.g. discrete and continuous. Properties: • F(-∞)=0, F(+∞)=1. • F(x) is nondecreasing (as x increases) • F(x) is right-continuous, i.e. F(x+)=F(x)

  26. Useful Facts about the Distribution Function • P(X>x)=1-F(x) • P(X<x) = F(x-)(for most x’s, this is the same as F(x), except when x is a point of discontinuity) • P(X = x) =F(x) - F(x-).i.e. when F is discontinuous at x, then the probability of x equals the hight of the gap, such x is called an atom. • A d.f. that is strictly increasing in some interval and that has jumps is said to be mixed.

  27. The Need for a Mixed d.f. Suppose the voltage X of a certain electric system is a random variable with p.d.f. f(x)=1/(1+x)2 when x>0 and zero elsewhere. Suppose a voltmeter will record the actual voltage when X < 3 But will simply record the value 3 when X ≥ 3. The distribution of the recorded value Y is continuous everywhere except at the value 3 which accrues P(X ≥ 3).

  28. Relation with the p.d.f. • Consider a distribution function F(x) that has a derivative • Hence F is necessarily the distribution of a continuous random variable (why?) • the p.d.f. is defined to be the derivative of F(x):f(x) = dF(x)/dx

  29. The Quantile Function • Consider a d.f. F(x) which is continuous and strictly increasing. (So it is invertible.) • The inverse function x=F-1(p); 0 ≤ p ≤ 1 is called the quantile function, it provides the pth quantile of the distribution. • EXAMPLE: the median of the distribution is the 0.5th quantile: m = F-1(1/2). • If F(x) is the d.f. of a discrete (or mixed) r.v. then F-1(p) is defined to be the least value x such that F(x) ≥ p.

  30. Quantiles of a Binomial Distribution • X ~ b(5,0.3) the median is 1 The .25 quantile is 1 The 90th percentile is 3.

  31. Section 3.4 Bivariate Distributions

  32. Section Abstract • We generalize the concept of distribution of a single random variable to the case of two r.v.’s where we speak of their joint distribution. We do so by introducing: • The joint p.f. of two discrete r.v.’s • The joint p.d.f. of two continuous r.v.’s • The joint d.f. for any two r.v.’s

  33. Discrete Joint Distributions • Definition by example: A subcommittee of five members is to be formed randomly from among a committee of 10 democrats, 8 republicans and 2 independent members. • Let X and Y be the numbers of democrats and republicans chosen to serve on the committee, respectively. Their joint p.f. is: P(x,y):=P(X = x,Y = y) = C10,xC8,yC2,5-x-y / C20,5 X = 0,1,…,5; y = 0,1,…,5; x+y ≤ 5 and x+y ≥ 3

  34. Requirements • P(x,y) ≥ 0 • Σx Σy p(x,y)=1; where the double sum is taken over all possible values of (x,y) in the xy-plane. The joint probability describes the joint behavior of two random variables.

  35. Another Example of Joint p.f. • Throw a die once, let X be the outcome. Throw a fair coin X times and let Y be the number of heads. Find P(x,y)? Solution: • The possible X values range from 1 to 6. • The possible Y values range from 0 to 6. • When X = 2 (e.g.) Y = 0,1 or 2. • P(2,0) = P(X=2,Y=0) = P(X=2)P(Y=0|X=2) =(1/6)(1/2)2=1/24.

  36. Cont’d: Table of p(x,y)

  37. Die and Coin Continued • Questions: • Find P(X>4,Y>3) • Find P(Y=3) Solution: • P(X>4,Y>3)=P(5,4)+P(5,5) P(6,4)+P(6,5)+P(6,6) • P(Y=3)=P(1,3)+ P(2,3)+ P(3,3)+ P(4,3)+ P(5,3)+ P(6,3) =0+ + 0 +P(3,3)+ P(4,3)+ P(5,3)+ P(6,3)

  38. Continuous Joint Distributions • A real valued function f(x,y) is a joint probability density function (p.d.f.) if: • f(x,y) ≥ 0 • ∫ ∫ f(x,y)dxdy=1; where integration is performed over the entire xy plane. A p.d.f. describes the joint behavior of two continuous random variables: For any region A in the xy plane, the probability that (X,Y) falls within A is the volume of the solid under the graph of f(x,y) and above the region A. Hence the problem of finding the probability is a question about finding a double integral.

  39. Examples Find the constant c that renders the function f(x,y) into a joint p.d.f.: • f(x,y)= cxy2; 0 ≤ x ≤ 1; 0 ≤ y ≤ 1 AND zero elsewhere • f(x,y)= cxy2; 0 ≤ x ≤ y ≤ 1 AND zero elsewhereFind P(X>3/4) in each case.

  40. Will the Chicken Be Safe? • A farmer wants to build a triangular pen for his chickens. He sends his son out to cut the lumber and the boy, without taking any thought as to the ultimate purpose, makes two cuts at two points selected at random. What are the chances that the resulting three pieces can be used to form a triangular pen?

  41. Bivariate Distribution Function • F(x,y) := P(X≤x, Y≤y) • F(-∞,-∞)=0; F(+∞,+∞)=1 • F is nondecreasing in both x and y. • Calculating Probability that (X,Y) belongs to a given rectangle Using d.f.: P(a<X≤b and c<Y≤d)= P(a<X≤b and Y≤d) - P(a<X≤b and Y≤c) =P(X≤b and Y≤d)- P(X≤a and Y≤d) -[P(X≤b and Y≤c)- P(X≤a and Y≤c) =F(b,d)-F(a,d)-F(b,c)+F(a,c)

  42. Calculating joint p.d.f. from joint d.f. If both X and Y are continuous; Alternatively, if F(x,y) has partial derivatives in both x and y, Then the joint p.d.f. is f(x,y) = ∂F(x,y)/∂x∂y; Conversely, F(x,y)=∫∫f(s,t)dsdt; where the integrals are from (-∞,y) and (-∞,x) respectively.

  43. Section 3.5 Marginal Distributions

  44. Definition • Let X,Y have the joint distribution F(x,y). i) The marginal distribution of X is given by: FX(x) := F(x,+∞) ii) The marginal distribution of Y is given by: FY(y):=F(+∞,y). That is, the marginal d.f. of X (resp.Y) is the same as the univariate d.f. of X (resp. Y) ‘recovered’ from the joint d.f. of X and Y.

  45. Discrete Marginals • Let p(x,y) be the joint p.f. of X and Y. • The marginal p.f.’s of X and Y are denoted by pX(x) and pY(y) respectively, where:pX(x) =∑yp(x,y); pY(y) =∑xp(x,y);

  46. Example: Table of p(x,y)

  47. Two Continuous Random Variables • X and Y: continuous r.v. with joint p.d.f. f(x,y). • The marginal p.d.f. of X is: fX(x):= -∞∫∞f(x,y)dy The marginal p.d.f. of Y is: fY(y):= -∞∫∞f(x,y)dx

  48. Independent Random Variables • Def.: Two Random Variables are independent iff their joint d.f. is the product of their marginal d.f.’s: F(x,y) = FX(x)FY(y) • Equivalently, X and Y are indep. iff f(x,y) = fX(x)fY(y) • That is, X and Y are indep if any two events A and B determined by X and Y respectively are independent events e.g. A = {X>2} & B={0 < Y ≤ 12}.

  49. Checking two discrete r.v. for Independence • From the table, (or formula) verify that:P(X=x, Y=y) = P(X=x) P(Y=y) • If this is the case for all possibilities, then X and Y are independent • If this equation is violated at least once then X and Y are dependent • REMARK: in the case of indep. The rows are all proportional (and so are the columns!)

  50. Discrete Example 1: dependent

More Related