1 / 43

Chapter 3. Discrete Probability Distributions

Chapter 3. Discrete Probability Distributions. 3.1 The Binomial Distribution 3.2 The Geometric and Negative Binomial Distributions 3.3 The Hypergeometric Distribution 3.4 The Poisson Distribution 3.5 The Multinomial Distribution.

cirila
Télécharger la présentation

Chapter 3. Discrete Probability Distributions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 3. Discrete Probability Distributions 3.1 The Binomial Distribution 3.2 The Geometric and Negative Binomial Distributions 3.3 The Hypergeometric Distribution 3.4 The Poisson Distribution 3.5 The Multinomial Distribution

  2. 3.1 The Binomial Distribution3.1.1 Bernoulli Random Variables(1/2) • To model - the outcome of a coin toss, - whether a valve is open or shut, - whether an item is defective or not, - any other process that has only two possible outcomes. • The outcomes are labeled 0 and 1 • The random variable is defined by the parameter p, 0  p  1, which is the probability that the outcome is 1.

  3. 3.1.1 Bernoulli Random Variables(2/2) • Expectations

  4. 3.1.2 Definition of the Binomial Distribution(1/5) • Consider an experiment consisting of • n Bernoulli trials (X1, ……, Xn) • that are independent and • that each have a constant probability p of success. • Then the total number of successes X, that is X=X1+…+Xn, is a random variable that has a binomial distribution with parameters n and p, which is written X~B(n,p)

  5. 3.1.2 Definition of the Binomial Distribution(2/5) • The probability mass function of a B(n,p) random variable is for , with

  6. 3.1.2 Definition of the Binomial Distribution(3/5) • Ex) X~B(8,0.5)

  7. 0.273 0.219 0.219 Probability 0.109 0.109 0.031 0.031 0.004 0.004 0 1 2 3 4 5 6 7 8 x 0 1 2 3 4 5 6 7 8 0.004 0.035 0.144 0.363 0.636 0.855 0.965 0.996 0.1000 3.1.2 Definition of the Binomial Distribution(4/5) • Ex) X~B(8,0.5)

  8. 3.1.1 Definition of the Binomial Distribution(5/5) • Symmetric Binomial Distributions : A B(n,0.5) distribution is a symmetric probability distribution for any value of the parameter n. The distribution is symmetric about the expected value n/2.

  9. Example 24 : Air Force Scrambles(1/3) • 16 planes. • A probability of 0.25 that the engines of a particular plane does not start at a given attempt. • Then, the number of planes successfully launched has a binomial distribution with parameters n = 16 and p = 0.75 • The expected number of plane launched is

  10. Example 24 : Air Force Scrambles(2/3) • The variance is • The probability that exactly 12 planes scramble successfully is • The probability that at least 14 planes scramble successfully is

  11. 0.225 0.208 0.180 0.134 0.110 Probability 0.054 0.052 0.020 0.000 0.000 0.001 0.006 0.010 0.000 0.000 0.000 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 x 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 0.000 0.000 0.000 0.007 0.079 0.369 0.802 0.990 0.000 0.000 0.001 0.027 0.189 0.594 0.936 0.100 Example 24 : Air Force Scrambles(3/3)

  12. Proportion of successes in Bernoulli Trials • Let • Then, if

  13. 3.2 The Geometric and Negative Binomial Distributions3.2.1 Definition of the Geometric Distribution(1/2) • The number of trials up to and including the first success in a sequence of independent Bernoulli trials with a constant success probability p has a geometric distribution with parameter p. • The probability mass function is for

  14. 3.2.1 Definition of the Geometric Distribution(2/2) • The cumulative distribution function is • The expectations

  15. Expectation of X: • Variance of X:

  16. Example 24 : Air Force Scrambles(1/2) • If the mechanics are unsuccessful in starting the engines, then they must wait 5 minutes before trying again. • The distribution of the number of attempts needed to start a plane’s engine  geometric distribution with p = 0.75. • The probability that the engines start on the third attempt is

  17. Example 24 : Air Force Scrambles(2/2) • The probability that the plane is launched within 10 minutes of the first attempt to start the engines is • The expected number of attempts required to start the engines is

  18. 3.2.2 Definition of the Negative Binomial Distribution(1/2) • The number of trials up to and including the r th success in a sequence of independent Bernoulli trials with a constant success probability p has a negative binomial distribution with parameter p and r • The probability mass function is for

  19. 3.2.2 Definition of the Negative Binomial Distribution(2/2) • The expectations

  20. Example 12 : Personnel Recruitment(1/2) • Suppose that a company wishes to hire three new workers and each applicant interviewed has a probability of 0.6 of being found acceptable. • The distribution of the total number of applicants that the company needs to interview  Negative Binomial distribution with parameter p = 0.6 and r = 3. • The probability that exactly six applicants need to be interviewed is

  21. Example 12 : Personnel Recruitment(2/2) • If the company has a budget that allows up to six applicants to be interviewed, then the probability that the budget is sufficient is • The expected number of interviews required is

  22. 3.3 The Hypergeometric Distribution3.3.1 Definition of the Hypergeometric Distribution(1/3) • Consider a collection of N items of which r are of a certain kind. • If one of the items is chosen at random, the probability that it is of the special kind is clearly • Consequently, if n items are chosen at random with replacement, then clearly the distribution of X, the number of defective items chosen, is

  23. 3.3.1 Definition of the Hypergeometric Distribution(2/3) • However, if n items are chosen at random without replacement, then the distribution of X is the hypergeometric distribution. • The hypergeometric distribution has a probability mass function given by for

  24. 3.3.1 Definition of the Hypergeometric Distribution(3/3) • The expectations • It represents the distribution of the number of items of a certain kind in a random sample of size n drawn without replacement from a population of size N that contains r items of this kind.

  25. Let X and Y be independent binomial random variables such that Let us consider the conditional probability mass function of X given that X+Y=n. That is, the conditional distribution of X given the value of X+Y is hypergeometric!

  26. Expectation of X: Let Xi be the following random variable: Xi=1 if the ith selection is acceptable; 0 otherwise. Then, Let X be the hypergeometric random variable with parameters (r,N.n). Then, This gives • Variance of X:

  27. Since Xi is a Bernoulli random variable, Also for i<j, Hence, Cf. Let Then, and As N goes to infinity,

  28. Example 17 : Milk Container Contents(1/3) • Suppose that milk is shipped to retail outlets in boxes that hold 16 milk containers. One particular box, which happens to contain six under weight containers, is opened for inspection, and five containers are chosen at random. • The distribution of the number of underweight milk containers in the sample chosen by inspector  Hypergeometric distribution with N=16, r=6, and n=5.

  29. Example 17 : Milk Container Contents(2/3) • The probability that the inspector chooses exactly two underweight containers is • The expected number of underweight containers chosen by the inspector is

  30. 0.412 0.288 0.206 Probability 0.058 0.034 0.002 1 2 3 4 5 0 Number of underweight milk containers found by inspector Example 17 : Milk Container Contents(3/3)

  31. 3.4 The Poisson Distribution3.4.1 Definition of the Poisson Distribution(1/3) • The distribution of • The number of defects in an item • The number of radioactive particles emitted by a substance • The number of telephone calls received by an operator with a certain time limit • That is, the number of “events” that occur within certain specified boundaries.

  32. 3.4.1 Definition of the Poisson Distribution(2/3) • A random variable X distributed as a Poisson random variable with parameter λ, which is written has a probability mass function for

  33. 3.4.1 Definition of the Poisson Distribution(3/3) • The Poisson distribution is often useful to model the number of times that a certain event occurs per unit of time, distance, or volume, and it has a mean and variance both equal to the parameter value λ. • The expectations

  34. The Poisson random variable can be used as an approximation for a binomial random variables with parameters (n,p)when n is large and p is small: Let X be a binomial random variable with parameters (n,p) and Then, That is, For large n and small p, Hence,

  35. Example 3 : Software Errors(1/3) • Suppose that the number of errors in a piece of software has a Poisson distribution with parameter λ=3. • [The expected number of errors] = [The variance in the number of errors] = 3. • The probability that a piece of software has no error is

  36. Example 3 : Software Errors(2/3) • The probability that there are three or more errors in a piece of software is

  37. 0.224 0.224 0.168 0.149 Probability 0.101 0.050 0.050 0.022 0.008 0.003 0.001 0 1 2 3 4 5 6 7 8 9 10 11 12 13 Number of software errors 0 1 2 3 4 5 6 7 8 9 10 11 12 13 0.050 0.224 0.168 0.050 0.008 0.001 0.000 0.149 0.224 0.101 0.022 0.003 0.000 0.000 Example 3 : Software Errors(3/3)

  38. 3.5 The Multinomial Distribution3.5.1 Definition of the Multinomial Distribution(1/2) • Consider a sequence of n independent trials where each individual trial can have k outcomes that occur with constant probability value p1,…, pk with p1+···+pk = 1. The random variable X1,…, Xk that count the number of occurrences of each outcome are said to have a multinomial distribution. • The joint probability mass function is for nonnegative integer values of the satisfying

  39. 3.5.1 Definition of the Multinomial Distribution(2/2) • The random variables X1,…, Xk have expectation and variances given by but they are not independent. (why?)

  40. Example 1 : Machine Breakdowns(1/4) • Suppose that the machine breakdowns are attributable to electrical faults, mechanical faults, and operator misuse, and these causes occur with probabilities of 0.2, 0.5, and 0.3, respectively. • The engineer is interested in predicting the causes of the next ten breakdowns. • X1: the number of breakdowns due to electrical reasons. • X2: the number of breakdowns due to mechanical reasons. • X3: the number of breakdowns due to operator misuse.

  41. Example 1 : Machine Breakdowns(2/4) • X1+X2+X3=10 • If the breakdown causes can be assumed to be independent of one another, then the probability mass function is • The probability that there will be three electrical breakdowns, five mechanical breakdowns, and two misuse breakdowns is

  42. Example 1 : Machine Breakdowns(3/4) • The expected number of electrical breakdowns is • The expected number of mechanical breakdowns is • The expected number of misuse breakdowns is

  43. Example 1 : Machine Breakdowns(4/4) • If the engineer is interested in the probability of there being no more two electrical breakdowns, then this can be calculated by nothing that X1~B(10, 0.2), so that

More Related