1 / 21

Special Probability Distributions Chapter 5

Special Probability Distributions Chapter 5. “A throw of the dice will never abolish chance.” Stéphane Mallarmé, French poet . Goals for Chapter 5. Probability Distributions to understand: possible outcomes (“choosing” formulas) Bernoulli Trials Binomial Poisson Uniform Exponential

Audrey
Télécharger la présentation

Special Probability Distributions Chapter 5

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Special Probability DistributionsChapter 5 “A throw of the dice will never abolish chance.” Stéphane Mallarmé, French poet MGMT 242

  2. Goals for Chapter 5 • Probability Distributions to understand: • possible outcomes (“choosing” formulas) • Bernoulli Trials • Binomial • Poisson • Uniform • Exponential • Normal • Standard Normal Random Variable; Z-scores MGMT 242

  3. Probability from Possible Outcomes • If we know the number of ways that all possible events can occur,W, and if we know the number of ways a specific event, A, can occur, W(A), and if each way (or possible outcome) is equally likely, then the probability of the event A is given by P(A) = W(A) / W • Example: what is the probability of throwing “7” in craps (two fair dice): 36 = 6x6 possible outcomes; (i.e. Six ways each of the two dice can land); for “7” can get 1 and 6, 2 and 5, 3 and 4, 4 and 3, 5 and 2, 6 and 1, so P(throw=7) = 6/36 = 1/6. MGMT 242

  4. “Choosing” Formulas for Probability--1 • Use combinatorial (“choosing”) formulas to calculate number of ways for all outcomes and for event of interest to occur: • P(A) = W(A) / W, where W(A) is the number of ways that event A can occur, and W is the total number of ways that all events can occur. MGMT 242

  5. “Choosing” Formulas for Probability--2 • Example: Ex.5.3, taste test--8 glasses, 4 containing house brand, 4 national brand. Tester is to identify four glasses containing house brand; how many different choices can he make? W, number of ways to choose 4 things from eight objects is 8! / [4!4!] = 70 (denoted as (84)--”8, choose 4”) • Ex. 5.4, How many of these choices include 3 correct (house brand) and 1 incorrect? W(A) = (43)(41) = 42 = 16. • What is probability that taster gets 3 correct (out of 4) by chance? P(A) = 16 / 70 = 0.228 or about 23%. MGMT 242

  6. Bernouilli Trials-1 • Count number of successes in a series of similar events: • number of heads in n coin tosses; • number of defective parts in an assembly line; • number who vote party-line in the total of voters at a polling place; MGMT 242

  7. Bernouilli Trials-2 If the following conditions are met, each event is a Bernoulli Trial: 1 There are only two possible outcomes for each event--Yes or No; Success or Failure, Test + or -. 2 statistical independence of success. The probability of success in one event does not depend on whether a success or failure occurred in previous events 3 probability of success () is constant. The probability of success in any event remains the same for any event . MGMT 242

  8. Examples of non-Bernoulli Trials • Survey households through the year to determine employment status of head of household: Not Bernoulli trials--probability of employment will show a seasonal variation; • Analyze grade distribution for a graduate course (possible grades: distinction, pass, fail): Not Bernoulli trials--more than two outcomes possible; • Look at the price of a stock daily over a period of 1 month; check whether stock price has increased (a success) Not Bernoulli trials--success of one event not independent of previous events. MGMT 242

  9. Bernoulli Trials and the Binomial Distribution • For a series of n Bernoulli trials, can calculate the probability of “x” successes (e.g x = 5 heads in 10 tosses) if the order of success events is not critical. • PX(x; n) = (nx) x(1- ) n-x is the probability of x successes in n trials, if  is the probability of success in an individual trial. • The coefficient (nx) can be derived as follows: if the order of successful trials is not important, then all we have to do is count the number of ways in which x successes can occur in n trials; these are the number of branches in a probability tree (see board demonstration) and give the total probability for x successes. MGMT 242

  10. The Binomial Distribution--Example • Ex. 5.9-11. 3% discount given to cash customers at motel; experience indicates that 30% of all customers will take the discount ( = 0.30) rather than use a credit card/ • Probability that exactly 5 of next 20 customers pay cash? PX(5; 20) = (205) 0.35(0.7)15 = 0.1789, (n=20; x =5; = 0.3); • Probability that 5 or fewer of the next 20 customers pay cash? Need to calculate cumulative probability: FX(5; 20)= PX(x; 20), (where the sum is taken from x=0 to x= 5). The cumulative probability value can be given by software or table values (see App. 1): FX(5; 20) = 0.0008+0.0068+0.0278+0.0716+0.1304 + 0.1789 = 0.4163 • What is the probability that at least 5 of the next 20 customers pay by credit card? P(X  5) = 1 - FX(4; 20) = 1 - (0.4163-0.1789) = 0.7626 MGMT 242

  11. The Binomial Distribution--Expectation • For the binomial distribution, the expectation value of x, E(x), the mean or long-term expected value for the number of successes, is given by E(x) = n  • The variance of x is given by V(x) = n  ( 1 - ) • Previous example (5.9-5.11): n = 20,  = 0.3, so E(x) = 20  0.3 = 6, the average number of people who would pay by cash (out of 20); V(x) = 20  0.3 ( 1 -0.3) = 4.2 = X2 = (2.1) 2; MGMT 242

  12. The Poisson Distribution • The Poisson distribution is useful in considering the frequency of events that occur randomly over time or over spatial dimensions. (e.g. the frequency of telephone calls to 911, the number of chocolate bits in a Toll-house cookie; the frequency that ocean liners hit icebergs) • Assumptions and conditions: • Events occur infrequently--two events do not occur simultaneously • Events occur randomly during a time period or in a spatial interval; the probability of occurrence is not affected by previous occurrences. MGMT 242

  13. The Poisson Distribution--Continued • The Poisson distribution is given by the formula PX(x) = e - x / (x!), where x, the number of events counted during the time interval, can have values x = 0, 1, 2, 3, …., and the quantity  = E(x) = V(x). (note that  is denoted by  in many other texts; note also that 0! = 1 and and that anything to the 0 power = 1, so that PX(0) = e - .) • Changing interval value changes µ by proportionate amount Example; Ex. 5.31, mean rate of tire failures for logging trucks is 4.0 per 10,000 miles; therefore µ= 0.4 failures per 1000 miles of driving. MGMT 242

  14. The Poisson Distribution--Continued Example: Ex. 5.32--Tire failures for logging trucks occur with the Poisson distribution, with µ= 4.0 failures per 10,000 miles as the mean rate. A) If a truck drives 1000 miles per week, what is the probability that there will be no failures during the week. µ= 0.4 failures per 1,000 miles (see previous slide). PX(0) = e - = e -0.4 = 0.67 is probability of no failures (X= 0) B) What is the probability that the truck will have at least two failures (I.e two or more failures)? P(X  2) = 1 - PX(0) - PX(1) = 1 - 0.67 - e -0.4 (0.4)1 / 1! = 0.062 MGMT 242

  15. The Uniform Distribution • The Uniform distribution is one of the simplest of all distributions; it is used to describe a situation where the probability for a value of X is the same within an interval a  X  b: fX(x) = 1/(b-a) (the 1/(b-a) gives fX(x) dx = 1) • E(x) = (a + b) / 2 for the Uniform distribution, that is, the average or mean value of x is halfway between the limits for x. • V(x) = (b - a)2 / 12 (derived from V(x) = E(x 2 ) - [E(x)] 2 and using the formula above for the distribution function, fX(x) = 1/(b-a) ) MGMT 242

  16. The Uniform Distribution--Example • Ex. 5.39, On summer days, T = the time that a suburban commuter train is late, can be modeled as uniformly distributed between 0 and 20 minutes: P(t  T  t + dt) = (1/20) dt for 0  t  20, = 0, otherwise. A) Find the probability that the train is at least 8 minutes late P(T  8 ) = 1 - P(T< 8) = 1 - 08(1/20)dt = 1 - 8/20 = 3/ 5 B) Find the standard deviation for T: V(T) = (20-0)2/ 12 = 33.3= T2, or T = 33.3 = 5.77; (Note: the mean late time will be (0+20)/2 = 10 m) MGMT 242

  17. The Exponential Distribution • The exponential distribution is useful in modeling waiting time problems, or problems involving time to failure (reliability); if the Poisson distribution is a good model for the probability of an event randomly occurring during a given interval of time, then, with the same assumptions, the exponential distribution is an appropriate model for the probability of the time between two events: • fT(t) =  exp (- t), for t  0; fT(t) = 0 for t<0. • E (t ) = 1/; • V (t) = 1/ 2 • P(T  t) = 1 - exp (- t) is the cumulative distribution function. • (Note that µ for the Poisson distribution is equal to  ) MGMT 242

  18. The Exponential Distribution--Example • Ex. 5.44, text. “The counter service time for unticketed airline passengers follows an exponential distribution, with mean time of 5 minutes = (1/  or  = 1/(5 m) = 0.2 m-1) a) What is P(T 2.5 ) (probability that service time will be 2.5 minutes or less)? P(T 2.5 ) = 02.5 0.2 exp(- 0.2t) dt = 1 - exp(- 0.2*2.5)=0.39 b) The probability that the service time will be longer than 10m P(T> 10) = 1 - P(T  10) = exp( -0.2*10) = 0.1365 0.14 • Ex. 5.45, text. A) What is the expected number of passengers served per minute? µ = 1*0.2 = 0.2 per minute c) The probability that no passenger is served within 10 m.; µ’ = 10*0.2 = 2.0 per 10 minutes give P(X=0) = e - 2= 0.1365 MGMT 242

  19. The Normal Probability Distribution • The Normal probability distribution (also called the “Gaussian” or “Bell-shaped Curve”) is the most important distribution in statistics. It can be shown to occur, generally, if the distribution is the sum or average of many components (“The Central Limit Theorem”). A normal distribution is implicitly assumed for many statistical tests (hypothesis tests, confidence limits, t-tests). • fX(x) = (1/[(2)]) exp{ -(x- µ)2/ (22)}, µ is the mean value of X ( E(X) = µ)  is the standard deviation of X ( V(x) = 2) MGMT 242

  20. The Standard Normal Probability Distribution • The cumulative distribution function, P(X<x), can not be calculated as an explicit function of x, but it can be evaluated numerically. However, to do so for every combination of values for µ and would be a chore, so a more convenient method is to transform X into a universal, “Standard”, variable, Z, as follows: Z = (X - µ) / ; • Note that the mean value of Z, E(Z), is 0, and that the variance of Z, V(Z), is 1. • Values of P(Z<z) are given in tables or from software. • Empirical Rule: P(-1 <Z< 1) = 0.68; P(-1.96 <Z< 1.96) = 0.95 MGMT 242

  21. Z-scores -- Example • The Z-value for a standard normal distribution is often denoted as a “Z-score” (not in this text, however). • Example, Ex.5.56 (text). The price X of a long-term $1000 bond one year later is normally distributed with mean $980 and SD=$40. A) P(X > $1000) = ? P(X > $1000) = 1 - P(X  $1000); For X = $1000, Z = (1000 -980)/ 40 = 0.5 = 0.5; P(Z  0.5) = 0.1915 + 0.5 = 0.6915 (Table 3, p 800), so P(X > $1000) = 1 - 0.6915 = 0.3085, or 31 %; • 5.57b. What is the price, x’, such that the probability is 60% that the value of the bond will exceed x’? For P(Z  z’ ) = 1 - 0.6 = 0.4, z’ = 0.0-0.2053 (Table 3) = -0.2053, or x’ = -0.2053*40 + 980 = $971.79 MGMT 242

More Related