1 / 28

Topics

Topics. Review of probability theory Random Variables Conditional probability and conditional expectation The analysis of variance Introduction Single factor ANOVA Simple linear regression and correlation Introduction The simple linear regression model Estimation model parameters

Télécharger la présentation

Topics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Topics • Review of probability theory • Random Variables • Conditional probability and conditional expectation • The analysis of variance • Introduction • Single factor ANOVA • Simple linear regression and correlation • Introduction • The simple linear regression model • Estimation model parameters • Inferences about the slope parameters Dr. Ahmed M. Sultan

  2. Topics (Cont.) • Multivariate regression analysis • When to use multivariate regression • Control variables • Interpreting coefficients • Goodness of fit (R squared statistic) • The exponential distribution and the Poisson process • Queueing theory • The M/M/1 queue • Steady state probabilities • Some performance measures Dr. Ahmed M. Sultan

  3. Topics (Cont.) • The M/M/m queue • Steady state probabilities • Some performance measures • The M/M/1/K queue • Steady state probabilities • Some performance measures • Discrete event simulation • Generating pseudo random numbers • Congruential methods for generating pseudo random numbers • Composite generators • Statistical tests for goodness of fit Dr. Ahmed M. Sultan

  4. Topics (Cont.) • Generating stochastic variables • The inverse transformation method • Sampling from continuous probability distribution • Data manipulation in MINITAB • Recording and transforming variables • Graphs and charts • Scatter plots • Histograms • Box plots and other charts • Cross tabulation Dr. Ahmed M. Sultan

  5. References • Devore, J. “Probability and statistics for engineering and sciences” • Andrews Willing “A short introduction to queueing theory” • Banks, et. al. “discrete event simulation” Dr. Ahmed M. Sultan

  6. Review of probability theory • Laws of probability DEFINITION If an event E occurs m times in an n trial experiment, then the probability P(E) is defined as: i.e the experimrnt is repeated infinitely Dr. Ahmed M. Sultan

  7. e.g In case of flipping a coin, the longer the experiment is repeated the closer will be the estimate to P(H) (or P(T)) to the theortical value of 0.5 0≤P(E) ≤1 P(E)=0 … E is impossible P(E)=1 … E is certain (sure) Dr. Ahmed M. Sultan

  8. HW • In a study to correlate senior year high school students scores in mathematics and enrollment in engineering colleges a 1000 students were surveyed: 400 have studied mathematics Engineering enrollment shows that of the 1000 seniors: 150 have studied mathematics 29 have not Determine the probability of: • A student who studied mathematics is enrolled in engineering • A student who neither studies mathematics nor enrolled in engineering • A student is not studying engineering Dr. Ahmed M. Sultan

  9. 1.1 Addition law of probability EUF … Union of E and F EF … Intersection of E and F If EF = ɸ, E and F are mutually exclusives (occurrence of one precludes the other) • Addition law Dr. Ahmed M. Sultan

  10. Example Rolling a die S={1,2,3,4,5,6} sample space P(1)= P(2)= P(3)= P(4)= P(5)= P(6)=1/6 Define E={1,2,3, or 4} F={3, 4, or 5} EF={3,4} P(E)= P(1)+ P(2)+ P(3)+ P(4)=4/6=2/3 P(F)=3/6=1/2 P(EF)=2/6=1/3 P(EUF)=P(E)+P(F)-P(EF) =2/3+1/2-1/3=5/6 Which is intuitively clear since EUF={1,2,3,4,5} Dr. Ahmed M. Sultan

  11. HW • A fair die is tossed twice. E and F represent the outcomes of the two tosses. Compute the following probabilities • Sum of E and F is 11 • Sum of E and F is even • Sum of E and F is odd and greater than 3 • E is even less than 6 and F is odd greater than 1 • E is graeter than 2 and F is less than 4 • E is 4 and sum of E and F is odd Dr. Ahmed M. Sultan

  12. 1.2 Conditional Probability • The conditional probability of an event E is the probability that the event will occur given the knowledge that an event F has already occurred. This probability is written P(E|F), notation for the probability of E given F. • In the case where events E and F are independent (where event F has no effect on the probability of event E), the conditional probability of event E given event F is simply the probability of event E, that is P(E). Dr. Ahmed M. Sultan

  13. …(Cont.) • If events E and F are not independent, then the probability of the intersection of E and F (the probability that both events occur) is defined by P(E and F) = P(F)P(E|F). • From this definition, the conditional probability P(E|F) is easily obtained by dividing by P(F): • P(E|F) = P(EF) / P(F) , P(F) > 0 • Note: This expression is only valid when P(F) is greater than 0. Dr. Ahmed M. Sultan

  14. Example In rolling a die, what is the probability that the outcome is 6, given that the rolling turned up an event number Solution E={6}, F={2,4,6} thus P(E|F)=P(EF)/P(F)=P(E)/P(F)=(1/6)/(1/2)=1/3 Note that P(EF)=P(E) because E is a subset of F Dr. Ahmed M. Sultan

  15. Example Ninety percent of flights depart on time. Eighty percent of flights arrive on time. Seventy-five percent of flights depart on time and arrive on time. (a) You are meeting a flight that departed on time. What is the probability that it will arrive on time? (b) You have met a flight, and it arrived on time. What is the probability that it departed on time? (c) Are the events, departing on time and arriving on time, independent? Dr. Ahmed M. Sultan

  16. Solution Denote the events, A = { arriving on time} , D = {departing on time} . P{A} = 0.8, P{D} = 0.9, P{AD} = 0.75. (a) P{A I D} =P{AD} / P{D} = 0.75 / 0.9 = 0.8333 (b) P{D I A}= P{AD} / P{A} = 0.75 / 0.8 = 0.9375 (c) Events are not independent because P{AI D} ≠ P{A}, P{DI A} ≠ P{D}, P{AD} ≠ P{A}P{D}. Actually, anyone of these inequalities is sufficient to prove that A and D are dependent. Further, we see that P{AI D} > P{A} and P{D I A} > P {D}. In other words, departing on time increases the probability of arriving on time, and vise versa. This perfectly agrees with our intuition. Dr. Ahmed M. Sultan

  17. HW In the example of tossing a die if given that the outcome is less than 6, determine: • Probability of getting an even number • Probability of getting an odd number larger than 1. Dr. Ahmed M. Sultan

  18. HW • You can toss a fair coin up to 7 times. You will win 1000 SR if three tails appear before a head is encountered. What are your chances of wining? Dr. Ahmed M. Sultan

  19. HW Graduating high school seniors with an ACT score of at least 26 can apply to two universities A, and B, for admission. The probability of being accepted in A is 0.4 and in B is 0.25. The chance of being accepted in both universities is only 15% • Determine the probability that the student is accepted in B given that A has granted admission as well • What is the probability that admission will be granted in A given that the student was accepted in B? Dr. Ahmed M. Sultan

  20. Random variables Definition: • Consider a random experiment with sample space S. A random variable X(ζ) is a single-valued real function that assigns a real number called the value of X(ζ) to each sample point ζ of S. Often, we use a single letter X for this function in place of X(ζ) and use r.v. to denote the random variable. Dr. Ahmed M. Sultan

  21. …(Cont.) • Note that the terminology used here is traditional. Clearly a random variable is not a variable at all in the usual sense, and it is a function. • The sample space S is termed the domain of the r.v. X, and the collection of all numbers [values of X(ζ)]is termed the range of the r.v. X. Thus the range of X is a certain subset of the set of all real numbers. Dr. Ahmed M. Sultan

  22. …(Cont.) • Note that two or more different sample points might give the same value of X(ζ), but two different numbers in the range cannot be assigned to the same sample point. • EXAMPLE In the experiment of tossing a coin, we might define the r.v. X as: X(H) = 1 X(T) = 0 Note that we could also define another r.v., say Y or Z, with Y(H) = 0, Y(T) = 1 or Z(H) = 1, Z(T) = 2 Dr. Ahmed M. Sultan

  23. EXAMPLE Consider an experiment of tossing 3 fair coins and counting the number of heads. Certainly, the same model suits the number of girls in a family with 3 children, the number of 1‘s in a random binary code consisting of 3 characters, etc. Let X be the number of heads (girls, 1's). Prior to an experiment, its value is not known. All we can say is that X has to be an integer between 0 and 3. Since assuming each value is an event, we can compute probabilities, • P{X = 0} = P{three tails}= P{TTT} = (1/2)(1/2)(1/2)=1/8 • P{X = 1} = P{HTT} + P{THT} + P{TTH} = 3/8 • P{X = 2} =P {H HT} + P {HT H} + P {T H H} =3/8 . • P{X = 3} =P{HHH} =1/8 Dr. Ahmed M. Sultan

  24. …(Cont.) • Summarizing, x P{X = x} 0 1/8 1 3/8 2 3/8 3 1/8 Total 1 • This table contains everything that is known about random variable X prior to the experiment. Dr. Ahmed M. Sultan

  25. …(Cont.) • Before we know the outcome ω, we cannot tell what X equals to. However, we can list all the possible values of X and determine the corresponding probabilities. • Collection of all the probabilities related to X is the distribution of X. The function P(x)=P{X=x} is the probability mass function, Or pmf: The cumulative distribution function, or cdf is defined as F(x) = P{X≤ x} = Σy ≤ xP(y) Dr. Ahmed M. Sultan

  26. Types of random variables • There are two types of random variables: • A Discrete random variable can take on only specified, distinct values. • A Continuous random variable can take on any value within an interval. Dr. Ahmed M. Sultan

  27. Probability distribution for a discrete random variable • A probability distribution for a discrete random variable is a mutually exclusive listing of all possible numerical outcomes for that random variable, such that a particular probability of occurrence is associated with each outcome. • Probability Distribution for the Toss of a Die: x P{X = x} 1 1/6 2 1/6 3 1/6 4 1/6 5 1/6 6 1/6 • This is an example of a uniform distribution. Dr. Ahmed M. Sultan

  28. Discrete Probability Distributions • Discrete Probability Distributions have 3 major properties: • 1) ∑ P(X) = 1 • 2) P(X) ≥ 0 • 3) When you substitute the random variable into the function, you find out the probability that the particular value will occur. • Three major probability distributions: Binomial distribution, Hypergeometric distribution, Poisson distribution. Dr. Ahmed M. Sultan

More Related