1 / 42

Exam 2: Rules

Exam 2: Rules. Section 2.1. Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back. Exam 2: Rules. Section 2.1. Five question:. One fill in the blanks. One multiple choice. Three to solve. One of those three is from your homework.

diem
Télécharger la présentation

Exam 2: Rules

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.

  2. Exam 2: Rules Section 2.1 Five question: One fill in the blanks One multiple choice Three to solve One of those three is from your homework Show work; if you do you might get partial credit

  3. In studying uncertainty: Identify the experiment of interest and understand it well (including the associated population) Identify the sample space (all possible outcomes) Identify an appropriate random variable that reflects what you are studying (and simple events based on this random variable) Construct the probability distribution associated with the simple events based on the random variable

  4. Ch3: 3.1 Random Variables 3.2 Probability distributions for discrete random variables 3.3 Expected values 3.4 The Binomial Distribution 3.5 Negative Binomial and Hypergeometric 3.6 The Poisson

  5. Random Variables Section 3.1 A Random Variable: is a function on the outcomes of an experiment; i.e. a function on outcomes in S. A discrete random variable is one with a sample space that is finite or countably infinite. (Countably infinite => infinite yet can be matched to the integer line) A continuous random variable is one with a continuous sample space.

  6. Probability distributions for discrete rvs Section 3.2 For discrete random variables, we call P(X = x) = P(x) the probability mass function (pmf). From the axioms of probability, we can show that: 1. 2. A CDF, F(x) is defined to be,

  7. Probability distributions for discrete rvs Section 3.2 So, for any two numbers a, b where a < b, We can also find the pmf using the CDF if we note that:

  8. Expected values Section 3.3 The expected value E(X) of a discrete random variable is the weighted average or the mean of that random variable, The variance of a discrete random variable is the weighted average of the squared distance from the mean, The standard deviation, Let h(X) be a function, a and b be constants then,

  9. Discrete Probability & Expected values Section 3.2-6 Talked about the following distributions: Bernoulli Binomial Hypergeometric Negative Binomial Geometric Poisson

  10. Discrete Probability & Expected values Section 3.2-6 Bernoulli Two possible outcomes S and F, probability of success = p. S = {S, F}

  11. Discrete Probability & Expected values Section 3.2-6 Binomial The experiment consists of a group of nindependent Bernoulli sub-experiments, where n is fixed in advance of the experiment and the probability of a success is p. What we are interested in studying is the number of successes that we may observe in any run of such an experiment.

  12. Discrete Probability & Expected values Section 3.2-6 Binomial The binomial random variable X = the number of successes (S’s) among n Bernoulli trials or sub-experiments. We say X is distributed Binomial with parameters n and p, The pmf can become (depending on the book),

  13. Discrete Probability & Expected values Section 3.2-6 Binomial The CDF can become (also depending on the book), Tabulated in Table A.1, page 664-666

  14. The Binomial Distribution Section 3.4 When to use the binomial distribution? When we have n independent Bernoulli trials When each Bernoulli trial is formed from a sample n of individuals (parts, animals, …) from a population with replacement. When each Bernoulli trial is formed from a sample of n individuals (parts, animals, …) from a population of size Nwithout replacement if n/N < 5%.

  15. Discrete Probability & Expected values Section 3.2-6 Hypergeometric The experiment consists of a group of ndependent Bernoulli sub-experiments, where n is fixed in advance of the experiment and the probability of a success is p. What we are interested in studying is the number of successes that we may observe in any run of such an experiment.

  16. Discrete Probability & Expected values Section 3.2-6 Hypergeometric The hypergeometric random variable X = the number of successes (S’s) among n trials or sub-experiments. We say X is distributed Hypergeometric with parameters N, M and n

  17. Discrete Probability & Expected values Section 3.2-6 Hypergeometric The pmf can become (depending on the book), The CDF

  18. Discrete Probability & Expected values Section 3.2-6 Hypergeometric

  19. Discrete Probability & Expected values Section 3.2-6 Negative Binomial The experiment consists of a group of independent Bernoulli sub-experiments, where r (not n), the number of successes we are looking to observe, is fixed in advance of the experiment and the probability of a success is p. What we are interested in studying is the number of failures that precede the rth success. Called negative binomial because instead of fixing the number of trials n we fix the number of successes r.

  20. Discrete Probability & Expected values Section 3.2-6 Negative Binomial The negative binomial random variable X = the number of failures (F’s) until the rth success. We say X is distributed negative Binomial with parameters r and p, pmf is:

  21. Discrete Probability & Expected values Section 3.2-6 Negative Binomial CDF is

  22. Discrete Probability & Expected values Section 3.2-6 Geometric A special case of the negative binomial is when r = 1, then we call the distribution geometric. The geometric random variable X = the number of failures (F’s) until the 1stsuccess. We say X is distributed geometric with parameter p, pmf is:

  23. Discrete Probability & Expected values Section 3.2-6 Geometric CDF is

  24. Discrete Probability & Expected values Section 3.2-6 Poisson We can get to the Poisson model in two ways: As an approximation of the Binomial distribution As a model describing the Poisson process

  25. Discrete Probability & Expected values Section 3.2-6 Poisson Approximating the Binomial distribution Rules for approximation: The math ones are: If , , and then In practice: If n is large (>50) and p is small such as np < 5, then we can approximate with , where

  26. Discrete Probability & Expected values Section 3.2-6 Poisson Approximating the Binomial distribution Poisson random variable X = the number of successes (S). We say X is distributed Poisson with parameter l, pmf:

  27. Discrete Probability & Expected values Section 3.2-6 Poisson Approximating the Binomial distribution CDF: Tabulated in Table A.2, page 667

  28. Discrete Probability & Expected values Section 3.2-6 Poisson As a model describing the Poisson process This is a process of counting events, usually, over time Assumptions: There exists a parameter a > 0 such that, There is a very small chance that 2 or more events will occur in , The number of events observed in is independent from that occurring in any other period.

  29. Discrete Probability & Expected values Section 3.2-6 Poisson As a model describing the Poisson process Poisson random variable X = the number of successes (S) within time period t. We say X is distributed Poisson with parameter at, pmf:

  30. Discrete Probability & Expected values Section 3.2-6 Poisson CDF: Tabulated in Table A.2, page 667

  31. Ch4: 4.1 Probability Density Functions 4.2 CDFs and Expected Values 4.3 The Normal Distribution 4.4 The Exponential Distribution

  32. Continuous pdfs, CDFs and Expectation Section 4.1-2 For continuous random variables, we call f(x) the probability density function (pdf). From the axioms of probability, we can show that: 1. 2. CDF

  33. Probability distributions for discrete rvs Section 3.2 So, for any two numbers a, b where a < b, We can also find the pdf using the CDF if we note that:

  34. Expected values Section 3.3

  35. Continuous random variables Section4.2-6 Talked about the following distributions: Uniform Normal Exponential

  36. Continuous random variables Section4.2-6 Uniform

  37. Continuous random variables Section4.2-6 Normal The most important distribution of classical and applied statistics. CDF Expectation

  38. Continuous random variables Section4.2-6 Normal The standard Normal Z is said to have a standard normal distribution with mean = μ = 0 and standard deviation = σ = 1, pdf, A CDF, , as provided by Table A.3 pages 668-669

  39. Continuous random variables Section4.2-6 Normal Percentiles zα = x(1-α) = equal to the (1-α)th percentile. If , with , then we can use the normal distribution to approximate this distribution as follows,

  40. Continuous random variables Section4.2-6 Exponential Commonly used to model component life time (if that component can be assumed not to change over time) and times between occurrence of multiple events in a Poisson process. A good approximation to the geometric distribution

  41. Continuous random variables Section4.2-6 Exponential We say that a random variable is exponentially distributed, , governed by parameter λ if the pdf of its distribution is, CDF, Expectation,

  42. For any type of random variables Chebyshev’s rule: Says that no matter what probability distribution you are looking at the chance that an observed simple event of an experiment (from now on we will hand waive it and call it an outcome) will be between k standard deviations from the mean of the distribution is going to be at least 1 – 1/k2 In simple math:

More Related