1 / 91

MOMENT GENERATING FUNCTION AND STATISTICAL DISTRIBUTIONS

MOMENT GENERATING FUNCTION AND STATISTICAL DISTRIBUTIONS. MOMENT GENERATING FUNCTION. The m.g.f. of random variable X is defined as. for t Є (-h,h) for some h>0. Properties of m.g.f. M(0)=E[1]=1 If a r.v. X has m.g.f. M(t), then Y=aX+b has a m.g.f.

calvin
Télécharger la présentation

MOMENT GENERATING FUNCTION AND STATISTICAL DISTRIBUTIONS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MOMENT GENERATING FUNCTION AND STATISTICAL DISTRIBUTIONS

  2. MOMENT GENERATING FUNCTION The m.g.f. of random variable X is defined as for t Є (-h,h) for some h>0.

  3. Properties of m.g.f. • M(0)=E[1]=1 • If a r.v. X has m.g.f. M(t), then Y=aX+b has a m.g.f. • M.g.f does not always exists (e.g. Cauchy distribution)

  4. Example • Suppose that X has the following p.d.f. Find the m.g.f; expectation and variance.

  5. CHARACTERISTIC FUNCTION The c.h.f. of random variable X is defined as for all real numbers t. C.h.f. always exists.

  6. Uniqueness Theorem: • If two r.v.s have mg.f.s that exist and are equal, then they have the same distribution. • If two r,v,s have the same distribution, then they have the same m.g.f. (if they exist) Similar statements are true for c.h.f.

  7. STATISTICAL DISTRIBUTIONS

  8. SOME DISCRETE PROBABILITY DISTRIBUTIONS Degenerate, Uniform, Bernoulli, Binomial, Poisson, Negative Binomial, Geometric, Hypergeometric

  9. DEGENERATE DISTRIBUTION • An rv X is degenerate at point k if The cdf:

  10. UNIFORM DISTRIBUTION • A finite number of equally spaced values are equally likely to be observed. • Example: throw a fair die. P(X=1)=…=P(X=6)=1/6

  11. BERNOULLI DISTRIBUTION • A Bernoulli trial is an experiment with only two outcomes. An r.v. X has Bernoulli(p) distribution if

  12. BINOMIAL DISTRIBUTION Y = total number of successes in n Bernoulli trials. • Define an rv Yby • There are n trials (n is finite and fixed). • 2. Each trial can result in a success or a failure. • 3. The probability p of success is the same for all the trials. • 4. All the trials of the experiment are independent.

  13. BINOMIAL DISTRIBUTION • Example: • There are black and white balls in a box. Select and record the color of the ball. Put it back and re-pick (sampling with replacement). • n: number of independent and identical trials • p: probability of success (e.g. probability of picking a black ball) • X: number of successes in n trials

  14. BINOMIAL THEOREM • For any real numbers x and y and integer n>0

  15. BINOMIAL DISTRIBUTION • If Y~Bin(n,p), then

  16. POISSON DISTRIBUTION • The number of occurrences in a given time interval can be modeled by the Poisson distribution. • e.g. waiting for bus, waiting for customers to arrive in a bank. • Another application is in spatial distributions. • e.g. modeling the distribution of bomb hits in an area or the distribution of fish in a lake.

  17. POISSON DISTRIBUTION • If X~ Poi(λ), then E(X)=Var(X)=λ

  18. Relationship between Binomial and Poisson Let =np. The mgf of Poisson() The limiting distribution of Binomial rv is the Poisson distribution.

  19. NEGATIVE BINOMIAL DISTRIBUTION (PASCAL OR WAITING TIME DISTRIBUTION) • X: number of Bernoulli trials required to get a fixed number of failures before the r th success; or, alternatively, • Y: number of Bernoulli trials required to get a fixed number of successes, such as r successes.

  20. NEGATIVE BINOMIAL DISTRIBUTION (PASCAL OR WAITING TIME DISTRIBUTION) X~NB(r,p)

  21. NEGATIVE BINOMIAL DISTRIBUTION • An alternative form of the pdf: Note: Y=X+r

  22. GEOMETRIC DISTRIBUTION • Distribution of the number of Bernoulli trials required to get the first success. • It is the special case of the Negative Binomial Distribution r=1. X~Geometric(p)

  23. GEOMETRIC DISTRIBUTION • Example: If probability is 0.001 that a light bulb will fail on any given day, then what is the probability that it will last at least 30 days? • Solution:

  24. HYPERGEOMETRIC DISTRIBUTION • A box contains N marbles. Of these, M are red. Suppose that n marbles are drawn randomly from the box without replacement. The distribution of the number of red marbles, x is X~Hypergeometric(N,M,n) It is dealing with finite population.

  25. SOME CONTINUOUS PROBABILITY DISTRIBUTIONS Uniform, Normal, Exponential, Gamma, Chi-Square, Beta Distributions

  26. Uniform Distribution • A random variable X is said to be uniformly distributed if its density function is • The expected value and the variance are

  27. Uniform Distribution • Example 1 • The daily sale of gasoline is uniformly distributed between 2,000 and 5,000 gallons. Find the probability that sales are: • Between 2,500 and 3,000 gallons • More than 4,000 gallons • Exactly 2,500 gallons f(x) = 1/(5000-2000) = 1/3000 for x: [2000,5000] P(2500£X£3000) = (3000-2500)(1/3000) = .1667 1/3000 x 2000 2500 3000 5000

  28. Uniform Distribution • Example 1 • The daily sale of gasoline is uniformly distributed between 2,000 and 5,000 gallons. Find the probability that sales are: • Between 2,500 and 3,500 gallons • More than 4,000 gallons • Exactly 2,500 gallons f(x) = 1/(5000-2000) = 1/3000 for x: [2000,5000] P(X³4000) = (5000-4000)(1/3000) = .333 1/3000 x 2000 4000 5000

  29. Uniform Distribution • Example 1 • The daily sale of gasoline is uniformly distributed between 2,000 and 5,000 gallons. Find the probability that sales are: • Between 2,500 and 3,500 gallons • More than 4,000 gallons • Exactly 2,500 gallons f(x) = 1/(5000-2000) = 1/3000 for x: [2000,5000] P(X=2500) = (2500-2500)(1/3000) = 0 1/3000 x 2000 2500 5000

  30. Normal Distribution • This is the most popular continuous distribution. • Many distributions can be approximated by a normal distribution. • The normal distribution is the cornerstone distribution of statistical inference.

  31. Normal Distribution • A random variable X with mean m and variance s2is normally distributed if its probability density function is given by

  32. The Shape of the Normal Distribution The normal distribution is bell shaped, and symmetrical around m. m 90 110 Why symmetrical? Let m = 100. Suppose x = 110. Now suppose x = 90

  33. The Effects of m and s How does the standard deviation affect the shape of f(x)? s= 2 s =3 s =4 How does the expected value affect the location of f(x)? m = 10 m = 11 m = 12

  34. Finding Normal Probabilities • Two facts help calculate normal probabilities: • The normal distribution is symmetrical. • Any normal distribution can be transformed into a specific normal distribution called… “STANDARD NORMAL DISTRIBUTION” Example The amount of time it takes to assemble a computer is normally distributed, with a mean of 50 minutes and a standard deviation of 10 minutes. What is the probability that a computer is assembled in a time between 45 and 60 minutes?

  35. STANDARD NORMAL DISTRIBUTION • NORMAL DISTRIBUTION WITH MEAN 0 AND VARIANCE 1. • IF X~N( , 2), THEN NOTE:Z IS KNOWN AS Z SCORES. • “ ~ “ MEANS “DISTRIBUTED AS”

  36. Finding Normal Probabilities • Solution • If Xdenotes the assembly time of a computer, we seek the probability P(45<X<60). • This probability can be calculated by creating a new normal variable the standard normal variable. Every normal variable with some m and s, can be transformed into this Z. Therefore, once probabilities for Z are calculated, probabilities of any normal variable can be found. V(Z) = s2 = 1 E(Z) = m = 0

  37. Standard normal probabilities Copied from Walck, C (2007) Handbook on Statistical Distributions for experimentalists

  38. Standard normal table 1

  39. Standard normal table 2

  40. Standard normal table 3

  41. Finding Normal Probabilities • Example - continued - m 45 - 50 X 60 - 50 P(45<X<60) = P( < < ) s 10 10 = P(-0.5 < Z < 1) To complete the calculation we need to compute the probability under the standard normal distribution

  42. P(0<Z<z0) Using the Standard Normal Table Standard normal probabilities have been calculated and are provided in a table . The tabulated probabilities correspond to the area between Z=0 and some Z = z0 >0 Z = z0 Z = 0

  43. - m 45 - 50 X 60 - 50 P(45<X<60) = P( < < ) s 10 10 z0 = 1 z0 = -.5 Finding Normal Probabilities • Example - continued = P(-.5 < Z < 1) We need to find the shaded area

  44. - m 45 - 50 X 60 - 50 P(45<X<60) = P( < < ) s 10 10 z0 = 1 z0 =-.5 Finding Normal Probabilities • Example - continued = P(-.5<Z<1) = P(-.5<Z<0)+ P(0<Z<1) P(0<Z<1 .3413 z=0

  45. Finding Normal Probabilities • The symmetry of the normal distribution makes it possible to calculate probabilities for negative values of Z using the table as follows: -z0 +z0 0 P(-z0<Z<0) = P(0<Z<z0)

  46. Finding Normal Probabilities • Example - continued .3413 .1915 -.5 .5

  47. .1915 .1915 .1915 .1915 Finding Normal Probabilities • Example - continued .3413 1.0 -.5 .5 P(-.5<Z<1) = P(-.5<Z<0)+ P(0<Z<1) = .1915 + .3413 = .5328

  48. Finding Normal Probabilities • Example - continued .3413 P(Z<-0.5)=1-P(Z>-0.5)=1-0.6915=0.3085 By Symmetry P(Z<0.5)

  49. 0% 10% Finding Normal Probabilities • Example • The rate of return (X) on an investment is normally distributed with a mean of 10% and standard deviation of (i) 5%, (ii) 10%. • What is the probability of losing money? X 0 - 10 5 .4772 (i) P(X< 0 ) = P(Z< ) = P(Z< - 2) Z 2 -2 0 =P(Z>2) = 0.5 - P(0<Z<2) = 0.5 - .4772 = .0228

  50. 0% 10% 1 Finding Normal Probabilities • Example • The rate of return (X) on an investment is normally distributed with mean of 10% and standard deviation of (i) 5%, (ii) 10%. • What is the probability of losing money? X 0 - 10 10 .3413 (ii) P(X< 0 ) = P(Z< ) Z -1 = P(Z< - 1) = P(Z>1) = 0.5 - P(0<Z<1) = 0.5 - .3413 = .1587

More Related