1 / 15

Parameter, Statistic and Random Samples

Parameter, Statistic and Random Samples. A parameter is a number that describes the population. It is a fixed number, but in practice we do not know its value. A statistic is a function of the sample data, i.e., it is a quantity

ldahlke
Télécharger la présentation

Parameter, Statistic and Random Samples

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Parameter, Statistic and Random Samples • A parameter is a number that describes the population. It is a fixed number, but in practice we do not know its value. • A statistic is a function of the sample data, i.e., it is a quantity whose value can be calculated from the sample data. It is a random variable with a distribution function. Statistics are used to make inference about unknown population parameters. • The random variables X1, X2,…, Xn are said to form a (simple) random sample of size n if the Xi’s are independent random variables and each Xi has the sample probability distribution. We say that the Xi’s are iid. week1

  2. Example – Sample Mean and Variance • Suppose X1, X2,…, Xn is a random sample of size n from a population with mean μ and variance σ2. • The sample mean is defined as • The sample variance is defined as week1

  3. Goals of Statistics • Estimate unknown parameters μ and σ2. • Measure errors of these estimates. • Test whether sample gives evidence that parameters are (or are not) equal to a certain value. week1

  4. Sampling Distribution of a Statistic • The sampling distribution of a statistic is the distribution of values taken by the statistic in all possible samples of the same size from the same population. • The distribution function of a statistic is NOT the same as the distribution of the original population that generated the original sample. • The form of the theoretical sampling distribution of a statistic will depend upon the distribution of the observable random variables in the sample. week1

  5. Sampling from Normal population • Often we assume the random sample X1, X2,…Xn is from a normal population with unknown mean μ and variance σ2. • Suppose we are interested in estimating μ and testing whether it is equal to a certain value. For this we need to know the probability distribution of the estimator of μ. week1

  6. Claim • Suppose X1, X2,…Xn are i.i.d normal random variables with unknown mean μ and variance σ2then • Proof: week1

  7. Recall - The Chi Square distribution • If Z ~ N(0,1) then, X = Z2 has a Chi-Square distribution with parameter 1, i.e., • Can proof this using change of variable theorem for univariate random variables. • The moment generating function of X is • If , all independent then • Proof… week1

  8. Claim • Suppose X1, X2,…Xnare i.i.d normal random variables with mean μ and variance σ2. Then, are independent standard normal variables, where i = 1, 2, …, n and • Proof: … week1

  9. t distribution • Suppose Z ~ N(0,1) independent of X ~ χ2(n). Then, • Proof: week1

  10. Claim • Suppose X1, X2,…Xn are i.i.d normal random variables with mean μ and variance σ2. Then, • Proof: week1

  11. F distribution • Suppose X ~ χ2(n) independent of Y ~ χ2(m). Then, week1

  12. Properties of the F distribution • The F-distribution is a right skewed distribution. • i.e. • Can use Table 7 on page 796 to find percentile of the F- distribution. • Example… week1

  13. The Central Limit Theorem • Let X1, X2,…be a sequence of i.i.d random variables with E(Xi) = μ < ∞ and Var(Xi) = σ2 < ∞. Let Then, for - ∞ < x < ∞ where Z is a standard normal random variable and Ф(z)is the cdf for the standard normal distribution. • This is equivalent to saying that converges in distribution to Z ~ N(0,1). • Also, i.e. converges in distribution to Z ~ N(0,1). week1

  14. Example • Suppose X1, X2,…are i.i.d random variables and each has the Poisson(3) distribution. So E(Xi) = V(Xi) = 3. • The CLT says that as n  ∞. week1

  15. Examples • A very common application of the CLT is the Normal approximation to the Binomial distribution. • Suppose X1, X2,…are i.i.d random variables and each has the Bernoulli(p) distribution. So E(Xi) = p and V(Xi) = p(1- p). • The CLT says that as n  ∞. • Let Yn = X1 + … + Xn then Yn has a Binomial(n, p) distribution. So for large n, • Suppose we flip a biased coin 1000 times and the probability of heads on any one toss is 0.6. Find the probability of getting at least 550 heads. • Suppose we toss a coin 100 times and observed 60 heads. Is the coin fair? week1

More Related