1 / 8

Physics 114: Lecture 7 Uncertainties in Measurement

Physics 114: Lecture 7 Uncertainties in Measurement. Dale E. Gary NJIT Physics Department. Some Terms. Accuracy —How close the measurements are to the “true” value (note that we may not always know the true value).

ccarey
Télécharger la présentation

Physics 114: Lecture 7 Uncertainties in Measurement

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Physics 114: Lecture 7 Uncertainties in Measurement Dale E. Gary NJIT Physics Department

  2. Some Terms • Accuracy—How close the measurements are to the “true” value (note that we may not always know the true value). • Precision—How close repeated measurements are to each other. A measure of the spread of data points. • One can make measurements that are highly accurate (their mean is close to the true value) even though they may not be very precise (large spread of measurements). Conversely, on can make very precise measurements that are not accurate. • Errors—Deviations of measurements from the “true” value. Error here does not mean a blunder! Also referred to as uncertainties. • Systematic Errors—deviations from the “true” value that are very reproducible, generally due to some uncorrected effect of an instrument or measurement technique. An example is reading a scale slightly off the vertical, which may systematically give a too-high or too-low reading. • Statistical, or Random Errors—fluctuations in measurements that result in their being both too high and too low, due to how precisely the measurement can be made, and which are amenable to reduction by doing repeated measurements.

  3. Parent and Sample Distributions • Imagine a process for manufacturing ball bearings. Although each ball bearing is nominally the same, any process is going to cause slight deviations in shape, size, or other measure. If we measure the weight, say, of an infinite number of such ball bearings, these weight measurements will spread into a distribution around some mean value. This hypothetical, infinite distribution is called the parent distribution. The parent distribution’s spread depends, obviously, on how precise the manufacturing process is. • We can never measure an infinite number of ball bearings. Instead, we measure a smaller subset of ball bearings, and from this sample we again find that our measurements spread into a distribution around the sample mean. This finite distribution is called the sample distribution. • In the limit of an infinite sample, of course, the sample distribution should become the parent distribution (assuming we have no systematic errors).

  4. Example Sample & Parent Dist. • At the left are the sample distributions for a series of 16 sets of 50 measurements: • x = -4:0.5:4; for i = 1:16; subplot(4,4,i); hist(randn(1,50),x); axis([-4,4,0,20]); end • At the right is the sum of these 16 measurements (equivalent to 800 measurements). Apparently 800 is close to infinity, since the sample distribution now is quite close to the parent distribution (red line).

  5. What To Do When the “True” Value is Unknown—The Mean • When we do not know the “true” value that we are comparing our sample to, we can take the mean of the measurements as an approximation of the “true” value. • Of course, the mean of the parent population is

  6. Probability and Median • The spread of values about the mean in the parent population (that is, the histogram) forms a function called a probability density function (PDF). We will be using this term many, many times during the course. • Its connection to probability is as follows: if you take the PDF and normalize its area, so that the area under the curve (the integral) is unity, then the integral in a restricted range x1 to x2 is the probability that a given measurement will fall in that range. • Notice that we use P(x) for the probability, and p(x) for the probability density (PDF). • The median (m1/2) is the point where the probability is equal (i.e. 1/2) on each side:

  7. Most Probable Value (Mode) • Most probability density functions (PDFs) have a single peak. The value of x at which they peak is the most probable value, or mode. This is the same as the mean for symmetric PDFs, but they can be quite different for asymmetric ones. • The most probable value is called mmax, and obeys • Examples of when to use median vs. mean. • For a set of measurements (sample distribution) that follows the Gaussian (Normal) distribution, the mean and median are basically the same, so long as the sample is large. • However, the median is often preferred over the mean, as an estimate of the true value, in the presence of outliers. • Say we have a set of measurements x = 190. + randn(1,100); You can check that the mean and median are nearly identical. Now say there was something wrong with the 42nd measurement (x(42) = 300.;). Now the median is nearly unchanged, but the mean is much higher.

  8. Deviations and RMS • If the parent distribution mean is m, the deviations from the mean can be written • The average of the deviations, by virtue of the definition of the mean, must vanish: • Still, we may want to know what is the average absolution deviation, i.e. not consider the sign of the deviation, just the amount: • For computational purposes, it is better to define the square of the deviations (called the variance) • Then the standard deviation (also called RMS or root-mean-square deviation) is the square-root of the variance, s. • To calculate the variance of the sample distribution, use:

More Related