840 likes | 859 Vues
Overview of Time and Frequency Metrology. Michael Lombardi lombardi@nist.gov. NIST Boulder Laboratories. Date and Time-of-Day records when an event happened Time Interval duration between two events Frequency rate of a repetitive event. Types of Time and Frequency Information.
E N D
Overview of Time and FrequencyMetrology Michael Lombardi lombardi@nist.gov
Date and Time-of-Day records when an event happened Time Interval duration between two events Frequency rate of a repetitive event Types of Time and Frequency Information
Second (s) standard unit for time interval one of 7 base SI units Hertz (Hz) standard unit for frequency (s-1) events per second one of 21 SI units derived from base units Two units of measurement in the International System (SI) apply to time and frequency metrology
The relationship between frequency and time We can measure frequency to get time interval, and vice versa, because the relationship between frequency and time interval is known. Frequency is the reciprocal of time interval: Where T is the period of the signal in seconds, and f is the frequency in hertz. We can also express this as f = s-1(the notation used to define the hertz in the SI).
second (s) millisecond (ms), 10-3 s microsecond (s), 10-6 s nanosecond (ns), 10-9 s picosecond (ps), 10-12 s femtosecond (fs), 10-15 s Units of Time Interval
hertz (Hz), 1 event per second kilohertz (kHz), 103 Hz megahertz (MHz), 106 Hz gigahertz (GHz), 109 Hz Units of Frequency
The period is the reciprocal of the frequency, and vice versa. Period is expressed in units of time. Period
The wavelength is the length of one complete wave cycle. Wavelength is expressed in units of length. wavelength in meters = 300 / frequency in MHz Wavelength
We Use a Wide Range of Frequencies“Everyday” frequencies in time and frequency metrology
A clock is a device that counts cycles of a frequency and records units of time interval, such as seconds, minutes, hours, and days. Thus, a clock consists of a frequency source, a counter, and a output device. The frequency source is known as an oscillator. A good example is a wristwatch. Most wristwatches contain an oscillator that generates 32768 cycles per second. After a watch counts 32768 cycles, it can record that one second has elapsed. A oscillator is a device that produces a periodic event that repeats at a nearly constant rate. This rate is called the resonance frequency. Since the best clocks contain the best oscillators, the evolution of timekeeping has been a continual quest to find better and better oscillators. Clocks and Oscillators
Synchronization is the process of setting two or more clocks to the same time. Syntonization is the process of setting two or more clocks to the same frequency. Synchronization & Syntonization
The performance of time and frequency standards has improved by 13 orders of magnitude in the past 700 years, and by about 9 orders of magnitude (a factor of a billion) in the past 100 years. Clocks and oscillators keep getting better and better
An agreed upon system for keeping time, based on a common definition of the second. Seconds are then counted to form longer time intervals like minutes, hours, days, and years. Time scales serve as a reference for time-of-day, time interval, and frequency. What is a Time Scale?
UTC is an internationally recognized atomic time scale based on the SI definition of the second. The average value of UTC is computed by the International Bureau of Weights and Measures (BIPM) in France. They collect data from over 250 atomic oscillators located at about 60 national metrology institutes. UTC is a paper time scale, computed by the BIPM after the data is collected. The national metrology institutes maintain their own time scales that operate continously in real-time. For example, CENAMEP maintains UTC(CNMP) and NIST maintains UTC(NIST). The BIPM publishes a monthly document called the Circular T that shows the recent difference between UTC and UTC(k), or the UTC maintained by each national metrology institute. The Circular-T allows each laboratory to see their time offset with respect to UTC, and with respect to every laboratory that contributes to UTC. This means that all contributing labs around the world are continously compared to each other. Coordinated Universal Time (UTC)
UTC is the Official Reference for Time-Of-Day Clocks synchronized to UTC display the same second (and normally the same minute) all over the world. However, since UTC is used internationally, it ignores local conventions like time zones and daylight saving time (DST). The UTC hour refers to the hour at the Prime Meridian which passes through Greenwich, England. California time, for example, will differ from UTC by either 7 or 8 hours, depending upon whether or not DST is in effect.
UTC is the Official Reference for Time Interval • Time interval is the duration between two events. In time and frequency metrology, it is normally expressed in seconds or sub-seconds (milliseconds, microseconds, nanoseconds, picoseconds). • Since UTC is based on the SI definition of the second, all time interval measurements are referenced to its one second pulses. By counting the pulses, time is kept. • Timing systems are synchronized to UTC by using an On-Time Marker (OTM), consisting of a pulse or signal that coincides as closely as possible with the arrival of the Coordinated Universal Time (UTC) second. The uncertainty of the OTM indicates the time interval between its arrival and the UTC second
UTC is the Official Reference for Frequency • UTC runs at an extremely stable rate with an uncertainty measured in parts in 1015. Therefore, it serves as the international reference for all frequency measurements
Pendulums or quartz oscillators were never used to define the second. We went directly from astronomical to atomic time. Before 1956, the second was defined based on the length of the mean solar day. Called the mean solar second. From 1956 to 1967, the second was defined based on a fraction of the tropical year. Called the ephemeris second. Since 1967, the second has been defined based on oscillations of the cesium atom. Called the atomic second, or cesium second. The change to the cesium second in 1967 officially began the era of atomic timekeeping. Prior to 1967, time was kept by astronomical observations. How is the SI second defined?
The duration of 9,192,631,770 periods of the radiation corresponding to the transition between two hyperfine levels of the ground state of the cesium-133 atom. => Defined by Markowitz/Hall (USNO) & Essen/Parry (NPL), 1958. => Ratified by the SI in 1967. SI Definition of the Second
All atomic time scales are currently based on the cesium definition of the second. International Atomic Time (TAI) TAI runs at the same frequency as UTC (this frequency is determined by the BIPM), but is not corrected for leap seconds. TAI is seldom used by the general public. It is an “internal” time scale used by the BIPM and national laboratories like NIST. Coordinated Universal Time (UTC) UTC is TAI corrected for leap seconds so that it stays within 0.9 seconds of UT1. Atomic Time Scales
An integer second added to atomic time (UTC) to keep it within 0.9 seconds of the most widely used astronomical time scale (UT1). Leap seconds usually occur on June 30th or December 31st. On average, about 7 are needed every 10 years, suggesting that the long term frequency offset of the Earth is about 2 x 10-8. However, the Earth both speeds up and slows down, making the occurrence of leap seconds cyclical. No leap seconds were needed in 1999 to 2004, but one is scheduled for December 31, 2005. The biggest reason that so many leap seconds have been needed is that the atomic second (cesium) was defined relative to the ephemeris second (which served as the SI second in 1958), and not the mean solar second. Leap Seconds
When a leap second occurs, one minute has 61 seconds. This effectively stops UTC for one second so that UT1 can catch up. The sequence is: 23 hours, 59 minutes, 59 seconds 23 hours, 59 minutes, 60 seconds 0 hours, 0 minutes, 0 seconds Implementation of Leap Seconds
Device Under Test (DUT) Can be a tuning fork or a stopwatch or timer Can be a quartz, rubidium, or cesium oscillator Traceable References (transfer standards like WWV, WWVB, LORAN, GPS, or any reference that provides a link back to the SI) Calibration Method (measurement system and procedure used to collect data) Uncertainty Analysis (statistics and data reduction) Four Parts of a Calibration
Calibration Comparison between a reference and a device under test (DUT) that is conducted by collecting measurement data. Calibration results should include a statement of measurement uncertainty, and should establish a traceability chain back to the International System of Units (SI).
Test Uncertainty Ratio (TUR) • Performance ratio between the Reference and the device under test. • United States Mil Spec 45662A (now obsolete) required a 4:1 TUR. • ISO Guide 17025 requires a complete uncertainty analysis. However, if a 10:1 TUR is maintained, the uncertainty analysis becomes much easier, since you don’t have to worry as much about the uncertainty of the reference (it is “lost in the noise”).
Frequency Accuracy (Offset) The degree of conformity of a measured value to its definition at a given point in time. Accuracy tells us how closely an oscillator produces its nominal or nameplate frequency.
What else do they call it? • Frequency Offset • Frequency Error • Frequency Bias • Frequency Difference • Relative Frequency • Fractional Frequency • Accuracy
Resolution The smallest unit that a measurement can determine. For example, if a 10-digit frequency counter is used to measure a 1 MHz signal, the resolution is .001 Hz, or 1 mHz. 10 000 000. 001 == 10-digit counter The “single shot” resolution is determined by the quality of the measurement system, but more resolution can usually be obtained by averaging.
Estimating Frequency Offset (accuracy) in the Frequency Domain (a measurement made with respect to frequency) • fmeasured is the reading from an instrument, such as a frequency counter • fnominal is the frequency labeled on the oscillator’s output
Phase • The position of a point in time (instant) on a waveform cycle. A complete cycle is defined as the interval required for the waveform to reattain its arbitrary initial value. One cycle constitutes 360° of phase. One radian of phase equals approximately 57.3°. Phase can also express relative displacement between two corresponding features (for example, peaks or zero crossings) of two waveforms having the same frequency. • In time and frequency metrology, the phase difference is usually stated in units of time, rather than in units of phase angle. What we often call a phase plot might properly be known as a time plot, or a time difference plot, but the concept is the same. The time interval for 1° of phase is inversely proportional to the frequency. If the frequency of a signal is given by f, then the time tdeg (in seconds) corresponding to 1° of phase is: • tdeg = 1 / (360f) = T / 360 • Therefore, a 1° phase shift on a 5 MHz signal corresponds to a time shift of 555 picoseconds. This same answer can be obtained by taking the period of 5 MHz (200 nanoseconds) and dividing by 360.
Phase Comparisons • Used to estimate frequency offset in the time domain. • Phase comparisons measure the change in phase (or phase deviation) of the DUT signal relative to the reference during a calibration. When expressed in time units, this quantity is sometimes called t , spoken as “delta-t”, which simply means the change in time.