Kalpana Kalpana (Editor)

Atomic clock

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit
Classification
  
Clock

Powered
  
Yes

Fuel source
  
Electricity

Atomic clock

Industry
  
Telecommunications, science

Application
  
TAI, satellite navigation

An atomic clock is a clock device that uses an electronic transition frequency in the microwave, optical, or ultraviolet region of the electromagnetic spectrum of atoms as a frequency standard for its timekeeping element. Atomic clocks are the most accurate time and frequency standards known, and are used as primary standards for international time distribution services, to control the wave frequency of television broadcasts, and in global navigation satellite systems such as GPS.

Contents

The principle of operation of an atomic clock is not based on nuclear physics, but rather on atomic physics; it uses the microwave signal that electrons in atoms emit when they change energy levels. Early atomic clocks were based on masers at room temperature. Currently, the most accurate atomic clocks first cool the atoms to near absolute zero temperature by slowing them with lasers and probing them in atomic fountains in a microwave-filled cavity. An example of this is the NIST-F1 atomic clock, one of the national primary time and frequency standards of the United States.

The accuracy of an atomic clock depends on two factors. The first factor is temperature of the sample atoms—colder atoms move much more slowly, allowing longer probe times. The second factor is the frequency and intrinsic width of the electronic transition. Higher frequencies and narrow lines increase the precision.

National standards agencies in many countries maintain a network of atomic clocks which are intercompared and kept synchronized to an accuracy of 10−9 seconds per day (approximately 1 part in 1014). These clocks collectively define a continuous and stable time scale, International Atomic Time (TAI). For civil time, another time scale is disseminated, Coordinated Universal Time (UTC). UTC is derived from TAI, but approximately synchronised, by using leap seconds, to UT1, which is based on actual rotation of the Earth with respect to the solar time.

History

The idea of using atomic transitions to measure time was suggested by Lord Kelvin in 1879. Magnetic resonance, developed in the 1930s by Isidor Rabi, became the practical method for doing this. In 1945, Rabi first publicly suggested that atomic beam magnetic resonance might be used as the basis of a clock. The first atomic clock was an ammonia maser device built in 1949 at the U.S. National Bureau of Standards (NBS, now NIST). It was less accurate than existing quartz clocks, but served to demonstrate the concept. The first accurate atomic clock, a caesium standard based on a certain transition of the caesium-133 atom, was built by Louis Essen and Jack Parry in 1955 at the National Physical Laboratory in the UK. Calibration of the caesium standard atomic clock was carried out by the use of the astronomical time scale ephemeris time (ET). This led to the internationally agreed definition of the latest SI second being based on atomic time. Equality of the ET second with the (atomic clock) SI second has been verified to within 1 part in 1010. The SI second thus inherits the effect of decisions by the original designers of the ephemeris time scale, determining the length of the ET second.

Since the beginning of development in the 1950s, atomic clocks have been based on the hyperfine transitions in hydrogen-1, caesium-133, and rubidium-87. The first commercial atomic clock was the Atomichron, manufactured by the National Company. More than 50 were sold between 1956 and 1960. This bulky and expensive instrument was subsequently replaced by much smaller rack-mountable devices, such as the Hewlett-Packard model 5060 caesium frequency standard, released in 1964.

In the late 1990s four factors contributed to major advances in clocks:

  • Laser cooling and trapping of atoms
  • So-called high-finesse Fabry–Pérot cavities for narrow laser line widths
  • Precision laser spectroscopy
  • Convenient counting of optical frequencies using optical combs.
  • In August 2004, NIST scientists demonstrated a chip-scale atomic clock. According to the researchers, the clock was believed to be one-hundredth the size of any other. It requires no more than 125 mW, making it suitable for battery-driven applications. This technology became available commercially in 2011. Ion trap experimental optical clocks are more precise than the current caesium standard.

    In March 2017, NASA plans to deploy the Deep Space Atomic Clock (DSAC), a miniaturized, ultra-precise mercury-ion atomic clock, into outer space. The DSAC is considered much more stable than other current navigational clocks.

    Mechanism

    Since 1967, the International System of Units (SI) has defined the second as the duration of 9192631770 cycles of radiation corresponding to the transition between two energy levels of the caesium-133 atom. In 1997, the International Committee for Weights and Measures (CIPM) added that the preceding definition refers to a caesium atom at rest at a temperature of 0 K.

    This definition makes the caesium oscillator the primary standard for time and frequency measurements, called the caesium standard. The definitions of other physical units, e.g., the volt and the metre, rely on the definition of the second.

    The actual time-reference of an atomic clock consists of an electronic oscillator operating at microwave frequency. The oscillator is arranged so that its frequency-determining components include an element that can be controlled by a feedback signal. The feedback signal keeps the oscillator tuned in resonance with the frequency of the electronic transition of caesium or rubidium.

    The core of the atomic clock is a tunable microwave cavity containing a gas. In a hydrogen maser clock the gas emits microwaves (the gas mases) on a hyperfine transition, the field in the cavity oscillates, and the cavity is tuned for maximum microwave amplitude. Alternatively, in a caesium or rubidium clock, the beam or gas absorbs microwaves and the cavity contains an electronic amplifier to make it oscillate. For both types the atoms in the gas are prepared in one electronic state prior to filling them into the cavity. For the second type the number of atoms which change electronic state is detected and the cavity is tuned for a maximum of detected state changes.

    Most of the complexity of the clock lies in this adjustment process. The adjustment tries to correct for unwanted side-effects, such as frequencies from other electron transitions, temperature changes, and the spreading in frequencies caused by ensemble effects. One way of doing this is to sweep the microwave oscillator's frequency across a narrow range to generate a modulated signal at the detector. The detector's signal can then be demodulated to apply feedback to control long-term drift in the radio frequency. In this way, the quantum-mechanical properties of the atomic transition frequency of the caesium can be used to tune the microwave oscillator to the same frequency, except for a small amount of experimental error. When a clock is first turned on, it takes a while for the oscillator to stabilize. In practice, the feedback and monitoring mechanism is much more complex.

    A number of other atomic clock schemes are in use for other purposes. Rubidium standard clocks are prized for their low cost, small size (commercial standards are as small as 17 cm3) and short-term stability. They are used in many commercial, portable and aerospace applications. Hydrogen masers (often manufactured in Russia) have superior short-term stability compared to other standards, but lower long-term accuracy.

    Often, one standard is used to fix another. For example, some commercial applications use a rubidium standard periodically corrected by a global positioning system receiver (see GPS disciplined oscillator). This achieves excellent short-term accuracy, with long-term accuracy equal to (and traceable to) the U.S. national time standards.

    The lifetime of a standard is an important practical issue. Modern rubidium standard tubes last more than ten years, and can cost as little as US$50. Caesium reference tubes suitable for national standards currently last about seven years and cost about US$35,000. The long-term stability of hydrogen maser standards decreases because of changes in the cavity's properties over time.

    Modern clocks use magneto-optical traps to cool the atoms for improved precision.

    Physics package realisations

    A number of methods exist for utilizing hyperfine atomic transitions. These methods, with their respective benefits and drawbacks, have influenced the development of commercial devices and laboratory standards. By tradition, the hardware that is used to probe the atoms is called the physics package.

    Atomic beam standard

    The atomic beam standard is a direct extension of the Stern-Gerlach atomic splitting experiment. The atoms of choice are heated in an oven to create gas, which is collimated into a beam. This beam, consisting of a mixture of atoms in two states, passes through a state-selector magnet A, where atoms of the wrong state are separated out from the beam. The remaining beam is exposed to an RF field at or near the transition frequency. The beam then passes through a space containing a static homogeneous magnetic field before it is again exposed to the RF field. The RF field and the C-field coil will flip the state of the atoms, with a probability depending on how close the microwave frequency is to the atomic transition frequency. After the second RF field exposure the atomic beam passes through a second state selector magnet B, where the atoms that did not change state (are still in the state selected by magnet A) are discarded. This way, the number of atoms which survive magnet B is related to the microwaves' ability to match the atomic transition frequency. After the second state selector, a mass-spectrometer using an ionizer detects the rate of atoms being received.

    Modern variants of this beam mechanism use optical pumping to transition all atoms to the same state rather than dumping half the atoms. Optical detection using scintillation can also be used.

    The most common isotope for beam devices is caesium (133Cs), but rubidium (87Rb) and thallium (205Tl) are examples of others used in early research.

    The frequency errors can be made very small for a beam device, or predicted (such as the magnetic field pull of the C-coil) in such a way that a high degree of repeatability and stability can be achieved. This is why an atomic beam can be used as a primary standard.

    Atomic gas cell standard

    The atomic gas cell standard builds on a confined reference isotope (often an alkali metal such as Rubidium (87Rb)) inside an RF cavity. The atoms are excited to a common state using optical pumping; when the applied RF field is swept over the hyperfine spectrum, the gas will absorb the pumping light, and a photodetector provides the response. The absorption peak steers the fly-wheel oscillator.

    A typical rubidium gas-cell uses a rubidium (87Rb) lamp heated to 108-110 degrees Celsius, and an RF field to excite it to produce light, where the D1 and D2 lines are the significant wavelengths. An 85Rb cell filters out the a component of the D1 and D2 line so that only the b component pumps the 87Rb gas cell in the RF cavity.

    Among the significant frequency pulling mechanisms inherent to the gas cell are wall-shift, buffer-gas shift, cavity-shift and light-shift. The wall-shift occurs as the gas bumps into the wall of the glass container. Wall-shift can be reduced by wall coating and compensation by buffer gas. The buffer gas shift comes from the reference atoms which bounce into buffer gas atoms such as neon and argon; these shifts can be both positive and negative. The cavity shift comes from the RF cavity, which can deform the resonance amplitude response; this depends upon cavity centre frequency and resonator Q-value. Light-shift is an effect where frequency is pulled differently depending on the light intensity, which often is modulated by the temperature shift of the rubidium lamp and filter cell.

    There are thus many factors in which temperature and ageing can shift frequency over time, and this is why a gas cell standard is unfit for a primary standard, but can become a very inexpensive, low-power and small-size solution for a secondary standard or where better stability compared to crystal oscillators is needed, but not the full performance of a caesium beam standard. The rubidium gas standards have seen use in telecommunications systems and portable instruments.

    Active maser standard

    The active maser standard is a development from the atomic beam standard in which the observation time was increased by using a bounce-box. By controlling the beam intensity, spontaneous emission provides sufficient energy to provide a continuous oscillation, which is tapped and used as a reference for a flywheel oscillator.

    The active maser is sensitive to wall-shift and cavity pulling. The wall-shift is mitigated by using PTFE coating (or other suitable coating) to reduce the effect. The cavity pulling effect can be reduced by automatic cavity tuning. In addition the magnetic field pulls the frequency.

    Although the long-term stability of the active maser is not as good as that of a caesium beam, it remains one of the most stable sources available. The inherent pulling effects make repeatability troublesome and prohibit its use as a primary standard, but it makes an excellent secondary standard. It is used as a low-noise flywheel standard for caesium beam standards.

    Fountain standard

    The fountain standard is a development from the beam standard where the beam is folded back to itself by the Earth's gravity, such that the first and second RF fields are applied during the atoms' upward and downward trips through the same RF cavity, essentially removing phase errors between the two cavities. The slow speed of the atoms also reduces black body temperature shifts. The length of the beam has the same practical limits on vacuum chamber size as a beam clock, but the laser-cooled atoms travel so much slower that the observation time increases about 100-fold (from roughly 10 ms to 1 s) and hence a much higher Q value is achieved in the Ramsey fringes. (The line width is reduced from about 50 Hz to about 1 Hz.)

    Caesium fountains have been implemented in many laboratories, but rubidium has even greater ability to provide stability in the fountain configuration.

    Ion trap standard

    The ion trap standard is a set of different approaches, but their common property is that a cooled ion is confined in an electrostatic trap. The hyperfine region of the available electron is then being tracked similar to that of a gas cell standard.

    Ion traps have been used for numerous ions. 199Hg+ was an early candidate. Quantum logic spectroscopy of a single Al ion became the most precise in 2008. In 2010 an improved setup using a Mg+ logic ion instead of Be was demonstrated

    Power consumption

    The power consumption of atomic clocks varies with their size. Atomic clocks on the scale of one chip require less than 30 milliwatt; Primary frequency and time standards like the United States Time Standard atomic clocks, NIST-F1 and NIST-F2, use far greater quantities of power.

    Evaluated accuracy

    The evaluated accuracy uB reports of various primary frequency and time standards are published online by the International Bureau of Weights and Measures (BIPM). Several frequency and time standards groups as of 2015 reported uB values in the 2 × 10−16 to 3 × 10−16 range.

    In 2011, the NPL-CsF2 caesium fountain clock operated by the National Physical Laboratory (NPL), which serves as the United Kingdom primary frequency and time standard, was improved regarding the two largest sources of measurement uncertainties — distributed cavity phase and microwave lensing frequency shifts. In 2011 this resulted in an evaluated frequency uncertainty reduction from uB = 4.1 × 10−16 to uB = 2.3 × 10−16;— the lowest value for any primary national standard at the time. At this frequency uncertainty, the NPL-CsF2 is expected to neither gain nor lose a second in about 138 million (138 × 106) years.

    The NIST-F2 caesium fountain clock operated by the National Institute of Standards and Technology (NIST), was officially launched in April 2014, to serve as a new U.S. civilian frequency and time standard, along with the NIST-F1 standard. The planned uB performance level of NIST-F2 is 1 × 10−16. "At this planned performance level the NIST-F2 clock will not lose a second in at least 300 million years." NIST-F2 was designed using lessons learned from NIST-F1. The NIST-F2 key advance compared to the NIST-F1 is that the vertical flight tube is now chilled inside a container of liquid nitrogen, at −193 °C (−315.4 °F). This cycled cooling dramatically lowers the background radiation and thus reduces some of the very small measurement errors that must be corrected in NIST-F1.

    The first in-house accuracy evaluation of NIST-F2 reported a uB of 1.1 × 10−16. However, a published scientific criticism of that NIST F-2 accuracy evaluation described problems in its treatment of distributed cavity phase shifts and the microwave lensing frequency shift, which is treated significantly differently than in the majority of accurate fountain clock evaluations. The next NIST-F2 submission to the BIPM in March, 2015 again reported a uB of 1.5 × 10−16, but did not address the standing criticism. There have been neither subsequent reports to the BIPM from NIST-F2 nor has an updated accuracy evaluation been published.

    At the request of the Italian standards organization, NIST fabricated many duplicate components for a second version of NIST-F2, known as IT-CsF2 to be operated by the Istituto Nazionale di Ricerca Metrologica (INRiM), NIST's counterpart in Turin, Italy. In May, October and November 2016 the IT-CsF2 caesium fountain clock reported a uB of 1.7 × 10−16 in the BIPM reports of evaluation of primary frequency standards.

    Research

    Most research focuses on the often conflicting goals of making the clocks smaller, cheaper, more portable, more energy efficient, more accurate, more stable and more reliable. The Atomic Clock Ensemble in Space is an example of clock research.

    Secondary representations of the second

    A list of frequencies recommended for secondary representations of the second is maintained by the International Bureau of Weights and Measures (BIPM) since 2006 and is available online. The list contains the frequency values and the respective standard uncertainties for the rubidium microwave transition and for several optical transitions. These secondary frequency standards are accurate at the level of parts in 10−18; however, the uncertainties provided in the list are in the range of parts in 10−1410−15 since they are limited by the linking to the caesium primary standard that currently (2015) defines the second.

    For context, a femtosecond (6985100000000000000♠1×10−15 s) is to a second what a second is to about 31.71 million (7007317100000000000♠31.71×106) years and an attosecond (6982100000000000000♠1×10−18 s) is to a second what a second is to about 31.71 billion (7010317100000000000♠31.71×109) years.

    21st century experimental atomic clocks that provide non-caesium-based secondary representations of the second are becoming so precise that they are likely to be used as extremely sensitive detectors for other things besides measuring frequency and time. For example, the frequency of atomic clocks is altered slightly by gravity, magnetic fields, electrical fields, force, motion, temperature and other phenomena. The experimental clocks tend to continue improving, and leadership in performance has been shifted back and forth between various types of experimental clocks.

    Quantum clocks

    In March 2008, physicists at NIST described a quantum logic clock based on individual ions of beryllium and aluminium. This clock was compared to NIST's mercury ion clock. These were the most accurate clocks that had been constructed, with neither clock gaining nor losing time at a rate that would exceed a second in over a billion years. In February 2010, NIST physicists described a second, enhanced version of the quantum logic clock based on individual ions of magnesium and aluminium. Considered the world's most precise clock in 2010 with a fractional frequency inaccuracy of 8.6 × 10−18, it offers more than twice the precision of the original.

    The accuracy of experimental quantum clocks has since been superseded by experimental optical lattice clocks based on strontium-87 and ytterbium-171.

    Optical clocks

    The theoretical move from microwaves as the atomic "escapement" for clocks to light in the optical range (harder to measure but offering better performance) earned John L. Hall and Theodor W. Hänsch the Nobel Prize in Physics in 2005. One of 2012's Physics Nobelists, David J. Wineland, is a pioneer in exploiting the properties of a single ion held in a trap to develop clocks of the highest stability.

    New technologies, such as femtosecond frequency combs, optical lattices, and quantum information, have enabled prototypes of next-generation atomic clocks. These clocks are based on optical rather than microwave transitions. A major obstacle to developing an optical clock is the difficulty of directly measuring optical frequencies. This problem has been solved with the development of self-referenced mode-locked lasers, commonly referred to as femtosecond frequency combs. Before the demonstration of the frequency comb in 2000, terahertz techniques were needed to bridge the gap between radio and optical frequencies, and the systems for doing so were cumbersome and complicated. With the refinement of the frequency comb, these measurements have become much more accessible and numerous optical clock systems are now being developed around the world.

    As in the radio range, absorption spectroscopy is used to stabilize an oscillator—in this case a laser. When the optical frequency is divided down into a countable radio frequency using a femtosecond comb, the bandwidth of the phase noise is also divided by that factor. Although the bandwidth of laser phase noise is generally greater than stable microwave sources, after division it is less.

    The two primary systems under consideration for use in optical frequency standards are:

  • single ions isolated in an ion trap and
  • neutral atoms trapped in an optical lattice.
  • These two techniques allow the atoms or ions to be highly isolated from external perturbations, thus producing an extremely stable frequency reference.

    Atomic systems under consideration include Al+, Hg+/2+, Hg, Sr, Sr+/2+, In+/3+, Mg, Ca, Ca+, Yb+/2+/3+ and Yb.

    The rare-earth element ytterbium (Yb) is valued not so much for its mechanical properties but for its complement of internal energy levels. "A particular transition in Yb atoms, at a wavelength of 578 nm, currently provides one of the world's most accurate optical atomic frequency standards," said Marianna Safronova. The estimated amount of uncertainty achieved corresponds to a Yb clock uncertainty of about one second over the lifetime of the universe so far, 15 billion years, according to scientists at the Joint Quantum Institute (JQI) and the University of Delaware in December 2012.

    In 2013 optical lattice clocks (OLCs) were shown to be as good as or better than caesium fountain clocks. Two optical lattice clocks containing about 10 000 atoms of strontium-87 were able to stay in synchrony with each other at a precision of at least 1.5 × 10−16, which is as accurate as the experiment could measure. These clocks have been shown to keep pace with all three of the caesium fountain clocks at the Paris Observatory. There are two reasons for the possibly better precision. Firstly, the frequency is measured using light, which has a much higher frequency than microwaves, and secondly, by using many atoms, any errors are averaged. Using ytterbium-171 atoms, a new record for stability with a precision of 6982160000000000000♠1.6×10−18 over a 7-hour period was published on 22 August 2013. At this stability, the two optical lattice clocks working independently from each other used by the NIST research team would differ less than a second over the age of the universe (7017435494880000000♠13.8×109 years); this was 10 times better than previous experiments. The clocks rely on 10 000 ytterbium atoms cooled to 10 microkelvin and trapped in an optical lattice. A laser at 578 nm excites the atoms between two of their energy levels. Having established the stability of the clocks, the researchers are studying external influences and evaluating the remaining systematic uncertainties, in the hope that they can bring the clock's accuracy down to the level of its stability. An improved optical lattice clock was described in a 2014 Nature paper. In 2015 JILA evaluated the absolute frequency uncertainty of their latest strontium-87 optical lattice clock at 2.1 × 10−18, which corresponds to a measurable gravitational time dilation for an elevation change of 2 cm (0.79 in) on planet Earth that according to JILA/NIST Fellow Jun Ye is "getting really close to being useful for relativistic geodesy". At this frequency uncertainty, this JILA optical lattice optical clock is expected to neither gain nor lose a second in more than 15 billion (15 × 109) years.

    Optical clocks are currently (2015) still primarily research projects, less mature than rubidium and caesium microwave standards, which regularly deliver time to the International Bureau of Weights and Measures (BIPM) for establishing International Atomic Time (TAI). As the optical experimental clocks move beyond their microwave counterparts in terms of accuracy and stability performance this puts them in a position to replace the current standard for time, the caesium fountain clock. In the future this might lead to redefine the caesium microwave based SI second and other new dissemination techniques at the highest level of accuracy to transfer clock signals will be required that can be used in both shorter-range and longer-range (frequency) comparisons between better clocks and to explore their fundamental limitations without significantly compromising their performance.

    Clock comparison techniques

    In June 2015 the European National Physical Laboratory (NPL) in Teddington, UK; the French department of Time-Space Reference Systems at the Paris Observatory (LNE-SYRTE); the German German National Metrology Institute (PTB) in Braunschweig; and Italy’s Istituto Nazionale di Ricerca Metrologica (INRiM) in Turin labs have started tests to improve the accuracy of current state-of-the-art satellite comparisons by a factor 10, but it will still be limited to one part in 1 × 10−16. These 4 European labs are developing and host a variety of experimental optical clocks that harness different elements in different experimental set-ups and want to compare their optical clocks against each other and check whether they agree. In a next phase these labs strive to transmit comparison signals in the visible spectrum through fibre-optic cables. This will allow their experimental optical clocks to be compared with an accuracy similar to the expected accuracies of the optical clocks themselves. Some of these labs have already established fibre-optic links, and tests have begun on sections between Paris and Teddington, and Paris and Braunschweig. Fibre-optic links between experimental optical clocks also exist between the American NIST lab and its partner lab JILA, both in Boulder, Colorado but these span much shorter distances than the European network and are between just two labs. According to Fritz Riehle, a physicist at PTB "Europe is in a unique position as it has a high density of the best clocks in the world". In August 2016 the French LNE-SYRTE in Paris and German PTB in Braunschweig reported the comparison and agreement of two fully independent experimental strontium lattice optical clocks in Paris and Braunschweig at an uncertainty of 5 × 10−17 via a newly established phase-coherent frequency link connecting Paris and Braunschweig, using 1,415 km (879 mi) of telecom fibre-optic. The fractional uncertainty of the whole link was assessed to be 2.5 × 10−19, making comparisons of even more accurate clocks possible.

    Applications

    The development of atomic clocks has led to many scientific and technological advances such as a system of precise global and regional navigation satellite systems, and applications in the Internet, which depend critically on frequency and time standards. Atomic clocks are installed at sites of time signal radio transmitters. They are used at some long wave and medium wave broadcasting stations to deliver a very precise carrier frequency. Atomic clocks are used in many scientific disciplines, such as for long-baseline interferometry in radioastronomy.

    The Global Positioning System (GPS) operated by the US Air Force Space Command provides very accurate timing and frequency signals. A GPS receiver works by measuring the relative time delay of signals from a minimum of four, but usually more, GPS satellites, each of which has at least two onboard caesium and as many as two rubidium atomic clocks. The relative times are mathematically transformed into three absolute spatial coordinates and one absolute time coordinate. GPS Time (GPST) is a continuous time scale and theoretically accurate to about 14 ns. However, most receivers lose accuracy in the interpretation of the signals and are only accurate to 100 ns. The GPST is related to but differs from TAI (International Atomic Time) and UTC (Coordinated Universal Time). GPST remains at a constant offset with TAI (TAI – GPST = 19 seconds) and like TAI does not implement leap seconds. Periodic corrections are performed to the on-board clocks in the satellites to keep them synchronized with ground clocks. The GPS navigation message includes the difference between GPST and UTC. As of July 2015, GPST is 17 seconds ahead of UTC because of the leap second added to UTC on 30 June 2015. Receivers subtract this offset from GPS Time to calculate UTC and specific timezone values.

    The GLObal NAvigation Satellite System (GLONASS) operated by the Russian Aerospace Defence Forces provides an alternative to the Global Positioning System (GPS) system and is the second navigational system in operation with global coverage and of comparable precision. GLONASS Time (GLONASST) is generated by the GLONASS Central Synchroniser and is typically better than 1,000 ns. Unlike GPS, the GLONASS time scale implements leap seconds, like UTC.

    The Galileo Global Navigation Satellite System is operated by the European GNSS Agency and European Space Agency. Galileo started offering global Early Operational Capability (EOC) on 15 December 2016, providing the third and first non-military operated Global Navigation Satellite System, and is expected to reach Full Operational Capability (FOC) in 2019. To achieve Galileo's FOC coverage constellation goal 6 planned extra satellites need to be added. Galileo System Time (GST) is a continuous time scale which is generated on the ground at the Galileo Control Centre in Fucino, Italy, by the Precise Timing Facility, based on averages of different atomic clocks and maintained by the Galileo Central Segment and synchronised with TAI with a nominal offset below 50 ns. According to the European GNSS Agency Galileo offers 30 ns timing accuracy. Each Galileo satellite has two passive hydrogen maser and two rubidium atomic clocks for onboard timing. The Galileo navigation message includes the differences between GST, UTC and GPST (to promote interoperability).

    System under construction

    The BeiDou-2 satellite navigation systems is under construction in 2017 but has to add planned extra satellites to achieve its full-scale global coverage constellation goal. BeiDou Time (BDT) is a continuous time scale starting at 1 January 2006 at 0:00:00 UTC and is synchronised with UTC within 100 ns. BeiDou became operational in China in December 2011, with 10 satellites in use, and began offering services to customers in the Asia-Pacific region in December 2012. The BeiDou global navigation system should be finished by 2020.

    Time signal radio transmitters

    A radio clock is a clock that automatically synchronizes itself by means of government radio time signals received by a radio receiver. Many retailers market radio clocks inaccurately as atomic clocks; although the radio signals they receive originate from atomic clocks, they are not atomic clocks themselves. Normal low cost consumer grade receivers solely rely on the amplitude-modulated time signals and use narrow band receivers (with 10 Hz bandwidth) with small ferrite loopstick antennas and circuits with non optimal digital signal processing delay and can therefore only be expected to determine the beginning of a second with a practical accuracy uncertainty of ± 0.1 second. This is sufficient for radio controlled low cost consumer grade clocks and watches using standard-quality quartz clocks for timekeeping between daily synchronization attempts, as they will be most accurate immediately after a successful synchronization and will become less accurate from that point forward until the next synchronization. Instrument grade time receivers provide higher accuracy. Such devices incur a transit delay of approximately 1 ms for every 300 kilometres (186 mi) of distance from the radio transmitter. Many governments operate transmitters for time-keeping purposes.

    References

    Atomic clock Wikipedia


    Similar Topics