The **second** (symbol: **s**) (abbreviated **s** or **sec**) is the base unit of time in the International System of Units (SI). It is qualitatively defined as the *second* division of the hour by sixty, the first division by sixty being the minute. SI definition of second is "the duration of 9 192 631 770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium 133 atom". Seconds may be measured using a mechanical, electrical or an atomic clock.

## Contents

- International second
- Time units
- Frequency units
- Early civilizations
- Based on subdivisions of the moon cycle
- Based on mechanical clocks
- Based on a fraction of a year
- Based on cesium microwave atomic clock
- Proposed based on optical atomic clock
- SI multiples
- Other current definitions
- References

SI prefixes are combined with the word *second* to denote subdivisions of the second, *e.g.*, the millisecond (one thousandth of a second), the microsecond (one millionth of a second), and the nanosecond (one billionth of a second). Though SI prefixes may also be used to form multiples of the second such as kilosecond (one thousand seconds), such units are rarely used in practice. The more common larger non-SI units of time are not formed by powers of ten; instead, the second is multiplied by 60 to form a minute, which is multiplied by 60 to form an hour, which is multiplied by 24 to form a day.

The second is also the base unit of time in other systems of measurement: the centimetre–gram–second, metre–kilogram–second, metre–tonne–second, and foot–pound–second systems of units.

## International second

Under the International System of Units (via the International Committee for Weights and Measures, or CIPM), since 1967 the second has been defined as the duration of 7009919263177000000♠9192631770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium 133 atom. In 1997 CIPM added that the periods would be defined for a caesium atom at rest, and approaching the theoretical temperature of absolute zero (0 K), and in 1999, it included corrections from ambient radiation. Absolute zero implies no movement, and therefore zero external radiation effects (i.e., zero local electric and magnetic fields).

The second thus defined is consistent with the ephemeris second, which was based on astronomical measurements. (See History below.) The realization of the standard second is described briefly in a special publication from the National Institute of Standards and Technology, and in detail by the National Research Council of Canada.

## Time units

1 international second is equal to:

^{1}⁄

_{60}minute (but see also leap second)

^{1}⁄

_{3,600}hour

^{1}⁄

_{86,400}day (IAU system of units)

^{1}⁄

_{31,557,600}Julian year (IAU system of units)

## Frequency units

^{1}⁄

_{(1 hertz)}; more generally, (period of wave in seconds) =

^{1}⁄

_{(frequency of wave in hertz)}, where (period of wave)×(wavenumber) =

^{1}⁄

_{(velocity of wave)}in seconds per metre (SI) or in kayser-seconds (CGS).

^{1}⁄

_{(1 becquerel).}

## Early civilizations

Early civilizations constructed divisions in the day, but none used the term second, and none was a precursor to the modern second:

*c.*150 BC) and Ptolemy (

*c.*AD 150) subdivided the day into sixty parts (the sexagesimal system). They also used a mean hour (

^{1}⁄

_{24}day); simple fractions of an hour (

^{1}⁄

_{4},

^{2}⁄

_{3}, etc.); and time-degrees (

^{1}⁄

_{360}day, equivalent to four modern minutes).

^{1}⁄

_{60}, by

^{1}⁄

_{60}of that, by

^{1}⁄

_{60}of that, etc., to at least six places after the sexagesimal point - a precision equivalent to better than 2 microseconds. The Babylonians did not use the hour, but did use a double-hour lasting 120 modern minutes, a time-degree lasting four modern minutes, and a barleycorn lasting 3

^{1}⁄

_{3}modern seconds (the

*helek*of the modern Hebrew calendar), but did not sexagesimally subdivide these smaller units of time. No sexagesimal unit of the day was ever used as an independent unit of time.

## Based on subdivisions of the moon cycle

*second*, and defined the division of time between new moons of certain specific weeks as a number of days, hours, minutes, seconds, thirds, and fourths after noon Sunday.

*horae*,

*minuta*,

*secunda*,

*tertia*, and

*quarta*) after noon on specified calendar dates.

*third*(

^{1}⁄

_{60}of a second) remains in some languages, for example Polish (

*tercja*) and Turkish (

*salise*).

## Based on mechanical clocks

The earliest clocks to display seconds appeared during the last half of the 16th century. The second became accurately measurable with the development of mechanical clocks keeping *mean time*, as opposed to the *apparent time* displayed by sundials. The earliest spring-driven timepiece with a second hand which marked seconds is an unsigned clock depicting Orpheus in the Fremersdorf collection, dated between 1560 and 1570. During the 3rd quarter of the 16th century, Taqi al-Din built a clock with marks every 1/5 minute. In 1579, Jost Bürgi built a clock for William of Hesse that marked seconds. In 1581, Tycho Brahe redesigned clocks that displayed minutes at his observatory so they also displayed seconds. However, they were not yet accurate enough for seconds. In 1587, Tycho complained that his four clocks disagreed by plus or minus four seconds.

In 1644, Marin Mersenne calculated that a pendulum with a length of 39.1 inches (0.994 m) would have a period at one standard gravity of precisely two seconds, one second for a swing forward and one second for the return swing, enabling such a pendulum to tick in precise seconds.

In 1670, London clockmaker William Clement added this seconds pendulum to the original pendulum clock of Christiaan Huygens. From 1670 to 1680, Clement made many improvements to his clock and introduced the longcase or grandfather clock to the public. This clock used an anchor escapement mechanism with a seconds pendulum to display seconds in a small subdial. This mechanism required less power and caused less friction than the older verge escapement and was accurate enough to measure seconds reliably as one-sixtieth of a minute. Within a few years, most British precision clockmakers were producing longcase clocks and other clockmakers soon followed. Thus the second could now be reliably measured.

In 1832, Gauss proposed using the second as the base unit of time in his millimeter-milligram-second system of units. The British Association for the Advancement of Science (BAAS) in 1862 stated that "All men of science are agreed to use the second of mean solar time as the unit of time." BAAS formally proposed the CGS system in 1874, although this system was gradually replaced over the next 70 years by MKS units. Both the CGS and MKS systems used the same second as their base unit of time. MKS was adopted internationally during the 1940s, defining the second as ^{1}⁄_{86,400} of a mean solar day.

## Based on a fraction of a year

In 1956, the second was redefined in terms of a *year* (the period of the Earth's revolution around the Sun) *for a particular epoch* because, by then, it had become recognized that the Earth's rotation on its own axis was not sufficiently uniform as a standard of time. The Earth's motion was described in Newcomb's Tables of the Sun (1895), which provided a formula for estimating the motion of the Sun relative to the epoch 1900 based on astronomical observations made between 1750 and 1892.

The second was thus defined as:

the fraction ^{1}⁄_{31,556,925.9747} of the tropical year for 1900 January 0 at 12 hours ephemeris time.

This definition was ratified by the Eleventh General Conference on Weights and Measures in 1960, which also established the International System of Units.

The *tropical year* in the 1960 definition was not measured but calculated from a formula describing a mean tropical year that decreased linearly over time, hence the curious reference to a specific *instantaneous* tropical year. This was in conformity with the ephemeris time scale adopted by the IAU in 1952. This definition brings the observed positions of the celestial bodies into accord with Newtonian dynamical theories of their motion. Specifically, those tables used for most of the 20th century were Newcomb's Tables of the Sun (used from 1900 through 1983) and Brown's Tables of the Moon (used from 1923 through 1983).

Thus, the 1960 SI definition abandoned any explicit relationship between the scientific second and the length of a day, as most people understand the term.

## Based on cesium microwave atomic clock

With the development of the atomic clock in the early 1960s, it was decided to use atomic time as the basis of the definition of the second, rather than the revolution of the Earth around the Sun.

Following several years of work, Louis Essen from the National Physical Laboratory (Teddington, England) and William Markowitz from the United States Naval Observatory (USNO) determined the relationship between the hyperfine transition frequency of the caesium atom and the ephemeris second. Using a common-view measurement method based on the received signals from radio station WWV, they determined the orbital motion of the Moon about the Earth, from which the apparent motion of the Sun could be inferred, in terms of time as measured by an atomic clock. They found that the second of ephemeris time (ET) had the duration of 9,192,631,770 ± 20 cycles of the chosen caesium frequency. As a result, in 1967 the *Thirteenth General Conference on Weights and Measures* defined the SI second of atomic time as:

the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium-133 atom.

This SI second, referred to atomic time, was later verified to be in agreement, within 1 part in 10^{10}, with the second of ephemeris time as determined from lunar observations. (Nevertheless, this SI second was already, when adopted, a little shorter than the then-current value of the second of mean solar time.)

During the 1970s it was realized that gravitational time dilation caused the second produced by each atomic clock to differ depending on its altitude. A uniform second was produced by correcting the output of each atomic clock to mean sea level (the rotating geoid), lengthening the second by about 1×10^{−10}. This correction was applied at the beginning of 1977 and formalized in 1980. In relativistic terms, the SI second is defined as the proper time on the rotating geoid.

The definition of the second was later refined at the 1997 meeting of the BIPM to include the statement

This definition refers to a caesium atom at rest at a temperature of 0 K.

The revised definition seems to imply that the ideal atomic clock contains a single caesium atom at rest emitting a single frequency. In practice, however, the definition means that high-precision realizations of the second should compensate for the effects of the ambient temperature (black-body radiation) within which atomic clocks operate, and extrapolate accordingly to the value of the second at a temperature of absolute zero.

## Proposed: based on optical atomic clock

Today, the atomic clock operating in the microwave region is challenged by atomic clocks operating in the optical region. To quote Ludlow *et al.*, “In recent years, optical atomic clocks have become increasingly competitive in performance with their microwave counterparts. The overall accuracy of single-trapped-ion-based optical standards closely approaches that of the state-of-the-art caesium fountain standards. Large ensembles of ultracold alkaline earth atoms have provided impressive clock stability for short averaging times, surpassing that of single-ion-based systems. So far, interrogation of neutral-atom-based optical standards has been carried out primarily in free space, unavoidably including atomic motional effects that typically limit the overall system accuracy. An alternative approach is to explore the ultranarrow optical transitions of atoms held in an optical lattice. The atoms are tightly localized so that Doppler and photon-recoil related effects on the transition frequency are eliminated.”

The Canadian National Research Council attaches a "relative uncertainty" of 2.5×10^{−11} (limited by day-to-day and device-to-device reproducibility) to their atomic clock based upon the ^{127}I_{2} molecule, and is advocating use of an ^{88}Sr ion trap instead (relative uncertainty due to linewidth of 2.2×10^{−15}). See magneto-optical trap and "Trapped ion optical frequency standards". National Physical Laboratory. Such uncertainties rival that of the NIST-F1 caesium atomic clock in the microwave region, estimated as a few parts in 10^{16} averaged over a day.

## SI multiples

SI prefixes are commonly used to measure time less than a second, but rarely for multiples of a second (which is known as metric time). Instead, the non-SI units minutes, hours, days, Julian years, Julian centuries, and Julian millennia are used.

Thus a megasecond is 11 days, 13 hours, 46 minutes and 40 seconds, which is roughly of the order of a week. A kilosecond is 16 minutes, 40 seconds, or the length of a short break. A gigasecond is 31.7 years, so typical human lifespans are 2 to 3 gigaseconds.

## Other current definitions

For specialized purposes, a second may be used as a unit of time in time scales where the precise length differs slightly from the SI definition. One such time scale is UT1, a form of universal time. McCarthy and Seidelmann refrain from stating that the SI second is the legal standard for timekeeping throughout the world, saying only that "over the years UTC [which ticks SI seconds] has become either the basis for legal time of many countries, or accepted as the *de facto* basis for standard civil time".