1 m in ...
... is equal to ...
7000100000000000000♠1000 mm 7000100000000000000♠1×10 km
1.0936 yd 3.2808 ft 39.370 in
The metre, or meter (American spelling), (from the Greek noun μέτρον, "measure") is the base unit of length in the International System of Units (SI). The SI unit symbol is m. The metre is defined as the distance travelled by light in a vacuum in 1/299 792 458 seconds.
- History of definition
- Pendulum definition
- Meridional definition
- Prototype metre bar
- Wavelength definition
- Speed of light definition
- SI prefixed forms of metre
- Equivalents in other units
The metre was originally defined in 1793 as one ten-millionth of the distance from the equator to the North Pole. In 1799, it was redefined in terms of a prototype metre bar (the actual bar used was changed in 1889). In 1960, the metre was redefined in terms of a certain number of wavelengths of a certain emission line of krypton-86. In 1983, the current definition was adopted.
The imperial inch is defined as 0.0254 metres (2.54 centimetres or 25.4 millimetres). One metre is about 3 3⁄8 inches longer than a yard, i.e. about 39 3⁄8 inches.
Metre is the standard spelling of the metric unit for length in nearly all English-speaking nations except the United States and the Philippines, which use meter.
Measuring devices (such as ammeter, speedometer) are spelled "-meter" in all variants of English. The suffix "-meter" has the same Greek origin as the unit of length.
The etymological roots of metre can be traced to the Greek verb μετρέω (metreo) (to measure, count or compare) and noun μέτρον (metron) (a measure), which were used for physical measurement, for poetic metre and by extension for moderation or avoiding extremism (as in "be measured in your response"). This range of uses is also found in Latin (metior, mensura), French (mètre, mesure), English and other languages. The motto ΜΕΤΡΩ ΧΡΩ (metro chro) in the seal of the International Bureau of Weights and Measures (BIPM), which was a saying of the Greek statesman and philosopher Pittacus of Mytilene and may be translated as "Use measure!", thus calls for both measurement and moderation.
History of definition
In 1668 the English cleric and philosopher John Wilkins proposed in an essay a decimal-based unit of length, the universal measure or standard based on a pendulum with a one-second period. In 1670 Gabriel Mouton, Bishop of Lyon, also suggested a universal length standard with decimal multiples and divisions, to be based on a one-minute angle of the Earth's meridian arc or (as the Earth's circumference was not easy to measure) on a pendulum with a one-second period. In 1675, the Italian scientist Tito Livio Burattini, in his work Misura Universale, used the phrase metro cattolico ("universal measure"), derived from the Greek μέτρον καθολικόν (métron katholikón), to denote the standard unit of length derived from a pendulum. As a result of the French Revolution, the French Academy of Sciences charged a commission with determining a single scale for all measures. On 7 October 1790 that commission advised the adoption of a decimal system, and on 19 March 1791 advised the adoption of the term mètre ("measure"), a basic unit of length, which they defined as equal to one ten-millionth of the distance between the North Pole and the Equator, In 1793, the French National Convention adopted the proposal; this use of metre in English began at least as early as 1797.
In 1668, Wilkins proposed using Christopher Wren's suggestion of defining the metre using a pendulum with a length which produced a half-period of one second, known as a 'seconds pendulum'. Christiaan Huygens had observed that length to be 38 Rijnland inches or 39.26 English inches. This is the equivalent of what is now known to be 997 mm. No official action was taken regarding this suggestion.
In the 18th century, there were two approaches to the definition of the standard unit of length. One favoured Wilkins approach: to define the metre in terms of the length of a pendulum which produced a half-period of one second. The other approach was to define the metre as one ten-millionth (1/10 000 000) of the length of a quadrant along the Earth's meridian; that is, the distance from the Equator to the North Pole. This means that the quadrant (a section/distance 1⁄4 of the Earth's circumference) would have been defined as exactly 10 000 000 metres (10 000 km) at that time, with the total circumference of the Earth defined as 40 000 000 metres (40 000 km). In 1791, the French Academy of Sciences selected the meridional definition over the pendular definition because the force of gravity varies slightly over the surface of the Earth, which affects the period of a pendulum.
To establish a universally accepted foundation for the definition of the metre, more accurate measurements of this meridian were needed. The French Academy of Sciences commissioned an expedition led by Jean Baptiste Joseph Delambre and Pierre Méchain, lasting from 1792 to 1799, which attempted to accurately measure the distance between a belfry in Dunkerque and Montjuïc castle in Barcelona to estimate the length of the meridian arc through Dunkerque. This portion of the meridian, assumed to be the same length as the Paris meridian, was to serve as the basis for the length of the half meridian connecting the North Pole with the Equator. The problem with this approach is that the exact shape of the Earth is not a simple mathematical shape, such as a sphere or oblate spheroid, at the level of precision required for defining a standard of length. The irregular and particular shape of the Earth smoothed to sea level is called a geoid, which literally means "Earth-shaped", but does not correspond to the actual shape of the earth, but rather is a mathematical model of its shape. Despite these issues, in 1793 France adopted this definition of the metre as its official unit of length based on provisional results from this expedition.
Prototype metre bar
In the 1870s and in light of modern precision, a series of international conferences was held to devise new metric standards. The Metre Convention (Convention du Mètre) of 1875 mandated the establishment of a permanent International Bureau of Weights and Measures (BIPM: Bureau International des Poids et Mesures) to be located in Sèvres, France. This new organisation was to construct and preserve a prototype metre bar, distribute national metric prototypes, and maintain comparisons between them and non-metric measurement standards. The organisation created such a bar in 1889 at the first General Conference on Weights and Measures (CGPM: Conférence Générale des Poids et Mesures), establishing the International Prototype Metre as the distance between two lines on a standard bar composed of an alloy of 90% platinum and 10% iridium, measured at the melting point of ice.
However, it was later determined that the first prototype metre bar was short by about 200 micrometres because of miscalculation of the flattening of the Earth, making the prototype about 0.02% shorter than the original proposed definition of the metre. Regardless, this length became the standard. The original international prototype of the metre is still kept at the BIPM under the conditions specified in 1889.
In 1893, the standard metre was first measured with an interferometer by Albert A. Michelson, the inventor of the device and an advocate of using some particular wavelength of light as a standard of length. By 1925, interferometry was in regular use at the BIPM. However, the International Prototype Metre remained the standard until 1960, when the eleventh CGPM defined the metre in the new International System of Units (SI) as equal to 1 650 763.73 wavelengths of the orange-red emission line in the electromagnetic spectrum of the krypton-86 atom in a vacuum.
Speed of light definition
To further reduce uncertainty, the 17th CGPM in 1983 replaced the definition of the metre with its current definition, thus fixing the length of the metre in terms of the second and the speed of light:
This definition fixed the speed of light in vacuum at exactly 7008299792458000000♠299792458 metres per second (≈7008300000000000000♠300000 km/s). An intended by-product of the 17th CGPM's definition was that it enabled scientists to compare lasers accurately using frequency, resulting in wavelengths with one-fifth the uncertainty involved in the direct comparison of wavelengths, because interferometer errors were eliminated. To further facilitate reproducibility from lab to lab, the 17th CGPM also made the iodine-stabilised helium–neon laser "a recommended radiation" for realising the metre. For the purpose of delineating the metre, the BIPM currently considers the HeNe laser wavelength, λHeNe, to be 6993632991212580000♠632.99121258 nm with an estimated relative standard uncertainty (U) of 6989210000000000000♠2.1×10−11. This uncertainty is currently one limiting factor in laboratory realisations of the metre, and it is several orders of magnitude poorer than that of the second, based upon the caesium fountain atomic clock (U = 6984500000000000000♠5×10−16). Consequently, a realisation of the metre is usually delineated (not defined) today in labs as 7006157980076204200♠1579800.762042(33) wavelengths of helium-neon laser light in a vacuum, the error stated being only that of frequency determination. This bracket notation expressing the error is explained in the article on measurement uncertainty.
Practical realisation of the metre is subject to uncertainties in characterising the medium, to various uncertainties of interferometry, and to uncertainties in measuring the frequency of the source. A commonly used medium is air, and the National Institute of Standards and Technology (NIST) has set up an online calculator to convert wavelengths in vacuum to wavelengths in air. As described by NIST, in air, the uncertainties in characterising the medium are dominated by errors in measuring temperature and pressure. Errors in the theoretical formulas used are secondary. By implementing a refractive index correction such as this, an approximate realisation of the metre can be implemented in air, for example, using the formulation of the metre as 7006157980076204200♠1579800.762042(33) wavelengths of helium–neon laser light in vacuum, and converting the wavelengths in a vacuum to wavelengths in air. Of course, air is only one possible medium to use in a realisation of the metre, and any partial vacuum can be used, or some inert atmosphere like helium gas, provided the appropriate corrections for refractive index are implemented.
The metre is defined as the path length travelled by light in a given time and practical laboratory length measurements in metres are determined by counting the number of wavelengths of laser light of one of the standard types that fit into the length, and converting the selected unit of wavelength to metres. Three major factors limit the accuracy attainable with laser interferometers for a length measurement:
Of these, the last is peculiar to the interferometer itself. The conversion of a length in wavelengths to a length in metres is based upon the relation
which converts the unit of wavelength λ to metres using c, the speed of light in vacuum in m/s. Here n is the refractive index of the medium in which the measurement is made, and f is the measured frequency of the source. Although conversion from wavelengths to metres introduces an additional error in the overall length due to measurement error in determining the refractive index and the frequency, the measurement of frequency is one of the most accurate measurements available.
SI prefixed forms of metre
SI prefixes are often employed to denote decimal multiples and submultiples of the metre, as shown in the table below. As indicated in the table, some are commonly used, while others are not. Long distances are usually expressed in km, astronomical units (149.6 Gm), light-years (10 Pm), or parsecs (31 Pm), rather than in Mm, Gm, Tm, Pm, Em, Zm or Ym; "30 cm", "30 m", and "300 m" are more common than "3 dm", "3 dam", and "3 hm", respectively.
The terms micron and (occasionally) millimicron are often used instead of micrometre (μm) and nanometre (nm), but this practice is officially discouraged.
Equivalents in other units
Within this table, "inch" and "yard" mean "international inch" and "international yard" respectively, though approximate conversions in the left column hold for both international and survey units."≈" means "is approximately equal to"; "≡" means "equal by definition" or "is exactly equal to".
One metre is exactly equivalent to 10 000/254 inches and to 10 000/9 144 yards.
A simple mnemonic aid exists to assist with conversion, as three "3"s:1 metre is nearly equivalent to 3 feet 3 3⁄8 inches. This gives an overestimate of 0.125 mm. However, the practice of memorising such conversion formulas has been discouraged in favour of practice and visualisation of metric units.
The ancient Egyptian cubit was about 0.5 m (surviving rods are 523–529 mm). Scottish and English definitions of the ell (two cubits) were 941 mm (0.941 m) and 1 143 mm (1.143 m) respectively. The ancient Parisian toise (fathom) was slightly shorter than 2 m and was standardised at exactly 2 m in the mesures usuelles system, such that 1 m was exactly 1⁄2 toise. The Russian versta was 1.0668 km. The Swedish mil was 10.688 km, but was changed to 10 km when Sweden converted to metric units.