In physics, a moment is an expression involving the product of a distance and a physical quantity, and in this way it accounts for how the physical quantity is located or arranged. Moments are usually defined with respect to a fixed reference point; they deal with physical quantities as measured at some distance from that reference point. For example, the moment of force acting on an object, often called torque, is the product of the force and the distance from a reference point. In principle, any physical quantity can be multiplied by distance to produce a moment; commonly used quantities include forces, masses, and electric charge distributions.
Contents
Elaboration
In its most simple and basic form, a moment is the product of the distance to some point, raised to some power, multiplied by some physical quantity such as the force, charge, etc. at that point:
where
where
More complex forms take into account the angular relationships between the distance and the physical quantity, but the above equations capture the essential feature of a moment, namely the existence of an underlying
Each value of n corresponds to a different moment: the 1st moment corresponds to n=1; the 2nd moment to n=2, etc. The 0th moment (n=0) is sometimes called the monopole moment; the 1st moment (n=1) is sometimes called the dipole moment, and the 2nd moment (n=2) is sometimes called the quadrupole moment, especially in the context of electric charge distributions.
Examples
Multipole moments
Assuming a density function that is finite and localized to a particular region, outside that region a 1/r potential may be expressed as a series of spherical harmonics:
The coefficients
where
When
Applications of multipole moments
The multipole expansion applies to 1/r scalar potentials, examples of which include the electric potential and the gravitational potential. For these potentials, the expression can be used to approximate the strength of a field produced by a localized distribution of charges (or mass) by calculating the first few moments. For sufficiently large r, a reasonable approximation can be obtained from just the monopole and dipole moments. Higher fidelity can be achieved by calculating higher order moments. Extensions of the technique can be used to calculate interaction energies and intermolecular forces.
The technique can also be used to determine the properties of an unknown distribution
History
The concept of moment in physics is derived from the mathematical concept of moments. . The principle of moments is derived from Archimedes' discovery of the operating principle of the lever. In the lever one applies a force, in his day most often human muscle, to an arm, a beam of some sort. Archimedes noted that the amount of force applied to the object, the moment of force, is defined as M = rF, where F is the applied force, and r is the distance from the applied force to object. However, historical evolution of the term 'moment' and its use in different branches of science, such as mathematics, physics and engineering, is unclear.
Federico Commandino, in 1565, translated into Latin from Archimedes:
The center of gravity of each solid figure is that point within it, about which on all sides parts of equal moment stand.This was apparently the first use of the word moment (Latin, momentorum) in the sense which we now know it: a moment about a center of rotation.
The word moment was first used in Mechanics in its now rather old-fashioned sense of 'importance' or 'consequence,' and the moment of a force about an axis meant the importance of the force with respect to its power to generate in matter rotation about the axis... But the word 'moment' has also come to be used by analogy in a purely technical sense, in such expressions as the 'moment of a mass about an axis,' or 'the moment of an area with respect to a plane,' which require definition in each case. In those instances there is not always any corresponding physical idea, and such phrases stand, both historically and scientifically, on a different footing. - A. M. Worthington, 1920