Various types of stability may be discussed for the solutions of differential equations or difference equations describing dynamical systems. The most important type is that concerning the stability of solutions near to a point of equilibrium. This may be discussed by the theory of Lyapunov. In simple terms, if the solutions that start out near an equilibrium point
Contents
History
Lyapunov stability is named after Aleksandr Lyapunov, a Russian mathematician who published his book The General Problem of Stability of Motion in 1892. Lyapunov was the first to consider the modifications necessary in nonlinear systems to the linear theory of stability based on linearizing near a point of equilibrium. His work, initially published in Russian and then translated to French, received little attention for many years. Interest in it started suddenly during the Cold War period when the so-called "Second Method of Lyapunov" (see below) was found to be applicable to the stability of aerospace guidance systems which typically contain strong nonlinearities not treatable by other methods. A large number of publications appeared then and since in the control and systems literature. More recently the concept of the Lyapunov exponent (related to Lyapunov's First Method of discussing stability) has received wide interest in connection with chaos theory. Lyapunov stability methods have also been applied to finding equilibrium solutions in traffic assignment problems.
Definition for continuous-time systems
Consider an autonomous nonlinear dynamical system
where
- This equilibrium is said to be Lyapunov stable, if, for every
ϵ > 0 , there exists aδ > 0 such that, if∥ x ( 0 ) − x e ∥ < δ , then for everyt ≥ 0 we have∥ x ( t ) − x e ∥ < ϵ . - The equilibrium of the above system is said to be asymptotically stable if it is Lyapunov stable and there exists
δ > 0 such that if∥ x ( 0 ) − x e ∥ < δ , thenlim t → ∞ ∥ x ( t ) − x e ∥ = 0 . - The equilibrium of the above system is said to be exponentially stable if it is asymptotically stable and there exist
α > 0 , β > 0 , δ > 0 such that if∥ x ( 0 ) − x e ∥ < δ , then∥ x ( t ) − x e ∥ ≤ α ∥ x ( 0 ) − x e ∥ e − β t t ≥ 0 .
Conceptually, the meanings of the above terms are the following:
- Lyapunov stability of an equilibrium means that solutions starting "close enough" to the equilibrium (within a distance
δ from it) remain "close enough" forever (within a distanceϵ from it). Note that this must be true for anyϵ that one may want to choose. - Asymptotic stability means that solutions that start close enough not only remain close enough but also eventually converge to the equilibrium.
- Exponential stability means that solutions not only converge, but in fact converge faster than or at least as fast as a particular known rate
α ∥ x ( 0 ) − x e ∥ e − β t
The trajectory x is (locally) attractive if
(where y(t) denotes the system output) for
That is, if x belongs to the interior of its stable manifold, it is asymptotically stable if it is both attractive and stable. (There are examples showing that attractivity does not imply asymptotic stability. Such examples are easy to create using homoclinic connections.)
If the Jacobian of the dynamical system at an equilibrium happens to be a stability matrix (i.e., if the real part of each eigenvalue is strictly negative), then the equilibrium is asymptotically stable.
Lyapunov's second method for stability
Lyapunov, in his original 1892 work, proposed two methods for demonstrating stability. The first method developed the solution in a series which was then proved convergent within limits. The second method, which is now referred to as the Lyapunov stability criterion, makes use of a Lyapunov function V(x) which has an analogy to the potential function of classical dynamics. It is introduced as follows for a system
Then V(x) is called a Lyapunov function candidate and the system is stable in the sense of Lyapunov (Note that
It is easier to visualize this method of analysis by thinking of a physical system (e.g. vibrating spring and mass) and considering the energy of such a system. If the system loses energy over time and the energy is never restored then eventually the system must grind to a stop and reach some final resting state. This final state is called the attractor. However, finding a function that gives the precise energy of a physical system can be difficult, and for abstract mathematical systems, economic systems or biological systems, the concept of energy may not be applicable.
Lyapunov's realization was that stability can be proven without requiring knowledge of the true physical energy, provided a Lyapunov function can be found to satisfy the above constraints.
Definition for discrete-time systems
The definition for discrete-time systems is almost identical to that for continuous-time systems. The definition below provides this, using an alternate language commonly used in more mathematical texts.
Let (X, d) be a metric space and f : X → X a continuous function. A point x in X is said to be Lyapunov stable, if,
We say that x is asymptotically stable if it belongs to the interior of its stable set, i.e. if,
Stability for linear state space models
A linear state space model
where
is negative definite for some positive definite matrix
Correspondingly, a time-discrete linear state space model
is asymptotically stable (in fact, exponentially stable) if all the eigenvalues of
This latter condition has been generalized to switched systems: a linear switched discrete time system (ruled by a set of matrices
is asymptotically stable (in fact, exponentially stable) if the joint spectral radius of the set
Stability for systems with inputs
A system with inputs (or controls) has the form
where the (generally time-dependent) input u(t) may be viewed as a control, external input, stimulus, disturbance, or forcing function. The study of such systems is the subject of control theory and applied in control engineering. For systems with inputs, one must quantify the effect of inputs on the stability of the system. The main two approaches to this analysis are BIBO stability (for linear systems) and input-to-state (ISS) stability (for nonlinear systems)
Example
Consider an equation where, compared to the Van der Pol oscillator equation, the friction term is changed:
The equilibrium is at :
Here is a good example of an unsuccessful try to find a Lyapunov function that proves stability:
Let
so that the corresponding system is
Let us choose as a Lyapunov function
which is clearly positive definite. Its derivative is
It seems that if the parameter
Barbalat's lemma and stability of time-varying systems
Assume that f is function of time only.
Barbalat's lemma says:
IfUsually, it is difficult to analyze the asymptotic stability of time-varying systems because it is very difficult to find Lyapunov functions with a negative definite derivative.
We know that in case of autonomous (time-invariant) systems, if
-
V ( x , t ) is lower bounded -
V ˙ ( x , t ) is negative semi-definite (NSD) -
V ˙ ( x , t ) is uniformly continuous in time (satisfied ifV ¨
The following example is taken from page 125 of Slotine and Li's book Applied Nonlinear Control.
Consider a non-autonomous system
This is non-autonomous because the input
Taking
This says that
Using Barbalat's lemma:
This is bounded because