In information theory, the Rényi entropy generalizes the Hartley entropy, the Shannon entropy, the collision entropy and the min entropy. Entropies quantify the diversity, uncertainty, or randomness of a system. The Rényi entropy is named after Alfréd Rényi.
Contents
- Definition
- Special cases of the Rnyi entropy
- Hartley or max entropy
- Shannon entropy
- Collision entropy
- Min entropy
- Inequalities between different values of
- Rnyi divergence
- Why 1 is special
- Exponential families
- Physical meaning
- References
The Rényi entropy is important in ecology and statistics as indices of diversity. The Rényi entropy is also important in quantum information, where it can be used as a measure of entanglement. In the Heisenberg XY spin chain model, the Rényi entropy as a function of α can be calculated explicitly by virtue of the fact that it is an automorphic function with respect to a particular subgroup of the modular group. In theoretical computer science, the min-entropy is used in the context of randomness extractors.
Definition
The Rényi entropy of order
Here,
Applications often exploit the following relation between the Rényi entropy and the p-norm of the vector of probabilities:
Here, the discrete probability distribution
The Rényi entropy for any
Special cases of the Rényi entropy
As α approaches zero, the Rényi entropy increasingly weighs all possible events more equally, regardless of their probabilities. In the limit for α → 0, the Rényi entropy is just the logarithm of the size of the support of X. The limit for α → 1 is the Shannon entropy. As α approaches infinity, the Rényi entropy is increasingly determined by the events of highest probability.
Hartley or max-entropy
Provided the probabilities are nonzero,
Shannon entropy
The limiting value of
Collision entropy
Collision entropy, sometimes just called "Rényi entropy," refers to the case α = 2,
where X and Y are independent and identically distributed.
Min-entropy
In the limit as
Equivalently, the min-entropy
The name min-entropy stems from the fact that it is the smallest entropy measure in the family of Rényi entropies. In this sense, it is the strongest way to measure the information content of a discrete random variable. In particular, the min-entropy is never larger than the Shannon entropy.
The min-entropy has important applications for randomness extractors in theoretical computer science: Extractors are able to extract randomness from random sources that have a large min-entropy; merely having a large Shannon entropy does not suffice for this task.
Inequalities between different values of α
That
which is proportional to Kullback–Leibler divergence (which is always non-negative), where
In particular cases inequalities can be proven also by Jensen's inequality:
For values of
On the other hand, the Shannon entropy
Rényi divergence
As well as the absolute Rényi entropies, Rényi also defined a spectrum of divergence measures generalising the Kullback–Leibler divergence.
The Rényi divergence of order α or alpha-divergence of a distribution P from a distribution Q is defined to be
when 0 < α < ∞ and α ≠ 1. We can define the Rényi divergence for the special values α = 0, 1, ∞ by taking a limit, and in particular the limit α → 1 gives the Kullback-Leibler divergence.
Some special cases:
The Rényi divergence is indeed a divergence, meaning simply that
Why α=1 is special
The value α = 1, which gives the Shannon entropy and the Kullback–Leibler divergence, is special because it is only at α=1 that the chain rule of conditional probability holds exactly:
for the absolute entropies, and
for the relative entropies.
The latter in particular means that if we seek a distribution p(x,a) which minimizes the divergence from some underlying prior measure m(x,a), and we acquire new information which only affects the distribution of a, then the distribution of p(x|a) remains m(x|a), unchanged.
The other Rényi divergences satisfy the criteria of being positive and continuous; being invariant under 1-to-1 co-ordinate transformations; and of combining additively when A and X are independent, so that if p(A,X) = p(A)p(X), then
and
The stronger properties of the α = 1 quantities, which allow the definition of conditional information and mutual information from communication theory, may be very important in other applications, or entirely unimportant, depending on those applications' requirements.
Exponential families
The Rényi entropies and divergences for an exponential family admit simple expressions
and
where
is a Jensen difference divergence.
Physical meaning
The Renyi entropy in quantum physics is not considered to be an observable, due to its nonlinear dependence on the density matrix. The Shannon entropy shares this nonlinear dependence. Recently, Ansari and Nazarov showed a correspondence that reveals the physical meaning of the Renyi entropy flow in time. His proposal is similar to the fluctuation-dissipation theorem in spirit and allows the measurement of the quantum entropy using the full counting statistics (FCS) of energy transfers.