![]() | ||
In information theory, dual total correlation (Han 1978), excess entropy (Olbrich 2008), or binding information (Abdallah and Plumbley 2010) is one of the two known non-negative generalizations of mutual information. While total correlation is bounded by the sum entropies of the n elements, the dual total correlation is bounded by the joint-entropy of the n elements. Although well behaved, dual total correlation has received much less attention than the total correlation. A measure known as "TSE-complexity" defines a continuum between the total correlation and dual total correlation (Ay 2001).
Contents
Definition
For a set of n random variables
where
Normalized
The dual total correlation normalized between [0,1] is simply the dual total correlation divided by its maximum value
Bounds
Dual total correlation is non-negative and bounded above by the joint entropy
Secondly, Dual total correlation has a close relationship with total correlation,
Relation to other quantities
In measure theoretic terms, by the definition of dual total correlation:
which is equal to the union of the pairwise mutual informations:
History
Han (1978) originally defined the dual total correlation as,
However Abdallah and Plumbley (2010) showed its equivalence to the easier-to-understand form of the joint entropy minus the sum of conditional entropies via the following: