![]() | ||
The entropic vector or entropic function is a concept arising in information theory. Shannon's information entropy measures and their associated identities and inequalities (both constrained and unconstrained) have received a lot of attention over the past from the time Shannon introduced his concept of Information Entropy. A lot of inequalities and identities have been found and are available in standard Information Theory texts. But recent researchers have laid focus on trying to find all possible identities and inequalities (both constrained and unconstrained) on such entropies and characterize them. Entropic vector lays down the basic framework for such a study.
Contents
Definition
Let
A vector h in
All the properties of entropic functions can be transposed to entropic vectors:
Given a deterministic random variable
Given
Given
Example
Let X,Y be two independent random variables with discrete uniform distribution over the set
It follows that
The entropic vector is thus :
The Shannon inequality and Γn
The entropy satisfies the properties
The Shannon inequality is
The entropy vector that satisfies the linear combination of this region is called
The region
if and only if n ∈ {1, 2, 3}
It is difficult harder con the case
The most important results for the characterization of
The Matus theorem
In 1998 Zhang and Yeung proved a new non-Shannon's inequality
and in 2007 Matus proved that
Group-charactizable vectors and quasi-uniform distribution
One way to charactize
such that there exists a group
if
Definition:
Theorem:
Open problem
Given a vector