Rahul Sharma (Editor)

Entropic vector

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit
Entropic vector

The entropic vector or entropic function is a concept arising in information theory. Shannon's information entropy measures and their associated identities and inequalities (both constrained and unconstrained) have received a lot of attention over the past from the time Shannon introduced his concept of Information Entropy. A lot of inequalities and identities have been found and are available in standard Information Theory texts. But recent researchers have laid focus on trying to find all possible identities and inequalities (both constrained and unconstrained) on such entropies and characterize them. Entropic vector lays down the basic framework for such a study.

Contents

Definition

Let X 1 , X 2 , , X n be random variables, with n N

A vector h in R 2 n 1 is an entropic vector of order n if and only if there exists a tuple X = X 1 , X 2 , , X n with associated vector h X defined by h X ( I ) = H ( X I ) = H ( X i 1 , X i 2 , , X i k ) where I = { i 1 , i 2 , , i k } y h = h x . The set of all entropic vectors of order n is denoted by Γ n

All the properties of entropic functions can be transposed to entropic vectors:

H : P n R + is continuous

Given a deterministic random variable x , we have H ( x ) = 0

Given α R + , there exists a random variable x such that H ( x ) = α

Given P a probability distribution on [ n ] , we have H ( P ) log 2 n

Example

Let X,Y be two independent random variables with discrete uniform distribution over the set { 0 , 1 } . Then

H ( X ) = H ( Y ) = 1 , I ( X ; Y ) = 0

It follows that

H ( X , Y ) = H ( X ) + H ( Y ) I ( X ; Y ) = 2

The entropic vector is thus :

v = ( 1 , 1 , 2 ) T Γ 2

The Shannon inequality and Γn

The entropy satisfies the properties

1 ) H ( ) = 0 2 ) α β : H ( α ) H ( β )

The Shannon inequality is

3 ) H ( X α ) + H ( X β ) H ( X α β ) + H ( X α β )

The entropy vector that satisfies the linear combination of this region is called Γ n .

The region Γ n has been studied recently, the cases for n = 1, 2, 3

L n = Γ n = Γ n = Γ n ¯ L n o = Γ n o = Γ n ¯ o = S h a n n o n n +

if and only if n ∈ {1, 2, 3}

It is difficult harder con the case n 4 , the number of inequalities given by monotone and submodularity properties increase when we increase n, however the relationship among entropic vectors, polymatroids, are an object of study for the information theory and there are other ways to characterize those relationships mentioned

The most important results for the characterization of Γ n is not precisely about these set, but its topological clousure i.e. the set Γ n , which says that Γ n is a convex cone, other interesing characterization is that Γ n = Γ n ( Γ n is the set of vectors that satisfy Shannon-type inequalities) for n 3 , in other words the set of entropy vector is completely characterized by Sahnnon's Inequalities, for the case n = 4 fails this property, particularly by the Ingleton's inequality.

L n Γ n ¯ Γ n Γ n o Γ n ¯ o L n o Γ n o = S h a n n o n n +

The Matus theorem

In 1998 Zhang and Yeung proved a new non-Shannon's inequality

I ( X 3 , X 4 ) = I ( X 4 , X 2 ) + I ( X 1 : X 3 , X 4 ) + 3 I ( X 3 : X 4 | X 1 ) + I ( X 3 : X 4 | X 2 )

and in 2007 Matus proved that Γ 4 ¯ is not polihedral.

Group-charactizable vectors and quasi-uniform distribution

One way to charactize Γ n is by looking at some special distributions. Definition: A group characterizable vector h is also denoted to be 2 n R

such that there exists a group G and subgroups G 1 , G 2 , , G n and for α n

H ( α ) = | G i | | G |

if G α is not and 0 otherwise. G = i α G i .

Definition: γ n is the set of all group charactizable vectors is n , and we can describe better the set Γ n

Theorem: γ n Γ n

Open problem

Given a vector v R 2 n 1 , is it possible to say if there exists n random variables such that their joint entropies are given by v ? It turns out that for n = 2 , 3 the problem has been solved. But for n 4 , it still remains unsolved. Defining the set of all such vectors v R 2 n 1 that can be constructed from a set of n random variables as Γ n , we see that a complete characterization of this space remains an unsolved mystery.

References

Entropic vector Wikipedia