Neha Patil (Editor)

Directed information

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit

Directed information, I ( X n Y n ) , is a measure of information theory and it measures the amount of information that flows from the process X n to Y n , where X n denotes the vector X 1 , X 2 , . . . , X n and Y n denotes Y 1 , Y 2 , . . . , Y n . The term "directed information" was coined by James Massey and is defined as

I ( X n Y n ) = i = 1 n I ( X i ; Y i | Y i 1 ) ,

where I ( X i ; Y i | Y i 1 ) is the conditional mutual information.

Note that if n = 1 , directed information becomes mutual information I ( X ; Y ) . Directed information has many applications in problems where causality plays an important role such as capacity of channel with feedback, capacity of discrete memoryless networks with feedback, gambling with causal side information, and compression with causal side information.

References

Directed information Wikipedia