Rahul Sharma (Editor)

Information source (mathematics)

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit

In mathematics, an information source is a sequence of random variables ranging over a finite alphabet Γ, having a stationary distribution.

The uncertainty, or entropy rate, of an information source is defined as

H { X } = lim n H ( X n | X 0 , X 1 , , X n 1 )

where

X 0 , X 1 , , X n

is the sequence of random variables defining the information source, and

H ( X n | X 0 , X 1 , , X n 1 )

is the conditional information entropy of the sequence of random variables. Equivalently, one has

H { X } = lim n H ( X 0 , X 1 , , X n 1 , X n ) n + 1 .

References

Information source (mathematics) Wikipedia


Similar Topics