In the mathematical theory of probability, the entropy rate or source information rate of a stochastic process is, informally, the time density of the average information in a stochastic process. For stochastic processes with a countable index, the entropy rate H(X) is the limit of the joint entropy of n members of the process Xk divided by n, as n tends to infinity:
when the limit exists. An alternative, related quantity is:
For strongly stationary stochastic processes,
Entropy rates for Markov chains
Since a stochastic process defined by a Markov chain that is irreducible, aperiodic and positive recurrent has a stationary distribution, the entropy rate is independent of the initial distribution.
For example, for such a Markov chain Yk defined on a countable number of states, given the transition matrix Pij, H(Y) is given by:
where μi is the asymptotic distribution of the chain.
A simple consequence of this definition is that an i.i.d. stochastic process has an entropy rate that is the same as the entropy of any individual member of the process.