In information theory, Pinsker's inequality, named after its inventor Mark Semenovich Pinsker, is an inequality that bounds the total variation distance (or statistical distance) in terms of the Kullback–Leibler divergence. The inequality is tight up to constant factors.
Contents
Formal statement
Pinsker's inequality states that, if
where
is the total variation distance (or statistical distance) between
is the Kullback–Leibler divergence in nats. When the sample space
Note that in terms of the total variation norm
The proof of Pinsker's inequality uses the partition inequality for f-divergences.
History
Pinsker first proved the inequality with a worse constant. The inequality in the above form was proved independently by Kullback, Csiszár, and Kemperman.
Inverse problem
A precise inverse of the inequality cannot hold: for every