Generalized relative entropy (
Contents
- Definition
- Relationship to the trace distance
- Proof of inequality a
- Proof of inequality b
- Alternative proof of the Data Processing inequality
- References
In the study of quantum information theory, we typically assume that information processing tasks are repeated multiple times, independently. The corresponding information-theoretic notions are therefore defined in the asymptotic limit. The quintessential entropy measure, von Neumann entropy, is one such notion. In contrast, the study of one-shot quantum information theory is concerned with information processing when a task is conducted only once. New entropic measures emerge in this scenario, as traditional notions cease to give a precise characterization of resource requirements.
In the asymptotic scenario, relative entropy acts as a parent quantity for other measures besides being an important measure itself. Similarly,
Definition
To motivate the definition of the
For
From the definition, it is clear that
Relationship to the trace distance
Suppose the trace distance between two density operators
For
In particular, this implies the following analogue of the Pinsker inequality
Furthermore, the proposition implies that for any
Proof of inequality a)
Upper bound: Trace distance can be written as
This maximum is achieved when
so that if
From the definition of the
Lower bound: Let
where
This means
and thus
Moreover,
Using
Hence
Proof of inequality (b)
To derive this Pinsker-like inequality, observe that
Alternative proof of the Data Processing inequality
A fundamental property of von Neumann entropy is strong subadditivity. Let
where
for every CPTP map
It is readily seen that
for any CPTP map
By the quantum analogue of the Stein lemma,
where the minimum is taken over
Applying the data processing inequality to the states
Dividing by