Lecture on quantum contextuality
Quantum contextuality is a foundational concept in quantum theory. Quantum Contextuality means that in any theory that attempts to explain quantum mechanics deterministically, the measurement result of a quantum observable depends on the specific experimental setup being used to measure that observable, in particular the commuting observables being measured with it.
Contents
- Lecture on quantum contextuality
- Algorithmic approach to quantum contextuality by prof dagomir kaszlikowski
- Gleasons theorem
- Kochen and Specker
- Graph theory and optimization
- References
Algorithmic approach to quantum contextuality by prof dagomir kaszlikowski
Gleason's theorem
Andrew Gleason proposed a theorem demonstrating that quantum contextuality exists only in dimensions greater than two. This was pointed out already by Niels Bohr in his paper which says that EPR-like paradoxes occur in the quantum systems without the need for entangled or composite systems.
Kochen and Specker
Later, Simon B. Kochen and Ernst Specker, and separately John Bell, constructed proofs that quantum mechanics is contextual for systems of dimension 3 and greater. In addition, Kochen and Specker constructed an explicitly noncontextual hidden variable model for the two-dimensional qubit case in their paper on the subject., thereby completing the characterisation of the dimensionality of quantum systems that can demonstrate contextual behaviour.
Graph theory and optimization
Adan Cabello, Simone Severini, and Andreas Winter introduced a general graph-theoretic framework for studying contextuality of different physical theories. This allowed to show that quantum contextuality is closely related to the Lovász number, an important parameter used in optimization and information theory. By making use of similar techniques, Mark Howard, Joel Wallman, Victor Veitch, and Joseph Emerson have then shown that the Lovász number has a key role in determining the power of quantum computing.