This series consists of talks in the area of Foundations of Quantum Theory. Seminar and group meetings will alternate.
The experimental violation of Bell inequalities using spacelike separated measurements precludes the explanation of quantum correlations through causal influences propagating at subluminal speed. Yet, it is always possible, in principle, to explain such experimental violations through models based on hidden influences propagating at a finite speed v>c, provided v is large enough. Here, we show that for any finite speed v>c, such models predict correlations that can be exploited for faster-than-light communication.
TBA
It is my contention that non-commutative geometry is really "ordinary geometry" carried out in a non-commutative logic. I will sketch a specific project, relating groupoid C*-algebras to toposes, by means of which I hope to detect the nature of this non-commutative logic.
Quantum theory can be thought of as a noncommutative generalization of Bayesian probability theory, but for the analogy to be convincing, it should be possible to describe inferences among quantum systems in a manner that is independent of the causal relationship between those systems.
We use the mathematical language of sheaf theory to give a unified treatment of non-locality and contextuality, which generalizes the familiar probability tables used in non-locality theory to cover Kochen-Specker configurations and more. We show that contextuality, and non-locality as a special case, correspond exactly to *obstructions to the existence of global sections*.
Classical constraints come in various forms: first and second class, irreducible and reducible, regular and irregular, all of which will be illustrated. They can lead to severe complications when classical constraints are quantized. An additional complication involves whether one should quantize first and reduce second or vice versa, which may conflict with the axiom that canonical quantization requires Cartesian coordinates. Most constraint quantization procedures (e.g., Dirac, BRST, Faddeev) run into difficulties with some of these issues and may lead to erroneous results.
A family of probability distributions (i.e. a statistical model) is said to be sufficient for another, if there exists a transition matrix transforming the probability distributions in the former to the probability distributions in the latter. The so-called Blackwell-Sherman-Stein Theorem provides necessary and sufficient conditions for one statistical model to be sufficient for another, by comparing their "information values" in a game-theoretical framework. In this talk, I will extend some of these ideas to the quantum case.
Wheeler's delayed choice (WDC) is one of the "standard experiments in foundations". It aims at the puzzle of a photon simultaneously behaving as wave and particle. Bohr-Einstein debate on wave-particle duality prompted the introduction of Bohr's principle of complementarity, ---`.. the study of complementary phenomena demands mutually exclusive experimental arrangements" . In WDC experiment the mutually exclusive setups correspond to the presence or absence of a second beamsplitter in a Mach-Zehnder interferometer (MZI). A choice of the setup determines the observed behaviour.
This is a very informal talk about some of the issues associated with the notion of "macroscopic realism" (MR) and its relation to quantum mechanics (QM).