This series consists of talks in the area of Foundations of Quantum Theory. Seminar and group meetings will alternate.
The status of the quantum state is perhaps the most controversial issue in the foundations of quantum theory. Is it an epistemic state (representing knowledge, information, or belief) or an ontic state (a direct reflection of reality)? In the ontological models framework, quantum states correspond to probability measures over more fundamental states of reality. The quantum state is then ontic if every pair of pure states corresponds to a pair of measures that do not overlap, and is otherwise epistemic.
If a wave function does not describe microscopic reality then what does? Reformulating quantum mechanics in path-integral terms leads to a notion of ``precluded event" and thence to the proposal that quantal reality differs from classical reality in the same way as a set of worldlines differs from a single worldline. One can then ask, for example, which sets of electron trajectories correspond to a Hydrogen atom in its ground state and how they differ from those of an excited state.
The purpose of this talk is twofold: First, following Spekkens, to motivate noncontextuality as a natural principle one might expect to hold in nature and introduce operational noncontextuality inequalities motivated by a contextuality scenario first considered by Ernst Specker. These inequalities do not rely on the assumption of outcome-determinism which is implicit in the usual Kochen-Specker (KS) inequalities. We argue that they are the appropriate generalization of KS inequalities, serving as a test for the possibility of noncontextual explanations of experimental data.
It is not unnatural to expect that difficulties lying at the foundations of quantum mechanics can only be resolved by literally going back and rethinking the quantum theory from first principles (namely, the principles of logic). In this talk, I will present a first-order quantum logic which generalizes the propositional quatum logic originated by Birkhoff and von Neumann as well as the standard classical predicate logic used in the development of virtually all of modern mathematics.
On the face of it, quantum physics is nothing like classical physics. Despite its oddity, work in the foundations of quantum theory has provided some palatable ways of understanding this strange quantum realm. Most of our best theories take that story to include the existence of a very non-classical entity: the wave function. Here I offer an alternative which combines elements of Bohmian mechanics and the many-worlds interpretation to form a theory in which there is no wave function.
We present a method for determining the maximum possible violation of any linear Bell inequality per quantum mechanics. Essentially this amounts to a constrained optimization problem for an observable’s eigenvalues, but the problem can be reformulated so as to be analytically tractable. This opens the door for an arbitrarily precise characterization of quantum correlations, including allowing for non-random marginal expectation values. Such a characterization is critical when contrasting QM to superficially similar general probabilistic theories.
The standard formulation of quantum mechanics is
operationally asymmetric with respect to time reversal---in the language of
compositions of tests, tests in the past can influence the outcomes of test in
the future but not the other way around. The question of whether this represents
a fundamental asymmetry or it is an artifact of the formulation is not a new
one, but even though various arguments in favor of an inherent symmetry have
been made, no complete time-symmetric formulation expressed in rigorous
Although it can only
be argued to have become consequential in the study of quantum cosmology, the
question ``Why do we observe a classical world? " has been one of the
biggest preoccupations of quantum foundations. In the consistent
histories formalism, the question is shifted to an analysis of
the telltale sign of quantum mechanics: superposition of states. In
the consistent histories formalism, histories of the system which
``decohere", i.e. fall out of superposition or have negligible
In systems described
by Ising-like Hamiltonians, such as spin-lattices, the Bell Inequality can be
strongly violated. Surprisingly, these systems are both local and
non-superdeterministic. They are local, because 1) they include only local,
near-neighbor interaction, 2) they satisfy, accordingly, the Clauser-Horne
factorability condition, and 3) they can violate the Bell Inequality also in dynamic
Bell experiments. Starting from this result we construct an elementary
Coalgebras
are a flexible tool commonly used in computer science to model abstract devices
and systems. Coalgebraic models also come with a natural notion of logics
for the systems being modelled. In this talk we will introduce coalgebras
and aim to illustrate their usefulness for modelling physical systems.
Extending earlier work of Abramsky, we will show how a weakening of the
usual morphisms for coalgebras provides the flexibility to model quantum
systems in an easy to motivate manner.