This series consists of talks in the area of Foundations of Quantum Theory. Seminar and group meetings will alternate.
This talk touches on three questions regarding the ontological status of quantum states using the ontological models
framework: it is assumed that a physical system has some underlying ontic state and that quantum states correspond to probability distributions over these ontic states.
The last decade has seen a wave of characterizations of quantum theory using the formalism of generalized probability theory.
After a brief motivation of this question, the presentation is divided in two parts. We first introduce the principle of quantum information causality, which states the maximum amount of quantum information that a transmitted quantum system can communicate as a function of its Hilbert space dimension, independently of any quantum physical resources previously shared by the communicating parties.
The Church-Turing thesis is one of the pillars of computer science; it postulates that every classical system has equivalent computability power to the so-called Turing machine. While this thesis is crucial for our understanding of computing devices, its implications in other scientific fields have hardly been explored. What if we consider the Church-Turing thesis as a law of nature?
I will present a new approach to information-theoretic foundations of quantum theory, that does not rely on probability theory, spectral theory, or Hilbert spaces. The direct nonlinear generalisations of quantum kinematics and dynamics are constructed using quantum information geometric structures over algebraic states of W*-algebras (quantum relative entropies and Poisson structure). In particular, unitary evolutions are generalised to nonlinear hamiltonian flows, while Lueders’ rules are generalised to constrained relative entropy maximisations.
Interferometers capture a basic mystery of quantum mechanics: a single particle can exhibit wave behavior, yet that wave behavior disappears when one tries to determine the particle's path inside the interferometer. This idea has been formulated quantitatively as an inequality, e.g., by Englert and Jaeger, Shimony, and Vaidman, which upper bounds the sum of the interference visibility and the path distinguishability. Such wave-particle duality relations (WPDRs) are often thought to be conceptually inequivalent to Heisenberg's uncertainty principle, although this has been debated.
In quantum theory, people have thought for some while about the problem of how to estimate the decoherence of a quantum channel from classical data gained in measurements. Applications of these developments include security criteria for quantum key distribution and tests of decoherence models. In this talk, I will present some ideas for how to interpret the same classical data to make statements about decoherence in cases where nature is not necessarily described by quantum theory. This is work in progress in collaboration with many people.
The role of measurement induced disturbance in weak measurements is of central importance for the interpretation of the weak value. Uncontrolled disturbance can interfere with the postselection process and make the weak value dependent on the details of the measurement process. Here we develop the concept of a generalized weak measurement for classical and quantum mechanics. The two cases appear remarkably similar, but we point out some important differences. A priori it is not clear what the correct notion of disturbance should be in the context of weak measurements.
A persistent mystery of quantum theory is whether it admits an interpretation that is realist, self-consistent, model-independent, and unextravagant in the sense of featuring neither multiple worlds nor pilot waves. In this talk, I will present a new interpretation of quantum theory -- called the minimal modal interpretation (MMI) -- that aims to meet these conditions while also hewing closely to the basic structure of the theory in its widely accepted form.
Weak measurement is increasingly acknowledged as an important theoretical and experimental tool. Weak values- the results of weak measurements- are often used to understand seemingly paradoxical quantum behavior. Until now however, it was not known how to perform a weak non-local measurement of a general operator. Such a procedure is necessary if we are to take the associated `weak values' seriously as a physical quantity. We propose a novel scheme for performing non-local weak measurement which is based on the principle of quantum erasure.