This series consists of talks in the area of Foundations of Quantum Theory. Seminar and group meetings will alternate.
The Church-Turing thesis is one of the pillars of computer science; it postulates that every classical system has equivalent computability power to the so-called Turing machine. While this thesis is crucial for our understanding of computing devices, its implications in other scientific fields have hardly been explored. What if we consider the Church-Turing thesis as a law of nature?
Pure states and pure transformations play a crucial role in most of the recent reconstructions of quantum theory. In the frameworks of general probabilistic theories, purity is defined in terms of probabilistic mixtures and bears an intuitive interpretation of ``maximal knowledge" of the state of the system or of the evolution undergone by it. On the other hand, many quantum features do not need the probabilistic structure of the theory.
We present a first principles approach to a probabilistic description of nature based on two guiding principles: spacetime locality and operationalism. No notion of time or metric is assumed, neither any specific physical model. Remarkably, the emerging framework converges with the recently proposed positive formalism of quantum theory, obtained constructively from known quantum physics. However, it also seems to embrace classical physics.
I will present a new approach to information-theoretic foundations of quantum theory, that does not rely on probability theory, spectral theory, or Hilbert spaces. The direct nonlinear generalisations of quantum kinematics and dynamics are constructed using quantum information geometric structures over algebraic states of W*-algebras (quantum relative entropies and Poisson structure). In particular, unitary evolutions are generalised to nonlinear hamiltonian flows, while Lueders? rules are generalised to constrained relative entropy maximisations.
Interferometers capture a basic mystery of quantum mechanics: a single particle can exhibit wave behavior, yet that wave behavior disappears when one tries to determine the particle's path inside the interferometer. This idea has been formulated quantitatively as an inequality, e.g., by Englert and Jaeger, Shimony, and Vaidman, which upper bounds the sum of the interference visibility and the path distinguishability. Such wave-particle duality relations (WPDRs) are often thought to be conceptually inequivalent to Heisenberg's uncertainty principle, although this has been debated.
Interferometers capture a basic mystery of quantum mechanics: a single particle can exhibit wave behavior, yet that wave behavior disappears when one tries to determine the particle's path inside the interferometer. This idea has been formulated quantitatively as an inequality, e.g., by Englert and Jaeger, Shimony, and Vaidman, which upper bounds the sum of the interference visibility and the path distinguishability. Such wave-particle duality relations (WPDRs) are often thought to be conceptually inequivalent to Heisenberg's uncertainty principle, although this has been debated.
In quantum theory, people have thought for some while about the problem of how to estimate the decoherence of a quantum channel from classical data gained in measurements. Applications of these developments include security criteria for quantum key distribution and tests of decoherence models. In this talk, I will present some ideas for how to interpret the same classical data to make statements about decoherence in cases where nature is not necessarily described by quantum theory. This is work in progress in collaboration with many people.
The role of measurement induced disturbance in weak measurements is of central importance for the interpretation of the weak value. Uncontrolled disturbance can interfere with the postselection process and make the weak value dependent on the details of the measurement process. Here we develop the concept of a generalized weak measurement for classical and quantum mechanics. The two cases appear remarkably similar, but we point out some important differences. A priori it is not clear what the correct notion of disturbance should be in the context of weak measurements.
A persistent mystery of quantum theory is whether it admits an interpretation that is realist, self-consistent, model-independent, and unextravagant in the sense of featuring neither multiple worlds nor pilot waves. In this talk, I will present a new interpretation of quantum theory -- called the minimal modal interpretation (MMI) -- that aims to meet these conditions while also hewing closely to the basic structure of the theory in its widely accepted form.
Weak measurement is increasingly acknowledged as an important theoretical and experimental tool. Weak values- the results of weak measurements- are often used to understand seemingly paradoxical quantum behavior. Until now however, it was not known how to perform a weak non-local measurement of a general operator. Such a procedure is necessary if we are to take the associated `weak values' seriously as a physical quantity. We propose a novel scheme for performing non-local weak measurement which is based on the principle of quantum erasure.