This series consists of talks in the area of Foundations of Quantum Theory. Seminar and group meetings will alternate.
The talk first offers a brief assessment of the realist and nonrealist understanding of quantum theory, in relation to the role of probability and statistics there from the perspective of quantum information theory, in part in view of several recent developments in quantum information theory in the work of M. G. D’Ariano and L. Hardy, among others. It then argues that what defines quantum theory, both quantum mechanics and quantum field theory, most essentially, including as concerns realism or the lack thereof and the probability and statistics, is a new (vs.
The Reeh-Schlieder theorem says, roughly, that, in any reasonable quantum field theory, for any bounded region of spacetime R, any state can be approximated arbitrarily closely by operating on the vacuum state (or any state of bounded energy) with operators formed by smearing polynomials in the field operators with functions having support in R.
In this talk we will discuss how C*-algebras can be identified and characterised in terms of certain cosheaves with self-action. The reason we are interested in such a study is it to try and give a rigorous mathematical derivation of the axioms of quantum theory. In particular, many of the standard axioms for C*-algebras have unclear physical and operational meaning, but by defining an equivalence of categories between C*-algebras and cosheaves with self-action, we believe that these axioms can acquire a clear operational meaning.
The existence of observables that are incompatible or not jointly measurable is a characteristic feature of quantum mechanics, which is the root of a number of nonclassical phenomena, such as uncertainty relations, wave--particle dual behavior, Bell-inequality violation, and contextuality.
However, no intuitive criterion is available for determining the compatibility of even two (generalized) observables, despite the overarching importance of this problem and intensive efforts of many researchers over more than 80 years.
In this talk, I will explore a timeless interpretation of quantum mechanics of closed systems, solely in terms of path integrals in non-relativistic timeless configuration space. What prompts a fresh look at the foundational problems in this context, is the advent of multiple gravitational models in which Lorentz symmetry is only emergent. In this setting, I propose a new understanding of records as certain relations between two configurations, the recorded one and the record-holding one.
In this talk we will discuss the relation between the incompatibility of quantum measurements and quantum nonlocality. We show that any set of measurements that is not jointly measurable (i.e. incompatible) can be used for demonstrating EPR steering, a form of quantum nonlocality. This implies that EPR steering and (non) joint measurability can be viewed as equivalent. Moreover, we discuss the connection between Bell nonlocality and joint measurability, and give evidence that both notions are inequivalent.
To best distinguish between classical and non-classical models of nature requires a good notion of classicality. I will argue that noncontextuality is a good candidate for this notion. Until now, certain theoretical and experimental roadblocks have stood in the way of a test of noncontextuality which is free of unattainable experimental idealizations. I will present solutions to these roadblocks as well as the results of an experimental test.
One necessity to avoid the measurement problem in quantum mechanics is a clear ontology. Such an ontology is for instance provided by Bohmian mechanics. In the non-relativistic regime, Bohmian mechanics is a theory about particles whose motion is governed by a velocity field. The latter is generated by a wave
Using quantum control in foundational experiments allows new theoretical and experimental possibilities. We show how, e.g., quantum controlling devices reverse a temporal ordering in detection. We consider probing of wave–particle duality in quantum-controlled and the entanglement-assisted delayed-choice experiments. Then we discuss other situations where quantum control may be useful, and finally demonstrate how the techniques we developed are applied to the study of consistency of the classically reasonable requirements.
It is well known - to those who know it - that noise and randomness can enhance signal resolution. I'll present an easy-to-follow example from digital audio that illustrates the way in which adding noise ("dither") prior to measurement enhances the accuracy with which we are able to distinguish the features of the sound or image. I will then explore the way in which the environmental interactions prior to measurement ordinarily characterized as environment-induced decoherence may play a similar role.