This series consists of talks in the area of Foundations of Quantum Theory. Seminar and group meetings will alternate.
A fundamental question in trying to understand the world -- be it classical or quantum -- is why things happen. We seek a causal account of events, and merely noting correlations between them does not provide a satisfactory answer. Classical statistics provides a better alternative: the framework of causal models proved itself a powerful tool for studying causal relations in a range of disciplines. We aim to adapt this formalism to allow for quantum variables and in the process discover a new perspective on how causality is different in the quantum world.
In this work we develop a formalism for describing localised quanta for a real-valued Klein-Gordon field in a one-dimensional box [0, R]. We quantise the field using non-stationary local modes which, at some arbitrarily chosen initial time, are completely localised within the left or the right side of the box. In this concrete set-up we directly face the problems inherent to a notion of local field excitations, usually thought of as elementary particles.
Consider discrete physics with a minimal time step taken to be
tau. A time series of positions q,q',q'', ... has two classical
observables: position (q) and velocity (q'-q)/tau. They do not commute,
for observing position does not force the clock to tick, but observing
velocity does force the clock to tick. Thus if VQ denotes first observe
position, then observe velocity and QV denotes first observe velocity,
then observe position, we have
The start of the talk will be an outline how the ordinary notions of quantum theory translate into the category of C*-algebras, where there are several possible choices of morphisms. The second half will relate this to a category of convex sets used as state spaces. Alfsen and Shultz have characterized the convex sets arising from state spaces C*-algebras and this result can be applied to get a categorical equivalence between C*-algebras and state spaces of C*-algebras which is a generalization of the equivalence between the Schroedinger and Heisenberg pictures.
There is now a remarkable mathematical theory of causation. But applying this theory to a Bell scenario implies the Bell inequalities, which are violated in experiment. We alleviate this tension by translating the basic definitions of the theory into the framework of generalised probabilistic theories. We find that a surprising number of results carry over: the d-separation criterion for conditional independence (the no-signalling principle on steroids), and even certain quantitative limits on correlations.
We introduce a new way of quantifying the degrees of incompatibility of two observables in a probabilistic physical theory and, based on this, a global measure of the degree of incompatibility inherent in such theories. This opens up a flexible way of comparing probabilistic theories with respect to the nonclassical feature of incompatibility. We show that quantum theory contains observables that are as incompatible as any probabilistic physical theory can have.
Since the 1909 work of Carathéodory, an axiomatic approach to thermodynamics has gained ground which highlights the role of the the binary relation of adiabatic accessibility between equilibrium states. A feature of Carathédory's system is that the version therein of the second law contains an ambiguity about the nature of irreversible adiabatic processes, making it weaker than the traditional Kelvin-Planck statement of the law.
There has recently been much interest in finding simple principles that explain the particular sets of experimental probabilities that are possible with quantum mechanics in Bell-type experiments. In the quantum gravity community, similar questions had been raised, about whether a certain generalisation of quantum mechanics allowed more than quantum mechanics in this regard. We now bring these two strands of work together to see what can be learned on both sides.
Central to quantum theory, the wavefunction is a complex distribution associated with a quantum system. Despite its fundamental role, it is typically introduced as an abstract element of the theory with no explicit definition. Rather, physicists come to a working understanding of it through its use to calculate measurement outcome probabilities through the Born Rule. Tomographic methods can reconstruct the wavefunction from measured probabilities.
The status of the quantum state is perhaps the most controversial issue in the foundations of quantum theory. Is it an epistemic state (representing knowledge, information, or belief) or an ontic state (a direct reflection of reality)? In the ontological models framework, quantum states correspond to probability measures over more fundamental states of reality. The quantum state is then ontic if every pair of pure states corresponds to a pair of measures that do not overlap, and is otherwise epistemic.