This series consists of talks in the area of Foundations of Quantum Theory. Seminar and group meetings will alternate.
Bell inequalities are important tools in contrasting classical and quantum behaviors. To date, most Bell inequalities are linear combinations of statistical correlations between remote parties. Nevertheless, finding the classical and quantum mechanical (Tsirelson) bounds for a given Bell inequality in a general scenario is a difficult task which rarely leads to closed-form solutions. Here we introduce a new class of Bell inequalities based on products of correlators that alleviate these issues. Each such Bell inequality is associated with a non-cooperative coordination game.
The no-signalling principle, preventing superluminal communication and the consequent logical paradoxes, is typically formulated within the information-theoretic framework in terms of admissible correlations in composite systems. In my talk, I will present its complementary incarnation associated with dynamics of single systems subject to invasive measurements. The `dynamical no-signalling principle' applies to any theory with well defined rules of calculating detection statistics in spacetime.
Randomness is a valuable resource in both classical and quantum networks and we wish to generate desired probability distributions as cheaply as possible. If we are allowed to slightly change the distribution under some tolerance level, we can sometimes greatly reduce the cardinality of the randomness or the dimension of the entanglement. By studying statistical inequalities, we show how to upper bound of the amount of randomness required for any given classical network and tolerance level. We also present a problem we encounter when compressing the randomness in a quantum network.
We investigate the emergence of classicality and objectivity in arbitrary physical theories. First we provide an explicit example of a theory where there are no objective states. Then we characterize classical states of generic theories, and show how classical physics emerges through a decoherence process, which always exists in causal theories as long as there are classical states. We apply these results to the study of the emergence of objectivity, here recast as a multiplayer game.
Canonical quantization is not well suited to quantize gravity, while affine quantization is. For those unfamiliar with affine quantization the talk will include a primer. This procedure is then applied to deal with three nonrenormalizable, field theoretical, problems of increasing difficulty, the last one being general relativity itself.
The need for a time-shift invariant formulation of quantum theory arises from fundamental symmetry principles as well as heuristic cosmological considerations. Such a description then leaves open the question of how to reconcile global invariance with the perception of change, locally. By introducing relative time observables, we are able to make rigorous the Page-Wootters conditional probability formalism to show how local Heisenberg evolution is compatible with global invariance.
We derive Born’s rule and the density-operator formalism for quantum systems with Hilbert spaces of finite dimension. Our extension of Gleason’s theorem only relies upon the consistent assignment of probabilities to the outcomes of projective measurements and their classical mixtures. This assumption is significantly weaker than those required for existing Gleason-type theorems valid in dimension two.
The theory of relativity associates a proper time with each moving object via its spacetime trajectory. In quantum theory on the other hand, such trajectories are forbidden. I will discuss an operation approach to exploring this conflict, considering the average time measured by a quantum clock in the weak-field, low-velocity limit. Considering the role of the clock’s state of motion, one finds that all ``good'' quantum clocks experience the time dilation prescribed by general relativity for the most classical states of motion.
A common criticism directed against many-world theories is that, being deterministic, they cannot make sense of probability. I argue that, on the contrary, deterministic theories with branching provide us the only known coherent definition of objective probability. I illustrate this argument with a toy many-worlds theory known as Kent's universe, and discuss its limitations when applied to the usual Many-Worlds interpretation of quantum mechanics.
Epistemic interpretations of quantum theory maintain that quantum states only represent incomplete information about the physical states of the world. A major motivation for this view is the promise to provide a reasonable account of state update under measurement by asserting that it is simply a natural feature of updating incomplete statistical information. Here we demonstrate that all known epistemic ontological models of quantum theory in dimension d ≥ 3, including those designed to evade the conclusion of the PBR theorem, cannot represent state update correctly.