This series consists of talks in the area of Foundations of Quantum Theory. Seminar and group meetings will alternate.
Weak values are quantities accessed through quantum experiments involving weak measurements and post-selection. It has been shown that ‘anomalous’ weak values (those lying beyond the eigenvalue range of the corresponding operator) defy classical explanation in the sense of requiring contextuality [M. F. Pusey, Phys. Rev. Lett. 113, 200401, arXiv:1409.1535]. We elaborate on and extend that result in several directions. Firstly, the original theorem requires certain perfect correlations that can never be realised in any actual experiment.
In the framework of ontological models, the features of quantum
theory that emerge as inherently nonclassical always involve properties that
are fine tuned, i.e. properties that hold at the operational level but break at the
ontological level (they only hold for fine tuned values of the ontic parameters). Famous
Schur-Weyl duality, arising from tensor-power representations of the unitary group, is a big useful hammer in the quantum information toolbox. This is especially the case for problems which have a full unitary invariance, say, estimating the spectrum of a quantum state from a few copies. Many problems in quantum computing have a smaller symmetry group: the Clifford group. This talk will show how to decompose tensor-power Clifford representations through a Schur-Weyl type construction. Our results are also relevant for the theory of Howe duality between symplectic and orthogonal groups.
Bell inequalities are important tools in contrasting classical and quantum behaviors. To date, most Bell inequalities are linear combinations of statistical correlations between remote parties. Nevertheless, finding the classical and quantum mechanical (Tsirelson) bounds for a given Bell inequality in a general scenario is a difficult task which rarely leads to closed-form solutions. Here we introduce a new class of Bell inequalities based on products of correlators that alleviate these issues. Each such Bell inequality is associated with a non-cooperative coordination game.
The no-signalling principle, preventing superluminal communication and the consequent logical paradoxes, is typically formulated within the information-theoretic framework in terms of admissible correlations in composite systems. In my talk, I will present its complementary incarnation associated with dynamics of single systems subject to invasive measurements. The `dynamical no-signalling principle' applies to any theory with well defined rules of calculating detection statistics in spacetime.
Randomness is a valuable resource in both classical and quantum networks and we wish to generate desired probability distributions as cheaply as possible. If we are allowed to slightly change the distribution under some tolerance level, we can sometimes greatly reduce the cardinality of the randomness or the dimension of the entanglement. By studying statistical inequalities, we show how to upper bound of the amount of randomness required for any given classical network and tolerance level. We also present a problem we encounter when compressing the randomness in a quantum network.
We investigate the emergence of classicality and objectivity in arbitrary physical theories. First we provide an explicit example of a theory where there are no objective states. Then we characterize classical states of generic theories, and show how classical physics emerges through a decoherence process, which always exists in causal theories as long as there are classical states. We apply these results to the study of the emergence of objectivity, here recast as a multiplayer game.
Canonical quantization is not well suited to quantize gravity, while affine quantization is. For those unfamiliar with affine quantization the talk will include a primer. This procedure is then applied to deal with three nonrenormalizable, field theoretical, problems of increasing difficulty, the last one being general relativity itself.
The need for a time-shift invariant formulation of quantum theory arises from fundamental symmetry principles as well as heuristic cosmological considerations. Such a description then leaves open the question of how to reconcile global invariance with the perception of change, locally. By introducing relative time observables, we are able to make rigorous the Page-Wootters conditional probability formalism to show how local Heisenberg evolution is compatible with global invariance.
We derive Born’s rule and the density-operator formalism for quantum systems with Hilbert spaces of finite dimension. Our extension of Gleason’s theorem only relies upon the consistent assignment of probabilities to the outcomes of projective measurements and their classical mixtures. This assumption is significantly weaker than those required for existing Gleason-type theorems valid in dimension two.