This series consists of talks in the area of Foundations of Quantum Theory. Seminar and group meetings will alternate.
What can we say about the spectra of a collection of microscopic variables when only their coarse-grained sums are experimentally accessible? In this paper, using the tools and methodology from the study of quantum nonlocality, we develop a mathematical theory of the macroscopic fluctuations generated by ensembles of independent microscopic discrete systems. We provide algorithms to decide which multivariate gaussian distributions can be approximated by sums of finitely-valued random vectors.
Computational complexity theory is a branch of computer science dedicated to classifying computational problems in terms of their difficulty. While computability theory tells us what we can compute in principle, complexity theory informs us regarding what is feasible. In this chapter I argue that the science of quantum computing illuminates the foundations of complexity theory by emphasising that its fundamental concepts are not model-independent. However this does not, as some have suggested, force us to radically revise the foundations of the theory.
Entropy is an important information measure. A complete understanding of entropy flow will have applications in quantum thermodynamics and beyond; for example it may help to identify the sources of fidelity loss in quantum communications and methods to prevent or control them. Being nonlinear in density matrix, its evaluation for quantum systems requires simultaneous evolution of more-than-one density matrix.
When utilized appropriately, the path-integral offers an alternative to the ordinary quantum formalism of state-vectors, selfadjoint operators, and external observers -- an alternative that seems closer to the underlying reality and more in tune with quantum gravity. The basic dynamical relationships are then expressed, not by a propagator, but by the quantum measure, a set-function $\mu$ that assigns to every (suitably regular) set $E$ of histories its generalized measure $\mu(E)$.
In this talk I will discuss how we might go about about performing a Bell experiment in which humans are used to decide the settings at each end. The radical possibility we wish to investigate is that, when humans are used to decide the settings (rather than various types of random number generators), we might then expect to see a violation of Quantum Theory in agreement with the relevant Bell inequality. Such a result, while very unlikely, would be tremendously significant for our understanding of the world (and I will discuss some interpretations).
The fact that in physics concepts such as space, time, mass and energy are considered to be foundational has been conveniently serving a set of higher-level physical theories.
However, this keeps us from gaining a deeper understanding of such concepts which can in turn help us build a theory based on truly foundational concepts.
In this talk I introduce an alternate description of physical reality based on a simple foundational concept that there exist things that influence one another.
The study of thermodynamics in the quantum regime has in recent years experienced somewhat of a renaissance in our community. This excitement is fueled both by the fundamental nature of the subject as well as the potential for heat machines designed with quantum advantages. Here, I will suggest the study of quantum thermodynamics restricted to a Gaussian regime, with two primary goals in mind.
Hayden and Van Dam showed that starting with a separable state in Alice and Bob’s state space and a shared entangled state in a common bipartite resource space, then using local unitary operations, it is possible to produce an entangled pair in the state space while at the same time only perturbing the shared entangled state by a small amount, which can be made arbitrarily small as the dimension of the resource space grows. They referred to this as “embezzling entanglement” since numerically it “appears" that the resource state was returned exactly.
Anomalies are a ubiquitous phenomenon in quantum mechanics whereby a classical
symmetry is irrevocably violated by quantization. Anomalies not only constrain the
space of classical theories than are consistent with quantum mechanics but are
responsible for rich, surprising and experimentally tested physical phenomena.
In this talk I will give a non-technical, bird's eye introduction to anomalies.
Seminal work of Steve Lack showed that universal algebraic theories (PROPs) may be composed to produce more sophisticated theories. I’ll apply this method to construct an axiomatic version of the theory of a pair of complementary observables starting from the theory of monoids. How far can we get with this? Quite far! We’ll get a large chunk of finite dimensional quantum theory this way —but the fact that quantum systems have non-trivial dynamics means that it’s (always) possible to present the resulting theory as a composite PROP in Lack’s sense. If time permits, I’ll also discuss ho