This series consists of talks in the area of Foundations of Quantum Theory. Seminar and group meetings will alternate.
A picture can be used to represent an experiment. In this talk we will consider such pictures and show how to turn them into pictures representing calculations (in the style of Penrose's diagrammatic tensor notation). In particular, we will consider circuits described probabilistically. A circuit represents an experiment where we act on various systems with boxes, these boxes being connected by the passage of systems between them. We will make two assumptions concerning such circuits.
I will review some recent advances on the line of deriving quantum field theory from pure quantum information processing. The general idea is that there is only Quantum Theory (without quantization rules), and the whole Physics---including space-time and relativity---is emergent from the processing. And, since Quantum Theory itself is made with purely informational principles, the whole Physics must be reformulated in information-theoretical terms.
The second law of thermodynamics tells that physics imposes a fundamental constraint on the efficiency of all thermal machines.
Here I will address the question of whether size imposes further constraints upon thermal machines, namely whether there is a minimum size below which no machine can run, and whether when they are small if they can still be efficient? I will present a simple model which shows that there is no size limitation and no limit on the efficiency of thermal machine and that this leads to a unified view of small refrigerators, pumps and engines.
Nonlocality is the most striking feature of quantum mechanics. It might even be considered its defining feature and understanding it may be the most important step towards understanding the whole theory. Yet for a long time it was impossible to pinpoint the reason behind the exact amount of nonlocality allowed by quantum mechanics expressed by Tsirelson bound. Recently information causality has been shown to be the principle from which this bound can be derived.
Landauer's erasure principle states that there is an inherent work cost associated with all irreversible operations, like the erasure of the data stored in a system. The necessary work is determined by our uncertainty: the more we know about the system, the less it costs to erase it.
I revisit an example of stronger-than-quantum correlations that was discovered by Ernst Specker in 1960. The example was introduced as a parable wherein an over-protective seer sets a simple prediction task to his daughter's suitors. The challenge cannot be met because the seer asks the suitors for a noncontextual assignment of values but measures a system for which the statistics are inconsistent with such an assignment. I will show how by generalizing these sorts of correlations, one is led naturally to some well-known proofs of nonlocality and contextuality, and to some new ones.
In this talk we quickly review the basics of the modal "toy model" of quantum theory described by Schumacher in his September 22 colloquium at PI. We then consider how the theory addresses more general open systems. Because the modal theory has a more primitive mathematical structure than actual quantum mechanics, it lacks density operators, positive operator measurements, and completely positive maps.
The question of the existence of gravitational stress-energy in general relativity has exercised investigators in the field since the very inception of the theory. Folklore has it that no adequate definition of a localized gravitational stress-energetic quantity can be given. Most arguments to that effect invoke one version or another of the Principle of Equivalence. I argue that not only are such arguments of necessity vague and hand-waving but, worse, are beside the point and do not address the heart of the issue.
The uncertainty principle bounds the uncertainties about the outcomes of two incompatible measurements, such as position and momentum, on a particle. It implies that one cannot predict the outcomes for both possible choices of measurement to arbitrary precision, even if information about the preparation of the particle is available in a classical memory. However, if the particle is prepared entangled with a quantum memory, it is possible to predict the outcomes for both measurement choices precisely. I will explain a recent extension of the uncertainty principle to incorporate this case.