This series consists of talks in the area of Foundations of Quantum Theory. Seminar and group meetings will alternate.
In quantum information, we frequently consider (for instance, whenever we talk about entanglement) a composite system consisting of two separated subsystems. A standard axiom of quantum mechanics states that a composite system can be modeled as the tensor product of the two subsystems. However, there is another less restrictive way to model a composite system, which is used in quantum field theory: we can require only that the algebras of observables for each subsystem commute within some larger subalgebra.
Bell's theorem shows that our intuitive understanding of causation must be overturned in light of quantum correlations. Nevertheless, quantum mechanics does not permit signalling and hence a notion of cause remains. Understanding this notion is not only important at a fundamental level, but also for technological applications such as key distribution and randomness expansion. It has recently been shown that a useful way to determine which classical causal structures give rise to a given set of correlations is to use entropy vectors.
In this talk, I will outline the current state of the art in the study of the reality of the quantum state. The main theme will be that, although you cannot derive the reality of the quantum state in an ontological model without additional assumptions, you can place constraints on the amount of overlap between probability measures that begin to make psi-epistemic theories look implausible.
The ideas of no-signalling, nonlocality, Bell inequalities, and quantum correlations can all be understood as implications of a presumed causal structure. In particular, the causal structure of the Bell scenario implies the Bell inequalities whenever the shared resource is presumed to act like a classical hidden random variable. If the shared resource in the scenario is a quantum system, however, then the quantum causal structure can give rise to a larger set of correlations, including probability distributions which violate Bell inequalities up to Tsirelson's bound.
In this talk I will go over the recent paper by Daniela Frauchiger and Renato Renner, "Single-world interpretations of quantum theory cannot be self-consistent" (arXiv:1604.07422).
The paper introduces an extended Wigner's friend thought experiment, which makes use of Hardy's paradox to show that agents will necessarily reach contradictory conclusions - unless they take into account that they themselves may be in a superposition, and that their subjective experience of observing an outcome is not the whole story.
Certain superposition states of the 1-D infinite square well have transient zeros at locations other than the nodes of the eigenstates that comprise them. It is shown that if an infinite potential barrier is suddenly raised at some or all of these zeros, the well can be split into multiple adjacent infinite square wells without affecting the wavefunction.
Most physicists take it for granted that the experimental violation of Bell's inequality provides evidence that it is not possible to completely describe the state of a physical system in terms of purely local information when this system is entangled with some other system. We disagree. Provided we redefine appropriately what is the information-theoretic state of a quantum system, it becomes possible to recover the whole from the description of its parts.
For a spin 1/2 (a qubit), Hamiltonian evolution is equivalent to an elliptic rotation of the (Bloch) spin vector in 3D space. In contrast, measurement alters the state norm, so may not be described as such a rotation. Nevertheless, extending the 3D spin vector to a 4D "spacetime" representation allows weak measurements to be interpreted as hyperbolic (boost) rotations. The combined Hamiltonian and measurement dynamics in continuous weak measurement trajectories are then equivalent to (stochastic) Lorentz transformations.