This series consists of talks in the area of Foundations of Quantum Theory. Seminar and group meetings will alternate.
Bruno de Finetti is one of the founding fathers of the subjectivist school of probability, where probabilities are interpreted as rational degrees of belief. His work on the relation between the theorems of the probability calculus and rationality is among the corner stones of modern subjective probability theory. De Finetti maintained that rationality requires that an agent’s degrees of belief be coherent.
Instrumentalism about the quantum state is the view that this mathematical object does not serve to represent a component of (non-directly observable) reality, but is rather a device solely for making predictions about the results of experiments. One honest way to be such an instrumentalist is a) to take an ensemble view (= frequentism about quantum probabilities), whereby the state represents predictions for measurement results on ensembles of systems, but not individual systems and b) to assign some specific level for the quantum/classical cut.
I will present recent work [1] on preparation by measurement of Greenberger–Horne–Zeilinger (GHZ) states in circuit quantum electrodynamics. In particular, for the 3-qubit case, when employing a nonlinear filter on the recorded homodyne signal the selected states are found to exhibit values of the Bell–Mermin operator exceeding 2 under realistic conditions. I will discuss the potential of the dispersive readout to demonstrate a violation of the Mermin bound, and present a measurement scheme avoiding the necessity for full detector tomography.
We present a first-principles implementation of {\em spatial} scale invariance as a local gauge symmetry in geometry dynamics using the method of best matching. In addition to the 3-metric, the proposed scale invariant theory also contains a 3-vector potential A_k as a dynamical variable. Although some of the mathematics is similar to Weyl's ingenious, but physically questionable, theory, the equations of motion of this new theory are second order in time-derivatives. It is tempting to try to interpret the vector potential A_k as the electromagnetic field.
Betting (or gambling) is a useful tool for studying decision-making in the face of [classical] uncertainty. We would like to understand how a quantum "agent" would act when faced with uncertainty about its [quantum] environment. I will present a preliminary construction of a theory of quantum gambling, motivated by roulette and quantum optics. I'll begin by reviewing classical gambling and the Kelly Criterion for optimal betting. Then I'll demonstrate a quantum optical version of roulette, and discuss some of the challenges and pitfalls in designing such analogues.
Violation of local realism can be probed by theory–independent tests, such as Bell’s inequality experiments. There, a common assumption is the existence of perfect, classical, reference frames, which allow for the specification of measurement settings with arbitrary precision. However, if the reference frames are ``bounded'', only limited precision can be attained. We expect then that the finiteness of the reference frames limits the observability of genuine quantum features.
Although most realistic approaches to quantum theory are based on classical particles, QFT reveals that classical fields are a much closer analog. And unlike quantum fields, classical fields can be extrapolated to curved spacetime without conceptual difficulty. These facts make it tempting to reconsider whether quantum theory might be reformulated on an underlying classical field structure.
Researchers in quantum foundations claim (D'Ariano, Fuchs, ...):
Quantum = probability theory + x
and hence:
x = Quantum - probability theory
Guided by the metaphorical analogy:
probability theory / x = flesh / bones
we introduce a notion of quantum measurement within x, which, when flesing it with Hilbert spaces, provides orthodox quantum mechanical probability calculus.
We know the mathematical laws of quantum mechanics, but as yet we are not so sure why those laws should be inevitable. In the simpler but related environment of classical inference, we also know the laws (of probability). With better understanding of quantum mechanics as the eventual goal, Kevin Knuth and I have been probing the foundations of inference. The world we wish to infer is a partially-ordered set ('poset') of states, which may as often supposed be exclusive, but need not be (e.g. A might be a requirement for B).
Eugene Wigner and Hermann Weyl led the way in applying the theory of group representations to the newly formulated theory of quantum mechanics starting in 1927. My talk will focus, first, on two aspects of this early work. Physicists had long exploited symmetries as a way of simplifying problems within classical physics.