This series consists of talks in the area of Foundations of Quantum Theory. Seminar and group meetings will alternate.
A standard canonical quantization of general relativity yields a time-independent Schroedinger equation whose solutions are static wavefunctions on configuration space. Naively this is in contradiction with the real world where things do change. Broadly speaking, the problem how to reconcile a theory which contains no concept of time with a changing world is called 'the problem of time'.
The essential ingredients of a quantum theory are usually a Hilbert space of states and an algebra of operators encoding observables. The mathematical operations available with these structures translate fairly well into physical operations (preparation, measurement etc.) in a non-relativistic world. This correspondence weakens in quantum field theory, where the direct operational meaning of the observable algebra structure (encoded usually through commutators) is lost.
Both classical probability theory and quantum theory lend themselves to a Bayesian interpretation where probabilities represent degrees of belief, and where the various rules for combining and updating probabilities are but algorithms for plausible reasoning in the face of uncertainty. I elucidate the differences and commonalities of these two theories, and argue that they are in fact the only two algorithms to satisfy certain basic consistency requirements.
Lee Smolin has argued that one of the barriers to understanding time in a quantum world is our tendency to spatialize time. The question is whether there is anything in physics that could lead us to mathematically characterize time so that it is not just another funny spatial dimension. I will explore the possibility(already considered by Smolin and others) that time may be distinguished from space by what I will call a measure of Booleanity.
Quantum entanglement has two remarkable properties. First, according to Bell\'s theorem, the statistical correlations between entangled quantum systems are inconsistent with any theory of local hidden variables. Second, entanglement is monogamous -- that is, to the degree that A and B are entangled with each other, they cannot be entangled with any other systems. It turns out that these properties are intimately related.
We all know that the EPR argument fails, and we can all provide proofs of one sort or another that it can\'t work. But in spite of this, there\'s something curiously tempting about the reasoning, and the temptation sometimes leads to needless perplexity about other issues. This paper will do two things. It will offer a diagnosis of where the EPR argument goes wrong that shows why we should be suspicious long before we get to Bell-type results, and then use the thought behind this diagnosis to suggest an orientation toward thinking about quantum states.
This paper critically examines the view of quantum mechanics that emerged shortly after the introduction of quantum mechanics and that has been widespread ever since. Although N. Bohr, P. A. M. Dirac, and W. Heisenberg advanced this view earlier, it is best exemplified by J. von Neumann’s argument in Mathematical Foundations of Quantum Mechanics (1932) that the transformation of \'a [quantum] state ... under the action of an energy operator . . . is purely causal,\' while, \'on the other hand, the state ... which may measure a [given] quantity ...
Conventional quantum mechanics answers this question by specifying the required mathematical properties of wavefunctions and invoking the Born postulate. The ontological question remains unanswered. There is one exception to this. A variation of the Feynman chessboard model allows a classical stochastic process to assemble a wavefunction, based solely on the geometry of spacetime paths. A direct comparison of how a related process assembles a Probability Density Function reveals both how and why PDFs and wavefunctions differ from the perspective of an underlying kinetic theory.
Many statistics problems involve predicting the joint strategy that will be chosen by the players in a noncooperative game. Conventional game theory predicts that the joint strategy will satisfy an ``equilibrium concept\'\'. The relative probabilities of the joint strategies satisfying the equilibrium concept are not given, and all joint strategies that do not satisfy it are given probability zero. As an alternative, I view the prediction problem as one of statistical inference, where the ``data\'\' includes the details of the noncooperative game.