Since 2002 Perimeter Institute has been recording seminars, conference talks, and public outreach events using video cameras installed in our lecture theatres. Perimeter now has 7 formal presentation spaces for its many scientific conferences, seminars, workshops and educational outreach activities, all with advanced audio-visual technical capabilities. Recordings of events in these areas are all available On-Demand from this Video Library and on Perimeter Institute Recorded Seminar Archive (PIRSA). PIRSA is a permanent, free, searchable, and citable archive of recorded seminars from relevant bodies in physics. This resource has been partially modelled after Cornell University's arXiv.org.
In accordance with Betteridge's Law of Headlines, the answer to the question in the title is "no." I will argue that the usual norms of Bayesian inference lead the conclusion that quantum states are features of physical reality. The argument will involve both existing $\psi$-ontology results and extension of them that avoids the use of the Cartesian Product Assumption. As the usual norms of Bayesian inference lead to the conclusion of the reality of quantum state, rejecting it requires abandonment of virtually all of Bayesian information theory. This, I will argue, is unwarranted
The progression of theories suggested for our world, from ego- to geo- to helio-centric models to universe and multiverse theories and beyond, shows one tendency: The size of the described worlds increases, with humans being expelled from their center to ever more remote and random locations. If pushed too far, a potential theory of everything (TOE) is actually more a theories of nothing (TON). Indeed such theories have already been developed. I show that including observer localization into such theories is necessary and sufficient to avoid this problem.
In this talk I compare the normative concept of probability at the heart of QBism with the notion of probability implied by the use of Solomonoff induction in Markus Mueller's preprint arXiv:1712.01816.
Algorithmic information theory (AIT) delivers an objective quantification of simplicity-qua-compressibility,that was employed by Solomonoff (1964) to specify a gold standard of inductive inference. Or so runs the conventional account,that I will challenge in my talk.
In this talk we describe a general procedure for associating a minimal informationally-complete quantum measurement (or MIC) with a probabilistic representation of quantum theory. Towards this, we make use of the idea that the Born Rule is a consistency criterion among subjectively assigned probabilities rather than a tool to set purely physically mandated probabilities.
In his famous thought experiment, Wigner assigns an entangled state to the composite quantum system made up of Wigner's friend and her observed system. While the two of them have different accounts of the process, each Wigner and his friend can in principle verify his/her respective state assignments by performing an appropriate measurement. As manifested through a click in a detector or a specific position of the pointer, the outcomes of these measurements can be regarded as reflecting directly observable "facts".