Since 2002 Perimeter Institute has been recording seminars, conference talks, and public outreach events using video cameras installed in our lecture theatres. Perimeter now has 7 formal presentation spaces for its many scientific conferences, seminars, workshops and educational outreach activities, all with advanced audio-visual technical capabilities. Recordings of events in these areas are all available On-Demand from this Video Library and on Perimeter Institute Recorded Seminar Archive (PIRSA). PIRSA is a permanent, free, searchable, and citable archive of recorded seminars from relevant bodies in physics. This resource has been partially modelled after Cornell University's arXiv.org.
The evidence for the big bang is now overwhelming. However, the basic question of what caused the bang remains open. One possibility is that time somehow \'emerged,\' placing the universe in an inflationary state. Another, perhaps more conservative possibility, is that the big bang was a violent event in a pre-existing universe. I will describe model calculations employing the AdS/CFT correspondence which show how this is possible, and which point to a new explanation for the origin of large scale structure in the universe.
Quantum Field Theory I course taught by Volodya Miransky of the University of Western Ontario
In recent work with Bob Coecke and others, we have developed a categorical axiomatization of quantum mechanics. This analyzes the main structural features of quantum mechanics into simple and general elements, which admit an elegant diagrammatic representation. This enables an illuminating and effective analysis of quantum information protocols and computational structures.
In recent years there has been a growing awareness that studies on quantum foundations have close relationships with other fields such as probability and information theory. In this talk I give another example of how such interdisciplinary work can be fruitful, by applying some of the lessons from quantum mechanics, in particular from Bell\'s theorem, to a debate on the philosophical foundations of decision theory.
I\'ll sketch of a proposal for unifying classical and quantum probability, arguing first for the need to recognize a measure over phase space as a component of classical theories (indeed, of any theory satisfying certain constraints and capable of generating predictions for open systems) and then showing how to use that measure to define objective chances. Time permitting, I\'ll briefly address questions about the nature and interpretation of the measure.
As is well known, time-energy uncertainty generically manifests itself in the short time behavior of a system weakly coupled to a bath, in the energy non-conservation of the interaction term (H_I does not commute with H_0). Similarly, the monotonic evolution of the system density operator to its equilibrium value which is a universal property of quantum dynamical semigroups (Spohn\'s theorem), e.g., systems with Lindbladian evolution, is in general violated at short (non-Markovian) timescales.
In reference [1] R. D. Sorkin investigated a formulation of quantum mechanics as a generalized measure theory. Quantum mechanics computes probabilities from the absolute squares of complex amplitudes, and the resulting interference violates the (Kolmogorov) sum rule expressing the additivity of probabilities of mutually exclusive events.However, there is a higher order sum rule that quantum mechanics does obey, involving the probabilities of three mutually exclusive possibilities.
It has sometimes - though usually informally - been suggested that the psychological arrow can be reduced to the thermodynamic arrow through information processing properties of the brain. In this talk we demonstrate that this particular suggestion cannot succeed, as, insofar as information processing (at least in the sense of a classical computer) has an arrow of time, it is not governed by the thermodynamic arrow.
I will discuss fine tuning in modified gravity models that can account for today’s dark energy. I will introduce some models where the underlying cosmological constant may be Planck scale but starts as a redundant coupling which can be eliminated by a field redefinition. The observed vacuum energy arises when the redundancy is explicitly broken. I’ll give a recipe for constructing models that realize this mechanism and satisfy all solar system constraints on gravity, including one based on Gauss-Bonnet gravity which provides a technically natural explanation for dark energy.