This series covers all areas of research at Perimeter Institute, as well as those outside of PI's scope.
Hawking\'s black hole information paradox is one of the great thought experiments in physics. It points to a breakdown of some central principle of physics, though which one breaks down is still in dispute. It has led to the discovery of ideas that seem to be key to unifying quantum mechanics and gravity, namely the holographic principle and gauge/gravity duality. I review this subject, and discuss ongoing work and future directions.
I will summarize current observational constraints in cosmology with emphasis on what we have learned about the properties of the primordial density perturbations. I will describe future directions including observations of high redshift neutral hydrogen through is 21 cm line.
The Great Plague of London, which claimed the lives of one fifth of London\'s population in 1665, is one of the most famous epidemics of all time. We have recently digitized the mortality records for London during the Great Plague, yielding weekly data for each of the 130 parishes. I will describe the temporal and spatial dynamics of the plague, and discuss our efforts to estimate the transmissibility of the infectious agent. I will also briefly describe other projects in progress inspired by disease-specific mortality records for London over the past 650 years.
There are two notions that play a central role in the mathematical theory of computation. One is that of a computable problem, i.e., of a problem that can, in principle, be solved by an (idealized) computer. It is known that there exist problems that \'have answers\', but for which those answers are not computable. The other is that of the difficulty of a computation, i.e. of the number of (idealized) steps required actually to carry out that computation.
At a very basic level, physics is about what we can say about propositions like \'A has a value in S\' (or \'A is in S\' for short), where A is some physical quantity like energy, position, momentum etc. of a physical system, and S is some subset of the real line. In classical physics, given a state of the system, every proposition of the form \'A is in S\' is either true or false, and thus classical physics is realist in the sense that there is a \'way things are\'. In contrast to that, quantum theory only delivers a probability of \'A is in S\' being true.
It is common to assert that the discovery of quantum theory overthrew our classical conception of nature. But what, precisely, was overthrown? Providing a rigorous answer to this question is of practical concern, as it helps to identify quantum technologies that outperform their classical counterparts, and of significance for modern physics, where progress may be slowed by poor physical intuitions and where the ability to apply quantum theory in a new realm or to move beyond quantum theory necessitates a deep understanding of the principles upon which it is based.
The history of human knowledge is often highlighted by our efforts to explore beyond our apparent horizon. In this talk, I will describe how this challenge has now evolved into our quest to understand the physics at/beyond the cosmological horizon, some twenty orders of magnitude above Columbus\' original goal.
A convergence of climate, resource, technological, and economic stresses gravely threaten the future of humankind. Scientists have a special role in humankind\\\'s response, because only rigorous science can help us understand the complexities and potential consequences of these stresses. Diminishing the threat they pose will require profound social, institutional, and technological changes -- changes that will be opposed by powerful status-quo special interests.
The Achilles\\\' heel of quantum information processors is the fragility of quantum states and processes. Without a method to control imperfection and imprecision of quantum devices, the probability that a quantum computation succeed will decrease exponentially in the number of gates it requires. In the last ten years, building on the discovery of quantum error correction, accuracy threshold theorems were proved showing that error can be controlled using a reasonable amount of resources as long as the error rate is smaller than a certain threshold.