Since 2002 Perimeter Institute has been recording seminars, conference talks, public outreach events such as talks from top scientists using video cameras installed in our lecture theatres. Perimeter now has 7 formal presentation spaces for its many scientific conferences, seminars, workshops and educational outreach activities, all with advanced audio-visual technical capabilities.
Recordings of events in these areas are all available and On-Demand from this Video Library and on Perimeter Institute Recorded Seminar Archive (PIRSA). PIRSA is a permanent, free, searchable, and citable archive of recorded seminars from relevant bodies in physics. This resource has been partially modelled after Cornell University's arXiv.org.
Accessibly by anyone with internet, Perimeter aims to share the power and wonder of science with this free library.
I propose [1] to use the residual anyons of overscreened Kondo physics for quantum computation. A superconducting proximity gap of Δ<TK can be utilized to isolate the anyon from the continuum of excitations and stabilize the non-trivial fixed point. We use the dynamical large-N technique [2] and bosonization to show that the residual entropy survives in a superconductor and suggest a charge Kondo setup for isolating and detecting the Majorana fermion in the two-channel Kondo impurity.
Motivated by puzzles in quantum gravity AdS/CFT, Lenny Susskind posed the following question: supposing one had the technological ability to distinguish a macroscopic superposition of two given states |v> and |w> from incoherent mixture of those states, would one also have the technological ability to map |v> to |w> and vice versa? More precisely, how does the quantum circuit complexity of the one task relate to the quantum circuit complexity of the other? Here we resolve Susskind's question -- showing that the two complexities are essentially identical, even for approximate v
Last fall, a team at Google announced the first-ever demonstration of "quantum computational supremacy"---that is, a clear quantum speedup over a classical computer for some task---using a 53-qubit programmable superconducting chip called Sycamore. In addition to engineering, Google's accomplishment built on a decade of research in quantum computing theory. This talk will discuss questions like: what exactly was the contrived computational problem that Google solved? How does one verify the outputs using a classical computer? And how confident are we that the problem really is classical
We propose a unified manner of understanding two important phenomena: color confinement in large-N gauge theory, and Bose-Einstein condensation (BEC). We do this by clarifying the relation between the standard criteria, based on the off-diagonal long range order (ODLRO) for the BEC and the Polyakov loop for gauge theory: the constant offset of the distribution of Polyakov line phase is ODLRO.
Meta-learning involves learning mathematical devices using problem instances as training data. In this talk, we first describe recent meta-learning approaches involving the learning of objects such as: initial weights, parameterized losses, hyper-parameter search strategies, and samplers. We then discuss learned optimizers in further detail and their applications towards optimizing variational circuits. This talk also covers some lessons learned starting a spin-off from academia.
We propose a model for combining the Standard Model (SM) with gravity. It relies on a non-minimal coupling of the Higgs field to the Ricci scalar and on the Palatini formulation of gravity. Without introducing any new degrees of freedom in addition to those of the SM and the graviton, this scenario achieves two goals. First, it generates the electroweak symmetry breaking by a non-perturbative gravitational effect. In this way, it does not only address the hierarchy problem but opens up the possibility to calculate the Higgs mass.
Cosmologists wish to explain how our universe, in all its complexity, could ever have come about. This is the problem of initial conditions and the first step towards its solution is the assessment of the universe’s entropy today. It is widely agreed upon that the entropy of vacuum energy, given by the Bekenstein bound, makes up the bulk of the current entropy budget, dominating over that of gravity and over thermal motions of the cosmic radiation background.