Since 2002 Perimeter Institute has been recording seminars, conference talks, public outreach events such as talks from top scientists using video cameras installed in our lecture theatres. Perimeter now has 7 formal presentation spaces for its many scientific conferences, seminars, workshops and educational outreach activities, all with advanced audio-visual technical capabilities.
Recordings of events in these areas are all available and On-Demand from this Video Library and on Perimeter Institute Recorded Seminar Archive (PIRSA). PIRSA is a permanent, free, searchable, and citable archive of recorded seminars from relevant bodies in physics. This resource has been partially modelled after Cornell University's arXiv.org.
Accessibly by anyone with internet, Perimeter aims to share the power and wonder of science with this free library.
Inferring a quantum system\'s state, from repeated measurements, is critical for verifying theories and designing quantum hardware. It\'s also surprisingly easy to do wrong, as illustrated by maximum likelihood estimation (MLE), the current state of the art. I\'ll explain why MLE yields unreliable and rank-deficient estimates, why you shouldn\'t be a quantum frequentist, and why we need a different approach. I\'ll show how operational divergences -- well-motivated metrics designed to evaluate estimates -- follow from quantum strictly proper scoring rules.
TBA
TBA
In stochastic treatments of the ERRB set-up, it is equivalent to impose Bell\'s inequalities, a local causality condition, or a certain \"non-contextual hidden variables\" condition. But these conditions are violated by quantum mechanics. On the other hand, it is possible to view quantum mechanics as part of \"quantum measure theory\", a generalization of probability measure theory that allows pair wise interferences between histories whilst banning higher order interference. In this setting, is may be possible find quantum analogues of the three stochastic conditions.
TBA
Quantum information theory has two equivalent mathematical conjectures concerning quantum channels, which are also equivalent to other important conjectures concerning the entanglement. In this talk I explain these conjectures and introduce recent results.
It is a fundamental property of quantum mechanics that non-orthogonal pure states cannot be distinguished with certainty, which leads to the following problem: Given a state picked at random from some ensemble, what is the maximum probability of success of determining which state we actually have? I will discuss two recently obtained analytic lower bounds on this optimal probability. An interesting case to which these bounds can be applied is that of ensembles consisting of states that are themselves picked at random.
Ancillary state construction is a necessary component of quantum computing.
The Everett (many-worlds) interpretation has made great progress over the past 20-30 years, largely due to the role of decoherence in providing a solution to the preferred basis problem. This makes it a serious candidate for a realist solution to the measurement problem. A remaining objection to the Everett interpretation (and one that is often considered fatal) is that that interpretation cannot make adequate sense of quantum probabilities.
Using results from models of the atmosphere/ocean/sediment carbon cycle, the impacts of fossil-fuel CO2 release will be examined including the effect on climate many thousands of years into the future, rather than for just a few centuries as commonly claimed. Prof. Archer will explain how aspects of the Earth system, such as the growth or melting of the great ice sheets, the thawing of permafrost, and the release of methane from the methane hydrate deposits in the deep ocean, take thousands of years to respond to a change in climate.
Check back for details on the next lecture in Perimeter's Public Lectures Series