Since 2002 Perimeter Institute has been recording seminars, conference talks, and public outreach events using video cameras installed in our lecture theatres. Perimeter now has 7 formal presentation spaces for its many scientific conferences, seminars, workshops and educational outreach activities, all with advanced audio-visual technical capabilities. Recordings of events in these areas are all available On-Demand from this Video Library and on Perimeter Institute Recorded Seminar Archive (PIRSA). PIRSA is a permanent, free, searchable, and citable archive of recorded seminars from relevant bodies in physics. This resource has been partially modelled after Cornell University's arXiv.org.
With recent advancement of experimental physics, macroscopic objects, which are typically well-described by classical physics, can now be isolated so well from their environment, that their quantum uncertainties can be studied quantitatively. In the research field called “optomechanics”, mechanical motions of masses from picograms to kilograms are being prepared into nearly pure quantum states, and observed at time scales ranging from nanoseconds to milliseconds.
The Ryu-Takayanagi formula relates the entanglement entropy in a conformal field theory to the area of a minimal surface in its holographic dual. I will show that this relation can be inverted to reconstruct the bulk stress-energy tensor near the boundary of the bulk spacetime, from the entanglement on the boundary. I will also show that the positivity and monotonicity of the relative entropy for small spherical domains between the reduced density matrices of an excited state and of the ground state of the CFT, translate to energy conditions in the bulk.
Renormalization is a principled coarse-graining of space-time. It shows us how the small-scale details of a system may become irrelevant when looking at larger scales and lower energies. Coarse-graining is also crucial, however, for biological and cultural systems that lack a natural spatial arrangement. I introduce the notion of coarse-graining and equivalence classes, and give a brief history of attempts to tame the problem of simplifying and "averaging" things as various as algorithms and languages.