Since 2002 Perimeter Institute has been recording seminars, conference talks, and public outreach events using video cameras installed in our lecture theatres. Perimeter now has 7 formal presentation spaces for its many scientific conferences, seminars, workshops and educational outreach activities, all with advanced audio-visual technical capabilities. Recordings of events in these areas are all available On-Demand from this Video Library and on Perimeter Institute Recorded Seminar Archive (PIRSA). PIRSA is a permanent, free, searchable, and citable archive of recorded seminars from relevant bodies in physics. This resource has been partially modelled after Cornell University's arXiv.org.
The PAMELA satellite-borne experiment was launched from the Baikonur cosmodrome on the 15th of June 2006. It has been collecting data since July 2006. The instrument is composed of a silicon-microstrip magnetic spectrometer, a time-of-flight system, a silicon-tungsten electromagnetic calorimeter, an anticoincidence system, a shower tail counter scintillator and a neutron detector. The primary scientific goal is the measurement of the antiproton and positron energy spectrum in order to search for exotic sources, such as dark matter particle annihilations.
Both classical probability theory and quantum theory lend themselves to a Bayesian interpretation where probabilities represent degrees of belief, and where the various rules for combining and updating probabilities are but algorithms for plausible reasoning in the face of uncertainty. I elucidate the differences and commonalities of these two theories, and argue that they are in fact the only two algorithms to satisfy certain basic consistency requirements.
The growth of matter perturbations in the presence of dark energy with small fluctuations depends on the speed of sound of these fluctuations and the comoving scale. The growth index can differ from the value that it takes in the limit of no dark energy perturbations by an amount comparable to the accuracy of future observations. This may contribute to a better characterization of the dark energy properties.
The warped geometry present in Randall-Sundrum (RS) models provides an elegant means by which to generate stable scale hierarchies. Given the famous hierarchy problem of the Standard Model, and the relatively small number of known mechanisms which may solve it, the RS model has deservedly received a lot of attention. However the construction of a completely realistic RS model remains difficult and requires a number of modifications beyond the minimal framework.
If primordial black holes are produced at the end of inflation, they should quickly decay via Hawking radiation. For the most part the radiation signature of these black holes will be wiped out, as the universe is still radiation dominated when they disappear. The exception to this would be a stochastic background of gravity waves. I present an algorithm by which the spectrum of radiation can be calculated, and discuss the dependence on the initial energy density and the number of relativistic species.
As LHC era is coming close, all sorts of ideas about physics beyond the standard model are being explored. It remains possible that a strong-coupling chiral theory could appear at TeV scale. When it comes to strongly coupled theories, lattice is still the most reliable and straightforward regularization method. But defining a chiral gauge theory on the lattice is formidable on its own. In this talk, I will present some most recent theoretical developments in attempt to tackle this problem, and explain some general theorems we proved for generic chiral gauge theories on lattice.
The handling of the constraints on initial data is a major issue in most canonical formulations of general relativity. Since the 1960s unconstrained initial data for GR that living on null hypersurfaces has been known, but no canoncial formulation based on these data was developed due to conceptual and technical difficulties. I will explain how these dificulties have been overcome and outline the resulting canonical framework.
Lee Smolin has argued that one of the barriers to understanding time in a quantum world is our tendency to spatialize time. The question is whether there is anything in physics that could lead us to mathematically characterize time so that it is not just another funny spatial dimension. I will explore the possibility(already considered by Smolin and others) that time may be distinguished from space by what I will call a measure of Booleanity.