This series consists of weekly discussion sessions on foundations of quantum Theory and quantum information theory. The sessions start with an informal exposition of an interesting topic, research result or important question in the field. Everyone is strongly encouraged to participate with questions and comments.
What does a fractional quantum Hall liquid and Kitaev\'s proposals for topological quantum computation have in common? It turns out that they are physical systems that exhibit degenerate ground states with properties seemingly different than ordinary (Landau-type) vacua, such as the ground states of a Heisenberg magnet. For example, those (topologically quantum ordered)states cannot be characterized by (local) order parameters such as magnetization. How does one characterize this new order?
Wigner-Dirac relativistic quantum theory is applied to decay laws of an unstable particle in different reference frames. It is shown that decay slows down from the point of view of the moving observer, as expected. However, small deviations from Einstein\'s time dilation formula are also found. The origin of these deviations is discussed, as well as possibilities for their experimental detection.
The Hamiltonian of traditionally adopted detector models features out of diagonal elements between the vacuum and the one particle states of the field to be detected. We argue that reasonably good detectors, when written in terms of fundamental fields, have a more trivial response on the vacuum. In particular, the model configuration ``detector in its ground state + vacuum of the field\' generally corresponds to a stable bound state of the underlying theory (e.g.
The talk concerns a generalization of the concept of a minimum uncertainty state to the finite dimensional case. Instead of considering the product of the variances of two complementary observables we consider an uncertainty relation involving the quadratic Renyi entropies summed over a full set of mutually unbiased bases (MUBs).
Coin flipping by telephone (Blum \'81) is one of the most basic cryptographic tasks of two-party secure computation. In a quantum setting, it is possible to realize (weak) coin flipping with information theoretic security. Quantum coin flipping has been a longstanding open problem, and its solution uses an innovative formalism developed by Alexei Kitaev for mapping quantum games into convex optimization problems.
Decoherence attempts to explain the emergent classical behaviour of a
quantum system interacting with its quantum environment. In order to
formalize this mechanism we introduce the idea that the information
preserved in an open quantum evolution (or channel) can be
characterized in terms of observables of the initial system. We use
this approach to show that information which is broadcast into many
parts of the environment can be encoded in a single observable. This
I should like to show how particular mathematical properties can limit our metaphysical choices, by discussing old and new theorems within the statistical-model framework of Mielnik, Foulis & Randall, and Holevo, and what these theorems have to say about possible metaphysical models of quantum mechanics.
The \\\"frequency comb\\\" defined by the eigenmodes of an optical resonator is a naturally large set of exquisitely well defined quantum systems, such as in the broadband mode-locked lasers which have redefined time/frequency metrology and ultra precise measurements in recent years. High coherence can therefore be expected in the quantum version of the frequency comb, in which nonlinear interactions couple different cavity modes, as can be modeled by different forms of graph states.
We construct a simple translationally invariant, nearest-neighbor Hamiltonian on a chain of 10-dimensional qudits that makes it possible to realize universal quantum computing without any external control during the computational process, requiring only initial product state preparation. Both the quantum circuit and its input are encoded in an initial canonical basis state of the qudit chain. The computational process is then carried out by the autonomous Hamiltonian time evolution.
The renormalization group (RG) is one of the conceptual pillars of statistical mechanics and quantum field theory, and a key theoretical element in the modern formulation of critical phenomena and phase transitions. RG transformations are also the basis of numerical approaches to the study of low energy properties and emergent phenomena in quantum many-body systems. In this colloquium I will introduce the notion of \\\"entanglement renormalization\\\" and use it to define a coarse-graining transformation for quantum systems on a lattice [G.Vidal, Phys. Rev. Lett.