This series consists of weekly discussion sessions on foundations of quantum Theory and quantum information theory. The sessions start with an informal exposition of an interesting topic, research result or important question in the field. Everyone is strongly encouraged to participate with questions and comments.
Quantum computation by single-qubit measurements was proposed by Raussendorf and Briegel [PRL 86, 5188] as a potential scheme for implementing quantum computers. It also offers an unusual means of describing unitary transformations. To better understand which measurement-based procedures perform unitary operations, we may consider the following problem: under what circumstances can a measurement-based procedure for a unitary U be found, provided a similar procedure for U which relies on post-selection?
By storing quantum information in the degenerate ground state of a Hamiltonian, it is hoped that it can be made quite robust against noise processes. We will examine this situation, with particular emphasis on the Toric code in 2D, and show how adversarial effects, either perturbations to the Hamiltonian or interactions with an environment, destroy the stored information extremely quickly.
We analyze how quantum complexity poses bounds to the simulation of quantum systems. While methods as Density Functional Theory (DFT) and the Density Matrix Renormalization Group (DMRG) work very well in practice, essentially nothing on the formal requirements is known. In this talk, we consider these methods from a quantum complexity perspective: First, we discuss DFT which encapsulates the difficulty of solving the Schroedinger equation in a universal functional and show that this functional cannot be efficiently computed unless several complexity classes collapse.
We study the possibility of a self-correcting quantum memory based on stabilizer codes with geometrically-local stabilizer generators. We prove that the distance of such stabilizer codes in D dimensions is bounded by O(L^{D-1}) where L is the linear size of the D-dimensional lattice. In addition, we prove that in D=1 and D=2, the energy barrier separating different logical states is upper-bounded by a constant independent of L. This shows that in such systems there is no natural energy dissipation mechanism which prevents errors from accumulating.
The rise of quantum information science has been paralleled by the development of a vigorous research program aimed at obtaining an informational characterization or reconstruction of the quantum formalism, in a broad framework for stochastic theories that encompasses quantum and classical theory, but also a wide variety of other theories that can serve as foils to them.
I would like to provide a short, possibly elementary, introduction to the problem of computing string amplitudes at higher genus for superstrings. Essentially, I will recall which is the mathematical problem in defining the path integral measure (which has a well defined algebraic geometry realization for bosonic strings) and the solution proposed by d~@~YHocker and Phong for the genus 2 case. Their main results are the chiral splitted form of the measure, and its explicit expression in genus two.
In this talk I will give an introduction to the simulation of quantum many-body systems using the so-called tensor networks. After a brief historical review, I will introduce the basics on tensor network representations of quantum states, and will explain some recent developments. In particular, in the last part of my talk I will focus on recent results obtained in the simulation of 2-dimensional quantum lattice systems of infinite size.
A quantum channel models a physical process in which noise is added to a quantum system via interaction with its environment. Protecting quantum systems from such noise can be viewed as an extension of the classical communication problem introduced by Shannon sixty years ago. A fundamental quantity of interest is the quantum capacity of a given channel, which measures the amount of quantum information which can be protected, in the limit of many transmissions over the channel.
I will report on efforts to implement a new method for simulating concatenated quantum error correction, where many levels of concatenation are simulated together explicitly. That is, the approach involves a Monte Carlo simulation of a noisy circuit involving many thousands of qubits, rather than tens of qubits previously. The new approach allows the threshold and resource usage of concatenated quantum error correction to be determined more accurately than before.
One of the quintessential features of quantum information is its exclusivity, the inability of strong quantum correlations to be shared by many physical systems. Likewise, complementarity has a similar status in quantum mechanics as the sine qua non of quantum phenomena. We show that this is no coincidence, and that the central role of exclusivity in quantum information theory stems from the phenomenon of complementarity.