This series consists of weekly discussion sessions on foundations of quantum Theory and quantum information theory. The sessions start with an informal exposition of an interesting topic, research result or important question in the field. Everyone is strongly encouraged to participate with questions and comments.
Probabilistic protocols in quantum information are an attempt to improve performance by occasionally reporting a better result than could be expected from a deterministic protocol. Here we show that probabilistic protocols can never improve performance beyond the quantum limits on the corresponding deterministic protocol. To illustrate this result we examine three common probabilistic protocols: probabilistic amplification, weak value amplification, and probabilistic metrology.
A fundamental question in complexity theory is how much resource is needed to solve k independent instances of a problem compared to the resource required to solve one instance. Suppose solving one instance of a problem with probability of correctness p, we require c units of some resource in a given model of computation. A direct sum theorem states that in order to compute k independent instances of a problem, it requires k times units of the resource needed to compute one instance.
Quantum many-body problems are notorious hard. This is partly because the Hilbert space becomes exponentially big with the particle number N. While exact solutions are often considered intractable, numerous approaches have been proposed using approximations.
Problems in computer science are often classified based on the scaling of the runtimes for algorithms that can solve the problem. Easy problems are efficiently solvable but often in physics we encounter problems that take too long to be solved on a classical computer. Here we look at one such problem in the context of quantum error correction. We will further show that no efficient algorithm for this problem is likely to exist.
We show that in certain generic circumstances the state of light of an optical cavity traversed by beams of atoms is naturally driven towards a non-thermal metastable state. This state can be such that successive pairs of unentangled particles sent through the cavity will reliably emerge significantly entangled thus providing a renewable source of quantum entanglement. Significant for possible experimental realizations is the fact that this entangling fixed point state of the cavity can be reached largely independently of the initial state in which the cavity was prepared.
Matchgates are a restricted set of two-qubit gates known to be classically simulable when acting on nearest-neighbor qubits on a path, but universal for quantum computation when the gates can also act on more distant qubits. In this talk, I will address the power of matchgates when they can act on pairs of qubits according to the edges of arbitrary graphs. Specifically, we show that matchgates are universal on any connected graph other than a path or a cycle, and that they are classically simulable on a cycle.
The Bose-Hubbard model is a system of interacting bosons that live on the vertices of a graph. The particles can move between adjacent vertices and experience a repulsive on-site interaction. The Hamiltonian is determined by a choice of graph that specifies the geometry in which the particles move and interact. We prove that approximating the ground energy of the Bose-Hubbard model on a graph at fixed particle number is QMA-complete. In our QMA-hardness proof, we encode the history of an n-qubit computation in the subspace with at most one particle per site (i.e., hard-core bosons).
Although various pieces of indirect evidence about the nature of dark matter have been collected, its direct detection has eluded experimental searches despite extensive effort. If the mass of dark matter is below 1 MeV, it is essentially imperceptible to conventional detection methods because negligible energy is transferred to nuclei during collisions. Here I propose directly detecting dark matter through the quantum decoherence it causes rather than its classical effects such as recoil or ionization.
I discuss a technique - the quantum adversary upper bound - that uses the structure of quantum algorithms to gain insight into the quantum query complexity of Boolean functions. Using this bound, I show that there must exist an algorithm for a certain Boolean formula that uses a constant number of queries. Since the method is non-constructive, it does not give information about the form of the algorithm. After describing the technique and applying it to a class of functions, I will outline quantum algorithms that match the non-constructive bound.