This series consists of weekly discussion sessions on foundations of quantum Theory and quantum information theory. The sessions start with an informal exposition of an interesting topic, research result or important question in the field. Everyone is strongly encouraged to participate with questions and comments.
The out-of-time-ordered correlator (OTOC) and entanglement are two physically motivated and widely used probes of the ``scrambling'' of quantum information, which has drawn great interest recently in quantum gravity and many-body physics. By proving upper and lower bounds for OTOC saturation on graphs with bounded degree and a lower bound for entanglement on general graphs, we show that the time scales of scrambling as given by the growth of OTOC and entanglement entropy can be asymptotically separated in a random quantum circuit model defined on graphs with a tight bottleneck.
We study approximate quantum low-density parity-check (QLDPC) codes, which are approximate quantum error-correcting codes specified as the ground space of a frustration-free local Hamiltonian, whose terms do not necessarily commute. Such codes generalize stabilizer QLDPC codes, which are exact quantum error-correcting codes with sparse, low-weight stabilizer generators (i.e. each stabilizer generator acts on a few qubits, and each qubit participates in a few stabilizer generators).
One of the central problems in the study of quantum resource theories is to provide a given resource with an operational meaning, characterizing physical tasks relevant to information processing in which the resource can give an explicit advantage over all resourceless states. We show that this can always be accomplished for all convex resource theories. We establish in particular that any resource state enables an advantage in a channel discrimination task, allowing for a strictly greater success probability than any state without the given resource.
How violently do two quantum operators disagree? Different subfields of physics feature different notions of incompatibility: i) In quantum information theory, uncertainty relations are cast in terms of entropies. These entropic uncertainty relations constrain measurement outcomes. ii) Condensed matter and high-energy physics feature interacting quantum many-body systems, such as spin chains. A local perturbation, such as a Pauli operator on one side of a chain, preads through many-body entanglement.
Optimally encoding classical information in a quantum system is one of the oldest and most fundamental challenges of quantum information theory. Holevo’s bound places a hard upper limit on such encodings, while the Holevo-Schumacher-Westmoreland (HSW) theorem addresses the question of how many classical messages can be “packed” into a given quantum system. In this article, we use Sen’s recent quantum joint typicality results to prove a one-shot multiparty quantum packing lemma generalizing the HSW theorem.
In this talk, I will discuss some interesting connections between Hamiltonian complexity, error correction, and quantum circuits. First, motivated by the Quantum PCP Conjecture, I will describe a construction of a family of local Hamiltonians where the complexity of ground states — even when subject to large amounts of noise — is superpolynomial (under plausible complexity assumptions). The construction is simple, making use of the well-known Feynman-Kitaev circuit Hamiltonian construction.
In the usual paradigm of quantum error correction, the information to be protected can be encoded in a system of abstract qubits or modes. But how does this work for physical information, which cannot be described in this way? Just as direction information cannot be conveyed using a sequence of words if the parties involved do not share a reference frame, physical quantum information cannot be conveyed using a sequence of qubits or modes without a shared reference frame. Covariant quantum error correction is a procedure for protecting such physical information against noise in such a way
The Leggett-Garg (LG) inequalities were introduced, as a temporal parallel of the Bell inequalities, to test macroscopic realism -- the view that a macroscopic system evolving in time possesses definite properties which can be determined without disturbing the future or past state. The talk will begin with a review of the LG framework. Unlike the Bell inequalities, the original LG inequalities are only a necessary condition for macrorealism, and are therefore not a decisive test.
We study correlations in fermionic systems with long-range interactions in thermal equilibrium. We prove an upper-bound on the correlation decay between anti-commut-ing operators based on long-range Lieb-Robinson type bounds. Our result shows that correlations between such operators in fermionic long-range systems of spatial dimension $D$ with at most two-site interactions decaying algebraically with the distance with an exponent $\alpha \geq 2\,D$, decay at least algebraically with an exponent arbitrarily close to $\alpha$.
The precise relationship between post-selected classical and
post-selected quantum computation is an open problem in complexity
theory. Post-selection has proven to be a useful tool in uncovering some
of the differences between quantum and classical theories, in
foundations and elsewhere. This is no less true in the area of
computational complexity -- quantum computations augmented with
post-selection are thought to be vastly more powerful than their
classical counterparts. However, the precise reasons why this might be