This series consists of weekly discussion sessions on foundations of quantum Theory and quantum information theory. The sessions start with an informal exposition of an interesting topic, research result or important question in the field. Everyone is strongly encouraged to participate with questions and comments.
In this talk, I will outline a quantum generalization of causal networks that are used to analyze complex probabilistic inference problems involving large numbers of correlated random variables. I will review the framework of classical causal networks and the graph theoretical constructions that are abstracted from them, including entailed conditional independence, d-separation and Markov equivalence.
The modern view of representing a quantum observable as a semispectral measure as opposed to the traditional approach of using only spectral measures has added a great deal to our understanding of the mathematical structures and conceptual foundations of quantum mechanics.
Adiabatic Quantum Computation is not only a possibly more robust alternative to standard quantum computation. Since it considers a continuous-time evolution of the system, it also provides a natural bridge towards studying the dynamics of interacting many-particle quantum systems, quantum phase transitions and other issues in fundamental physics. After a brief review of adiabatic quantum computation, I will show our recent results on the dynamics of entanglement and fidelity for the search and Deutsch algorithms including several variations and optimization.
Clifton, Bub, and Halvorson claim to be able to derive quantum mechanics from information-theoretic axioms. However, their derivation relies on the auxiliary assumption that the relevant probabilities for measurement outcomes can be represented by the observables (self-adjoint operators) and states of a C*-algebra. There are legitimate probability theories that are not so representable --- in particular, the nonlocal boxes of Popescu and Rohrlich.
It is somewhat surprising, but problems in quantum computing lead to problems in algebraic graph theory. I will discuss some instances that I am familiar with, and note a commmon thread.
This talk is concerned with the noise-insensitive transmission of quantum information. For this purpose, the sender incorporates redundancy by mapping a given initial quantum state to a messenger state on a larger-dimensional Hilbert space. This encoding scheme allows the receiver to recover part of the initial information if the messenger system is corrupted by interaction with its environment. Our noise model for the transmission leaves a part of the quantum information unchanged, that is, we assume the presence of a noiseless subsystem or of a decoherence-free subspace.
We will look at the axioms of quantum mechanics as expressed, for example, in the book by M. A. Nielsen and I. L. Chung ("Quantum Computation and Quantum Information"). We then take a critical look at these axioms, raising several questions as we go. In particular, we will look at the possible informational completeness property of the family of operators that we measure. We will propose physical solutions based on the results of quantum mechanics on phase space and the measurement of quantum particles by quantum mechanical means.
Inspired by the notion that the differences between quantum theory and classical physics are best expressed in terms of information theory, Hardy (2001) and Clifton, Bub, and Halvorson (2003) have constructed frameworks general enough to embrace both quantum and classical physics, within which one can invoke principles that distinguish the classical from the quantum.
Entanglement entropy is currently of interest in several areas in physics, such as condensed matter, field theory, and quantum information. One of the most interesting properties of the entanglement entropy is its scaling behavior, especially close to phase transitions. It was believed that for dimensions higher than 1 the entropy scales like surface area of the subsystem. We will describe a recent result for free fermions at zero temperature, where the entropy in fact scales faster. The latter problem will be related to a mathematical conjecture due to H. Widom (1982).
While modern theories lavishly invoke several spatial dimensions within models that seek to unify relativity theory and quantum mechanics, none seems to consider the possibility that a yet-unfamiliar aspect of time may do the work. I introduce the notion of Becoming and then consider its consequences for physical theory. Becoming portrays a possible aspect of time that is "curled" very much like the extra spatial dimensions in superstring theories.