Skip to main content

Defining the axioms of quantum theory and foil theory tests

The mathematical formulation of quantum theory, which is over a century old, is complex and abstract. One of the aims of quantum foundations is to derive a more streamlined mathematical formalism of quantum theory from simple axioms or statements of truth. In other words, could a 19th-century mathematician have used logic to discover quantum theory, without any of the experimental evidence that has been observed since? Perimeter faculty Lucien Hardy kickstarted this research program in 2001 before coming to Perimeter, and many of the key works in quantum foundations have been produced by Perimeter researchers continuing that program.

One influential set of papers, by Perimeter postdoctoral researcher Guilio Chiribella and collaborators, proposed a new central axiom known as the purification postulate. It implies that there is a reversible realization of every physical process. Stated another way, every physical process can be regarded as arising from a reversible interaction of the system with an environment that is eventually discarded. This requirement directly implies many quantum features, including no-cloning and quantum teleportation. If true, this axiomatization suggests that quantum theory is the only pure and reversible theory of information. The idea has been highly influential in the field.

The axiomatization program seeks to define the boundaries and properties of quantum theory from within a landscape of possible theories. To do so, some researchers develop concrete alternatives to quantum theory, known as foil theories, against which quantum theory can be tested. Foil theories that share many features with quantum theory but differ just slightly are particularly effective, because they demonstrate the sort of work axioms must do to pick out quantum theory precisely, as opposed to the foil. 

Two of the most prominent foil theories originated at Perimeter. One, developed by Perimeter faculty Rob Spekkens, is known as the Spekkens toy theory. It posits a world that is classical, not quantum, but places limits on how much any agent within that world can know about it. The results of this toy theory end up looking very nearly quantum in many respects. Another influential foil theory, developed by Perimeter postdoctoral researcher Jonathan Barrett, is known as Boxworld. Boxworld features a particular type of state known as Popescu-Rohrlich boxes, which allows strongly non-local interactions to happen that are prohibited by the rules of quantum theory.

Through foil theory tests and the refinement of axioms, quantum foundations researchers are moving closer to understanding quantum theory – its limits and properties, and what that implies for the world we live in.
 

Related Papers:

Lucien Hardy, “Quantum theory from five reasonable axioms.” Arxiv:quant-ph/0101012.; 

Giulio Chiribella, Giacomo Mauro D’Ariano, and Paolo Perinotti, “Probabilistic theories with purification.” Phys. Rev. A 81, 062348 (2010).; 

Giulio Chiribella, Giacomo Mauro D’Ariano, and Paolo Perinotti,” Informational derivation of quantum theory.” Phys. Rev. A 84, 012311 (2011).; 

Lluís Masanes and Markus P Müller, “A derivation of quantum theory from physical requirements.” 2011 New J. Phys. 13 063001.; 

Robert Spekkens, “ Evidence for the epistemic view of quantum states: a toy theory.” Phys. Rev. A 75, 032110 (2007).; 

Jonathan Barrett, “Information processing in generalized probabilistic theories.” Phys. Rev. A 75, 032304 (2007).

Indefinite Causality

One of the most-studied aspects of causation in quantum foundations today originated in the work of Perimeter faculty member Lucien Hardy. Known as indefinite causality, it grew out of the quest to unite general relativity with quantum mechanics. A complete theory of nature must satisfy the lessons of both.

In general relativity, causal structures are dynamical: spacetime tells you which events come first and which come later. But spacetime isn’t fixed. A mass might move one way and not another, and that could affect the order of events as the system evolves in time.

In quantum theory, meanwhile, dynamical quantities are subject to indefiniteness. For example, a particle can be in a superposition of two locations at once.

A complete theory of quantum gravity, Hardy realized, must somehow combine the dynamical and the indefinite: it would have an indefinite causal structure. This presents a deep conceptual challenge to our usual ways of thinking about physics. What comes before and what comes after would no longer be definite. We wouldn’t be able to describe the world at a given moment and see how it evolves over time.

Starting in 2005, Hardy wrote a series of papers developing a general operational framework within which theories with indefinite causal structure can be formulated. 

One possible realization of this theory is a quantum equivalence principle, in which causation remains definite locally (which is why we still perceive cause and effect), even if on the grand scale it is indefinite. This idea is just one possible interpretation of the implications of indefinite causality.

In 2009, Hardy proposed ways in which a ‘quantum gravity computer’ might take advantage of his indefinite causal structure theory. These papers inspired other research groups internationally and led to a proposed “quantum switch”: a way of preparing a superposition of different causal orders. The study of indefinite causal structure and its applications has become a major area of research within quantum foundations internationally.
 

Related Papers:

L. Hardy, “Probability Theories with Dynamic Causal Structure: A New Framework for Quantum Gravity,” arXiv:gr-qc/0509120.; 

L. Hardy, “Towards Quantum Gravity: A Framework for Probabilistic Theories with Non-Fixed Causal Structure,” J. Phys. A40:3081 (2007), arXiv:gr-qc/0608043.; 

L. Hardy, “Quantum gravity computers: On the theory of computation with indefinite causal structure,” in “Quantum reality, relativistic causality, and closing the epistemic circle,” Springer (2009), arXiv:quant-ph/0701019.; 

L. Hardy, “Formalism Locality in Quantum Theory and Quantum Gravity,” in “Philosophy of Quantum Information and Entanglement,” Eds A. Bokulich and G. Jaeger, CUP (2010), arXiv:0804.0054.; 

Lucien Hardy, “Implementation of the quantum equivalence principle,” arxiv.org/abs/1903.01289

Quantum Causal Inference

Causal inference is a well-established subfield of machine learning that seeks to determine cause-effect relationships from statistical data. It is used to answer questions in fields like medical treatment (what if we change the dosage of a medicine?) and policy (why do greenhouse gases change ocean salinity?). It often involves either finding, or ruling out, possible ‘hidden variables’ – unexpected causes that might be missed in the data. 

Perimeter faculty member Robert Spekkens, along with his graduate student Christopher Wood, were among the first to realize the value of causal inference approaches in quantum physics, and their work gave rise to the brand-new field of quantum causal inference.

Like causal inference researchers, quantum theorists often grapple with hidden variables. Quantum theory predicts strange correlations that are inexplicable in a classical world (like Einstein’s ‘spooky action at a distance’). With colleagues Podolsky and Rosen, Einstein argued that there must be some classical hidden variable causing these correlations. This was shown to be incorrect, in theory, by John Bell in 1964. That result was later confirmed via experiment, providing demonstrable proof of these effects in quantum mechanics, which earned the 2022 Nobel prize.

Spekkens and Wood demonstrated how to conceptualize Bell’s theorem from the perspective of the field of causal inference. They showed that the standard (classical) notion of a causal model cannot provide a satisfactory explanation of the correlations predicted by quantum theory in a Bell experiment. This new perspective has become increasingly prominent, suggesting that there may be analogs of Bell’s theorem in other causal structures, distinct from the one considered by Bell.

Led by that insight, Spekkens and research scientist Elie Wolfe launched a research program to develop new quantum causal models that can provide satisfactory explanations for Bell’s inequality violations.

They also developed algorithms for testing causal compatibility for arbitrary causal structures. As part of this effort, they developed the ‘inflation technique’ for causal inference, which has been highly influential in both classical and quantum circles. The inflation technique has become the gold standard for quantum researchers trying to witness quantum effects in different causal structures.

Because of its potential applications in fields across machine learning and data science, as well as its significance for understanding the very nature of quantum mechanics, quantum causal inference is an outstanding example of the synergy between fundamental research and technology innovation.

Related Papers:

Chistropher Wood, Robert Spekkens, “The lesson of causal discovery algorithms for quantum correlations: causal explanations of Bell-inequality violations require fine-tuning.” New Journal of Physics 17 (3), 033002 (2015).; 

Joe Henson, Raymond Lal, and Matthew F. Pusey, “Theory-independent limits on correlations from generalized Bayesian networks.” New Journal of Physics 16.11: 113043  (2014).; 

Elie Wolfe, Robert Spekkens , Tobias Fritz, “The inflation technique for causal inference with latent variables” Journal of Causal Inference 7 (2), 20170020 (2019). 

M. Navascués, E. Wolfe, “The inflation technique solves completely the classical inference problem,” [arXiv:1707.06476].; 

E. Wolfe, A. Pozas-Kerstjens, M. Grinberg, D. Rosset, A. Acín, M. Navascués, “Quantum inflation: A general approach to quantum causal compatibility,” [arXiv:1909.10519].

More Turning Points: