**Larissa Albantakis, **University of Wisconsin-Madison

*Being vs. Happening: information from the intrinsic perspective of the system itself*

When applied to a physical system, the two main, established notions of information, Shannon Information and Algorithmic Information, explicitly neglect the mechanistic structure of the system under evaluation. Shannon information treats the system as a channel and quantifies correlations between the system’s inputs and outputs, or between its past and future states. Algorithmic information quantifies the length of the shortest program capable of reproducing the system’s outputs or dynamics. The goal in both cases is to predict the system’s behavior from the perspective of an extrinsic investigator. From the intrinsic perspective of the system itself, however, information must be physically instantiated to be causally relevant. For every ‘bit’, there must be some mechanism that is in one of two (or several) possible states, and which state it is in must matter to other mechanisms. In other words, the state must be “a difference that makes a difference” and implementation matters. By examining the informational and causal properties of artificial organisms (“animats”) controlled by small, adaptive neural networks (Markov Brains), I will discuss necessary requirements for intrinsic information, autonomy, and agency.

**Ämin Baumeler, **Institute for Quantum Optics and Quantum Information, Vienna

*When Causality Is Relaxed: Classical Correlations, Computation, and Time Travel*

Following Stefan Wolf’s talk, we address the doubts expressed on fundamental space-time causality. Usually it is assumed that causal structures represent a definite partial ordering of events. By relaxing that notion one risks problems of logical nature. Yet, as we show, there exists a logically consistent world beyond the causal, even in the classical realm where quantum theory is not invoked. We explore the classical correlations within and the computational limits of that world. It turns out that relaxing causality in that fashion does not allow for efficient computation of NP-hard problems. These results are related to closed time-like curves: Contrary to previous models of time travel, which necessitate quantum theory and violate the NP-hardness assumption, we obtain a computationally tame model for classical and reversible time travel where freedom of choice is unrestricted.

**Časlav Brukner, **Institute for Quantum Optics and Quantum Information, Vienna

*A no-go theorem for observer-independent facts*

In his famous thought experiment, Wigner assigns an entangled state to the composite quantum system made up of Wigner's friend and her observed system. While the two of them have different accounts of the process, each Wigner and his friend can in principle verify his/her respective state assignments by performing an appropriate measurement. As manifested through a click in a detector or a specific position of the pointer, the outcomes of these measurements can be regarded as reflecting directly observable "facts". Reviewing arXiv:1507.05255, I will derive a no-go theorem for observer-independent facts, which would be common both for Wigner and the friend. I will then analyze this result in the context of a newly derived theorem in arXiv:1604.07422, where Frauchiger and Renner prove that "single-world interpretations of quantum theory cannot be self-consistent". It is argued that "self-consistency" has the same implications as the assumption that observational statements of different observers can be compared in a single (and hence an observer-independent) theoretical framework. The latter, however, may not be possible, if the statements are to be understood as relational in the sense that their determinacy is relative to an observer.

**Giulio Chiribella, **University of Oxford

*Quantum speedup in testing causal hypotheses*

An important ingredient of the scientific method is the ability to test alternative hypotheses on the causal relations relating a given set of variables. In the classical world, this task can be achieved with a variety of statistical, information-theoretic, and computational techniques. In this talk I will address the extension from the classical scenario to the quantum scenario, and, more generally, to general probabilistic theories. After introducing the basic hypothesis testing framework, I will focus on a concrete example, where the task is to identify the causal intermediary of a given variable, under the promise that the causal intermediary belongs to a given set of candidate variables. In this problem, I will show that quantum physics offers an exponential advantage over the best classical strategies, with a doubling of the exponential decay of the error probability. The source of the advantage can be found in the combination of two quantum features: the complementarity between the information on the causal structure and other properties of the cause effect relation, and the ability to perform multiple tests in a quantum superposition. An interesting possibility is that one of the "hidden principles" of quantum theory could be on our ability to test alternative causal hypotheses.

**John DeBrota, **University of Massachusetts Boston

*Symmetric Informationally Complete Measurements Identify the Essential Difference between Classical and Quantum.*

In this talk we describe a general procedure for associating a minimal informationally-complete quantum measurement (or MIC) with a probabilistic representation of quantum theory. Towards this, we make use of the idea that the Born Rule is a consistency criterion among subjectively assigned probabilities rather than a tool to set purely physically mandated probabilities. In our setting, the difference between quantum theory and classical statistical physics is the way their physical assumptions augment bare probability theory: Classical statistical physics corresponds to a trivial augmentation, while quantum theory makes crucial use of the Born Rule. We prove that the representation of the Born Rule obtained from a symmetric informationally-complete measurement (or SIC) minimizes the distinction between the two theories in at least two senses, one functional, the other geometric. Our results suggest that this representation supplies a natural vantage point from which to identify their essential differences, and, perhaps thereby, a set of physical postulates reflecting the quantum nature of the world.

**Gemma De las Cuevas, **University of Innsbruck

*On the concepts of universality in physics and computer science*

A central fact in computer science is that there are universal machines, that is machines that can run any other program. Recently, a somewhat similar notion of universality has been discovered in physics, by which some spin models can simulate all other models. In this work we shed light on the relation between the two concepts of universality

**Matthew Evans, **Massachusetts Institute of Technology

*Gravitational Waves: Discoveries and Future Detectors*

Two years ago the Laser Interferometer Gravitational-wave Observatory (LIGO) announced the first direct detection of gravitational waves; minute distortions in space-time caused by cataclysmic events far away in the universe. Very recently, the merger of a binary neutron star system was detected by both of the Advanced LIGO detectors and the Advanced Virgo detector in Italy, triggering a massive follow-up campaign by ground and space-based telescopes. A counterpart to the gravitational-wave source was located, and transient emission was detected from gamma rays to radio. I will talk about the sources of the signals we detected, the physics behind the detectors, and prospects for the future of this emerging field.

**Lucien Hardy, **Perimeter Institute

*Using humans to switch the settings in a Bell experiment*

I discuss how we might go about performing a Bell experiment in which humans are used to decide the settings at each end. To get a sufficiently high rate of switching at both ends, I suggest an experiment over a distance of about 100km with 100 people at each end wearing EEG headsets, with the signals from these headsets being used to switch the settings. The radical possibility we wish to investigate is that, when humans are used to decide the settings (rather than various types of random number generators), we might then expect to see a violation of Quantum Theory in agreement with the relevant Bell inequality. Such a result, while very unlikely, would be tremendously significant for our understanding of the world (and I will discuss some interpretations). Possible radical implications aside, performing an experiment like this would push the development of new technologies. The biggest problem would be to get sufficiently high rates wherein there has been a human induced switch at each end before a signal as to the new value of the setting could be communicated to the other end and, at the same time, a photon pair is detected. It looks like an experiment like this, while challenging, is just about feasible with current technologies.

**Stephan Hartmann, **Munich Center for Mathematical Philosophy

*Argumentation, Conditionals, and the Use of Information Theoretic Concepts in Bayesianism*

In this talk, I show how information theoretic concepts can be used to extend the scope of traditional Bayesianism. I will focus on the learning of indicative conditionals (“If A, then B”) and a Bayesian account of argumentation. We will see that there are also interesting connections to research done in the psychology of reasoning. The talk is partly based on the paper “Bayesian Argumentation and the Value of Logical Validity” (with Ben Eva, forthcoming in *Psychological Review*, http://philsci-archive.pitt.edu/14491/).

**Dominic Horsman, **University of Grenoble

*When does a physical systems compute: Is physics more or less than computation?*

Landauer's famous dictum that 'information is physical' has been enthusiastically taken on by a range of communities, with researchers in areas from quantum and unconventional computing to biology, psychology, and economics adopting the language of information processing. However, this rush to make all science about computing runs the risk of collapsing into triviality: if every physical process is computing, then to say that something performs computation gives no meaningful information about it, leaving computational language devoid of content. In this talk I will give an introduction to Abstraction/Representation Theory, a framework for representing both computing and physical science that allows us to draw a meaningful distinction between them. The use of AR theory - with its commuting-diagrammatic framework and associated algebra of representation - allows us to take significant steps towards giving a formal language and framework for the processes of science. I will show how AR theory represents this process (including the potential for automation), and the insights it gives into the usage and limits of computation as a formal process language for, and description of, physical sciences.

**Marcus Hutter, **Australian National University

*Observer Localization in Multiverse Theories*

The progression of theories suggested for our world, from ego- to geo- to helio-centric models to universe and multiverse theories and beyond, shows one tendency: The size of the described worlds increases, with humans being expelled from their center to ever more remote and random locations. If pushed too far, a potential theory of everything (TOE) is actually more a theories of nothing (TON). Indeed such theories have already been developed. I show that including observer localization into such theories is necessary and sufficient to avoid this problem. I develop a quantitative recipe to identify TOEs and distinguish them from TONs and theories in-between. This precisely shows what the problem is with some recently suggested universal TOEs.

**Dominik Janzing, **Max Planck Institute

*Causal inference rules for algorithmic dependences and why they reproduce the arrow of time*

The causal Markov condition relates statistical dependences to causality. Its relevance is meanwhile widely appreciated in machine learning, statistics, and physics. I describe the *algorithmic* causal Markov condition relating algorithmic dependences to causality, which can be used for inferring causal relations among single objects without referring to statistics. The underlying postulate "no algorithmic dependence without causal relation" extends Reichenbach's Principle to a probability-free setting. I argue that a related postulate called "algorithmic independence of initial state and dynamics" reproduces the non-decrease of entropy according to the thermodynamic arrow of time.

**Markus Mueller, **Perimeter Institute & Institute for Quantum Optics and Quantum Information, Vienna

*From observers to physics via algorithmic information theory*

__Remark to last week's participants:__ This will be a condensed version of last week's talks. I will drop many details (in particular on the relation to quantum theory) and also drop the introductory slides to algorithmic probability (for this, see Marcus Hutter's introductory talk on Tuesday afternoon, April 10).

Motivated by the conceptual puzzles of quantum theory and related areas of physics, I describe a rigorous and minimal “proof of principle” theory in which observers are fundamental and in which the physical world is a (provably) emergent phenomenon. This is a reversal of the standard view, which holds that physical theories ought to describe the objective evolution of a unique external world, with observers or agents as derived concepts that play no fundamental role whatsoever. Using insights from algorithmic information theory (AIT), I show that this approach admits to address several foundational puzzles that are difficult to address via standard approaches. This includes the measurement and Boltzmann brain problems, and problems related to the computer simulation of observers. Without assuming the existence of an external world from the outset, the resulting theory actually predicts that there is one as a consequence of AIT — in particular, a world with simple, computable, probabilistic laws on which different observers typically (but not always) agree. This approach represents a consistent but highly unfamiliar picture of the world, leading to a new perspective from which to approach some questions in the foundations of physics.

**Wayne Myrvold, **University of Western Ontario

*Can quantum states be understood as Bayesian states of belief?*

In accordance with Betteridge's Law of Headlines, the answer to the question in the title is "no." I will argue that the usual norms of Bayesian inference lead the conclusion that quantum states are features of physical reality. The argument will involve both existing $\psi$-ontology results and extension of them that avoids the use of the Cartesian Product Assumption. As the usual norms of Bayesian inference lead to the conclusion of the reality of quantum state, rejecting it requires abandonment of virtually all of Bayesian information theory. This, I will argue, is unwarranted.

**Rüdiger Schack, **Royal Holloway University of London

*Normative probability in quantum mechanics*

In this talk I compare the normative concept of probability at the heart of QBism with the notion of probability implied by the use of Solomonoff induction in Markus Mueller's preprint arXiv:1712.01816.

**Tom Sterkenburg, **Munich Center for Mathematical Philosophy

*Algorithmic information theory: a critical perspective*

Algorithmic information theory (AIT) delivers an objective quantification of simplicity-qua-compressibility,that was employed by Solomonoff (1964) to specify a gold standard of inductive inference. Or so runs the conventional account,that I will challenge in my talk.

**Stephan Wolf, **Universit della Svizzera italiana

*The Logic of Physical Law*

Landauer's principle claims that "Information is Physical." Its conceptual antipode, Wheeler's "It from Bit," has since long been popular among computer scientists in the form of the Church-Turing hypothesis: All natural processes can be simulated by a universal Turing machine. Switching back and forth between the two paradigms, motivated by quantum-physical Bell correlations and the doubts they raise about fundamental space-time causality, we look for an intrinsic, physical randomness notion and find one, namely complexity, around the second law of thermodynamics. Bell correlations combined with Kolmogorov complexity in the role of randomness imply an all-or-nothing nature of the Church-Turing hypothesis: Either beyond-Turing computations are physically impossible, or they can be carried out by "devices" as simple as individual photons. This latter result demonstrates in an exemplary way the fruitful interplay between physical and informational-computational principles.