Algorithmic Information, Induction and Observers in Physics

Algorithmic Information, Induction and Observers in Physics

 

Friday Apr 13, 2018
Speaker(s): 

Landauer's famous dictum that 'information is physical' has been enthusiastically taken on by a range of communities, with researchers in areas from quantum and unconventional computing to biology, psychology, and economics adopting the language of information processing. However, this rush to make all science about computing runs the risk of collapsing into triviality: if every physical process is computing, then to say that something performs computation gives no meaningful information about it, leaving computational language devoid of content.

Scientific Areas: 

 

Friday Apr 13, 2018
Speaker(s): 

In this talk, I show how information theoretic concepts can be used to extend the scope of traditional Bayesianism. I will focus on the learning of indicative conditionals (“If A, then B”) and a Bayesian account of argumentation. We will see that there are also interesting connections to research done in the psychology of reasoning. The talk is partly based on the paper “Bayesian Argumentation and the Value of Logical Validity” (with Ben Eva, forthcoming in Psychological Review, http://philsci-archive.pitt.edu/14491/).

Scientific Areas: 

 

Thursday Apr 12, 2018
Speaker(s): 

When applied to a physical system, the two main, established notions of information, Shannon Information and Algorithmic Information, explicitly neglect the mechanistic structure of the system under evaluation. Shannon information treats the system as a channel and quantifies correlations between the system’s inputs and outputs, or between its past and future states. Algorithmic information quantifies the length of the shortest program capable of reproducing the system’s outputs or dynamics.

Scientific Areas: 

 

Thursday Apr 12, 2018
Speaker(s): 

The causal Markov condition relates statistical dependences to causality. Its relevance is meanwhile widely appreciated in machine learning, statistics, and physics. I describe the *algorithmic* causal Markov condition relating algorithmic dependences to causality, which can be used for inferring causal relations among single objects without referring to statistics. The underlying postulate "no algorithmic dependence without causal relation" extends Reichenbach's Principle to a probability-free setting.

Scientific Areas: 

 

Thursday Apr 12, 2018
Speaker(s): 

Remark to last week's participants: This will be a condensed version of last week's talks. I will drop many details (in particular on the relation to quantum theory) and also drop the introductory slides to algorithmic probability (for this, see Marcus Hutter's introductory talk on Tuesday afternoon, April 10).

Scientific Areas: 

 

Thursday Apr 12, 2018
Speaker(s): 

The progression of theories suggested for our world, from ego- to geo- to helio-centric models to universe and multiverse theories and beyond, shows one tendency: The size of the described worlds increases, with humans being expelled from their center to ever more remote and random locations. If pushed too far, a potential theory of everything (TOE) is actually more a theories of nothing (TON). Indeed such theories have already been developed. I show that including observer localization into such theories is necessary and sufficient to avoid this problem.

Scientific Areas: 

 

Wednesday Apr 11, 2018
Speaker(s): 

In accordance with Betteridge's Law of Headlines, the answer to the question in the title is "no." I will argue that the usual norms of Bayesian inference lead the conclusion that quantum states are features of physical reality. The argument will involve both existing $\psi$-ontology results and extension of them that avoids the use of the Cartesian Product Assumption. As the usual norms of Bayesian inference lead to the conclusion of the reality of quantum state, rejecting it requires abandonment of virtually all of Bayesian information theory. This, I will argue, is unwarranted

Scientific Areas: 

 

Wednesday Apr 11, 2018
Speaker(s): 

The progression of theories suggested for our world, from ego- to geo- to helio-centric models to universe and multiverse theories and beyond, shows one tendency: The size of the described worlds increases, with humans being expelled from their center to ever more remote and random locations. If pushed too far, a potential theory of everything (TOE) is actually more a theories of nothing (TON). Indeed such theories have already been developed. I show that including observer localization into such theories is necessary and sufficient to avoid this problem.

Scientific Areas: 

 

Wednesday Apr 11, 2018
Speaker(s): 

In this talk I compare the normative concept of probability at the heart of QBism with the notion of probability implied by the use of Solomonoff induction in Markus Mueller's preprint arXiv:1712.01816.

Scientific Areas: 

 

Tuesday Apr 10, 2018
Speaker(s): 

Algorithmic information theory (AIT) delivers an objective quantification of simplicity-qua-compressibility,that was employed by Solomonoff (1964) to specify a gold standard of inductive inference. Or so runs the conventional account,that I will challenge in my talk.

Scientific Areas: 

Pages