Skyrms s Higher Order Degrees of Belief

Size: px
Start display at page:

Download "Skyrms s Higher Order Degrees of Belief"

Transcription

1 Skyrms s Higher Order Degrees of Belief Shivaram Lingamneni March 13, Logical structure Disclaimers: I don t agree with Skyrms, I don t understand many of his points, and I may not understand his overall goals. By his own admission, Skyrms is performing two fairly distinct tasks: 1. Defending higher-order probabilities against accusations of inconsistency, illegitimacy, and triviality (and also irrelevance) 2. Developing (sketches of) formal systems that include higher-order probabilities The relationship between the two efforts is unclear. Are properties (robustness, elegance) of the formal systems intended to buttress the original case for having higher-order probabilities at all? It seems likely (e.g., page 118). 2 Apologetic 2.1 The Cantor s theorem paradox Higher-order beliefs appear to require the construction of a type hierarchy, or perhaps an untyped cumulative hierarchy, as in set theory. The alternative is outright inconsistency (pg. 110). Good and Savage seem to have raised objections along these lines. 2.2 Miller s paradox David Miller claims the following inconsistency arises in higher-order formal systems (with a single undifferentiated probability operator): 1. Pr( E) = Pr(E Pr(E) = Pr( E)) 2. Pr(E Pr(E) = Pr( E)) = Pr( E) = 1 2 This is bad. Also, if we take E to be, it immediately contradicts the Kolmogorov axioms, which set P ( ) = 1. Skyrms resolves the paradox by blocking the introduction of premise 1 when P (E) 1. His objection is that there is an equivocation on the de dicto 2 1

2 / de re distinction. Personally I don t fully understand Skyrms s argument, but here s a possibly equivalent one: there is an equivocation on the meaning of, or given that. In premise 2, may be read as after restricting to cases in which Pr(E) = Pr( E), i.e., the standard definition of a conditional probability P (A B) as the probability of A relativized to the subset of the space in which B occurs. All situations where Pr(E) = Pr( E) must have Pr(E) =.5, so if we consider only those cases it seems we are compelled to reach a final probability of Pr(E) =.5, and thus to believe premise 2. However, for premise 1 to be plausible, we must read as logical consequence under the assumption that Pr(E) = Pr( E). Under a reading where performs restriction/relativization, as before, premise 1 is false; if our overall estimate of Pr( E) is influenced by cases where Pr(E) = Pr( E) does not hold, then it might be greater or less than.5. (Imagine 10 equally likely possible worlds; in 9 of them, Pr(E) =.9, but in the last one, Pr(E) = Pr( E) =.5. Then if we aggregate over all the worlds, we get Pr( E) =.14, but Pr(E Pr(E) = Pr( E)) =.5.) 2.3 De Finetti s objection De Finetti holds that higher-order probabilities are only attitudes or emotions, and therefore not events that can have probabilities. Skyrms replies that he prefers Ramsey s pragmatism, in which beliefs have causal properties, since they are propensities to act. Skyrms relates (but does not identify) this objection with another, that higher-order probabilities are trivial (i.e., always 0 or 1). In practice, we seem not to know our own minds; our beliefs are legitimate objects of uncertainty. Skyrms sees both of these replies as related to a dichotomy between positivism and pragmatism. 2.4 Savage s objection Savage objects that higher-order probabilities induce an infinite hierarchy (which is bad) with an infinite structure of consistency conditions (even worse); the conditions take the form P i (B) = E(P i+1 (B)), i.e., your degree of belief is the expected value of your higher-order degree of belief. Savage suggests that it might be preferable to model uncertain degrees of belief via (for example) a convex set of probability measures. Skyrms seems to reply that higher-order beliefs are themselves beliefs modeling uncertainty in first-order beliefs (e.g., Savage s convex sets) does not solve the problem of modeling the agent s own awareness of that uncertainty (Skyrms s higher-order system). Skyrms affirms that first-order belief should be the expected value or mean of the corresponding second-order belief, but notes that the secondorder belief can additionally exhibit low or high variance, corresponding to the agent s sureness. 2

3 3 Formal systems 3.1 Jeffrey conditionalization Skyrms first discusses what conditionalization is and why it s a good thing. I m guessing we all more or less agree with him there. He then gives a definition (pg. 122) of Probability Kinematics, that is to say, Jeffrey conditionalization over a partition of mutually exclusive events {p i i = 1... n}. for two probability measures (or distributions) Pr i and Pr f. The thrust of this is that Pr i and Pr f agree on all conditional probabilities over the elements of the partition. Thus, coherence implies that Pr f s unconditional probabilities may be interpreted as Jeffrey updates from Pr i s unconditional probabilities and the shared conditional probabilities. (Aside: what s the foundation for supposing that we can describe allor-most knowledge in such a framework? Do the unchanging conditional probabilities reflect a fixed underlying model of causality, e.g., physics?) 3.2 Higher-order belief Skyrms now makes the case that it is epistemologically necessary to be able to work with uncertainty about one s first-order beliefs (pg. 122): the assumption that every observation can be interpreted as conferring certainty to some observational proposition leads to an unacceptable epistemology of the given. Skyrms appears to be anticipating the following objection: his system has a first-order I see a green cloth and a second-order but I m not sure about that. Why not fold these into a single first-order observation I had an indistinct visual sensation of a green cloth, and model our uncertainty by lowering the conditional probabilities associated with the observation? Skyrms seems to be holding that the idealizing assumptions here (that we can perfectly describe our indistinct visual sensation) are too steep. Aside: later on (pg. 126), he alludes to the desirability of isolating purely observational probabilities, claims that Hartry Field has succeeded in doing this 1, and uses this to support the case for higher-order probability: we need it to account for weighting of observations according to our sureness of them. So this doesn t contradict his earlier position, but it seems to involve a comparable idealizing assumption to the one he rejected: how can we extract the pure observational content of an experience? 3.3 Skyrms shows us the money: the sufficiency and Miller conditions At this point, Skyrms constructs a formal system which incorporates firstorder and second-order probabilities. He then states/proves an equivalence: conditionalization on first-order beliefs coincides with probability kinematics 1 Field s contribution appears to be a technical reparametrization of Jeffrey conditionalization; he explicitly disclaims having philosophically justified that his new parameter isolates purely observational data. 3

4 on a partition p j iff, for q any first-order proposition, for all elements p j of the partition: PR(q p j pr(p i ) = a i ) = PR(q p j ) i This is the sufficiency condition and it asserts that your second-order beliefs are causally inert your beliefs about your beliefs do not affect interactions between real-world propositions. This seems to correspond exactly with the defining assumption of probability kinematics, that the conditional probabilities remain constant. Furthermore, it is natural to also require the generalized Miller condition: for all elements p j of the partition: PR(p j i pr(p i ) = a i ) = a j which asserts explicitly that your final probability for a first-order proposition must be exactly your first-order probability for it. Question: what does it mean to condition on second-order probabilities? Is this the operation of introspecting, realizing that you have certain beliefs, then updating your other beliefs based on them? Skyrms s definition of probability kinematics avoids relating these ideas to an explicitly specified process of belief update. Skyrms observes in passing that if the first-order probabilities are not causally inert, the sufficiency condition fails. These are worth a look: Exchangeable sequence Skyrms points out that sufficiency fails when pr is interpreted as objective probability ( propensity probability, chance ), essentially since chances have real-world consequences. His example is as follows: consider two coins, one with pr(h) =.25 (C a ), one with pr(h) =.75 (C b ). With probability.5, pick C a and generate an infinite sequence of flips; with probability.5, pick C b and do likewise. Given the events {H i i = 1, 2, 3,...}, where H i denotes a result of heads on the ith flip, let PR describe the overall probability of the event (i.e., incorporating the mixing over the coins), then let pr describe the objective probability of the event (i.e.,.75 or.25 depending on what coin we re using). Now we may compute: PR(H 2 pr(h 1 ) =.75) = PR(H 2 C b ) =.75 i.e., since the coin s chance is constant across all flips, and PR(H 2 H 1 pr(h 1 ) =.75) = PR(H 2 H 1 C b ) =.75 since knowing which coin it is screens off all evidence from actual flips. (Skyrms points out that this is the reverse of the sufficiency condition, which says that actual events should screen off evidence of the form pr(a) = j.) But, by contrast: PR(H 2 H 1 ) = PR(H 2 H 2 ) PR(H 1 ) 4

5 = PR(H 1 H 2 C a )PR(C a ) + PR(H 1 H 2 C b )PR(C b ) PR(H 1 C a )PR(C a ) + PR(H 2 C b )PR(C b ) which equals.625, rather than.75 as before. (Skyrms has 9 think.625 is right; the argument stands either way.) Black Bart = =.5625, but I Skyrms also points out that mental states can cause objective physical phenomena, via involuntary physical reactions; thus, higher-order probabilities can have first-order consequences, violating the sufficiency condition. Skyrms s example is an event K, Black Bart will try to kill me, and an event S, I will sweat when I see Black Bart. Skyrms points out that PR(S K) < PR(S K pr(k) =.99), i.e., my credence in S is influenced not by K itself (since I have no direct access to Black Bart s intentions), but by pr(k) (since my subjective estimate of his intentions is what causes me to sweat). The agent here is clearly highly idealized; he has involuntary reactions but also precise introspection over them. One might suspect this example involves some kind of fallacious reification? 3.4 The sunrise problem On pages 129 and 130, Skyrms gives an exposition of the sunrise problem estimating the underlying parameter of a Bernoulli random variable given a vector of past observations (e.g., the probability that the sun will rise tomorrow, given that it has risen every day so far). I m not sure what the significance of this is; it looks like a standard Bayesian analysis and does not appear to refer to higher-order probabilities. 4 Information theory 4.1 Jaynes implies Jeffrey (and possibly the reverse) Pages relate E. T. Jaynes s understanding of inference to Jeffrey conditionalization. Informally, Jaynes gives the following principle: given new evidence, update your distribution in a way that minimizes the information gain. In his 1957 paper, Jaynes describes the general process of finding the maximum entropy estimate of a distribution given some observations (or statistics computed from the observations). He characterizes (or motivates) it as the least biased estimate possible on the given information, or maximally noncommittal with regard to missing information. Let f be a probability density. Given the following constraints: 1. f(x) 0 (Kolmogorov) 2. f(x)dx = 1 (Kolmogorov) 5

6 3. For i = 1... n, f(x)r i (x)dx = α i (f conforms to n measured statistics, e.g., if x is real-valued and r 1 (x) = x then α 1 = E[x], the mean or expected value) we wish to maximize the Shannon entropy of f: h(f) = f(x) ln(f(x))dx Note in passing that over a discrete space, this would be: h(f) = f(x) ln f(x) It can be proven that the minimizing f must have the form f(x) = exp[λ n λ i r i (x)] where the λ i are constant Lagrange multipliers. This is the so-called maximum entropy distribution (sometimes Gibbs distribution ), and is the proper exponential form that Skyrms alludes to at the very end of the paper. This distribution arises in statistical mechanics, where it s used to maximize the entropy of statistical systems like gases; Jaynes s insight was to apply it to other domains. That s the principle applied to the estimation of distributions from scratch, but how do we apply it to the problem of updating an existing distribution? The information gain between two distributions P and Q is measured by the Kullback-Leibler divergence D(P Q), a.k.a. relative entropy. In the discrete case: D(P Q) = P (i) ln P (i) Q(i) i Compare with the discrete case of Shannon entropy above. In the continuous real-valued case: D(P Q) = i=1 p(x) ln p(x) q(x) dx and in the case where the random variable is part of an arbitrary measure space, and p and q are density functions absolutely continuous w.r.t. an underlying measure λ (I think this is the right hypothesis): D(P Q) = p(x) ln p(x) q(x) dλ(x) and when P and Q are absolutely continuous w.r.t. each other, and dp is dq the Radon-Nikodym derivative of P with respect to Q: D(P Q) = ln dp dq dp so Jaynes says that if your initial distribution/model/measure is Q and you have new evidence ( statistics ), you should move to the P that conforms to those statistics while minimizing D(P Q). So what Skyrms seems to be proving is that: 6

7 1. If you update your density function f 1 to f 2 in a way that minimizes relative entropy while preserving statistics over a set p 1, p 2..., then f 1 and f 2 must satisfy a generalization of probability kinematics, i.e., constancy of probabilities conditioned on the p i. 2. In the discrete case, if you update your density function f 1 to f 2 according to probability kinematics in the above sense, then f 2 will be the distribution minimizing relative entropy. (This result may or may not extend to the continuous/uncountable case.) 4.2 But what does this mean? One of Skyrms s side claims is that the sufficiency condition (inertness of second-order probabilities) corresponds to the information-theoretic notion of sufficient statistic (one that loses no information; for example, when sampling from a normal distribution, the sample mean is a sufficient statistic for the population mean). This seems non-obvious and a significant insight. However, the precise formulation and justification of the correspondence are unclear to me at this time. Furthermore, Skyrms asserts that counter instances to the sufficiency condition are counter instances to both probability kinematics and the minimum relative information principle. So this seems like a very useful criterion; we can try and identify classes of situations, such as the exchangeable sequence and Black Bart examples, where Jaynesian inference doesn t apply. On the other hand, he also acknowledges (pg. 135) that the sufficiency condition and generalized Miller condition are not adequate to recover the maximum entropy estimate (i.e., the exponential formula described above). So the systems in this paper appear to be proper weakenings of Jaynesian inference. But what does this have to do with higher-order probabilities? The fact that Jeffrey conditionalization and Jaynesian maximum entropy inference are in some sense equivalent is surely important. But does it bolster the case for higher-order probabilities? Only, I think, inasmuch as it shows their applicability to an additional setting (arbitrary measure spaces) since the sufficiency condition can be generalized there, so can Jeffrey conditionalization, and Skyrms s proof shows that it accords with another accepted technique, namely Jaynesian inference. But this does not seem like independent confirmation of the validity of Skyrms s higher-order system. As in the discrete case, the sufficiency condition was tailor-made to agree with Jeffrey conditionalization. Throughout the paper, Skyrms seems to be saying, Let s add secondorder probabilities to our system, but let s have them do as little as possible. Then they will agree with existing systems that don t have second-order probabilities. Whether you believe this will probably depend on how much you believe his original contention: that we need higher-order probability for underlying epistemological reasons. 2 2 Skyrms cites Ramsey on this point: Ramsey thinks of personal probabilities as theoretical parts of an imperfect but useful psychological useful model (pg. 115). But has 7

8 5 Sources Cover, T. and Thomas, J. Elements of Information Theory Field, H. A Note on Jeffrey Conditionalization Jaynes, E. T. Information Theory and Statistical Mechanics Skyrms, B. Higher order degrees of belief Wikipedia. Skyrms shown that higher-order probabilities are a useful part of the useful model? As far as I can determine, Black Bart is the only time they re actually used and this is exactly the case the system fails to describe. 8

Recursive Ambiguity and Machina s Examples

Recursive Ambiguity and Machina s Examples Recursive Ambiguity and Machina s Examples David Dillenberger Uzi Segal May 0, 0 Abstract Machina (009, 0) lists a number of situations where standard models of ambiguity aversion are unable to capture

More information

Formal Epistemology: Lecture Notes. Horacio Arló-Costa Carnegie Mellon University

Formal Epistemology: Lecture Notes. Horacio Arló-Costa Carnegie Mellon University Formal Epistemology: Lecture Notes Horacio Arló-Costa Carnegie Mellon University hcosta@andrew.cmu.edu Bayesian Epistemology Radical probabilism doesn t insists that probabilities be based on certainties;

More information

An Introduction to Bayesian Reasoning

An Introduction to Bayesian Reasoning Tilburg Center for Logic and Philosophy of Science (TiLPS) Tilburg University, The Netherlands EPS Seminar, TiLPS, 9 October 2013 Overview of the Tutorial This tutorial aims at giving you an idea of why

More information

Philosophy 148 Announcements & Such. Independence, Correlation, and Anti-Correlation 1

Philosophy 148 Announcements & Such. Independence, Correlation, and Anti-Correlation 1 Branden Fitelson Philosophy 148 Lecture 1 Branden Fitelson Philosophy 148 Lecture 2 Philosophy 148 Announcements & Such Independence, Correlation, and Anti-Correlation 1 Administrative Stuff Raul s office

More information

Dr. Truthlove, or How I Learned to Stop Worrying and Love Bayesian Probabilities

Dr. Truthlove, or How I Learned to Stop Worrying and Love Bayesian Probabilities Dr. Truthlove, or How I Learned to Stop Worrying and Love Bayesian Probabilities Kenny Easwaran 10/15/2014 1 Setup 1.1 The Preface Paradox Dr. Truthlove loves believing things that are true, and hates

More information

In Defense of Jeffrey Conditionalization

In Defense of Jeffrey Conditionalization In Defense of Jeffrey Conditionalization Franz Huber Department of Philosophy University of Toronto Please do not cite! December 31, 2013 Contents 1 Introduction 2 2 Weisberg s Paradox 3 3 Jeffrey Conditionalization

More information

Confirmation Theory. Pittsburgh Summer Program 1. Center for the Philosophy of Science, University of Pittsburgh July 7, 2017

Confirmation Theory. Pittsburgh Summer Program 1. Center for the Philosophy of Science, University of Pittsburgh July 7, 2017 Confirmation Theory Pittsburgh Summer Program 1 Center for the Philosophy of Science, University of Pittsburgh July 7, 2017 1 Confirmation Disconfirmation 1. Sometimes, a piece of evidence, E, gives reason

More information

Probability. Lecture Notes. Adolfo J. Rumbos

Probability. Lecture Notes. Adolfo J. Rumbos Probability Lecture Notes Adolfo J. Rumbos October 20, 204 2 Contents Introduction 5. An example from statistical inference................ 5 2 Probability Spaces 9 2. Sample Spaces and σ fields.....................

More information

For True Conditionalizers Weisberg s Paradox is a False Alarm

For True Conditionalizers Weisberg s Paradox is a False Alarm For True Conditionalizers Weisberg s Paradox is a False Alarm Franz Huber Abstract: Weisberg (2009) introduces a phenomenon he terms perceptual undermining He argues that it poses a problem for Jeffrey

More information

For True Conditionalizers Weisberg s Paradox is a False Alarm

For True Conditionalizers Weisberg s Paradox is a False Alarm For True Conditionalizers Weisberg s Paradox is a False Alarm Franz Huber Department of Philosophy University of Toronto franz.huber@utoronto.ca http://huber.blogs.chass.utoronto.ca/ July 7, 2014; final

More information

I. Induction, Probability and Confirmation: Introduction

I. Induction, Probability and Confirmation: Introduction I. Induction, Probability and Confirmation: Introduction 1. Basic Definitions and Distinctions Singular statements vs. universal statements Observational terms vs. theoretical terms Observational statement

More information

Desire-as-belief revisited

Desire-as-belief revisited Desire-as-belief revisited Richard Bradley and Christian List June 30, 2008 1 Introduction On Hume s account of motivation, beliefs and desires are very di erent kinds of propositional attitudes. Beliefs

More information

P Q (P Q) (P Q) (P Q) (P % Q) T T T T T T T F F T F F F T F T T T F F F F T T

P Q (P Q) (P Q) (P Q) (P % Q) T T T T T T T F F T F F F T F T T T F F F F T T Logic and Reasoning Final Exam Practice Fall 2017 Name Section Number The final examination is worth 100 points. 1. (10 points) What is an argument? Explain what is meant when one says that logic is the

More information

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 CS 70 Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 Introduction One of the key properties of coin flips is independence: if you flip a fair coin ten times and get ten

More information

Discrete Mathematics and Probability Theory Fall 2010 Tse/Wagner MT 2 Soln

Discrete Mathematics and Probability Theory Fall 2010 Tse/Wagner MT 2 Soln CS 70 Discrete Mathematics and Probability heory Fall 00 se/wagner M Soln Problem. [Rolling Dice] (5 points) You roll a fair die three times. Consider the following events: A first roll is a 3 B second

More information

A Note On Comparative Probability

A Note On Comparative Probability A Note On Comparative Probability Nick Haverkamp and Moritz Schulz Penultimate draft. Please quote from the published version (Erkenntnis 2012). Abstract A possible event always seems to be more probable

More information

Weighted Majority and the Online Learning Approach

Weighted Majority and the Online Learning Approach Statistical Techniques in Robotics (16-81, F12) Lecture#9 (Wednesday September 26) Weighted Majority and the Online Learning Approach Lecturer: Drew Bagnell Scribe:Narek Melik-Barkhudarov 1 Figure 1: Drew

More information

Lecture 1: Probability Fundamentals

Lecture 1: Probability Fundamentals Lecture 1: Probability Fundamentals IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge January 22nd, 2008 Rasmussen (CUED) Lecture 1: Probability

More information

Philosophy 148 Announcements & Such

Philosophy 148 Announcements & Such Branden Fitelson Philosophy 148 Lecture 1 Philosophy 148 Announcements & Such Administrative Stuff Raul s office is 5323 Tolman (not 301 Moses). We have a permanent location for the Tues. section: 206

More information

Uncertainty. Chapter 13

Uncertainty. Chapter 13 Uncertainty Chapter 13 Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes Rule Uncertainty Let s say you want to get to the airport in time for a flight. Let action A

More information

Probability theory basics

Probability theory basics Probability theory basics Michael Franke Basics of probability theory: axiomatic definition, interpretation, joint distributions, marginalization, conditional probability & Bayes rule. Random variables:

More information

Philosophy 148 Announcements & Such. Axiomatic Treatment of Probability Calculus I. Axiomatic Treatment of Probability Calculus II

Philosophy 148 Announcements & Such. Axiomatic Treatment of Probability Calculus I. Axiomatic Treatment of Probability Calculus II Branden Fitelson Philosophy 148 Lecture 1 Branden Fitelson Philosophy 148 Lecture 2 Philosophy 148 Announcements & Such Administrative Stuff Raul s office is 5323 Tolman (not 301 Moses). We have a permanent

More information

Lecture 11: Continuous-valued signals and differential entropy

Lecture 11: Continuous-valued signals and differential entropy Lecture 11: Continuous-valued signals and differential entropy Biology 429 Carl Bergstrom September 20, 2008 Sources: Parts of today s lecture follow Chapter 8 from Cover and Thomas (2007). Some components

More information

A new resolution of the Judy Benjamin problem

A new resolution of the Judy Benjamin problem A new resolution of the Judy Benjamin problem Igor Douven Institute of Philosophy University of Leuven Jan-Willem Romeijn Faculty of Philosophy University of Groningen Contents 1 The Judy Benjamin problem

More information

Definitions and Proofs

Definitions and Proofs Giving Advice vs. Making Decisions: Transparency, Information, and Delegation Online Appendix A Definitions and Proofs A. The Informational Environment The set of states of nature is denoted by = [, ],

More information

A gentle introduction to imprecise probability models

A gentle introduction to imprecise probability models A gentle introduction to imprecise probability models and their behavioural interpretation Gert de Cooman gert.decooman@ugent.be SYSTeMS research group, Ghent University A gentle introduction to imprecise

More information

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ). CS 70 Discrete Mathematics for CS Spring 2006 Vazirani Lecture 8 Conditional Probability A pharmaceutical company is marketing a new test for a certain medical condition. According to clinical trials,

More information

Incompatibility Paradoxes

Incompatibility Paradoxes Chapter 22 Incompatibility Paradoxes 22.1 Simultaneous Values There is never any difficulty in supposing that a classical mechanical system possesses, at a particular instant of time, precise values of

More information

First-Degree Entailment

First-Degree Entailment March 5, 2013 Relevance Logics Relevance logics are non-classical logics that try to avoid the paradoxes of material and strict implication: p (q p) p (p q) (p q) (q r) (p p) q p (q q) p (q q) Counterintuitive?

More information

Filters in Analysis and Topology

Filters in Analysis and Topology Filters in Analysis and Topology David MacIver July 1, 2004 Abstract The study of filters is a very natural way to talk about convergence in an arbitrary topological space, and carries over nicely into

More information

The paradox of knowability, the knower, and the believer

The paradox of knowability, the knower, and the believer The paradox of knowability, the knower, and the believer Last time, when discussing the surprise exam paradox, we discussed the possibility that some claims could be true, but not knowable by certain individuals

More information

COMP310 Multi-Agent Systems Chapter 16 - Argumentation. Dr Terry R. Payne Department of Computer Science

COMP310 Multi-Agent Systems Chapter 16 - Argumentation. Dr Terry R. Payne Department of Computer Science COMP310 Multi-Agent Systems Chapter 16 - Argumentation Dr Terry R. Payne Department of Computer Science Overview How do agents agree on what to believe? In a court of law, barristers present a rationally

More information

The Lottery Paradox. Atriya Sen & Selmer Bringsjord

The Lottery Paradox. Atriya Sen & Selmer Bringsjord The Lottery Paradox Atriya Sen & Selmer Bringsjord Paradoxes Type 1: Single body of knowledge from which it is possible to deduce a contradiction (logical paradox) e.g. Russell s paradox Type 2: Counter-intuitive

More information

Bayes Formula. MATH 107: Finite Mathematics University of Louisville. March 26, 2014

Bayes Formula. MATH 107: Finite Mathematics University of Louisville. March 26, 2014 Bayes Formula MATH 07: Finite Mathematics University of Louisville March 26, 204 Test Accuracy Conditional reversal 2 / 5 A motivating question A rare disease occurs in out of every 0,000 people. A test

More information

Bayesian Inference. Introduction

Bayesian Inference. Introduction Bayesian Inference Introduction The frequentist approach to inference holds that probabilities are intrinsicially tied (unsurprisingly) to frequencies. This interpretation is actually quite natural. What,

More information

Pengju XJTU 2016

Pengju XJTU 2016 Introduction to AI Chapter13 Uncertainty Pengju Ren@IAIR Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes Rule Wumpus World Environment Squares adjacent to wumpus are

More information

ECO 317 Economics of Uncertainty Fall Term 2009 Notes for lectures 1. Reminder and Review of Probability Concepts

ECO 317 Economics of Uncertainty Fall Term 2009 Notes for lectures 1. Reminder and Review of Probability Concepts ECO 317 Economics of Uncertainty Fall Term 2009 Notes for lectures 1. Reminder and Review of Probability Concepts 1 States and Events In an uncertain situation, any one of several possible outcomes may

More information

Lecture 3: Lower Bounds for Bandit Algorithms

Lecture 3: Lower Bounds for Bandit Algorithms CMSC 858G: Bandits, Experts and Games 09/19/16 Lecture 3: Lower Bounds for Bandit Algorithms Instructor: Alex Slivkins Scribed by: Soham De & Karthik A Sankararaman 1 Lower Bounds In this lecture (and

More information

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10 EECS 70 Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10 Introduction to Basic Discrete Probability In the last note we considered the probabilistic experiment where we flipped

More information

Basics of Probability

Basics of Probability Basics of Probability Lecture 1 Doug Downey, Northwestern EECS 474 Events Event space E.g. for dice, = {1, 2, 3, 4, 5, 6} Set of measurable events S 2 E.g., = event we roll an even number = {2, 4, 6} S

More information

Fourier and Stats / Astro Stats and Measurement : Stats Notes

Fourier and Stats / Astro Stats and Measurement : Stats Notes Fourier and Stats / Astro Stats and Measurement : Stats Notes Andy Lawrence, University of Edinburgh Autumn 2013 1 Probabilities, distributions, and errors Laplace once said Probability theory is nothing

More information

Philosophy 148 Announcements & Such

Philosophy 148 Announcements & Such Branden Fitelson Philosophy 148 Lecture 1 Philosophy 148 Announcements & Such The mid-term is Thursday. I discussed it last night. These are fair game: True/False/short calculation (quiz-like) questions

More information

Single Maths B: Introduction to Probability

Single Maths B: Introduction to Probability Single Maths B: Introduction to Probability Overview Lecturer Email Office Homework Webpage Dr Jonathan Cumming j.a.cumming@durham.ac.uk CM233 None! http://maths.dur.ac.uk/stats/people/jac/singleb/ 1 Introduction

More information

Rational Self-Doubt. Sherri Roush Department of Philosophy Chair, Group in Logic and the Methodology of Science UC Berkeley

Rational Self-Doubt. Sherri Roush Department of Philosophy Chair, Group in Logic and the Methodology of Science UC Berkeley Rational Self-Doubt Sherri Roush Department of Philosophy Chair, Group in Logic and the Methodology of Science UC Berkeley 1 Denial 2 Judgmental Self-Doubt Second-guessing your opinion on the basis of

More information

An AI-ish view of Probability, Conditional Probability & Bayes Theorem

An AI-ish view of Probability, Conditional Probability & Bayes Theorem An AI-ish view of Probability, Conditional Probability & Bayes Theorem Review: Uncertainty and Truth Values: a mismatch Let action A t = leave for airport t minutes before flight. Will A 15 get me there

More information

10/18/2017. An AI-ish view of Probability, Conditional Probability & Bayes Theorem. Making decisions under uncertainty.

10/18/2017. An AI-ish view of Probability, Conditional Probability & Bayes Theorem. Making decisions under uncertainty. An AI-ish view of Probability, Conditional Probability & Bayes Theorem Review: Uncertainty and Truth Values: a mismatch Let action A t = leave for airport t minutes before flight. Will A 15 get me there

More information

Formalizing Probability. Choosing the Sample Space. Probability Measures

Formalizing Probability. Choosing the Sample Space. Probability Measures Formalizing Probability Choosing the Sample Space What do we assign probability to? Intuitively, we assign them to possible events (things that might happen, outcomes of an experiment) Formally, we take

More information

Probabilistic and Bayesian Machine Learning

Probabilistic and Bayesian Machine Learning Probabilistic and Bayesian Machine Learning Lecture 1: Introduction to Probabilistic Modelling Yee Whye Teh ywteh@gatsby.ucl.ac.uk Gatsby Computational Neuroscience Unit University College London Why a

More information

HELP US REMOVE OUTDATED BELIEFS?

HELP US REMOVE OUTDATED BELIEFS? CAN INFORMATION THEORY HELP US REMOVE OUTDATED BELIEFS? Erik J. Olsson Lund University Cognitive Science Kungshuset, Lundagård S-222 22 Lund, Sweden E-mail Erik.Olsson@filosofi.uu.se Abstract: It is argued

More information

Philosophy 148 Announcements & Such

Philosophy 148 Announcements & Such Branden Fitelson Philosophy 148 Lecture 1 Philosophy 148 Announcements & Such Overall, people did very well on the mid-term (µ = 90, σ = 16). HW #2 graded will be posted very soon. Raul won t be able to

More information

COMP310 MultiAgent Systems. Chapter 16 - Argumentation

COMP310 MultiAgent Systems. Chapter 16 - Argumentation COMP310 MultiAgent Systems Chapter 16 - Argumentation Argumentation Argumentation is the process of attempting to agree about what to believe. Only a question when information or beliefs are contradictory.

More information

This corresponds to a within-subject experiment: see same subject make choices from different menus.

This corresponds to a within-subject experiment: see same subject make choices from different menus. Testing Revealed Preference Theory, I: Methodology The revealed preference theory developed last time applied to a single agent. This corresponds to a within-subject experiment: see same subject make choices

More information

HST.582J / 6.555J / J Biomedical Signal and Image Processing Spring 2007

HST.582J / 6.555J / J Biomedical Signal and Image Processing Spring 2007 MIT OpenCourseWare http://ocw.mit.edu HST.582J / 6.555J / 16.456J Biomedical Signal and Image Processing Spring 2007 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

More information

A PECULIAR COIN-TOSSING MODEL

A PECULIAR COIN-TOSSING MODEL A PECULIAR COIN-TOSSING MODEL EDWARD J. GREEN 1. Coin tossing according to de Finetti A coin is drawn at random from a finite set of coins. Each coin generates an i.i.d. sequence of outcomes (heads or

More information

Bayesian Statistics. State University of New York at Buffalo. From the SelectedWorks of Joseph Lucke. Joseph F. Lucke

Bayesian Statistics. State University of New York at Buffalo. From the SelectedWorks of Joseph Lucke. Joseph F. Lucke State University of New York at Buffalo From the SelectedWorks of Joseph Lucke 2009 Bayesian Statistics Joseph F. Lucke Available at: https://works.bepress.com/joseph_lucke/6/ Bayesian Statistics Joseph

More information

Module 1. Probability

Module 1. Probability Module 1 Probability 1. Introduction In our daily life we come across many processes whose nature cannot be predicted in advance. Such processes are referred to as random processes. The only way to derive

More information

Probabilistic Reasoning

Probabilistic Reasoning Course 16 :198 :520 : Introduction To Artificial Intelligence Lecture 7 Probabilistic Reasoning Abdeslam Boularias Monday, September 28, 2015 1 / 17 Outline We show how to reason and act under uncertainty.

More information

Conditional Probability, Independence and Bayes Theorem Class 3, Jeremy Orloff and Jonathan Bloom

Conditional Probability, Independence and Bayes Theorem Class 3, Jeremy Orloff and Jonathan Bloom Conditional Probability, Independence and Bayes Theorem Class 3, 18.05 Jeremy Orloff and Jonathan Bloom 1 Learning Goals 1. Know the definitions of conditional probability and independence of events. 2.

More information

CS 246 Review of Proof Techniques and Probability 01/14/19

CS 246 Review of Proof Techniques and Probability 01/14/19 Note: This document has been adapted from a similar review session for CS224W (Autumn 2018). It was originally compiled by Jessica Su, with minor edits by Jayadev Bhaskaran. 1 Proof techniques Here we

More information

Math 10850, fall 2017, University of Notre Dame

Math 10850, fall 2017, University of Notre Dame Math 10850, fall 2017, University of Notre Dame Notes on first exam September 22, 2017 The key facts The first midterm will be on Thursday, September 28, 6.15pm-7.45pm in Hayes-Healy 127. What you need

More information

Holistic Conditionalization and Underminable Perceptual Learning

Holistic Conditionalization and Underminable Perceptual Learning Holistic Conditionalization and Underminable Perceptual Learning Abstract Seeing a red hat can (i) increase my credence in the hat is red, and (ii) introduce a negative dependence between that proposition

More information

Practical implementation of possibilistic probability mass functions

Practical implementation of possibilistic probability mass functions Soft Computing manuscript No. (will be inserted by the editor) Practical implementation of possibilistic probability mass functions Leen Gilbert, Gert de Cooman 1, Etienne E. Kerre 2 1 Universiteit Gent,

More information

Fundamentals of Probability CE 311S

Fundamentals of Probability CE 311S Fundamentals of Probability CE 311S OUTLINE Review Elementary set theory Probability fundamentals: outcomes, sample spaces, events Outline ELEMENTARY SET THEORY Basic probability concepts can be cast in

More information

Kaplan s Paradox and Epistemically Possible Worlds

Kaplan s Paradox and Epistemically Possible Worlds Kaplan s Paradox and Epistemically Possible Worlds 1. Epistemically possible worlds David Chalmers Metaphysically possible worlds: S is metaphysically possible iff S is true in some metaphysically possible

More information

Probability is related to uncertainty and not (only) to the results of repeated experiments

Probability is related to uncertainty and not (only) to the results of repeated experiments Uncertainty probability Probability is related to uncertainty and not (only) to the results of repeated experiments G. D Agostini, Probabilità e incertezze di misura - Parte 1 p. 40 Uncertainty probability

More information

, (1) e i = ˆσ 1 h ii. c 2016, Jeffrey S. Simonoff 1

, (1) e i = ˆσ 1 h ii. c 2016, Jeffrey S. Simonoff 1 Regression diagnostics As is true of all statistical methodologies, linear regression analysis can be a very effective way to model data, as along as the assumptions being made are true. For the regression

More information

COMP9414/ 9814/ 3411: Artificial Intelligence. 14. Uncertainty. Russell & Norvig, Chapter 13. UNSW c AIMA, 2004, Alan Blair, 2012

COMP9414/ 9814/ 3411: Artificial Intelligence. 14. Uncertainty. Russell & Norvig, Chapter 13. UNSW c AIMA, 2004, Alan Blair, 2012 COMP9414/ 9814/ 3411: Artificial Intelligence 14. Uncertainty Russell & Norvig, Chapter 13. COMP9414/9814/3411 14s1 Uncertainty 1 Outline Uncertainty Probability Syntax and Semantics Inference Independence

More information

Where are we in CS 440?

Where are we in CS 440? Where are we in CS 440? Now leaving: sequential deterministic reasoning Entering: probabilistic reasoning and machine learning robability: Review of main concepts Chapter 3 Motivation: lanning under uncertainty

More information

Proofs: A General How To II. Rules of Inference. Rules of Inference Modus Ponens. Rules of Inference Addition. Rules of Inference Conjunction

Proofs: A General How To II. Rules of Inference. Rules of Inference Modus Ponens. Rules of Inference Addition. Rules of Inference Conjunction Introduction I Proofs Computer Science & Engineering 235 Discrete Mathematics Christopher M. Bourke cbourke@cse.unl.edu A proof is a proof. What kind of a proof? It s a proof. A proof is a proof. And when

More information

ELEMENTARY NUMBER THEORY AND METHODS OF PROOF

ELEMENTARY NUMBER THEORY AND METHODS OF PROOF CHAPTER 4 ELEMENTARY NUMBER THEORY AND METHODS OF PROOF Copyright Cengage Learning. All rights reserved. SECTION 4.6 Indirect Argument: Contradiction and Contraposition Copyright Cengage Learning. All

More information

Direct Proof and Counterexample I:Introduction

Direct Proof and Counterexample I:Introduction Direct Proof and Counterexample I:Introduction Copyright Cengage Learning. All rights reserved. Goal Importance of proof Building up logic thinking and reasoning reading/using definition interpreting :

More information

Old Evidence, Logical Omniscience & Bayesianism

Old Evidence, Logical Omniscience & Bayesianism Old Evidence, Logical Omniscience & Bayesianism Branden Fitelson Department of Philosophy Group in Logic and the Methodology of Science & Institute for Cognitive and Brain Sciences University of California

More information

The central problem: what are the objects of geometry? Answer 1: Perceptible objects with shape. Answer 2: Abstractions, mere shapes.

The central problem: what are the objects of geometry? Answer 1: Perceptible objects with shape. Answer 2: Abstractions, mere shapes. The central problem: what are the objects of geometry? Answer 1: Perceptible objects with shape. Answer 2: Abstractions, mere shapes. The central problem: what are the objects of geometry? Answer 1: Perceptible

More information

Proofs. Introduction II. Notes. Notes. Notes. Slides by Christopher M. Bourke Instructor: Berthe Y. Choueiry. Fall 2007

Proofs. Introduction II. Notes. Notes. Notes. Slides by Christopher M. Bourke Instructor: Berthe Y. Choueiry. Fall 2007 Proofs Slides by Christopher M. Bourke Instructor: Berthe Y. Choueiry Fall 2007 Computer Science & Engineering 235 Introduction to Discrete Mathematics Sections 1.5, 1.6, and 1.7 of Rosen cse235@cse.unl.edu

More information

CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-II

CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-II CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-II 1 Bayes Rule Example Disease {malaria, cold, flu}; Symptom = fever Must compute Pr(D fever) to prescribe treatment Why not assess

More information

Direct Proof and Counterexample I:Introduction. Copyright Cengage Learning. All rights reserved.

Direct Proof and Counterexample I:Introduction. Copyright Cengage Learning. All rights reserved. Direct Proof and Counterexample I:Introduction Copyright Cengage Learning. All rights reserved. Goal Importance of proof Building up logic thinking and reasoning reading/using definition interpreting statement:

More information

Introduction to Logic and Axiomatic Set Theory

Introduction to Logic and Axiomatic Set Theory Introduction to Logic and Axiomatic Set Theory 1 Introduction In mathematics, we seek absolute rigor in our arguments, and a solid foundation for all of the structures we consider. Here, we will see some

More information

Joyce s Argument for Probabilism

Joyce s Argument for Probabilism Joyce s Argument for Probabilism Patrick Maher (p-maher@uiuc.edu) Department of Philosophy, University of Illinois at Urbana-Champaign Abstract. James Joyce s Nonpragmatic Vindication of Probabilism gives

More information

Dogmatism and Intuitionistic Probability

Dogmatism and Intuitionistic Probability Dogmatism and Intuitionistic Probability David Jehle, Brian Weatherson Many epistemologists hold that an agent can come to justifiably believe that p is true by seeing that it appears that p is true, without

More information

CS 540: Machine Learning Lecture 2: Review of Probability & Statistics

CS 540: Machine Learning Lecture 2: Review of Probability & Statistics CS 540: Machine Learning Lecture 2: Review of Probability & Statistics AD January 2008 AD () January 2008 1 / 35 Outline Probability theory (PRML, Section 1.2) Statistics (PRML, Sections 2.1-2.4) AD ()

More information

Hardy s Paradox. Chapter Introduction

Hardy s Paradox. Chapter Introduction Chapter 25 Hardy s Paradox 25.1 Introduction Hardy s paradox resembles the Bohm version of the Einstein-Podolsky-Rosen paradox, discussed in Chs. 23 and 24, in that it involves two correlated particles,

More information

Pengju

Pengju Introduction to AI Chapter13 Uncertainty Pengju Ren@IAIR Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes Rule Example: Car diagnosis Wumpus World Environment Squares

More information

Qualifying Exam in Machine Learning

Qualifying Exam in Machine Learning Qualifying Exam in Machine Learning October 20, 2009 Instructions: Answer two out of the three questions in Part 1. In addition, answer two out of three questions in two additional parts (choose two parts

More information

A Short Introduction to the Theory of Statistical Learning. Kee Siong Ng, PhD

A Short Introduction to the Theory of Statistical Learning. Kee Siong Ng, PhD A Short Introduction to the Theory of Statistical Learning Kee Siong Ng, PhD keesiong.ng@gmail.com Outline Introduction Bayesian Probability Theory Sequence Prediction and Data Compression Bayesian Networks

More information

Where are we in CS 440?

Where are we in CS 440? Where are we in CS 440? Now leaving: sequential deterministic reasoning Entering: probabilistic reasoning and machine learning robability: Review of main concepts Chapter 3 Making decisions under uncertainty

More information

2. Probability. Chris Piech and Mehran Sahami. Oct 2017

2. Probability. Chris Piech and Mehran Sahami. Oct 2017 2. Probability Chris Piech and Mehran Sahami Oct 2017 1 Introduction It is that time in the quarter (it is still week one) when we get to talk about probability. Again we are going to build up from first

More information

Agency and Interaction in Formal Epistemology

Agency and Interaction in Formal Epistemology Agency and Interaction in Formal Epistemology Vincent F. Hendricks Department of Philosophy / MEF University of Copenhagen Denmark Department of Philosophy Columbia University New York / USA CPH / August

More information

An Objective Justification of Bayesianism II: The Consequences of Minimizing Inaccuracy

An Objective Justification of Bayesianism II: The Consequences of Minimizing Inaccuracy An Objective Justification of Bayesianism II: The Consequences of Minimizing Inaccuracy Hannes Leitgeb and Richard Pettigrew December 19, 2009 Abstract One of the fundamental problems of epistemology is

More information

Artificial Intelligence Uncertainty

Artificial Intelligence Uncertainty Artificial Intelligence Uncertainty Ch. 13 Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? A 25, A 60, A 3600 Uncertainty: partial observability (road

More information

Biology as Information Dynamics

Biology as Information Dynamics Biology as Information Dynamics John Baez Biological Complexity: Can It Be Quantified? Beyond Center February 2, 2017 IT S ALL RELATIVE EVEN INFORMATION! When you learn something, how much information

More information

Basic Probability and Decisions

Basic Probability and Decisions Basic Probability and Decisions Chris Amato Northeastern University Some images and slides are used from: Rob Platt, CS188 UC Berkeley, AIMA Uncertainty Let action A t = leave for airport t minutes before

More information

Conditional probabilities and graphical models

Conditional probabilities and graphical models Conditional probabilities and graphical models Thomas Mailund Bioinformatics Research Centre (BiRC), Aarhus University Probability theory allows us to describe uncertainty in the processes we model within

More information

Data Analysis and Monte Carlo Methods

Data Analysis and Monte Carlo Methods Lecturer: Allen Caldwell, Max Planck Institute for Physics & TUM Recitation Instructor: Oleksander (Alex) Volynets, MPP & TUM General Information: - Lectures will be held in English, Mondays 16-18:00 -

More information

Reviewed by Martin Smith, University of Glasgow

Reviewed by Martin Smith, University of Glasgow 1 Titelbaum, M. Quitting Certainties: A Bayesian Framework Modelling Degrees of Belief, Oxford University Press, 2013, 345pp., 40 (hbk), ISBN 978-0-19-965830-5 Reviewed by Martin Smith, University of Glasgow

More information

STAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01

STAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01 STAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01 Nasser Sadeghkhani a.sadeghkhani@queensu.ca There are two main schools to statistical inference: 1-frequentist

More information

Proving Completeness for Nested Sequent Calculi 1

Proving Completeness for Nested Sequent Calculi 1 Proving Completeness for Nested Sequent Calculi 1 Melvin Fitting abstract. Proving the completeness of classical propositional logic by using maximal consistent sets is perhaps the most common method there

More information

Notes on statistical tests

Notes on statistical tests Notes on statistical tests Daniel Osherson Princeton University Scott Weinstein University of Pennsylvania September 20, 2005 We attempt to provide simple proofs for some facts that ought to be more widely

More information

Predicates, Quantifiers and Nested Quantifiers

Predicates, Quantifiers and Nested Quantifiers Predicates, Quantifiers and Nested Quantifiers Predicates Recall the example of a non-proposition in our first presentation: 2x=1. Let us call this expression P(x). P(x) is not a proposition because x

More information

Mathematical Statistics

Mathematical Statistics Mathematical Statistics MAS 713 Chapter 8 Previous lecture: 1 Bayesian Inference 2 Decision theory 3 Bayesian Vs. Frequentist 4 Loss functions 5 Conjugate priors Any questions? Mathematical Statistics

More information

Sample Spaces, Random Variables

Sample Spaces, Random Variables Sample Spaces, Random Variables Moulinath Banerjee University of Michigan August 3, 22 Probabilities In talking about probabilities, the fundamental object is Ω, the sample space. (elements) in Ω are denoted

More information