The problem of disjunctive causal factors. Following Hitchcock: fix K and do everything within a single cell K (which we don t mention).

Similar documents
Probabilistic Theories of Causality: Singular or Plural?

CAUSATION CAUSATION. Chapter 10. Non-Humean Reductionism

CHRISTOPHER READ HITCHCOCK A GENERALIZED PROBABILISTIC THEORY OF CAUSAL RELEVANCE *

Lecture 31 Woodward on Actual Causes

Chapter 14. Statistical versus Deterministic Relationships. Distance versus Speed. Describing Relationships: Scatterplots and Correlation

Tooley on backward causation

a. See the textbook for examples of proving logical equivalence using truth tables. b. There is a real number x for which f (x) < 0. (x 1) 2 > 0.

Causal Reasoning. Note. Being g is necessary for being f iff being f is sufficient for being g

1 Impact Evaluation: Randomized Controlled Trial (RCT)

I. Mackie: Causes and Conditions

CS261: A Second Course in Algorithms Lecture #18: Five Essential Tools for the Analysis of Randomized Algorithms

3. Nomic vs causal vs dispositional essentialism. 1. If quidditism is true, then properties could have swapped their nomic roles.

Appendix A Lewis s Counterfactuals

Hempel s Models of Scientific Explanation

Bounding the Probability of Causation in Mediation Analysis

Probability and Probability Distributions. Dr. Mohammed Alahmed

CS280, Spring 2004: Final

Lecture 34 Woodward on Manipulation and Causation

In Newcomb s problem, an agent is faced with a choice between acts that

For Philosophy and Phenomenolgical Research

Truth, Subderivations and the Liar. Why Should I Care about the Liar Sentence? Uses of the Truth Concept - (i) Disquotation.

Instructor: Sudeepa Roy

Philosophy 148 Announcements & Such. Independence, Correlation, and Anti-Correlation 1

Lewis Counterfactual Theory of Causation

Actual Causality: A Survey

Causality II: How does causal inference fit into public health and what it is the role of statistics?

Replay argument. Abstract. Tanasije Gjorgoski Posted on on 03 April 2006

A Crucial Mistake in the Free Will Debate

Desire-as-belief revisited

A proof of Bell s inequality in quantum mechanics using causal interactions

Seminaar Abstrakte Wiskunde Seminar in Abstract Mathematics Lecture notes in progress (27 March 2010)

Statistical testing. Samantha Kleinberg. October 20, 2009

MATH2206 Prob Stat/20.Jan Weekly Review 1-2

Conditional probabilities and graphical models

1. Justification of analogical reasoning 2. Representation

Regularity analyses have failed; it is time to give up and try something else: a counterfactual analysis.

Toward a Mechanistic Interpretation of Probability

Algebra Exam. Solutions and Grading Guide

Social Science Counterfactuals. Julian Reiss, Durham University

Getting Started with Communications Engineering

The Role of Chance in Explanation

First-Degree Entailment

CSE 331 Winter 2018 Reasoning About Code I

Paradoxes of special relativity

Of Miracles and Interventions

Probabilistic Causation, Preemption and Counterfactuals

Modal Dependence Logic

STA Module 10 Comparing Two Proportions

Causality in Concurrent Systems

Expectation of geometric distribution

CHAPTER 6 - THINKING ABOUT AND PRACTICING PROPOSITIONAL LOGIC

Structural Equations and Beyond

Topic Contents. Factoring Methods. Unit 3: Factoring Methods. Finding the square root of a number

A Probabilistic Analysis of Causation

Contrastive Causation

What Does Quantum Mechanics Suggest About Our Perceptions of Reality?

HPS 1653 / PHIL 1610 Introduction to the Philosophy of Science

AP Statistics. Chapter 6 Scatterplots, Association, and Correlation

Gov 2002: 4. Observational Studies and Confounding

EPISTEMOLOGY, COMPUTATION AND THE LAWS OF PHYSICS

Kaplan s Paradox and Epistemically Possible Worlds

εx 2 + x 1 = 0. (2) Suppose we try a regular perturbation expansion on it. Setting ε = 0 gives x 1 = 0,

STA Module 4 Probability Concepts. Rev.F08 1

Philosophy 5340 Epistemology. Topic 3: Analysis, Analytically Basic Concepts, Direct Acquaintance, and Theoretical Terms. Part 2: Theoretical Terms

Why Care About Counterfactual Support? The Cognitive Uses of Causal Order Lecture 2

Theoretical Cryptography, Lectures 18-20

PAC Learning. prof. dr Arno Siebes. Algorithmic Data Analysis Group Department of Information and Computing Sciences Universiteit Utrecht

Q: How can quantum computers break ecryption?

David Lewis. Causation

Indicative conditionals

4. Sentential Connectives

Expectation of geometric distribution. Variance and Standard Deviation. Variance: Examples

02. Explanation. Part 1.

Introducing Proof 1. hsn.uk.net. Contents

Many natural processes can be fit to a Poisson distribution

Causal Inference & Reasoning with Causal Bayesian Networks

Breaking de Morgan s law in counterfactual antecedents

Chapter 5 Effect Modification

Topic 4: Causation and conditionals

CSC 5170: Theory of Computational Complexity Lecture 5 The Chinese University of Hong Kong 8 February 2010

Philosophy 244: #8 Counterfactuals, Neighborhood Semantics, Probability, Predicative Necessity, etc.

DEMBSKI S SPECIFIED COMPLEXITY: A SIMPLE ERROR IN ARITHMETIC

CM10196 Topic 2: Sets, Predicates, Boolean algebras

Chapter 26: Comparing Counts (Chi Square)

(1) If Bush had not won the last election, then Nader would have won it.

Truth-Functional Logic

Introduction to Functions

Cartwright: Do the Laws of Physics State the Facts?

Reasons for Rejecting a Counterfactual Analysis of Conclusive Reasons

The Chameleons of Camelot

I. Induction, Probability and Confirmation: Introduction

Expectation is linear. So far we saw that E(X + Y ) = E(X) + E(Y ). Let α R. Then,

1 More finite deterministic automata

Proof Techniques (Review of Math 271)

ALGEBRA. 1. Some elementary number theory 1.1. Primes and divisibility. We denote the collection of integers

David Lewis. Void and Object

THE LOGIC OF COMPOUND STATEMENTS

SIMILARITY IS A BAD GUIDE TO COUNTERFACTUAL TRUTH. The Lewis-Stalnaker logic systems [2] account for a wide range of intuitions about

Notes on Computer Theory Last updated: November, Circuits

Handout 8: Bennett, Chapter 10

Transcription:

Example 1: Skidding. The problem of disjunctive causal factors Y Car skids; X car travels at 40 mph Eells/Cartwright approach: compare (*) Pr(Y / X&K) and Pr(Y / ~X&K) within each cell or context K. Following Hitchcock: fix K and do everything within a single cell K (which we don t mention). The comparison (*) is unproblematic if X is a binary variable with just two values (X and ~X). But suppose ~X is (materially) equivalent to a disjunction. Suppose the other possibilities are X 1 0 mph and X 2 50 mph. So ~X X 1 X 2. Suppose we have the following probabilities for a skid: Pr(Y / X 1 ) = 0 Pr(Y / X) = 0.1 Pr(Y / X 2 ) = 0.2 Then Pr(Y / X) > Pr(Y / ~X) requires that we compute Pr(Y / ~X) = [Pr(Y/X 1 ) Pr(X 1 ) + Pr(Y/X 2 ) Pr(X 2 )] / [Pr(X 1 ) + Pr(X 2 )] This is a weighted average of 0 and 0.2. Why should the causal relevance of X to Y depend upon Pr(X 1 ) and Pr(X 2 ), the probabilities of these different ways in which X might be false? The comparison to ~X does not seem to be the right comparison.

Example 2: Drug dosage Y recovery X 0 placebo; X 1 moderate dose of curit; X 2 large dose of curit Pr(Y / X 0 ) = 0.2 Pr(Y / X 1 ) = 0.4 Pr(Y / X 2 ) = 0.9 Is X 1 causally positive for Y? (Again, assume a fixed cell K.) Eells: Only if Pr(Y / X 1 ) > Pr(Y / ~X 1 ) within each cell. But Pr(Y / ~X 1 ) = (1/α) [Pr(Y/X 0 ) Pr(X 0 ) + Pr(Y/X 2 ) Pr(X 2 )] = (1/α) [(0.2) Pr(X 0 ) + (0.9) Pr(X 2 )] where α = [Pr(X 0 ) + Pr(X 2 )] The verdict about causal relevance will depend upon the proportion of subjects in the trial given the placebo and the proportion given the high dose! That seems crazy.

Example 3: A drink acid; B drink base; Y death Make the causal factor explicitly conjunctive: A&~B. Suppose that we ve fixed the causal context K. To compare Pr(Y / A&~B) and Pr(Y / ~(A&~B)), use Pr(Y / A&B) = 0.1 Pr(Y / A&~B) = 0.6 Pr(Y / ~A&B) = 0.9 Pr(Y / ~A&~B) = 0.4 ~(A & ~B) (A&B) (~A&B) (~A&~B) Once again, we have to look at a weighted average of the probability of Y given these disjuncts. It might well turn out that A&~B is NOT causally positive for death, if Pr(~A&B) is sufficiently high.

Summary of Solutions 1. Eells: taken care of by contexts a) Simple case: Different individuals in the population have determinate dispositions to fall into one of the disjunctive cases, should the conjunctive factor not apply. Ex. 3: K 1 circus performers (meant to take both, but lost antidote) K 1 & ~(A & ~B) (A & B) K 2 suicides (meant to kill themselves). If they had not swallowed acid and no base, would have done reverse. K 2 & ~(A & ~B) (~A & B) K 3 the clumsy chemists (swallowed acid by mistake. If they had not done it, would have swallowed neither.) K 3 & ~(A & ~B) (~A & ~B) These dispositions are part of the background context and, in conjunction with the negation of the conjunctive factor, locate the individual in different cells. Upshot: A&~B is causally positive for Y (death) only if this is the case within each cell, by context unanimity. The dispositions are already factored into the cells. So if both K 2 & (A&~B) and K 2 & ~(A&~B) have positive probability, then we say A&~B is causally mixed for Y; it lowers probability of death within K 2. Question: How does this sound? (Look at the other examples as well.) General case: Different individuals have probabilistic dispositions to join one of the disjunctive classes should the conjunctive factor not apply. Same strategy: you make each such disposition part of a distinct background context (with some mopping up).

2. Humphreys: the neutral state a) Binary Case If neither X nor ~X is a non-trivial disjunction, then the measure of causal relevance of X to Y is just Pr(Y / X) Pr(Y / ~X). {within a context} Metaphysical interpretation: a change from ~X to X (i.e., the occurrence of X) produces a (deterministic) change in propensity (chance, single-case probability) from Pr(Y / ~X) to Pr(Y / X). If the change is positive, X is a contributing cause; if negative, a counteracting cause. **On Humphreys account, there is no big difference between deterministic and probabilistic causation. The change in propensity is a deterministic change to the system. Whether Y happens or not is just chance, and essentially unimportant for figuring out the contribution of X. So the analysis is just the same as for deterministic causation, except that what is being deterministically brought about is change in propensity, rather than a change in some (observable) property Y** There is no further explanation of, or causation of, Y: it either happens or it does not, by chance. Example: Mouse exposed to carcinogenic chemicals X 1 and X 2 ; develops a tumor (Y). Each of X 1 and X 2 contributes to raise the propensity of Y. Woodward: there are three possibilities X 1 alone caused the tumor, X 2 alone caused the tumor, or both X 1 and X 2 caused the tumor together. Humphreys: In fact, only the third possibility is real: both are causes, because both raise the propensity. Then either the mouse develops a tumor or not.

b) General case What is the general form for the nonoccurrence of a causal factor when that factor is not binary, but (e.g.) quantitative? (Or more generally, the negation is disjunctive.) Given Humphreys sharp metaphysical picture of probabilistic causation (which, via the analogy to deterministic causation, has some real advantages), Eells approach to the problem of disjunctive factors will NOT be available. We want to know whether the propensity was raised! Suggests two possibilities: Examples 1) Go to the right cell. Compare Pr(Y / X) with Pr(Y / K i & ~X) for the right K i, that is, whichever one applies. This is perhaps fine for token causation; it might even be a kind of solution common to the binary case and the neutral case that Humphreys adopts. It doesn t work for type causation. 2) Neutral state. Compare Pr(Y / X) with neutral state, since ~X does not make sense. Ex. 1: Skidding. The neutral state is speed of 0 mph. If Y skidding and X 40 mph, then Pr(Y / X) = 0.1 exceeds Pr(Y / 0 mph) = 0, so X is a contributing cause of Y. (Any speed over 30 mph.) Ex. 2: Drug dose. The neutral state is the placebo: no drug. If Y quick recovery and X 1 moderate dose, then Pr(Y / X 1 ) = 0.4 > Pr(Y / X 0 ) = 0.2 So X is a contributing cause of Y. Ex. 3: Drinking acid. The neutral state here is presumably drinking neither chemical. If Y death and A drinking acid, then Pr(Y / A) > Pr(Y / neither), so contributing (See p. 44-5)

Neutral state Case i): absolute value of a variable. Level of the variable where the corresponding property is completely absent. In general: highly specific to each factor. Examples: Temperature (absolute 0), velocity (0), current (0) Multiple causal factors: temperature and potential difference are relevant factors for electron emission. The neutral state for temp. is absolute 0; the neutral state for potential difference is zero difference between internal and external potential. In assessing whether any given factor is a contributing or counteracting cause, keep all other factors at the neutral level? Or at any rate fixed. Case ii): change in value of a variable. When the factor is a change in value of a variable, the neutral state is just no change. Arguments for comparison with the neutral state A. No other candidate is reasonable. a) Average value. Pr(Y / X) Pr(Y). Objection 1: This gives the same answer as Pr(Y / X) Pr(Y / ~X) only in the binary case; otherwise, not equivalent. Objection 2: Example 2 above (drug dosage). If each group is 1/3 of the total, you get the result that a moderate dose of curit is a counteracting cause, which seems wrong. Should not be sensitive to proportions of contrast class in population (Note: Eells solution agrees with this criterion as well.) What s important is that it raises the propensity of recovery. Objection 3: There is nothing that guarantees that the average value will be causally neutral. The average level of smoking is 1 pack/day, but it s surely not correct to say that smoking half a pack/day is a counteracting cause for lung cancer!

b) Normal or natural value. Same problem as Objection 3 above. Nothing guarantees that the normal or natural value is not loaded. Ex: Normal level of radiation is not neutral wrt/ cancer. Ex: Suppose once/week is normal level of sexual activity. Does not follow that sexual activity once/month is a counteracting cause for pregnancy. [For Humphreys, the neutral state here would be complete abstinence.] c) Negation of the value. Pr(Y / X) Pr(Y / ~X) Objections: - highly underdetermined - automatically makes the absence of a contributing cause into a counteracting cause (since the average probability of the effect will be between these two) - but that is not plausible: syphilis is positive for paresis, but absence of syphilis is not counteractive (can t lower below zero probability).

3. Hitchcock: Ternary relation Binary account: C causes E iff Pr(E / C) > Pr(E / ~C) within each context or cell. Drug example: On the standard theory, the causal relevance of a moderate dose (C 1 ) to recovery (E) will depend upon P(E / ~C 1 ). - gives wrong answer (prevents recovery) if each group is equal - causal relevance of C 1 should not depend upon the group proportions at all Proposal: C 1 is not a positive or negative cause of E simpliciter, but only relative to some alternative (here, C 0 or C 2 ). In general: C 1 is a positive cause relative to C 0, but negative cause relative to C 2. B is a positive cause for A relative to an alternative B iff P(A / B) > P(A / B ); similarly for negative or neutral cause. Note 1: Still within a single cell. Can combine with context unanimity to make it cell-independent. Note 2: B must be an alternative. That is, B and B must be mutually exclusive. Underlying idea: The primary object of interest is not such pairwise relative causal claims, or even a single binary causal claim. Rather, the primary object of interest is the response function; f(x) = P(E / X = x), where X is the variable (e.g., drug dosage), x is a particular value, and E is the response of interest. This function contains all that one could need to know ; the question of causation is rejected. Relative causal claims convey information about this response function.

Application to two other problems 1. Singular vs. General Causation Sherlock Holmes example: negative type-level causal relevance, but positive token-level relevance. Proposal: In fact, there are both negative and positive claims to be made at the token and type levels, and they don t really come apart. The argument for separate types of causation disappears. Question: Does causation itself, of both types, disappear? It dissolves in favour of the probability relations. 2. Cause and contrast Contrastive stress seems to play an unavoidable role in causal claims, which leads Dretske to regard and Susan s stealing the bicycle Susan s stealing the bicycle as distinct events, or event allomorphs, (since one causes her arrest and the other does not). This seems a crazy conclusion. Should these not refer to the same event? Proposal: In fact, there are contrastive causal claims to be made, one of which is true and one false. Susan s stealing the bicycle, rather than buying it, caused her arrest. -- T Susan s stealing the bicycle, rather than the skis, caused her arrest. F These can be captured with distinct inequalities between conditional probabilities, and there is no need to introduce event allomorphs.

Application to counterfactual analysis Example: If Uncle Schlomo had not smoked 2 packs a day, he would not have got lung cancer. Objection: This depends on the nearest p.w. being one in which he does not smoke at all. But the nearest might be one where he smokes 3 packs and still gets lung cancer. Analysis: There may be a disjunction of closest possible worlds. Rather than arbitrarily determining which is closer, make the comparison between our world and each disjunct, using rather than. Lewis response: take standards of similarity from intuitions about counterfactuals, not the reverse. The closer world is the one where Schlomo does not smoke. Further objection: violates Lewis standards of similarity. [But this further objection does not respect Lewis point about taking intuitions about counterfactuals as primary.] Reconciliation with binary intuitions We have the binary intuitions because: - in many cases, the causal variable is binary - in other cases, differences among alternatives are irrelevant and they may be collapsed into one. (C 1 promotes E relative to any other alternative C 0, C 2, ) - Context makes one alternative salient. Question: The real point is not just grammatical. It seems to be that the binary causation is not a fundamental part of our ontology. The response function is: it has all the information we need. 1. Is this correct? (Remember, all is done within a homogeneous context, so we don t get problems of spurious causation.) 2. Don t we still need binary causation to determine the homogeneous contexts within which the values of the response function are defined?