Logical and Probabilistic Models of Belief Change

Size: px
Start display at page:

Download "Logical and Probabilistic Models of Belief Change"

Transcription

1 Logical and Probabilistic Models of Belief Change Eric Pacuit Department of Philosophy University of Maryland, College Park pacuit.org August 7, 2017 Eric Pacuit 1

2 Plan Day 1 Introduction to belief revision, AGM, possible worlds models, Bayesian models (time permitted) Day 2 Possible worlds models (continued), Bayesian models (continued), Updating probabilities Day 3 The value of learning, Lottery Paradox, Preface Paradox, Review Paradox, Iterated belief revision, Context shifts, Becoming aware Day 4 The value of learning, Lottery Paradox, Preface Paradox, Review Paradox, Iterated belief revision, Context shifts, Becoming aware (continued) Day 5 Interactive epistemology (Agreement Theorems), Convergence Theorems Eric Pacuit 2

3 pacuit.org/nls2017/belchange/ Eric Pacuit 3

4 Plan for today Introduction to belief revision AGM Possible worlds model Bayesian models (time permitting) Eric Pacuit 4

5 Belief Change Computer science: updating databases (Doyle 1979 and Fagin et al. 1983) Philosophy (epistemology/philosophy of science): scientific theory change and revisions of probability assignments; belief change (Levi 1977, 1980, Harper 1977) and its rationality. Eric Pacuit 5

6 AGM Carlos Alchourrón, Peter Gärdenfors, and David Makinson. C. Alchourrón, P. Gärdenfors and D. Makinson. On the logic of theory change: Partial meet contraction and revision functions. Journal of Symbolic Logic, 50, , Eric Pacuit 6

7 Classics I Levi. Subjunctives, Dispositions and Chances. Synthese 34: , W. Spohn. Ordinal conditional functions: A dynamic theory of epistemic states. in W.L. Harper and B. Skyrms, eds., Causation in Decision, Belief Change and Statistics, vol 2, pp , W. Harper. Rational Conceptual Change. PSA 1976, pp Eric Pacuit 7

8 Belief Change Consider the following beliefs of a rational agent: p 1 All Europeans swans are white. p 2 The bird caught in the trap is a swan. p 3 The bird caught in the trap comes from Sweden. p 4 Sweden is part of Europe. Thus, the agent believes: q The bird caught in the trap is white. Now suppose the rational agent for example, You learn that the bird caught in the trap is black ( q). There are several logically consistent ways to incorporate q! Eric Pacuit 8

9 Belief Change Consider the following beliefs of a rational agent: p 1 All Europeans swans are white. p 2 The bird caught in the trap is a swan. p 3 The bird caught in the trap comes from Sweden. p 4 Sweden is part of Europe. Thus, the agent believes: q The bird caught in the trap is white. Now suppose the rational agent for example, You learn that the bird caught in the trap is black ( q). There are several logically consistent ways to incorporate q! Eric Pacuit 8

10 Belief Change Consider the following beliefs of a rational agent: p 1 All Europeans swans are white. p 2 The bird caught in the trap is a swan. p 3 The bird caught in the trap comes from Sweden. p 4 Sweden is part of Europe. Thus, the agent believes: q The bird caught in the trap is white. Question: How should the agent incorporate q into his belief state to obtain a consistent belief state? There are several logically consistent ways to incorporate q! Eric Pacuit 8

11 Belief Change Consider the following beliefs of a rational agent: p 1 All Europeans swans are white. p 2 The bird caught in the trap is a swan. p 3 The bird caught in the trap comes from Sweden. p 4 Sweden is part of Europe. Thus, the agent believes: q The bird caught in the trap is white. Problem: Logical considerations alone are insufficient to answer this question! Why?? There are several logically consistent ways to incorporate q! Eric Pacuit 8

12 Belief Change Consider the following beliefs of a rational agent: p 1 All Europeans swans are white. p 2 The bird caught in the trap is a swan. p 3 The bird caught in the trap comes from Sweden. p 4 Sweden is part of Europe. Thus, the agent believes: q The bird caught in the trap is white. Problem: Logical considerations alone are insufficient to answer this question! Why?? There are several logically consistent ways to incorporate q! Eric Pacuit 8

13 Belief Change What extralogical factors serve to determine what beliefs to give up and what beliefs to retain? Eric Pacuit 9

14 Belief Change p 1 All Europeans swans are white. p 2 The bird caught in the trap is a swan. p 3 The bird caught in the trap comes from Sweden. p 4 Sweden is part of Europe. Thus, the agent believes: q The bird caught in the trap is white. New belief: q The bird caught in the trap is not white. Eric Pacuit 9

15 Belief Change p 1 All Europeans swans are white. p 2 The bird caught in the trap is a swan. p 3 The bird caught in the trap comes from Sweden. p 4 Sweden is part of Europe. Thus, the agent believes: //q The///// bird///////// caught/// in///// the////// trap/// is//////// white. New belief: q The bird caught in the trap is not white. Eric Pacuit 9

16 Belief Change /// p 1 All////////////// Europeans/////// swans///// are//////// white. /// p 2 The///// bird///////// caught/// in///// the////// trap/// is//a//////// swan. /// p 3 The///// bird///////// caught/// in///// the////// trap//////// comes/////// from////////// Sweden. /// p 4 ///////// Sweden/// is///// part//// of////////// Europe. Thus, the agent believes: //q The///// bird///////// caught/// in///// the////// trap/// is//////// white. New belief: q The bird caught in the trap is not white. Eric Pacuit 9

17 Belief Change p 1 All Europeans swans are white. /// p 2 The///// bird///////// caught/// in///// the////// trap/// is//a//////// swan. /// p 3 The///// bird///////// caught/// in///// the////// trap//////// comes/////// from////////// Sweden. /// p 4 ///////// Sweden/// is///// part//// of////////// Europe. Thus, the agent believes: //q The///// bird///////// caught/// in///// the////// trap/// is//////// white. New belief: q The bird caught in the trap is not white. Eric Pacuit 9

18 Belief Change /// p 1 All////////////// Europeans/////// swans///// are//////// white. p 2 The bird caught in the trap is a swan. /// p 3 The///// bird///////// caught/// in///// the////// trap//////// comes/////// from////////// Sweden. /// p 4 ///////// Sweden/// is///// part//// of////////// Europe. Thus, the agent believes: //q The///// bird///////// caught/// in///// the////// trap/// is//////// white. New belief: q The bird caught in the trap is not white. Eric Pacuit 9

19 Belief Change /// p 1 All////////////// Europeans/////// swans///// are//////// white. p 2 The bird caught in the trap is a swan. p 3 The bird caught in the trap comes from Sweden. p 4 Sweden is part of Europe. Thus, the agent believes: //q The///// bird///////// caught/// in///// the////// trap/// is//////// white. New belief: q The bird caught in the trap is not white. Eric Pacuit 9

20 Belief Change p 1 All Europeans swans are white. /// p 2 The///// bird///////// caught/// in///// the////// trap/// is//a//////// swan. p 3 The bird caught in the trap comes from Sweden. p 4 Sweden is part of Europe. Thus, the agent believes: //q The///// bird///////// caught/// in///// the////// trap/// is//////// white. New belief: q The bird caught in the trap is not white. Eric Pacuit 9

21 Belief Change p 1 All Europeans swans are white. p 2 The bird caught in the trap is a swan. /// p 3 The///// bird///////// caught/// in///// the////// trap//////// comes/////// from////////// Sweden. p 4 Sweden is part of Europe. Thus, the agent believes: //q The///// bird///////// caught/// in///// the////// trap/// is//////// white. New belief: q The bird caught in the trap is not white. Eric Pacuit 9

22 Belief Change p 1 All Europeans swans are white. p 2 The bird caught in the trap is a swan. p 3 The bird caught in the trap comes from Sweden. /// p 4 ///////// Sweden/// is///// part//// of////////// Europe. Thus, the agent believes: //q The///// bird///////// caught/// in///// the////// trap/// is//////// white. New belief: q The bird caught in the trap is not white. Eric Pacuit 9

23 Belief Change p 1 All Europeans swans are white. p 2 The bird caught in the trap is a swan. p 3 The bird caught in the trap comes from Sweden. p 4 Sweden is part of Europe. Thus, the agent believes: //q The///// bird///////// caught/// in///// the////// trap/// is//////// white. New belief: q The bird caught in the trap is not white. Eric Pacuit 9

24 Belief Change - Guiding Principles 1. When accepting a new piece of information, an agent should aim at a minimal change of his old beliefs. 2. If there are different ways to effect a belief change, the agent should give up those beliefs which are least entrenched. Eric Pacuit 10

25 Textbooks S. O. Hansson. A Textbook of Belief Dynamics. Theory Change and Database Updating. Dordrecht. Kluwer Academic Publishers, P. Gärdenfors. Knowledge in Flux. Modeling the Dynamics of Epistemic States. The MIT Press, H. Rott. Change, Choice and Inference: A Study of Belief Revision and Nonmonotonic Reasoning. Oxford University Press, Eric Pacuit 11

26 Epistemic States Belief sets (Ellis s belief systems) Possible worlds models (Doyle s truth maintenance systems) Spohn s generalized possible worlds model Bayesian models Generalized Bayesian models (Johnson-Laird mental models)... Eric Pacuit 12

27 Epistemic States Belief sets (Ellis s belief systems) Possible worlds models (Doyle s truth maintenance systems) Spohn s generalized possible worlds model Bayesian models Generalized Bayesian models (Johnson-Laird mental models)... J. Halpern. Reasoning about uncertainty. The MIT Press, Eric Pacuit 12

28 AGM Language of Beliefs in AGM: propositional logic: atomic propositions p, q, r,... connectives: negation ( ), conjunction ( ), disjunction ( ), implication ( ), and equivalence ( ). Eric Pacuit 13

29 AGM Language of Beliefs in AGM: propositional logic: atomic propositions p, q, r,... connectives: negation ( ), conjunction ( ), disjunction ( ), implication ( ), and equivalence ( ). Rationality constraints:: 1. Belief sets should be consistent 2. Belief sets should be closed under logical consequence Eric Pacuit 13

30 Classical Consequence For any set A of sentences, Cn(A) is the set of logical consequences of A. Cn : (L) (L) satisfying the following three conditions: A Cn(A) (inclusion); If A B, then Cn(A) Cn(B) (monotony); Cn(A) = Cn(Cn(A)) (idempotence) If p can be derived from A by classical propositional logic, then p Cn(A). Write A p when p Cn(A). Eric Pacuit 14

31 K is a belief set just in case K = Cn(K). Eric Pacuit 15

32 K is a belief set just in case K = Cn(K). Logical omniscience; explicit vs. implicit beliefs; A belief set is not what you actually believe, but what you are committed to believe (Levi 1991). Eric Pacuit 15

33 K is a belief set just in case K = Cn(K). Logical omniscience; explicit vs. implicit beliefs; A belief set is not what you actually believe, but what you are committed to believe (Levi 1991). Belief bases vs. belief sets. B 1 = {p, p q}, B 2 = {p, q}. Eric Pacuit 15

34 K is a belief set just in case K = Cn(K). Logical omniscience; explicit vs. implicit beliefs; A belief set is not what you actually believe, but what you are committed to believe (Levi 1991). Belief bases vs. belief sets. B 1 = {p, p q}, B 2 = {p, q}. Cn(B 1 ) = Cn(B 2 ). Eric Pacuit 15

35 K is a belief set just in case K = Cn(K). Logical omniscience; explicit vs. implicit beliefs; A belief set is not what you actually believe, but what you are committed to believe (Levi 1991). Belief bases vs. belief sets. B 1 = {p, p q}, B 2 = {p, q}. Cn(B 1 ) = Cn(B 2 ). What happens when we receive the evidence that p? Eric Pacuit 15

36 p q p q p q p q q p p q p q q p p q p q p q p q Eric Pacuit 16

37 p q p q p q p q q p p q p q q p p q p q p q p q Eric Pacuit 16

38 p q p q p q p q q p p q p q q p p q p q p q p q Eric Pacuit 16

39 p q p q p q p q q p p q p q q p p q p q p q p q Eric Pacuit 16

40 p q p q p q p q q p p q p q q p p q p q p q p q Eric Pacuit 16

41 p q p q p q p q q p p q p q q p p q p q p q p q Eric Pacuit 16

42 p q p q p q p q q p p q p q q p p q p q p q p q Eric Pacuit 16

43 Change operator: : B L B K ϕ Initial set of beliefs New evidence ϕ Eric Pacuit 17

44 Revision operator: : B L B K ϕ Initial set of beliefs New evidence ϕ Eric Pacuit 17

45 Revision operator: : B L B K ϕ Initial set of beliefs New evidence ϕ Eric Pacuit 17

46 Belief change operator: : B L B K ϕ Initial set of beliefs New evidence ϕ Eric Pacuit 17

47 Belief Change Minimal change: The criterion of informational economy demands that as few beliefs as possible be given up so that the change is in some sense a minimal change of K to accommodate for A. (Gärdenfors 1988, p. 53). Keep the most entrenched beliefs:...beliefs are only given up when there are no less entrenched candidates... If one of two beliefs must be retracted in order to accommodate some new fact, the less entrenched belief will be relinquished, while the more entrenched persists. (Boutilier 1996, pp ). Eric Pacuit 18

48 Belief Change Suppose that K is the current beliefs. If you give priority to the new information ϕ, then there are three belief change operations: Eric Pacuit 19

49 Belief Change Suppose that K is the current beliefs. If you give priority to the new information ϕ, then there are three belief change operations: 1. Expansion: K + ϕ; ϕ is added to K. Eric Pacuit 19

50 Belief Change Suppose that K is the current beliefs. If you give priority to the new information ϕ, then there are three belief change operations: 1. Expansion: K + ϕ; ϕ is added to K. 2. Contraction: K. ϕ; ϕ is removed from K. Eric Pacuit 19

51 Belief Change Suppose that K is the current beliefs. If you give priority to the new information ϕ, then there are three belief change operations: 1. Expansion: K + ϕ; ϕ is added to K. 2. Contraction: K. ϕ; ϕ is removed from K. 3. Revision: K ϕ; ϕ is added and other things are removed. Eric Pacuit 19

52 Expansion Postulates (E1) (E2) (E3) (E4) (E5) K + α is deductively closed α K + α K K + α If α K, then K + α = K If K K, then K + α K + α (Minimality) For all belief sets K and all sentences α, K + α is the smallest belief set that satisfies (E1), (E2), and (E3). Eric Pacuit 20

53 Expansion Theorem Let + be a function on belief sets and formulas. Then, + satisfies minimality if and only if K + α = Cn(K {α}). Eric Pacuit 21

54 p q p q p q p q q p p q p q q p p q p q p q p q Eric Pacuit 22

55 p q p q p q p q q p p q p q q p p q p q p q p q Eric Pacuit 22

56 p q p q p q p q q p p q p q q p p q p q p q p q Eric Pacuit 22

57 p q p q p q p q q p p q p q q p p q p q p q p q Eric Pacuit 22

58 p q p q p q p q q p p q p q q p p q p q p q p q Eric Pacuit 22

59 Contraction Postulates (C1) K. α is deductively closed (C2) K. α K (C3) If α K or α then K. α = K (C4) If α, then α K. α (C5) If α β, then K. α = K. β (C6) K Cn((K. α) {α}) Eric Pacuit 23

60 Definition. An operator is a withdrawal if and only if it satisfies (C1-C5). Eric Pacuit 24

61 Definition. An operator is a withdrawal if and only if it satisfies (C1-C5). The following is a withdrawal: K α = { K Cn( ) if α K if α K Eric Pacuit 24

62 Definition. An operator is a withdrawal if and only if it satisfies (C1-C5). The following is a withdrawal: K α = { K Cn( ) if α K if α K Minimal Information Loss (Recovery): K Cn((K. α) {α}) Eric Pacuit 24

63 Let K be a belief set and ϕ a formula. K ϕ is the remainder set of K. A K ϕ iff 1. A K 2. ϕ Cn(A) 3. There is no B such that A B K and ϕ Cn(B). Eric Pacuit 25

64 K α = {K} iff α Cn(K) K α = iff α Cn( ) If K K and α Cn(K ) then there is some T such that K T K α. Eric Pacuit 26

65 A selection function γ for K is a function on K α such that: If K α, then γ(k α) K α and γ(k α) If K α =, then γ(k α) = {K} Eric Pacuit 27

66 Let K be a set of formulas. A function. is a partial meet contraction for K if there is a selection function γ for K such that for all formula α: K. α = γ(k α) Eric Pacuit 28

67 Let K be a set of formulas. A function. is a partial meet contraction for K if there is a selection function γ for K such that for all formula α: K. α = γ(k α) γ selects exactly one element of K α (maxichoice contraction) γ selects the entire set K α (full meet contraction) Eric Pacuit 28

68 p q p q p q p q q p p q p q q p p q p q p q p q Eric Pacuit 29

69 p q p q p q p q q p p q p q q p p q p q p q p q Eric Pacuit 29

70 p q p q p q p q q p p q p q q p p q p q p q p q Eric Pacuit 29

71 p q p q p q p q q p p q p q q p p q p q p q p q Eric Pacuit 29

72 p q p q p q p q q p p q p q q p p q p q p q p q Eric Pacuit 29

73 p q p q p q p q q p p q p q q p p q p q p q p q Eric Pacuit 29

74 p q p q p q p q q p p q p q q p p q p q p q p q Eric Pacuit 29

75 p q p q p q p q q p p q p q q p p q p q p q p q Eric Pacuit 29

76 p q p q p q p q q p p q p q q p p q p q p q p q Eric Pacuit 29

77 p q p q p q p q q p p q p q q p p q p q p q p q Eric Pacuit 29

78 Levi Identity K ϕ = (K. ϕ) + ϕ Eric Pacuit 30

79 Let K be a set of formulas. A function. is a partial meet contraction for K if there is a selection function γ for K such that for all formula α: K. α = γ(k α) γ selects exactly one element of K α (maxichoice contraction) γ selects the entire set K α (full meet contraction) Then, K α = Cn( γ(k α) {α}) Eric Pacuit 31

80 p q p q p q p q q p p q p q q p p q p q p q p q Eric Pacuit 32

81 p q p q p q p q q p p q p q q p p q p q p q p q Eric Pacuit 32

82 p q p q p q p q q p p q p q q p p q p q p q p q Eric Pacuit 32

83 p q p q p q p q q p p q p q q p p q p q p q p q Eric Pacuit 32

84 p q p q p q p q q p p q p q q p p q p q p q p q Eric Pacuit 32

85 AGM Postulates (AGM1) (AGM2) (AGM3) (AGM4) (AGM5) (AGM6) (AGM7) K ϕ is deductively closed ϕ K ϕ K ϕ Cn(K {ϕ}) If ϕ K then K ϕ = Cn(K {ϕ}) K ϕ is inconsistent only if ϕ is inconsistent If ϕ and ψ are logically equivalent then K ϕ = K ψ K (ϕ ψ) Cn(K ϕ {ψ}) (AGM8) If ψ K ϕ then Cn(K ϕ {ψ}) K (ϕ ψ) Eric Pacuit 33

86 Theorem (AGM 1985). Let K be a belief set and let be a function on L. Then The function is a partial meet revision for K if and only if it satisfies the postulates AGM1 - AGM6 The function is a transitively relational partial meet revision for K if and only if it satisfies AGM1 - AGM8. Eric Pacuit 34

87 There is a transitive relation on K α such that γ(k α) = {K K α K K for all K K α} { {K K α K is -maximal} if α Cn( ) K. α = K otherwise Eric Pacuit 35

88 Evaluating the AGM postulates Eric Pacuit 36

89 Counterexample to AGM 2 (Success) ϕ K ϕ Eric Pacuit 37

90 Counterexample to AGM 2 (Success) ϕ K ϕ You are walking down a street and see someone holding a sign reading The World will End Tomorrow, but you don t add this add this to your beliefs. Eric Pacuit 37

91 Counterexample to AGM 2 (Success) ϕ K ϕ You are walking down a street and see someone holding a sign reading The World will End Tomorrow, but you don t add this add this to your beliefs. Is this a counterexample to AGM 2? Eric Pacuit 37

92 Counterexample to AGM 2 (Success) ϕ K ϕ You are walking down a street and see someone holding a sign reading The World will End Tomorrow, but you don t add this add this to your beliefs. Is this a counterexample to AGM 2? No Eric Pacuit 37

93 Counterexample to AGM 2 (Success) ϕ K ϕ You are walking down a street and see someone holding a sign reading The World will End Tomorrow, but you don t add this add this to your beliefs. Is this a counterexample to AGM 2? No Two people, Ann and Bob, are reliable sources of information on whether The Netherlands will win the world cup. They are equally reliable. Eric Pacuit 37

94 Counterexample to AGM 2 (Success) ϕ K ϕ You are walking down a street and see someone holding a sign reading The World will End Tomorrow, but you don t add this add this to your beliefs. Is this a counterexample to AGM 2? No Two people, Ann and Bob, are reliable sources of information on whether The Netherlands will win the world cup. They are equally reliable. AGM assumes that the most recent evidence that you received takes precedent. Ann says yes and a little bit later, Bob says no. Eric Pacuit 37

95 Counterexample to AGM 2 (Success) ϕ K ϕ You are walking down a street and see someone holding a sign reading The World will End Tomorrow, but you don t add this add this to your beliefs. Is this a counterexample to AGM 2? No Two people, Ann and Bob, are reliable sources of information on whether The Netherlands will win the world cup. They are equally reliable. AGM assumes that the most recent evidence that you received takes precedent. Ann says yes and a little bit later, Bob says no. Why should the, possibly arbitrary, order in which you receive the information give more weight to Bob s announcement? Eric Pacuit 37

96 Counterexample to AGM 2 (Success) ϕ K ϕ You are walking down a street and see someone holding a sign reading The World will End Tomorrow, but you don t add this add this to your beliefs. Is this a counterexample to AGM 2? No Two people, Ann and Bob, are reliable sources of information on whether The Netherlands will win the world cup. They are equally reliable. AGM assumes that the most recent evidence that you received takes precedent. Ann says yes and a little bit later, Bob says no. Why should the, possibly arbitrary, order in which you receive the information give more weight to Bob s announcement? Is this a counterexample to AGM 2? Eric Pacuit 37

97 Counterexample to AGM 2 (Success) ϕ K ϕ You are walking down a street and see someone holding a sign reading The World will End Tomorrow, but you don t add this add this to your beliefs. Is this a counterexample to AGM 2? No Two people, Ann and Bob, are reliable sources of information on whether The Netherlands will win the world cup. They are equally reliable. AGM assumes that the most recent evidence that you received takes precedent. Ann says yes and a little bit later, Bob says no. Why should the, possibly arbitrary, order in which you receive the information give more weight to Bob s announcement? Is this a counterexample to AGM 2? No Eric Pacuit 37

98 Rott s Counterexample AGM 7: K (ϕ ψ) Cn(K ϕ {ψ}) AGM 8: if ψ K ϕ then Cn(K ϕ {ψ}) K (ϕ ψ) So, if ψ Cn({ϕ}), then K ϕ = Cn(K ϕ {ψ}) Eric Pacuit 38

99 Rott s Counterexample There is an appointment to be made in a philosophy department. The position is a metaphysics position, and there are three main candidates: Andrew, Becker and Cortez. 1. Andrew is clearly the best metaphysician, but is weak in logic. 2. Becker is a very good metaphysician, also good in logic. 3. Cortez is a brilliant logician, but weak in metaphysics. Scenario 1: Paul is told by the dean, that the chosen candidate is either Andrew or Becker. Since Andrew is clearly the better metaphysician of the two, Paul concludes that the winning candidate will be Andrew. A K (A B) Eric Pacuit 38

100 Rott s Counterexample There is an appointment to be made in a philosophy department. The position is a metaphysics position, and there are three main candidates: Andrew, Becker and Cortez. 1. Andrew is clearly the best metaphysician, but is weak in logic. 2. Becker is a very good metaphysician, also good in logic. 3. Cortez is a brilliant logician, but weak in metaphysics. Scenario 1: Paul is told by the dean, that the chosen candidate is either Andrew or Becker. Since Andrew is clearly the better metaphysician of the two, Paul concludes that the winning candidate will be Andrew. A, B, C K (A B) Eric Pacuit 38

101 Rott s Counterexample 1. Andrew is clearly the best metaphysician, but is weak in logic. 2. Becker is a very good metaphysician, also good in logic. 3. Cortez is a brilliant logician, but weak in metaphysics. Scenario 2: Paul is told by the dean that the chosen candidate is either Andrew, Becker or Cortez. Knowing that Cortez is a splendid logician, but that he can hardly be called a metaphysician, Paul comes to realize that his background assumption that expertise in the field advertised is the decisive criterion for the appointment cannot be upheld. Apparently, competence in logic is regarded as a considerable asset by the selection committee. Paul concludes Becker will be hired. A, B, C K (A B C) Eric Pacuit 38

102 Rott s Counterexample 1. Andrew is clearly the best metaphysician, but is weak in logic. 2. Becker is a very good metaphysician, also good in logic. 3. Cortez is a brilliant logician, but weak in metaphysics. Scenario 2: Paul is told by the dean that the chosen candidate is either Andrew, Becker or Cortez. Knowing that Cortez is a splendid logician, but that he can hardly be called a metaphysician, Paul comes to realize that his background assumption that expertise in the field advertised is the decisive criterion for the appointment cannot be upheld. Apparently, competence in logic is regarded as a considerable asset by the selection committee. Paul concludes Becker will be hired. A, B, C K (A B C) Eric Pacuit 38

103 Rott s Counterexample Data: A, B, C K (A B) A, B, C K (A B C) Theory: (AGM7) K (ϕ ψ) Cn(K ϕ {ψ}) (AGM8) If ψ K ϕ then Cn(K ϕ {ψ}) K (ϕ ψ) So, if ψ K ϕ, then K ϕ K (ϕ ψ) Eric Pacuit 38

104 Rott s Counterexample Problem: A, B K (A B C) A B K (A B C) K (A B C) K ((A B C) (A B)) = K (A B) A, B K (A B) Eric Pacuit 38

105 ...Rott seems to take the point about meta-information to explain why the example conflicts with the theoretical principles, Eric Pacuit 39

106 ...Rott seems to take the point about meta-information to explain why the example conflicts with the theoretical principles, whereas I want to conclude that it shows why the example does not conflict with the theoretical principles, since I take the relevance of the meta-information to show that the conditions for applying the principles in question are not met by the example. Eric Pacuit 39

107 ...Rott seems to take the point about meta-information to explain why the example conflicts with the theoretical principles, whereas I want to conclude that it shows why the example does not conflict with the theoretical principles, since I take the relevance of the meta-information to show that the conditions for applying the principles in question are not met by the example... I think proper attention to the relation between concrete examples and the abstract models will allow us to reconcile some of the beautiful properties [of the abstract theory of belief revision] with the complexity of concrete reasoning. asdf dsaf (Stalnaker, 204) Robert Stalnaker. Iterated Belief Revision. Erkenntnis 70, pp , Eric Pacuit 39

108 Counterexamples to Recovery (C6) K Cn((K. α) {α}) While reading a book about Cleopatra I learned that she had both a son and a daughter. I therefore believe both that Cleopatra had a son (s) and Cleopatra had a daughter (d). Eric Pacuit 40

109 Counterexamples to Recovery (C6) K Cn((K. α) {α}) While reading a book about Cleopatra I learned that she had both a son and a daughter. I therefore believe both that Cleopatra had a son (s) and Cleopatra had a daughter (d). Later I learn from a well-informed friend that the book in question is just a historical novel. I accordingly contract my belief that Cleopatra had a child (s d). Eric Pacuit 40

110 Counterexamples to Recovery (C6) K Cn((K. α) {α}) While reading a book about Cleopatra I learned that she had both a son and a daughter. I therefore believe both that Cleopatra had a son (s) and Cleopatra had a daughter (d). Later I learn from a well-informed friend that the book in question is just a historical novel. I accordingly contract my belief that Cleopatra had a child (s d). However, shortly thereafter I learn from a reliable source that in fact Cleopatra had a child. I thereby reintroduce s d to my collection of beliefs without also returning either s or d. S. Hansson. A Textbook of Belief Dynamics, Theory Change and Database Updating. Kluwer Academic Publishers, Eric Pacuit 40

111 Evaluating counterexamples... information about how I learn some of the things I learn, about the sources of my information, or about what I believe about what I believe and don t believe. If the story we tell in an example makes certain information about any of these things relevant, then it needs to be included in a proper model of the story, if it is to play the right role in the evaluation of the abstract principles of the model. Robert Stalnaker. Iterated Belief Revision. Erkenntnis 70, pp , Eric Pacuit 41

112 Belief Revision: The Semantic View A. Grove. Two modelings for theory change. Journal of Philosophical Logic, 17, pgs , EP. Dynamic Epistemic Logic II: Logics of information change. Philosophy Compass, Vol. 8, Iss. 9, pgs , Eric Pacuit 42

113 ... w The set of states, with a distinguished state denoting the actual world The agent s (hard) information (i.e., the states consistent with what the agent knows) Eric Pacuit 43

114 ... w The set of states, with a distinguished state denoting the actual world. The agent s (hard) information (i.e., the states consistent with what the agent knows) Eric Pacuit 43

115 ... w The agent s (hard) information (i.e., the states consistent with what the agent knows) The agent s beliefs (soft information the states consistent with what the agent believes) Eric Pacuit 43

116 ... w The states consistent with what the agent knows with a distinguished state (the actual world ) Each state is associated with a propositional valuation for the underlying propositional language Eric Pacuit 44

117 ... w The agent s beliefs (soft information -the states consistent with what the agent believes) The agent s contingency plan : when the stronger beliefs fail, go with the weaker ones. Eric Pacuit 44

118 ... w The agent s beliefs (soft information the states consistent with what the agent believes) The agent s contingency plan : when the stronger beliefs fail, go with the weaker ones. Eric Pacuit 44

119 ... w The agent s beliefs (soft information the states consistent with what the agent believes) The agent s contingency plan : when the stronger beliefs fail, go with the weaker ones. Eric Pacuit 44

120 Sphere Models Let W be a set of states, A set F (W ) is called a system of spheres provided: For each S, S F, either S S or S S For any P W there is a smallest S F (according to the subset relation) such that P S The spheres are non-empty F and cover the entire information cell F = W Eric Pacuit 45

121 Let F be a system of spheres on W : for w, v W, let w F v iff for all S F, if v S then w S Then, F is reflexive, transitive, and well-founded. w F v means that no matter what the agent learns in the future, as long as world v is still consistent with his beliefs and w is still epistemically possible, then w is also consistent with his beliefs. Eric Pacuit 46

122 Belief Revision via Plausibility W = {w 1, w 2, w 3 } w 1 w 2 and w 2 w 1 (w 1 and w 2 are equi-plausbile) w 1 w 3 (w 1 w 3 and w 3 w 1 ) E D B w 3 A ϕ w 2 w 3 (w 2 w 3 and w 3 w 2 ) {w 1, w 2 } Min ([w i ]) w 1 w 2 Eric Pacuit 47

123 Belief Revision via Plausibility W = {w 1, w 2, w 3 } w 1 w 2 and w 2 w 1 (w 1 and w 2 are equi-plausbile) w 1 w 3 (w 1 w 3 and w 3 w 1 ) E D B w 3 A ϕ w 2 w 3 (w 2 w 3 and w 3 w 2 ) {w 1, w 2 } Min ([w i ]) w 1 w 2 Eric Pacuit 47

124 Belief Revision via Plausibility W = {w 1, w 2, w 3 } w 1 w 2 and w 2 w 1 (w 1 and w 2 are equi-plausbile) w 1 w 3 (w 1 w 3 and w 3 w 1 ) E D B w 3 A ϕ w 2 w 3 (w 2 w 3 and w 3 w 2 ) {w 1, w 2 } Min ([w i ]) w 1 w 2 Eric Pacuit 47

125 Belief Revision via Plausibility E D B A ϕ ϕ Belief: Bϕ Min (W ) [[ϕ]] M Eric Pacuit 47

126 Conservative Upgrade: Information from a trusted source Eric Pacuit ( ϕ): A C D B E 47 Belief Revision via Plausibility ψ E D B A ϕ C Conditional Belief: B ϕ ψ Min ([[ϕ]] M ) [[ψ]] M Conservative Upgrade: Information from a trusted source ( ϕ): A i C i D i B E

127 Conservative Upgrade: Information from a trusted source Eric Pacuit ( ϕ): A C D B E 47 Belief Revision via Plausibility E ψ D ϕ C Conditional Belief: B ϕ ψ Min ([[ϕ]] M ) [[ψ]] M Conservative Upgrade: Information from a trusted source ( ϕ): A i C i D i B E

128 Plausibility Models Epistemic Models: M = W, { i } i A, V Truth: M, w = ϕ is defined as follows: M, w = p iff w V (p) (with p At) M, w = ϕ if M, w = ϕ M, w = ϕ ψ if M, w = ϕ and M, w = ψ M, w = K i ϕ if for each v W, if w i v, then M, v = ϕ Eric Pacuit 48

129 Plausibility Models Epistemic-Plausibility Models: M = W, { i } i A, { i } i A, V Truth: M, w = ϕ is defined as follows: M, w = p iff w V (p) (with p At) M, w = ϕ if M, w = ϕ M, w = ϕ ψ if M, w = ϕ and M, w = ψ M, w = K i ϕ if for each v W, if w i v, then M, v = ϕ Eric Pacuit 48

130 Plausibility Models Epistemic-Plausibility Models: M = W, { i } i A, { i } i A, V Plausibility Relation: i W W where w i v means w is at least as plausible as v. Assumptions: 1. i is reflexive and transitive (and well-founded) 2. plausibility implies possibility: if w i v then w i v. 3. locally-connected: if w i v then either w i v or v i w. Eric Pacuit 48

131 Plausibility Models Epistemic-Plausibility Models: M = W, { i } i A, { i } i A, V Truth: M, w = ϕ is defined as follows: M, w = p iff w V (p) (with p At) M, w = ϕ if M, w = ϕ M, w = ϕ ψ if M, w = ϕ and M, w = ψ M, w = K i ϕ if for each v W, if w i v, then M, v = ϕ M, w = B i ϕ if for each v Min i ([w] i ), M, v = ϕ [w] i = {v w i v} is the agent s information cell. Eric Pacuit 48

132 Example b T 1, H 2 a w 2 a w 1 T 1, T 2 a a, b b a w 4 H 1, H 2 b H 1, T 2 w 3 w 1 = B a (H 1 H 2 ) B b (H 1 H 2 ) w 1 = B T 1 a H 2 w 1 = B T 1 b T 2 Eric Pacuit 49

133 Example b T 1, H 2 a w 2 a w 1 T 1, T 2 a a, b b a w 4 H 1, H 2 b H 1, T 2 w 3 w 1 = B a (H 1 H 2 ) B b (H 1 H 2 ) w 1 = B T 1 a H 2 w 1 = B T 1 b T 2 Eric Pacuit 49

134 Example b T 1, H 2 a w 2 a w 1 T 1, T 2 a a, b b a w 4 H 1, H 2 b H 1, T 2 w 3 w 1 = B a (H 1 H 2 ) B b (H 1 H 2 ) w 1 = B T 1 a H 2 w 1 = B T 1 b T 2 Eric Pacuit 49

135 Example b T 1, H 2 a w 2 a w 1 T 1, T 2 a a, b b a w 4 H 1, H 2 b H 1, T 2 w 3 w 1 = B a (H 1 H 2 ) B b (H 1 H 2 ) w 1 = B T 1 a H 2 w 1 = B T 1 b T 2 Eric Pacuit 49

136 Grades of Doxastic Strength v 0 v 1 w v 2 Suppose that w is the current state. Knowledge (KP) Belief (BP) Safe Belief ( P) Strong Belief (B s P) Eric Pacuit 50

137 Grades of Doxastic Strength v 0 v 1 w v 2 Suppose that w is the current state. Knowledge (KP) Belief (BP) Robust Belief ( P) Strong Belief (B s P) Eric Pacuit 50

138 Grades of Doxastic Strength v 0 P v 1 P w P v 2 P Suppose that w is the current state. Belief (BP) Robust Belief ( P) Strong Belief (B s P) Knowledge (KP) Eric Pacuit 50

139 Grades of Doxastic Strength v 0 P v 1 P w P v 2 P Suppose that w is the current state. Belief (BP) Robust Belief ([ ]P) Strong Belief (B s P) Knowledge (KP) Eric Pacuit 50

140 Grades of Doxastic Strength v 0 P v 1 P w P v 2 P Suppose that w is the current state. Belief (BP) Robust Belief ([ ]P) Strong Belief (B s P) Knowledge (KP) Eric Pacuit 50

141 Grades of Doxastic Strength v 0 P v 1 P w P v 2 P Suppose that w is the current state. Belief (BP) Robust Belief ([ ]P) Strong Belief (B s P) Knowledge (KP) Eric Pacuit 50

142 Is Bϕ B ψ ϕ valid? Eric Pacuit 51

143 Is Bϕ B ψ ϕ valid? Is B α ϕ B α β ϕ valid? Eric Pacuit 51

144 Is Bϕ B ψ ϕ valid? Is B α ϕ B α β ϕ valid? Is Bϕ B ψ ϕ B ψ ϕ valid? Eric Pacuit 51

145 Is Bϕ B ψ ϕ valid? Is B α ϕ B α β ϕ valid? Is Bϕ B ψ ϕ B ψ ϕ valid? Exercise: Prove that B, B ϕ and B s are definable in the language with K and [ ] modalities. Eric Pacuit 51

146 M, w = B ϕ ψ if for each v Min ([w] [[ϕ]]), M, v = ϕ where [[ϕ]] = {w M, w = ϕ} and [w] = {v w v} Eric Pacuit 52

147 M, w = B ϕ ψ if for each v Min ([w] [[ϕ]]), M, v = ϕ where [[ϕ]] = {w M, w = ϕ} and [w] = {v w v} Core Logical Principles: 1. B ϕ ϕ 2. B ϕ ψ B ϕ (ψ χ) 3. (B ϕ ψ 1 B ϕ ψ 2 ) B ϕ (ψ 1 ψ 2 ) 4. (B ϕ 1 ψ B ϕ 2 ψ) B ϕ 1 ϕ 2 ψ 5. (B ϕ ψ B ψ ϕ) (B ϕ χ B ψ χ) J. Burgess. Quick completeness proofs for some logics of conditionals. Notre Dame Journal of Formal Logic 22, 76 84, Eric Pacuit 52

148 Types of Beliefs: Logical Characterizations M, w = K i ϕ iff M, w = B ψ i ϕ for all ψ i knows ϕ iff i continues to believe ϕ given any new information Eric Pacuit 53

149 Types of Beliefs: Logical Characterizations M, w = K i ϕ iff M, w = B ψ i ϕ for all ψ i knows ϕ iff i continues to believe ϕ given any new information M, w = [ i ]ϕ iff M, w = B ψ i ϕ for all ψ with M, w = ψ. i robustly believes ϕ iff i continues to believe ϕ given any true formula. Eric Pacuit 53

150 Types of Beliefs: Logical Characterizations M, w = K i ϕ iff M, w = B ψ i ϕ for all ψ i knows ϕ iff i continues to believe ϕ given any new information M, w = [ i ]ϕ iff M, w = B ψ i ϕ for all ψ with M, w = ψ. i robustly believes ϕ iff i continues to believe ϕ given any true formula. M, w = B s i ϕ iff M, w = B iϕ and M, w = B ψ i ϕ for all ψ with M, w = K i (ψ ϕ). i strongly believes ϕ iff i believes ϕ and continues to believe ϕ given any evidence (truthful or not) that is not known to contradict ϕ. Eric Pacuit 53

151 Dynamic Epistemic Logic The key idea of dynamic epistemic logic is that we can represent changes in agents epistemic states by transforming models. Eric Pacuit 54

152 Dynamic Epistemic Logic The key idea of dynamic epistemic logic is that we can represent changes in agents epistemic states by transforming models. In the simplest case, we model an agent s acquisition of knowledge by the elimination of possibilities from an initial epistemic model. Eric Pacuit 54

153 Finding out that ϕ M = W, { i } i A, { i } i A, V Find out that ϕ M = W, { i } i A, { i } i A, V W Eric Pacuit 55

154 Example: College Park and Amsterdam Recall the College Park agent who doesn t know whether it s raining in Amsterdam, whose epistemic state is represented by the model: b, d r b, d b w 1 w 2 Eric Pacuit 56

155 Example: College Park and Amsterdam Recall the College Park agent who doesn t know whether it s raining in Amsterdam, whose epistemic state is represented by the model: b, d r b, d b w 1 w 2 What happens when the Amsterdam agent calls the College Park agent on the phone and says, It s raining in Amsterdam? Eric Pacuit 56

156 Example: College Park and Amsterdam Recall the College Park agent who doesn t know whether it s raining in Amsterdam, whose epistemic state is represented by the model: b, d r b, d b w 1 w 2 What happens when the Amsterdam agent calls the College Park agent on the phone and says, It s raining in Amsterdam? We model the change in b s epistemic state by eliminating all epistemic possibilities in which it s not raining in Amsterdam. Eric Pacuit 56

157 Example: College Park and Amsterdam Recall the College Park agent who doesn t know whether it s raining in Amsterdam, whose epistemic state is represented by the model: b, d r w 1 What happens when the Amsterdam agent calls the College Park agent on the phone and says, It s raining in Amsterdam? We model the change in b s epistemic state by eliminating all epistemic possibilities in which it s not raining in Amsterdam. Eric Pacuit 57

158 Model Update We can easily give a formal definition that captures the idea of knowledge acquisition as the elimination of possibilities. Eric Pacuit 58

159 Model Update We can easily give a formal definition that captures the idea of knowledge acquisition as the elimination of possibilities. Given M = W, {R a a Agt}, V, the updated model M ϕ is obtained by deleting from M all worlds in which ϕ was false. Eric Pacuit 58

160 Model Update We can easily give a formal definition that captures the idea of knowledge acquisition as the elimination of possibilities. Given M = W, {R a a Agt}, V, the updated model M ϕ is obtained by deleting from M all worlds in which ϕ was false. Formally, M ϕ = W ϕ, {R a ϕ a Agt}, V ϕ is the model s.th.: W ϕ = {v W M, v ϕ}; Eric Pacuit 58

161 Model Update We can easily give a formal definition that captures the idea of knowledge acquisition as the elimination of possibilities. Given M = W, {R a a Agt}, V, the updated model M ϕ is obtained by deleting from M all worlds in which ϕ was false. Formally, M ϕ = W ϕ, {R a ϕ a Agt}, V ϕ is the model s.th.: W ϕ = {v W M, v ϕ}; R a ϕ is the restriction of R a to W ϕ ; Eric Pacuit 58

162 Model Update We can easily give a formal definition that captures the idea of knowledge acquisition as the elimination of possibilities. Given M = W, {R a a Agt}, V, the updated model M ϕ is obtained by deleting from M all worlds in which ϕ was false. Formally, M ϕ = W ϕ, {R a ϕ a Agt}, V ϕ is the model s.th.: W ϕ = {v W M, v ϕ}; R a ϕ is the restriction of R a to W ϕ ; V ϕ (p) is the intersection of V (p) and W ϕ. Eric Pacuit 58

163 Model Update We can easily give a formal definition that captures the idea of knowledge acquisition as the elimination of possibilities. Given M = W, {R a a Agt}, V, the updated model M ϕ is obtained by deleting from M all worlds in which ϕ was false. Formally, M ϕ = W ϕ, {R a ϕ a Agt}, V ϕ is the model s.th.: W ϕ = {v W M, v ϕ}; R a ϕ is the restriction of R a to W ϕ ; V ϕ (p) is the intersection of V (p) and W ϕ. In the single-agent case, this models the agent learning ϕ. In the multi-agent case, this models all agents publicly learning ϕ. Eric Pacuit 58

164 Public Announcement Logic One of the big ideas of dynamic epistemic logic is to add to our formal language operators that can describe the kinds of model updates that we just saw for the College Park and Amsterdam example. Eric Pacuit 59

165 Public Announcement Logic One of the big ideas of dynamic epistemic logic is to add to our formal language operators that can describe the kinds of model updates that we just saw for the College Park and Amsterdam example. The language of Public Announcement Logic (PAL) is given by: ϕ ::= p ϕ (ϕ ϕ) K a ϕ [!ϕ]ϕ Eric Pacuit 59

166 Public Announcement Logic One of the big ideas of dynamic epistemic logic is to add to our formal language operators that can describe the kinds of model updates that we just saw for the College Park and Amsterdam example. The language of Public Announcement Logic (PAL) is given by: ϕ ::= p ϕ (ϕ ϕ) K a ϕ [!ϕ]ϕ Read [!ϕ]ψ as after (every) true announcement of ϕ, ψ. Eric Pacuit 59

167 Public Announcement Logic One of the big ideas of dynamic epistemic logic is to add to our formal language operators that can describe the kinds of model updates that we just saw for the College Park and Amsterdam example. The language of Public Announcement Logic (PAL) is given by: ϕ ::= p ϕ (ϕ ϕ) K a ϕ [!ϕ]ϕ Read [!ϕ]ψ as after (every) true announcement of ϕ, ψ. Read!ϕ ψ := [!ϕ] ψ as after a true announcement of ϕ, ψ. Eric Pacuit 59

168 Public Announcement Logic The truth clause for the dynamic operator [!ϕ] is: M, w [!ϕ]ψ iff M, w ϕ implies M ϕ, w ψ. So if ϕ is false, [!ϕ]ψ is vacuously true. Here is the!ϕ clause: M, w!ϕ ψ iff M, w ϕ and M ϕ, w ψ. Main Idea: we evaluate [!ϕ]ψ and!ϕ ψ not by looking at other worlds in the same model, but rather by looking at a new model. Eric Pacuit 60

169 Public Announcement Logic Suppose M = W, { i } i A, { i } i A, V is a multi-agent Kripke Model M, w = [!ψ]ϕ iff M, w = ψ implies M ψ, w = ϕ where M ψ = W, { i } i A, { i } i A, V with W = W {w M, w = ψ} For each i, i = i (W W ) For each i, i = i (W W ) for all p At, V (p) = V (p) W Eric Pacuit 61

170 Public Announcement Logic [!ψ]p (ψ p) Eric Pacuit 62

171 Public Announcement Logic [!ψ]p (ψ p) [!ψ] ϕ (ψ [!ψ]ϕ) Eric Pacuit 62

172 Public Announcement Logic [!ψ]p (ψ p) [!ψ] ϕ (ψ [!ψ]ϕ) [!ψ](ϕ χ) ([!ψ]ϕ [!ψ]χ) Eric Pacuit 62

173 Public Announcement Logic [!ψ]p (ψ p) [!ψ] ϕ (ψ [!ψ]ϕ) [!ψ](ϕ χ) ([!ψ]ϕ [!ψ]χ) [!ψ][!ϕ]χ [!(ψ [!ψ]ϕ)]χ Eric Pacuit 62

174 Public Announcement Logic [!ψ]p (ψ p) [!ψ] ϕ (ψ [!ψ]ϕ) [!ψ](ϕ χ) ([!ψ]ϕ [!ψ]χ) [!ψ][!ϕ]χ [!(ψ [!ψ]ϕ)]χ [!ψ]k i ϕ (ψ K i (ψ [!ψ]ϕ)) Eric Pacuit 62

175 Public Announcement Logic [!ψ]p (ψ p) [!ψ] ϕ (ψ [!ψ]ϕ) [!ψ](ϕ χ) ([!ψ]ϕ [!ψ]χ) [!ψ][!ϕ]χ [!(ψ [!ψ]ϕ)]χ [!ψ]k i ϕ (ψ K i (ψ [!ψ]ϕ)) Theorem Every formula of Public Announcement Logic is equivalent to a formula of Epistemic Logic. Eric Pacuit 62

176 Public Announcement vs. Conditional Belief Are [!ϕ]bψ and B ϕ ψ different? Eric Pacuit 63

177 Public Announcement vs. Conditional Belief Are [!ϕ]bψ and B ϕ ψ different? Yes! Eric Pacuit 63

178 Public Announcement vs. Conditional Belief Are [!ϕ]bψ and B ϕ ψ different? Yes! 1 2 p, q p, q w 1 w 2 w 3 p, q Eric Pacuit 63

179 Public Announcement vs. Conditional Belief Are [!ϕ]bψ and B ϕ ψ different? Yes! 1 2 p, q p, q w 1 w 2 w 3 p, q w 1 = B 1 B 2 q Eric Pacuit 63

180 Public Announcement vs. Conditional Belief Are [!ϕ]bψ and B ϕ ψ different? Yes! 1 2 p, q p, q w 1 w 2 w 3 p, q w 1 = B 1 B 2 q w 1 = B p 1 B 2q Eric Pacuit 63

181 Public Announcement vs. Conditional Belief Are [!ϕ]bψ and B ϕ ψ different? Yes! 1 2 p, q p, q w 1 w 2 w 3 p, q w 1 = B 1 B 2 q w 1 = B p 1 B 2q w 1 = [!p] B 1 B 2 q Eric Pacuit 63

182 Public Announcement vs. Conditional Belief Are [!ϕ]bψ and B ϕ ψ different? Yes! 1 2 p, q p, q w 1 w 2 w 3 p, q w 1 = B 1 B 2 q w 1 = B p 1 B 2q w 1 = [!p] B 1 B 2 q More generally, B p i (p K i p) is satisfiable but [!p]b i (p K i p) is not. Eric Pacuit 63

183 The Logic of Public Observation [!ϕ]kψ (ϕ K(ϕ [!ϕ]ψ)) Eric Pacuit 64

184 The Logic of Public Observation [!ϕ]kψ (ϕ K(ϕ [!ϕ]ψ)) [!ϕ][ ]ψ (ϕ [ ](ϕ [!ϕ]ψ)) Eric Pacuit 64

185 The Logic of Public Observation [!ϕ]kψ (ϕ K(ϕ [!ϕ]ψ)) [!ϕ][ ]ψ (ϕ [ ](ϕ [!ϕ]ψ)) Belief: [!ϕ]bψ (ϕ B(ϕ [!ϕ]ψ)) Eric Pacuit 64

186 The Logic of Public Observation [!ϕ]kψ (ϕ K(ϕ [!ϕ]ψ)) [!ϕ][ ]ψ (ϕ [ ](ϕ [!ϕ]ψ)) Belief: [!ϕ]bψ (ϕ B(ϕ [!ϕ]ψ)) [!ϕ]bψ (ϕ B ϕ [!ϕ]ψ) Eric Pacuit 64

187 The Logic of Public Observation [!ϕ]kψ (ϕ K(ϕ [!ϕ]ψ)) [!ϕ][ ]ψ (ϕ [ ](ϕ [!ϕ]ψ)) Belief: [!ϕ]bψ (ϕ B(ϕ [!ϕ]ψ)) [!ϕ]bψ (ϕ B ϕ [!ϕ]ψ) [!ϕ]b α ψ (ϕ B ϕ [!ϕ]α [!ϕ]ψ) Eric Pacuit 64

188 Belief Revision via Plausibility E D B A ϕ C Incorporate the new information ϕ (!ϕ): A i B Conservative Upgrade: Information from a trusted source ( ϕ): A i C i D i B E Radical Upgrade: Information from a strongly trusted source ( ϕ): A i B i C i D i E Eric Pacuit 65

189 Belief Revision via Plausibility E D B A ϕ C Incorporate the new information ϕ (!ϕ): A i B Conservative Upgrade: Information from a trusted source ( ϕ): A i C i D i B E Radical Upgrade: Information from a strongly trusted source ( ϕ): A i B i C i D i E Eric Pacuit 65

190 Belief Revision via Plausibility E D B A ϕ C Incorporate the new information ϕ (!ϕ): A i B Conservative Upgrade: Information from a trusted source ( ϕ): A i C i D i B E Radical Upgrade: Information from a strongly trusted source ( ϕ): A i B i C i D i E Eric Pacuit 65

191 Belief Revision via Plausibility E D B A ϕ C Public Announcement: Information from an infallible source (!ϕ): A i B Conservative Upgrade: Information from a trusted source ( ϕ): A i C i D i B E Radical Upgrade: Information from a strongly trusted source ( ϕ): A i B i C i D i E Eric Pacuit 65

192 Belief Revision via Plausibility E D B A ϕ C Public Announcement: Information from an infallible source (!ϕ): A i B Conservative Upgrade: Information from a trusted source ( ϕ): A i C i D i B E Radical Upgrade: Information from a strongly trusted source ( ϕ): A i B i C i D i E Eric Pacuit 65

193 Belief Revision via Plausibility E D B A ϕ C Public Announcement: Information from an infallible source (!ϕ): A i B Conservative Upgrade: Information from a trusted source ( ϕ): A i C i D i B E Radical Upgrade: Information from a strongly trusted source ( ϕ): A i B i C i D i E Eric Pacuit 65

194 K 0 Find out that ϕ = K t = K 0 ϕ M 0 Find out that ϕ = M t = M ϕ 0 = M ϕ 0 Eric Pacuit 66

195 K 0 Find out that ϕ = K t = K 0 ϕ M 0 Find out that ϕ = M t = M ϕ 0 = M ϕ 0 Eric Pacuit 66

196 K 0 Find out that ϕ = K t = K 0 ϕ M 0 Find out that ϕ = M t = M ϕ 0 = M ϕ 0 Eric Pacuit 66

197 K 0 Find out that ϕ = K t = K 0 ϕ M 0 Find out that ϕ = M t = M ϕ 0 = M ϕ 0 Eric Pacuit 66

198 K 0 Find out that ϕ = K t = K 0 ϕ M 0 Find out that ϕ = M t = M ϕ 0 = M ϕ 0 Eric Pacuit 66

Ten Puzzles and Paradoxes about Knowledge and Belief

Ten Puzzles and Paradoxes about Knowledge and Belief Ten Puzzles and Paradoxes about Knowledge and Belief ESSLLI 2013, Düsseldorf Wes Holliday Eric Pacuit August 16, 2013 Puzzles of Knowledge and Belief 1/34 Taking Stock w P P v Epistemic Model: M = W, {R

More information

Conditional Logic and Belief Revision

Conditional Logic and Belief Revision Conditional Logic and Belief Revision Ginger Schultheis (vks@mit.edu) and David Boylan (dboylan@mit.edu) January 2017 History The formal study of belief revision grew out out of two research traditions:

More information

Logic and Artificial Intelligence Lecture 13

Logic and Artificial Intelligence Lecture 13 Logic and Artificial Intelligence Lecture 13 Eric Pacuit Currently Visiting the Center for Formal Epistemology, CMU Center for Logic and Philosophy of Science Tilburg University ai.stanford.edu/ epacuit

More information

Belief revision: A vade-mecum

Belief revision: A vade-mecum Belief revision: A vade-mecum Peter Gärdenfors Lund University Cognitive Science, Kungshuset, Lundagård, S 223 50 LUND, Sweden Abstract. This paper contains a brief survey of the area of belief revision

More information

Models of Strategic Reasoning Lecture 4

Models of Strategic Reasoning Lecture 4 Models of Strategic Reasoning Lecture 4 Eric Pacuit University of Maryland, College Park ai.stanford.edu/~epacuit August 9, 2012 Eric Pacuit: Models of Strategic Reasoning 1/55 Game Plan Lecture 4: Lecture

More information

A Unifying Semantics for Belief Change

A Unifying Semantics for Belief Change A Unifying Semantics for Belief Change C0300 Abstract. Many belief change formalisms employ plausibility orderings over the set of possible worlds to determine how the beliefs of an agent ought to be modified

More information

Reasoning with Inconsistent and Uncertain Ontologies

Reasoning with Inconsistent and Uncertain Ontologies Reasoning with Inconsistent and Uncertain Ontologies Guilin Qi Southeast University China gqi@seu.edu.cn Reasoning Web 2012 September 05, 2012 Outline Probabilistic logic vs possibilistic logic Probabilistic

More information

Doxastic Logic. Michael Caie

Doxastic Logic. Michael Caie Doxastic Logic Michael Caie There are at least three natural ways of interpreting the object of study of doxastic logic. On one construal, doxastic logic studies certain general features of the doxastic

More information

Logic and Artificial Intelligence Lecture 12

Logic and Artificial Intelligence Lecture 12 Logic and Artificial Intelligence Lecture 12 Eric Pacuit Currently Visiting the Center for Formal Epistemology, CMU Center for Logic and Philosophy of Science Tilburg University ai.stanford.edu/ epacuit

More information

Ranking Theory. Franz Huber Department of Philosophy University of Toronto, Canada

Ranking Theory. Franz Huber Department of Philosophy University of Toronto, Canada Ranking Theory Franz Huber Department of Philosophy University of Toronto, Canada franz.huber@utoronto.ca http://huber.blogs.chass.utoronto.ca The Open Handbook of Formal Epistemology edited by Richard

More information

On iterated revision in the AGM framework

On iterated revision in the AGM framework On iterated revision in the AGM framework Andreas Herzig, Sébastien Konieczny, Laurent Perrussel Institut de Recherche en Informatique de Toulouse 118 route de Narbonne - 31062 Toulouse - France {herzig,konieczny,perrussel}@irit.fr

More information

INTRODUCTION TO NONMONOTONIC REASONING

INTRODUCTION TO NONMONOTONIC REASONING Faculty of Computer Science Chair of Automata Theory INTRODUCTION TO NONMONOTONIC REASONING Anni-Yasmin Turhan Dresden, WS 2017/18 About the Course Course Material Book "Nonmonotonic Reasoning" by Grigoris

More information

Horn Clause Contraction Functions

Horn Clause Contraction Functions Journal of Artificial Intelligence Research 48 (2013) xxx-xxx Submitted 04/13; published 09/13 Horn Clause Contraction Functions James P. Delgrande School of Computing Science, Simon Fraser University,

More information

Description Logics. Foundations of Propositional Logic. franconi. Enrico Franconi

Description Logics. Foundations of Propositional Logic.   franconi. Enrico Franconi (1/27) Description Logics Foundations of Propositional Logic Enrico Franconi franconi@cs.man.ac.uk http://www.cs.man.ac.uk/ franconi Department of Computer Science, University of Manchester (2/27) Knowledge

More information

Inverse Resolution as Belief Change

Inverse Resolution as Belief Change Inverse Resolution as Belief Change Maurice Pagnucco ARC Centre of Excel. for Autonomous Sys. School of Comp. Science and Engineering The University of New South Wales Sydney, NSW, 2052, Australia. Email:

More information

First Steps Towards Revising Ontologies

First Steps Towards Revising Ontologies First Steps Towards Revising Ontologies Márcio Moretto Ribeiro and Renata Wassermann Institute of Mathematics and Statistics University of São Paulo Brasil, {marciomr,renata}@ime.usp.br Abstract. When

More information

Fallbacks and push-ons

Fallbacks and push-ons Fallbacks and push-ons Krister Segerberg September 22, 2006 1 Background Grove showed in effect that, in the theory of belief change initiated by Alchourrón, Gärdenfors and Makinson, belief states may

More information

Learning by Questions and Answers:

Learning by Questions and Answers: Learning by Questions and Answers: From Belief-Revision Cycles to Doxastic Fixed Points Alexandru Baltag 1 and Sonja Smets 2,3 1 Computing Laboratory, University of Oxford, Alexandru.Baltag@comlab.ox.ac.uk

More information

Revising Nonmonotonic Theories: The Case of Defeasible Logic

Revising Nonmonotonic Theories: The Case of Defeasible Logic Revising Nonmonotonic Theories: The Case of Defeasible Logic D. Billington, G. Antoniou, G. Governatori, and M. Maher School of Computing and Information Technology Griffith University, QLD 4111, Australia

More information

Belief Revision II: Ranking Theory

Belief Revision II: Ranking Theory Belief Revision II: Ranking Theory Franz Huber epartment of Philosophy University of Toronto, Canada franz.huber@utoronto.ca http://huber.blogs.chass.utoronto.ca/ penultimate version: please cite the paper

More information

Preference, Choice and Utility

Preference, Choice and Utility Preference, Choice and Utility Eric Pacuit January 2, 205 Relations Suppose that X is a non-empty set. The set X X is the cross-product of X with itself. That is, it is the set of all pairs of elements

More information

Laboratoire d'informatique Fondamentale de Lille

Laboratoire d'informatique Fondamentale de Lille Laboratoire d'informatique Fondamentale de Lille Publication IT { 314 Operators with Memory for Iterated Revision Sebastien Konieczny mai 1998 c L.I.F.L. { U.S.T.L. LABORATOIRE D'INFORMATIQUE FONDAMENTALE

More information

Pei Wang( 王培 ) Temple University, Philadelphia, USA

Pei Wang( 王培 ) Temple University, Philadelphia, USA Pei Wang( 王培 ) Temple University, Philadelphia, USA Artificial General Intelligence (AGI): a small research community in AI that believes Intelligence is a general-purpose capability Intelligence should

More information

What is DEL good for? Alexandru Baltag. Oxford University

What is DEL good for? Alexandru Baltag. Oxford University Copenhagen 2010 ESSLLI 1 What is DEL good for? Alexandru Baltag Oxford University Copenhagen 2010 ESSLLI 2 DEL is a Method, Not a Logic! I take Dynamic Epistemic Logic () to refer to a general type of

More information

Definability of Horn Revision from Horn Contraction

Definability of Horn Revision from Horn Contraction Definability of Horn Revision from Horn Contraction Zhiqiang Zhuang 1,2 Maurice Pagnucco 2,3,4 Yan Zhang 1 1 School of Computing, Engineering, and Mathematics, University of Western Sydney, Australia 2

More information

Belief Change in the Context of Fallible Actions and Observations

Belief Change in the Context of Fallible Actions and Observations Belief Change in the Context of Fallible Actions and Observations Aaron Hunter and James P. Delgrande School of Computing Science Faculty of Applied Sciences Simon Fraser University Burnaby, BC, Canada

More information

Notes on Quantum Logic

Notes on Quantum Logic Notes on Quantum Logic Version 1.0 David B. Malament Department of Logic and Philosophy of Science University of California, Irvine dmalamen@uci.edu Contents 1 Formal (sentential) quantum logic 2 2 The

More information

CL16, 12th September 2016

CL16, 12th September 2016 CL16, 12th September 2016 for - Soft Outline for - Soft for - Soft The Problem of Anyone who utters: (S) Sherlock Holmes lives in Baker Street would not be objected against by non-philosophers. However:

More information

Bisimulation for conditional modalities

Bisimulation for conditional modalities Bisimulation for conditional modalities Alexandru Baltag and Giovanni Ciná Institute for Logic, Language and Computation, University of Amsterdam March 21, 2016 Abstract We give a general definition of

More information

Next Steps in Propositional Horn Contraction

Next Steps in Propositional Horn Contraction Next Steps in Propositional Horn Contraction Richard Booth Mahasarakham University Thailand richard.b@msu.ac.th Thomas Meyer Meraka Institute, CSIR and School of Computer Science University of Kwazulu-Natal

More information

HELP US REMOVE OUTDATED BELIEFS?

HELP US REMOVE OUTDATED BELIEFS? CAN INFORMATION THEORY HELP US REMOVE OUTDATED BELIEFS? Erik J. Olsson Lund University Cognitive Science Kungshuset, Lundagård S-222 22 Lund, Sweden E-mail Erik.Olsson@filosofi.uu.se Abstract: It is argued

More information

Introduction to Epistemic Reasoning in Interaction

Introduction to Epistemic Reasoning in Interaction Introduction to Epistemic Reasoning in Interaction Eric Pacuit Center for Logic and Philosophy of Science Tilburg University ai.stanford.edu/~epacuit December 9, 2009 Eric Pacuit 1 We are interested in

More information

Evidence-Based Belief Revision for Non-Omniscient Agents

Evidence-Based Belief Revision for Non-Omniscient Agents Evidence-Based Belief Revision for Non-Omniscient Agents MSc Thesis (Afstudeerscriptie) written by Kristina Gogoladze (born May 11th, 1986 in Tbilisi, Georgia) under the supervision of Dr Alexandru Baltag,

More information

Understanding the Brandenburger-Keisler Belief Paradox

Understanding the Brandenburger-Keisler Belief Paradox Understanding the Brandenburger-Keisler Belief Paradox Eric Pacuit Institute of Logic, Language and Information University of Amsterdam epacuit@staff.science.uva.nl staff.science.uva.nl/ epacuit March

More information

. Modal AGM model of preference changes. Zhang Li. Peking University

. Modal AGM model of preference changes. Zhang Li. Peking University Modal AGM model of preference changes Zhang Li Peking University 20121218 1 Introduction 2 AGM framework for preference change 3 Modal AGM framework for preference change 4 Some results 5 Discussion and

More information

Epistemic Game Theory

Epistemic Game Theory Epistemic Game Theory Lecture 3 ESSLLI 12, Opole Eric Pacuit Olivier Roy TiLPS, Tilburg University MCMP, LMU Munich ai.stanford.edu/~epacuit http://olivier.amonbofis.net August 8, 2012 Eric Pacuit and

More information

EQUIVALENCE OF THE INFORMATION STRUCTURE WITH UNAWARENESS TO THE LOGIC OF AWARENESS. 1. Introduction

EQUIVALENCE OF THE INFORMATION STRUCTURE WITH UNAWARENESS TO THE LOGIC OF AWARENESS. 1. Introduction EQUIVALENCE OF THE INFORMATION STRUCTURE WITH UNAWARENESS TO THE LOGIC OF AWARENESS SANDER HEINSALU Abstract. Here it is shown that the unawareness structure in Li (29) is equivalent to a single-agent

More information

1. Belief Convergence (or endless cycles?) by Iterated Learning 2. Belief Merge (or endless disagreements?) by Iterated Sharing (and Persuasion)

1. Belief Convergence (or endless cycles?) by Iterated Learning 2. Belief Merge (or endless disagreements?) by Iterated Sharing (and Persuasion) LMCS (ESSLLI) 2009 Bordeaux 1 Group Belief Dynamics under Iterated Upgrades 1. Belief Convergence (or endless cycles?) by Iterated Learning 2. Belief Merge (or endless disagreements?) by Iterated Sharing

More information

Dynamics for Inference

Dynamics for Inference Dynamics for Inference Alexei Angelides 1 Introduction Consider the following two situations: (DS) from P Q, and P, infer Q. and (WS) I order beef, Jesse orders fish, Darko orders veg. A new waiter walks

More information

Logic and Artificial Intelligence Lecture 6

Logic and Artificial Intelligence Lecture 6 Logic and Artificial Intelligence Lecture 6 Eric Pacuit Currently Visiting the Center for Formal Epistemology, CMU Center for Logic and Philosophy of Science Tilburg University ai.stanford.edu/ epacuit

More information

Base Revision in Description Logics - Preliminary Results

Base Revision in Description Logics - Preliminary Results Base Revision in Description Logics - Preliminary Results Márcio Moretto Ribeiro and Renata Wassermann Institute of Mathematics and Statistics University of São Paulo Brazil, {marciomr,renata}@ime.usp.br

More information

To every formula scheme there corresponds a property of R. This relationship helps one to understand the logic being studied.

To every formula scheme there corresponds a property of R. This relationship helps one to understand the logic being studied. Modal Logic (2) There appeared to be a correspondence between the validity of Φ Φ and the property that the accessibility relation R is reflexive. The connection between them is that both relied on the

More information

Preference and its Dynamics

Preference and its Dynamics Department of Philosophy,Tsinghua University 28 August, 2012, EASLLC Table of contents 1 Introduction 2 Betterness model and dynamics 3 Priorities and dynamics 4 Relating betterness and priority dynamics

More information

COMP219: Artificial Intelligence. Lecture 19: Logic for KR

COMP219: Artificial Intelligence. Lecture 19: Logic for KR COMP219: Artificial Intelligence Lecture 19: Logic for KR 1 Overview Last time Expert Systems and Ontologies Today Logic as a knowledge representation scheme Propositional Logic Syntax Semantics Proof

More information

From Onions to Broccoli: Generalizing Lewis s counterfactual logic

From Onions to Broccoli: Generalizing Lewis s counterfactual logic From Onions to Broccoli: Generalizing Lewis s counterfactual logic Patrick Girard Stanford University May 18, 2005 Abstract We present a generalization of Segerberg s onion semantics for belief revision,

More information

Conflict-Based Belief Revision Operators in Possibilistic Logic

Conflict-Based Belief Revision Operators in Possibilistic Logic Conflict-Based Belief Revision Operators in Possibilistic Logic Author Qi, Guilin, Wang, Kewen Published 2012 Conference Title Proceedings of the Twenty-Sixth AAAI Conference on Artificial Intelligence

More information

Indicative conditionals

Indicative conditionals Indicative conditionals PHIL 43916 November 14, 2012 1. Three types of conditionals... 1 2. Material conditionals... 1 3. Indicatives and possible worlds... 4 4. Conditionals and adverbs of quantification...

More information

COMP219: Artificial Intelligence. Lecture 19: Logic for KR

COMP219: Artificial Intelligence. Lecture 19: Logic for KR COMP219: Artificial Intelligence Lecture 19: Logic for KR 1 Overview Last time Expert Systems and Ontologies Today Logic as a knowledge representation scheme Propositional Logic Syntax Semantics Proof

More information

Reversed Squares of Opposition in PAL and DEL

Reversed Squares of Opposition in PAL and DEL Center for Logic and Analytical Philosophy, University of Leuven lorenz.demey@hiw.kuleuven.be SQUARE 2010, Corsica Goal of the talk public announcement logic (PAL), and dynamic epistemic logic (DEL) in

More information

Prime Forms and Minimal Change in Propositional Belief Bases

Prime Forms and Minimal Change in Propositional Belief Bases Annals of Mathematics and Artificial Intelligence manuscript No. (will be inserted by the editor) Prime Forms and Minimal Change in Propositional Belief Bases J. Marchi G. Bittencourt L. Perrussel Received:

More information

Modal Logic. Introductory Lecture. Eric Pacuit. University of Maryland, College Park ai.stanford.edu/ epacuit. January 31, 2012.

Modal Logic. Introductory Lecture. Eric Pacuit. University of Maryland, College Park ai.stanford.edu/ epacuit. January 31, 2012. Modal Logic Introductory Lecture Eric Pacuit University of Maryland, College Park ai.stanford.edu/ epacuit January 31, 2012 Modal Logic 1/45 Setting the Stage Modern Modal Logic began with C.I. Lewis dissatisfaction

More information

Update As Evidence: Belief Expansion

Update As Evidence: Belief Expansion Update As Evidence: Belief Expansion Roman Kuznets and Thomas Studer Institut für Informatik und angewandte Mathematik Universität Bern {kuznets, tstuder}@iam.unibe.ch http://www.iam.unibe.ch/ltg Abstract.

More information

Lecture 1: The Humean Thesis on Belief

Lecture 1: The Humean Thesis on Belief Lecture 1: The Humean Thesis on Belief Hannes Leitgeb LMU Munich October 2014 What do a perfectly rational agent s beliefs and degrees of belief have to be like in order for them to cohere with each other?

More information

Formal Epistemology: Lecture Notes. Horacio Arló-Costa Carnegie Mellon University

Formal Epistemology: Lecture Notes. Horacio Arló-Costa Carnegie Mellon University Formal Epistemology: Lecture Notes Horacio Arló-Costa Carnegie Mellon University hcosta@andrew.cmu.edu Logical preliminaries Let L 0 be a language containing a complete set of Boolean connectives, including

More information

Comment on Leitgeb s Stability Theory of Belief

Comment on Leitgeb s Stability Theory of Belief Comment on Leitgeb s Stability Theory of Belief Hanti Lin Kevin T. Kelly Carnegie Mellon University {hantil, kk3n}@andrew.cmu.edu Hannes Leitgeb s stability theory of belief provides three synchronic constraints

More information

Logics for Belief as Maximally Plausible Possibility

Logics for Belief as Maximally Plausible Possibility Logics for Belief as Maximally Plausible Possibility Giacomo Bonanno Department of Economics, University of California, Davis, USA gfbonanno@ucdavis.edu Abstract We consider a basic logic with two primitive

More information

Foundations of Artificial Intelligence

Foundations of Artificial Intelligence Foundations of Artificial Intelligence 7. Propositional Logic Rational Thinking, Logic, Resolution Wolfram Burgard, Maren Bennewitz, and Marco Ragni Albert-Ludwigs-Universität Freiburg Contents 1 Agents

More information

Human interpretation and reasoning about conditionals

Human interpretation and reasoning about conditionals Human interpretation and reasoning about conditionals Niki Pfeifer 1 Munich Center for Mathematical Philosophy Language and Cognition Ludwig-Maximilians-Universität München www.users.sbg.ac.at/~pfeifern/

More information

Foundations of Artificial Intelligence

Foundations of Artificial Intelligence Foundations of Artificial Intelligence 7. Propositional Logic Rational Thinking, Logic, Resolution Joschka Boedecker and Wolfram Burgard and Bernhard Nebel Albert-Ludwigs-Universität Freiburg May 17, 2016

More information

Partial Meet Revision and Contraction in Logic Programs

Partial Meet Revision and Contraction in Logic Programs Partial Meet Revision and Contraction in Logic Programs Sebastian Binnewies and Zhiqiang Zhuang and Kewen Wang School of Information and Communication Technology Griffith University, QLD, Australia {s.binnewies;

More information

Measuring the Good and the Bad in Inconsistent Information

Measuring the Good and the Bad in Inconsistent Information Proceedings of the Twenty-Second International Joint Conference on Artificial Intelligence Measuring the Good and the Bad in Inconsistent Information John Grant Department of Computer Science, University

More information

Base Belief Change and Optimized Recovery 1

Base Belief Change and Optimized Recovery 1 Base Belief Change and Optimized Recovery 1 Frances Johnson 2 and Stuart C. Shapiro University at Buffalo, CSE Dept., Buffalo, NY, USA flj shapiro@cse.buffalo.edu Abstract. Optimized Recovery (OR) adds

More information

Chapter 1: The Logic of Compound Statements. January 7, 2008

Chapter 1: The Logic of Compound Statements. January 7, 2008 Chapter 1: The Logic of Compound Statements January 7, 2008 Outline 1 1.1 Logical Form and Logical Equivalence 2 1.2 Conditional Statements 3 1.3 Valid and Invalid Arguments Central notion of deductive

More information

Explanations, Belief Revision and Defeasible Reasoning

Explanations, Belief Revision and Defeasible Reasoning Explanations, Belief Revision and Defeasible Reasoning Marcelo A. Falappa a Gabriele Kern-Isberner b Guillermo R. Simari a a Artificial Intelligence Research and Development Laboratory Department of Computer

More information

Epistemic Informativeness

Epistemic Informativeness Epistemic Informativeness Yanjing Wang and Jie Fan Abstract In this paper, we introduce and formalize the concept of epistemic informativeness (EI) of statements: the set of new propositions that an agent

More information

Handout: Proof of the completeness theorem

Handout: Proof of the completeness theorem MATH 457 Introduction to Mathematical Logic Spring 2016 Dr. Jason Rute Handout: Proof of the completeness theorem Gödel s Compactness Theorem 1930. For a set Γ of wffs and a wff ϕ, we have the following.

More information

Hempel s Logic of Confirmation

Hempel s Logic of Confirmation Hempel s Logic of Confirmation Franz Huber, California Institute of Technology May 21, 2007 Contents 1 Hempel s Conditions of Adequacy 3 2 Carnap s Analysis of Hempel s Conditions 4 3 Conflicting Concepts

More information

A Contraction Core for Horn Belief Change: Preliminary Report

A Contraction Core for Horn Belief Change: Preliminary Report A Contraction Core for Horn Belief Change: Preliminary Report Richard Booth University of Luxembourg Luxembourg richard.booth@uni.lu Thomas Meyer Meraka Institute, CSIR South Africa tommie.meyer@meraka.org.za

More information

For True Conditionalizers Weisberg s Paradox is a False Alarm

For True Conditionalizers Weisberg s Paradox is a False Alarm For True Conditionalizers Weisberg s Paradox is a False Alarm Franz Huber Abstract: Weisberg (2009) introduces a phenomenon he terms perceptual undermining He argues that it poses a problem for Jeffrey

More information

A Qualitative Theory of Dynamic Interactive Belief Revision

A Qualitative Theory of Dynamic Interactive Belief Revision A Qualitative Theory of Dynamic Interactive Belief Revision Alexandru Baltag 1 Sonja Smets 2,3 1 Computing Laboratory Oxford University Oxford OX1 3QD, United Kingdom 2 Center for Logic and Philosophy

More information

For True Conditionalizers Weisberg s Paradox is a False Alarm

For True Conditionalizers Weisberg s Paradox is a False Alarm For True Conditionalizers Weisberg s Paradox is a False Alarm Franz Huber Department of Philosophy University of Toronto franz.huber@utoronto.ca http://huber.blogs.chass.utoronto.ca/ July 7, 2014; final

More information

Levels of Knowledge and Belief Computational Social Choice Seminar

Levels of Knowledge and Belief Computational Social Choice Seminar Levels of Knowledge and Belief Computational Social Choice Seminar Eric Pacuit Tilburg University ai.stanford.edu/~epacuit November 13, 2009 Eric Pacuit 1 Introduction and Motivation Informal Definition:

More information

Reasoning with Probabilities. Eric Pacuit Joshua Sack. Outline. Basic probability logic. Probabilistic Epistemic Logic.

Reasoning with Probabilities. Eric Pacuit Joshua Sack. Outline. Basic probability logic. Probabilistic Epistemic Logic. Reasoning with July 28, 2009 Plan for the Course Day 1: Introduction and Background Day 2: s Day 3: Dynamic s Day 4: Reasoning with Day 5: Conclusions and General Issues Probability language Let Φ be a

More information

Iterated Belief Change: A Transition System Approach

Iterated Belief Change: A Transition System Approach Iterated Belief Change: A Transition System Approach Aaron Hunter and James P. Delgrande School of Computing Science Simon Fraser University Burnaby, BC, Canada {amhunter, jim}@cs.sfu.ca Abstract We use

More information

Symbolic Logic 3. For an inference to be deductively valid it is impossible for the conclusion to be false if the premises are true.

Symbolic Logic 3. For an inference to be deductively valid it is impossible for the conclusion to be false if the premises are true. Symbolic Logic 3 Testing deductive validity with truth tables For an inference to be deductively valid it is impossible for the conclusion to be false if the premises are true. So, given that truth tables

More information

Agent Communication and Belief Change

Agent Communication and Belief Change Agent Communication and Belief Change Satoshi Tojo JAIST November 30, 2014 Satoshi Tojo (JAIST) Agent Communication and Belief Change November 30, 2014 1 / 34 .1 Introduction.2 Introspective Agent.3 I

More information

Towards Tractable Inference for Resource-Bounded Agents

Towards Tractable Inference for Resource-Bounded Agents Towards Tractable Inference for Resource-Bounded Agents Toryn Q. Klassen Sheila A. McIlraith Hector J. Levesque Department of Computer Science University of Toronto Toronto, Ontario, Canada {toryn,sheila,hector}@cs.toronto.edu

More information

In Defense of Jeffrey Conditionalization

In Defense of Jeffrey Conditionalization In Defense of Jeffrey Conditionalization Franz Huber Department of Philosophy University of Toronto Please do not cite! December 31, 2013 Contents 1 Introduction 2 2 Weisberg s Paradox 3 3 Jeffrey Conditionalization

More information

First-Degree Entailment

First-Degree Entailment March 5, 2013 Relevance Logics Relevance logics are non-classical logics that try to avoid the paradoxes of material and strict implication: p (q p) p (p q) (p q) (q r) (p p) q p (q q) p (q q) Counterintuitive?

More information

09 Modal Logic II. CS 3234: Logic and Formal Systems. October 14, Martin Henz and Aquinas Hobor

09 Modal Logic II. CS 3234: Logic and Formal Systems. October 14, Martin Henz and Aquinas Hobor Martin Henz and Aquinas Hobor October 14, 2010 Generated on Thursday 14 th October, 2010, 11:40 1 Review of Modal Logic 2 3 4 Motivation Syntax and Semantics Valid Formulas wrt Modalities Correspondence

More information

Modal Logics. Most applications of modal logic require a refined version of basic modal logic.

Modal Logics. Most applications of modal logic require a refined version of basic modal logic. Modal Logics Most applications of modal logic require a refined version of basic modal logic. Definition. A set L of formulas of basic modal logic is called a (normal) modal logic if the following closure

More information

Propositional Logic: Logical Agents (Part I)

Propositional Logic: Logical Agents (Part I) Propositional Logic: Logical Agents (Part I) This lecture topic: Propositional Logic (two lectures) Chapter 7.1-7.4 (this lecture, Part I) Chapter 7.5 (next lecture, Part II) Next lecture topic: First-order

More information

Overview. Knowledge-Based Agents. Introduction. COMP219: Artificial Intelligence. Lecture 19: Logic for KR

Overview. Knowledge-Based Agents. Introduction. COMP219: Artificial Intelligence. Lecture 19: Logic for KR COMP219: Artificial Intelligence Lecture 19: Logic for KR Last time Expert Systems and Ontologies oday Logic as a knowledge representation scheme Propositional Logic Syntax Semantics Proof theory Natural

More information

PLAYING WITH KNOWLEDGE AND BELIEF

PLAYING WITH KNOWLEDGE AND BELIEF PLAYING WITH KNOWLEDGE AND BELIEF Virginie Fiutek PLAYING WITH KNOWLEDGE AND BELIEF ILLC Dissertation Series DS-2013-02 For further information about ILLC-publications, please contact Institute for Logic,

More information

Dynamic Logics of Knowledge and Access

Dynamic Logics of Knowledge and Access Dynamic Logics of Knowledge and Access Tomohiro Hoshi (thoshi@stanford.edu) Department of Philosophy Stanford University Eric Pacuit (e.j.pacuit@uvt.nl) Tilburg Center for Logic and Philosophy of Science

More information

Logic: Propositional Logic (Part I)

Logic: Propositional Logic (Part I) Logic: Propositional Logic (Part I) Alessandro Artale Free University of Bozen-Bolzano Faculty of Computer Science http://www.inf.unibz.it/ artale Descrete Mathematics and Logic BSc course Thanks to Prof.

More information

Dalal s Revision without Hamming Distance

Dalal s Revision without Hamming Distance Dalal s Revision without Hamming Distance Pilar Pozos-Parra 1, Weiru Liu 2, and Laurent Perrussel 3 1 University of Tabasco, Mexico pilar.pozos@ujat.mx 2 Queen s University Belfast, UK w.liu@qub.ac.uk

More information

Logical Agents. Knowledge based agents. Knowledge based agents. Knowledge based agents. The Wumpus World. Knowledge Bases 10/20/14

Logical Agents. Knowledge based agents. Knowledge based agents. Knowledge based agents. The Wumpus World. Knowledge Bases 10/20/14 0/0/4 Knowledge based agents Logical Agents Agents need to be able to: Store information about their environment Update and reason about that information Russell and Norvig, chapter 7 Knowledge based agents

More information

A Semantic Approach for Iterated Revision in Possibilistic Logic

A Semantic Approach for Iterated Revision in Possibilistic Logic Proceedings of the Twenty-Third AAAI Conference on Artificial Intelligence (2008) A Semantic Approach for Iterated Revision in Possibilistic Logic Guilin Qi AIFB Universität Karlsruhe D-76128 Karlsruhe,

More information

Propositional Logic: Logical Agents (Part I)

Propositional Logic: Logical Agents (Part I) Propositional Logic: Logical Agents (Part I) First Lecture Today (Tue 21 Jun) Read Chapters 1 and 2 Second Lecture Today (Tue 21 Jun) Read Chapter 7.1-7.4 Next Lecture (Thu 23 Jun) Read Chapters 7.5 (optional:

More information

Non-monotonic Logic I

Non-monotonic Logic I Non-monotonic Logic I Bridges between classical and non-monotonic consequences Michal Peliš 1 Common reasoning monotonicity Γ ϕ Γ ϕ can fail caused by: background knowledge, implicit facts, presuppositions,

More information

Logics of Rational Agency Lecture 3

Logics of Rational Agency Lecture 3 Logics of Rational Agency Lecture 3 Eric Pacuit Tilburg Institute for Logic and Philosophy of Science Tilburg Univeristy ai.stanford.edu/~epacuit July 29, 2009 Eric Pacuit: LORI, Lecture 3 1 Plan for the

More information

Comp487/587 - Boolean Formulas

Comp487/587 - Boolean Formulas Comp487/587 - Boolean Formulas 1 Logic and SAT 1.1 What is a Boolean Formula Logic is a way through which we can analyze and reason about simple or complicated events. In particular, we are interested

More information

Modal Logic. UIT2206: The Importance of Being Formal. Martin Henz. March 19, 2014

Modal Logic. UIT2206: The Importance of Being Formal. Martin Henz. March 19, 2014 Modal Logic UIT2206: The Importance of Being Formal Martin Henz March 19, 2014 1 Motivation The source of meaning of formulas in the previous chapters were models. Once a particular model is chosen, say

More information

7. Propositional Logic. Wolfram Burgard and Bernhard Nebel

7. Propositional Logic. Wolfram Burgard and Bernhard Nebel Foundations of AI 7. Propositional Logic Rational Thinking, Logic, Resolution Wolfram Burgard and Bernhard Nebel Contents Agents that think rationally The wumpus world Propositional logic: syntax and semantics

More information

The Muddy Children:A logic for public announcement

The Muddy Children:A logic for public announcement The Muddy Children: Jesse Technical University of Eindhoven February 10, 2007 The Muddy Children: Outline 1 2 3 4 The Muddy Children: Quincy Prescott Baba: At least one of you is muddy. Quincy: I don t

More information

Logic Databases (Knowledge Bases)

Logic Databases (Knowledge Bases) 491 Knowledge Representation Logic Databases (Knowledge Bases) Marek Sergot Department of Computing Imperial College, London January 2011 v1.0e Database/knowledge base as model We have a language L in

More information

SEMANTICAL CONSIDERATIONS ON NONMONOTONIC LOGIC. Robert C. Moore Artificial Intelligence Center SRI International, Menlo Park, CA 94025

SEMANTICAL CONSIDERATIONS ON NONMONOTONIC LOGIC. Robert C. Moore Artificial Intelligence Center SRI International, Menlo Park, CA 94025 SEMANTICAL CONSIDERATIONS ON NONMONOTONIC LOGIC Robert C. Moore Artificial Intelligence Center SRI International, Menlo Park, CA 94025 ABSTRACT Commonsense reasoning is "nonmonotonic" in the sense that

More information

CS206 Lecture 21. Modal Logic. Plan for Lecture 21. Possible World Semantics

CS206 Lecture 21. Modal Logic. Plan for Lecture 21. Possible World Semantics CS206 Lecture 21 Modal Logic G. Sivakumar Computer Science Department IIT Bombay siva@iitb.ac.in http://www.cse.iitb.ac.in/ siva Page 1 of 17 Thu, Mar 13, 2003 Plan for Lecture 21 Modal Logic Possible

More information