Outline. Part 1: Motivation Part 2: Probabilistic Databases Part 3: Weighted Model Counting Part 4: Lifted Inference for WFOMC

Size: px
Start display at page:

Download "Outline. Part 1: Motivation Part 2: Probabilistic Databases Part 3: Weighted Model Counting Part 4: Lifted Inference for WFOMC"

Transcription

1 Outline Part 1: Motivation Part 2: Probabilistic Databases Part 3: Weighted Model Counting Part 4: Lifted Inference for WFOMC Part 5: Completeness of Lifted Inference Part 6: Query Compilation Part 7: Symmetric Lifted Inference Complexity Part 8: Open-World Probabilistic Databases Part 9: Discussion & Conclusions

2 Informal: Defining Lifted Inference Exploit symmetries, Reason at first-order level, Reason about groups of objects, Scalable inference, High-level probabilistic reasoning, etc. A formal definition: Domain-lifted inference Inference runs in time polynomial in the number of objects in the domain. Polynomial in #people, #webpages, #cards Not polynomial in #predicates, #formulas, #logical variables Related to data complexity in databases [VdB 11, Jaeger 12]

3 Informal: Defining Lifted Inference Exploit symmetries, Reason at first-order level, Reason about groups of objects, Scalable inference, High-level probabilistic reasoning, etc. [Poole 03, etc.] A formal definition: Domain-lifted inference [VdB 11, Jaeger 12]

4 Informal: Defining Lifted Inference Exploit symmetries, Reason at first-order level, Reason about groups of objects, Scalable inference, High-level probabilistic reasoning, etc. [Poole 03, etc.] A formal definition: Domain-lifted inference Alternative in this tutorial: Lifted inference = Query Plan = FO Compilation [VdB 11, Jaeger 12]

5 Asymmetric WFOMC Rules Preprocess Q (omitted from this talk; see [Suciu 11]), then apply these rules (some have preconditions) WMC( Δ) = Z-WMC(Δ) Negation Normalization constant Z (easy to compute)

6 Asymmetric WFOMC Rules Preprocess Q (omitted from this talk; see [Suciu 11]), then apply these rules (some have preconditions) WMC( Δ) = Z-WMC(Δ) Negation Normalization constant Z (easy to compute) WMC(Δ 1 Δ 2 ) = WMC(Δ 1 ) * WMC(Δ 2 ) WMC(Δ 1 Δ 2 ) = Z -(Z 1 -WMC(Δ 1 ))*(Z 2 -WMC(Δ 2 )) Independent join / union

7 Asymmetric WFOMC Rules Preprocess Q (omitted from this talk; see [Suciu 11]), then apply these rules (some have preconditions) WMC( Δ) = Z-WMC(Δ) Negation Normalization constant Z (easy to compute) WMC(Δ 1 Δ 2 ) = WMC(Δ 1 ) * WMC(Δ 2 ) WMC(Δ 1 Δ 2 ) = Z -(Z 1 -WMC(Δ 1 ))*(Z 2 -WMC(Δ 2 )) Independent join / union WMC( z Δ) = Z Π C Domain (Z C WMC(Δ[C/z]) WMC( z Δ) = Π C Domain WMC(Δ[C/z] Independent project

8 Asymmetric WFOMC Rules Preprocess Q (omitted from this talk; see [Suciu 11]), then apply these rules (some have preconditions) WMC( Δ) = Z-WMC(Δ) Negation Normalization constant Z (easy to compute) WMC(Δ 1 Δ 2 ) = WMC(Δ 1 ) * WMC(Δ 2 ) WMC(Δ 1 Δ 2 ) = Z -(Z 1 -WMC(Δ 1 ))*(Z 2 -WMC(Δ 2 )) Independent join / union WMC( z Δ) = Z Π C Domain (Z C WMC(Δ[C/z]) WMC( z Δ) = Π C Domain WMC(Δ[C/z] Independent project WMC(Δ 1 Δ 2 ) = WMC(Δ 1 )+WMC(Δ 2 )-WMC(Δ 1 Δ 2 ) WMC(Δ 1 Δ 2 ) = WMC(Δ 1 )+WMC(Δ 2 )-WMC(Δ 1 Δ 2 ) Inclusion/ exclusion

9 Symmetric WFOMC Rules Simplification to independent project: If Δ[C 1 /x], Δ[C 2 /x], are independent WMC( z Δ) = Z (Z C1 -WMC(Δ[C 1 /z]) Domain WMC( z Δ) = WMC(Δ[C 1 /z]) Domain [VdB 11]

10 Symmetric WFOMC Rules Simplification to independent project: If Δ[C 1 /x], Δ[C 2 /x], are independent WMC( z Δ) = Z (Z C1 -WMC(Δ[C 1 /z]) Domain WMC( z Δ) = WMC(Δ[C 1 /z]) Domain A powerful new inference rule: atom counting Only possible with symmetric weights Intuition: Remove unary relations The workhorse of Symmetric WFOMC [VdB 11]

11 WFOMC Inference: Example FO-Model Counting: w(r) = w( R) = 1 Apply inference rules backwards (step )

12 WFOMC Inference: Example FO-Model Counting: w(r) = w( R) = 1 Apply inference rules backwards (step ) 4. Δ = (Stress(Alice) Smokes(Alice)) Domain = {Alice}

13 WFOMC Inference: Example FO-Model Counting: w(r) = w( R) = 1 Apply inference rules backwards (step ) 4. Δ = (Stress(Alice) Smokes(Alice)) 3 models Domain = {Alice}

14 WFOMC Inference: Example FO-Model Counting: w(r) = w( R) = 1 Apply inference rules backwards (step ) 4. Δ = (Stress(Alice) Smokes(Alice)) Domain = {Alice} WMC( Stress(Alice) Smokes(Alice))) = = Z WMC(Stress(Alice)) WMC( Smokes(Alice)) = = 3 models

15 WFOMC Inference: Example FO-Model Counting: w(r) = w( R) = 1 Apply inference rules backwards (step ) 4. Δ = (Stress(Alice) Smokes(Alice)) Domain = {Alice} WMC( Stress(Alice) Smokes(Alice))) = = Z WMC(Stress(Alice)) WMC( Smokes(Alice)) = = 3 models 3. Δ = x, (Stress(x) Smokes(x)) Domain = {n people}

16 WFOMC Inference: Example FO-Model Counting: w(r) = w( R) = 1 Apply inference rules backwards (step ) 4. Δ = (Stress(Alice) Smokes(Alice)) Domain = {Alice} WMC( Stress(Alice) Smokes(Alice))) = = Z WMC(Stress(Alice)) WMC( Smokes(Alice)) = = 3 models 3. Δ = x, (Stress(x) Smokes(x)) Domain = {n people} 3 n models

17 WFOMC Inference: Example 3. Δ = x, (Stress(x) Smokes(x)) Domain = {n people} 3 n models

18 WFOMC Inference: Example 3. Δ = x, (Stress(x) Smokes(x)) Domain = {n people} 3 n models 2. Δ = y, (ParentOf(y) Female MotherOf(y)) D = {n people}

19 WFOMC Inference: Example 3. Δ = x, (Stress(x) Smokes(x)) Domain = {n people} 3 n models 2. Δ = y, (ParentOf(y) Female MotherOf(y)) D = {n people} If Female = true? Δ = y, (ParentOf(y) MotherOf(y)) 3 n models

20 WFOMC Inference: Example 3. Δ = x, (Stress(x) Smokes(x)) Domain = {n people} 3 n models 2. Δ = y, (ParentOf(y) Female MotherOf(y)) D = {n people} If Female = true? Δ = y, (ParentOf(y) MotherOf(y)) 3 n models If Female = false? Δ = true 4 n models

21 WFOMC Inference: Example 3. Δ = x, (Stress(x) Smokes(x)) Domain = {n people} 3 n models 2. Δ = y, (ParentOf(y) Female MotherOf(y)) D = {n people} If Female = true? Δ = y, (ParentOf(y) MotherOf(y)) 3 n models If Female = false? Δ = true 4 n models 3 n + 4 n models

22 WFOMC Inference: Example 3. Δ = x, (Stress(x) Smokes(x)) Domain = {n people} 3 n models 2. Δ = y, (ParentOf(y) Female MotherOf(y)) D = {n people} WMC(Δ ) = WMC( Female y, (ParentOf(y) MotherOf(y))) = 2 * 2 n * 2 n - (2 1) * (2 n * 2 n WMC( y, (ParentOf(y) MotherOf(y)))) = 2 * 4 n (4 n 3 n ) 3 n + 4 n models

23 WFOMC Inference: Example 3. Δ = x, (Stress(x) Smokes(x)) Domain = {n people} 3 n models 2. Δ = y, (ParentOf(y) Female MotherOf(y)) D = {n people} WMC(Δ ) = WMC( Female y, (ParentOf(y) MotherOf(y))) = 2 * 2 n * 2 n - (2 1) * (2 n * 2 n WMC( y, (ParentOf(y) MotherOf(y)))) = 2 * 4 n (4 n 3 n ) 3 n + 4 n models 1. Δ = x,y, (ParentOf(x,y) Female(x) MotherOf(x,y)) D = {n people}

24 WFOMC Inference: Example 3. Δ = x, (Stress(x) Smokes(x)) Domain = {n people} 3 n models 2. Δ = y, (ParentOf(y) Female MotherOf(y)) D = {n people} WMC(Δ ) = WMC( Female y, (ParentOf(y) MotherOf(y))) = 2 * 2 n * 2 n - (2 1) * (2 n * 2 n WMC( y, (ParentOf(y) MotherOf(y)))) = 2 * 4 n (4 n 3 n ) 3 n + 4 n models 1. Δ = x,y, (ParentOf(x,y) Female(x) MotherOf(x,y)) D = {n people} (3 n + 4 n ) n models

25 Atom Counting: Example Δ = x,y, (Smokes(x) Friends(x,y) Smokes(y)) Domain = {n people}

26 Atom Counting: Example Δ = x,y, (Smokes(x) Friends(x,y) Smokes(y)) Domain = {n people} If we know precisely who smokes, and there are k smokers? Database: Smokes(Alice) = 1 Smokes(Bob) = 0 Smokes(Charlie) = 0 Smokes(Dave) = 1 Smokes(Eve) = 0... Smokes k n-k Friends Smokes k n-k

27 Atom Counting: Example Δ = x,y, (Smokes(x) Friends(x,y) Smokes(y)) Domain = {n people} If we know precisely who smokes, and there are k smokers? Database: Smokes(Alice) = 1 Smokes(Bob) = 0 Smokes(Charlie) = 0 Smokes(Dave) = 1 Smokes(Eve) = 0... Smokes k n-k Friends Smokes k n-k

28 Atom Counting: Example Δ = x,y, (Smokes(x) Friends(x,y) Smokes(y)) Domain = {n people} If we know precisely who smokes, and there are k smokers? Database: Smokes(Alice) = 1 Smokes(Bob) = 0 Smokes(Charlie) = 0 Smokes(Dave) = 1 Smokes(Eve) = 0... Smokes k n-k Friends Smokes k n-k

29 Atom Counting: Example Δ = x,y, (Smokes(x) Friends(x,y) Smokes(y)) Domain = {n people} If we know precisely who smokes, and there are k smokers? Database: Smokes(Alice) = 1 Smokes(Bob) = 0 Smokes(Charlie) = 0 Smokes(Dave) = 1 Smokes(Eve) = 0... Smokes k n-k Friends Smokes k n-k

30 Atom Counting: Example Δ = x,y, (Smokes(x) Friends(x,y) Smokes(y)) Domain = {n people} If we know precisely who smokes, and there are k smokers? Database: Smokes(Alice) = 1 Smokes(Bob) = 0 Smokes(Charlie) = 0 Smokes(Dave) = 1 Smokes(Eve) = 0... Smokes k n-k Friends Smokes k n-k

31 Atom Counting: Example Δ = x,y, (Smokes(x) Friends(x,y) Smokes(y)) Domain = {n people} If we know precisely who smokes, and there are k smokers? Database: Smokes(Alice) = 1 Smokes(Bob) = 0 Smokes(Charlie) = 0 Smokes(Dave) = 1 Smokes(Eve) = 0... Smokes k n-k Friends Smokes k n-k

32 Atom Counting: Example Δ = x,y, (Smokes(x) Friends(x,y) Smokes(y)) Domain = {n people} If we know precisely who smokes, and there are k smokers? Database: Smokes(Alice) = 1 Smokes(Bob) = 0 Smokes(Charlie) = 0 Smokes(Dave) = 1 Smokes(Eve) = 0... Smokes k n-k Friends Smokes k n-k

33 Atom Counting: Example Δ = x,y, (Smokes(x) Friends(x,y) Smokes(y)) Domain = {n people} If we know precisely who smokes, and there are k smokers? Database: Smokes(Alice) = 1 Smokes(Bob) = 0 Smokes(Charlie) = 0 Smokes(Dave) = 1 Smokes(Eve) = 0... Smokes k n-k Friends Smokes k n-k

34 Atom Counting: Example Δ = x,y, (Smokes(x) Friends(x,y) Smokes(y)) Domain = {n people} If we know precisely who smokes, and there are k smokers? Database: Smokes(Alice) = 1 Smokes(Bob) = 0 Smokes(Charlie) = 0 Smokes(Dave) = 1 Smokes(Eve) = 0... Smokes k n-k Friends Smokes k n-k

35 Atom Counting: Example Δ = x,y, (Smokes(x) Friends(x,y) Smokes(y)) Domain = {n people} If we know precisely who smokes, and there are k smokers? Database: Smokes(Alice) = 1 Smokes(Bob) = 0 Smokes(Charlie) = 0 Smokes(Dave) = 1 Smokes(Eve) = 0... models Smokes k n-k Friends Smokes k n-k

36 Atom Counting: Example Δ = x,y, (Smokes(x) Friends(x,y) Smokes(y)) Domain = {n people} If we know precisely who smokes, and there are k smokers? Database: Smokes(Alice) = 1 Smokes(Bob) = 0 Smokes(Charlie) = 0 Smokes(Dave) = 1 Smokes(Eve) = 0... models Smokes k n-k Friends Smokes k n-k If we know that there are k smokers?

37 Atom Counting: Example Δ = x,y, (Smokes(x) Friends(x,y) Smokes(y)) Domain = {n people} If we know precisely who smokes, and there are k smokers? Database: Smokes(Alice) = 1 Smokes(Bob) = 0 Smokes(Charlie) = 0 Smokes(Dave) = 1 Smokes(Eve) = 0... models Smokes k n-k Friends Smokes k n-k If we know that there are k smokers? models

38 Atom Counting: Example Δ = x,y, (Smokes(x) Friends(x,y) Smokes(y)) Domain = {n people} If we know precisely who smokes, and there are k smokers? Database: Smokes(Alice) = 1 Smokes(Bob) = 0 Smokes(Charlie) = 0 Smokes(Dave) = 1 Smokes(Eve) = 0... models Smokes k n-k Friends Smokes k n-k If we know that there are k smokers? models In total

39 Atom Counting: Example Δ = x,y, (Smokes(x) Friends(x,y) Smokes(y)) Domain = {n people} If we know precisely who smokes, and there are k smokers? Database: Smokes(Alice) = 1 Smokes(Bob) = 0 Smokes(Charlie) = 0 Smokes(Dave) = 1 Smokes(Eve) = 0... models Smokes k n-k Friends Smokes k n-k If we know that there are k smokers? models In total models

40 Augment Rules with Logical Rewritings [Suciu 11]

41 Augment Rules with Logical Rewritings 1. Remove constants (shattering) Δ = x (Friend(Alice, x) Friend(x, Bob)) [Suciu 11]

42 Augment Rules with Logical Rewritings 1. Remove constants (shattering) Δ = x (Friend(Alice, x) Friend(x, Bob)) F 1 (x) = Friend(Alice,x) F 2 (x) = Friend(x,Bob) F 3 = Friend(Alice, Alice) F 4 = Friend(Alice,Bob) F 5 = Friend(Bob,Bob) Δ = x (F 1 (x) F 2 (x)) (F 3 F 4 ) (F 4 F 5 ) [Suciu 11]

43 Augment Rules with Logical Rewritings 1. Remove constants (shattering) Δ = x (Friend(Alice, x) Friend(x, Bob)) F 1 (x) = Friend(Alice,x) F 2 (x) = Friend(x,Bob) F 3 = Friend(Alice, Alice) F 4 = Friend(Alice,Bob) F 5 = Friend(Bob,Bob) Δ = x (F 1 (x) F 2 (x)) (F 3 F 4 ) (F 4 F 5 ) 2. Rank variables (= occur in the same order in each atom) Δ = (Friend(x,y) Enemy(x,y)) (Friend(x,y) Enemy(y,x)) Wrong order [Suciu 11]

44 Augment Rules with Logical Rewritings 1. Remove constants (shattering) Δ = x (Friend(Alice, x) Friend(x, Bob)) F 1 (x) = Friend(Alice,x) F 2 (x) = Friend(x,Bob) F 3 = Friend(Alice, Alice) F 4 = Friend(Alice,Bob) F 5 = Friend(Bob,Bob) Δ = x (F 1 (x) F 2 (x)) (F 3 F 4 ) (F 4 F 5 ) 2. Rank variables (= occur in the same order in each atom) Δ = (Friend(x,y) Enemy(x,y)) (Friend(x,y) Enemy(y,x)) Wrong order F 1 (u,v) = Friend(u,v),u<v F 2 (u) = Friend(u,u) F 3 (u,v) = Friend(v,u),v<u E 1 (u,v) = Friend(u,v),u<v E 2 (u) = Friend(u,u) E 3 (u,v) = Friend(v,u),v<u Δ = (F 1 (x,y) E 1 (x,y)) (F 1 (x,y) E 3 (x,y)) (F 2 (x) E 2 (x)) (F 3 (x,y) E 3 (x,y)) (F 3 (x,y) E 1 (x,y)) [Suciu 11]

45 Augment Rules with Logical Rewritings 3. Perform Resolution [Gribkoff 14] Δ = x y (R(x) S(x,y)) x y (S(x,y) T(y)) Rules stuck Resolution on S(x,y): x y (R(x) T(y)) Add resolvent: Δ = x y (R(x) S(x,y)) x y (S(x,y) T(y)) x y (R(x) T(y)) Now apply I/E!

46 Augment Rules with Logical Rewritings 4. Skolemization [VdB 14] Δ = p, c, Card(p,c) Inference rules assume one type of quantifier! Mix / in encodings of MLNs with quantifiers and probabilistic programs Datalog smokes(x) :- friends(x,y), smokes(y). FOL Δ = x, Smokes(x) y, Friends(x,y), Smokes(y). Skolemization Input: Mix / Output: Only BUT: cannot introduce Skolem constants or functions! p, Card(p,S(p))

47 Skolemization: Example Δ = p, c, Card(p,c) [VdB 14]

48 Skolemization: Example Δ = p, c, Card(p,c) Skolemization Δ = p, c, Card(p,c) S(p) [VdB 14]

49 Skolemization: Example Δ = p, c, Card(p,c) Skolemization Δ = p, c, Card(p,c) S(p) w(s) = 1 and w( S) = -1 Skolem predicate [VdB 14]

50 Skolemization: Example Δ = p, c, Card(p,c) Skolemization Δ = p, c, Card(p,c) S(p) Consider one position p: c, Card(p,c) = true w(s) = 1 and w( S) = -1 Skolem predicate c, Card(p,c) = false [VdB 14]

51 Skolemization: Example Δ = p, c, Card(p,c) Skolemization Δ = p, c, Card(p,c) S(p) Consider one position p: c, Card(p,c) = true w(s) = 1 and w( S) = -1 Skolem predicate c, Card(p,c) = false S(p) = true Also model of Δ, weight * 1 [VdB 14]

52 Skolemization: Example Δ = p, c, Card(p,c) Skolemization Δ = p, c, Card(p,c) S(p) Consider one position p: c, Card(p,c) = true w(s) = 1 and w( S) = -1 Skolem predicate S(p) = true Also model of Δ, weight * 1 c, Card(p,c) = false S(p) = true No model of Δ, weight * 1 S(p) = false No model of Δ, weight * -1 [VdB 14] Extra models Cancel out

53 First-Order Knowledge Compilation Markov Logic 3.14 Smokes(x) Friends(x,y) Smokes(y) [Vdb 11, 13]

54 First-Order Knowledge Compilation Markov Logic 3.14 Smokes(x) Friends(x,y) Smokes(y) Weight Function w(smokes)=1 w( Smokes )=1 w(friends )=1 w( Friends )=1 w(f)=exp(3.14) w( F)=1 FOL Sentence x,y, F(x,y) [ Smokes(x) Friends(x,y) Smokes(y) ] [Vdb 11, 13]

55 First-Order Knowledge Compilation Markov Logic 3.14 Smokes(x) Friends(x,y) Smokes(y) Weight Function w(smokes)=1 w( Smokes )=1 w(friends )=1 w( Friends )=1 w(f)=exp(3.14) w( F)=1 FOL Sentence x,y, F(x,y) [ Smokes(x) Friends(x,y) Smokes(y) ] Compile? First-Order d-dnnf Circuit [Vdb 11, 13]

56 First-Order Knowledge Compilation Markov Logic 3.14 Smokes(x) Friends(x,y) Smokes(y) Weight Function w(smokes)=1 w( Smokes )=1 w(friends )=1 w( Friends )=1 w(f)=exp(3.14) w( F)=1 FOL Sentence x,y, F(x,y) [ Smokes(x) Friends(x,y) Smokes(y) ] Compile? First-Order d-dnnf Circuit Domain Alice Bob Charlie Z = WFOMC = [Vdb 11, 13]

57 First-Order Knowledge Compilation Markov Logic 3.14 Smokes(x) Friends(x,y) Smokes(y) Weight Function w(smokes)=1 w( Smokes )=1 w(friends )=1 w( Friends )=1 w(f)=exp(3.14) w( F)=1 FOL Sentence x,y, F(x,y) [ Smokes(x) Friends(x,y) Smokes(y) ] Compile? First-Order d-dnnf Circuit Domain Alice Bob Charlie Z = WFOMC = Evaluation in time polynomial in domain size [Vdb 11, 13]

58 First-Order Knowledge Compilation Markov Logic 3.14 Smokes(x) Friends(x,y) Smokes(y) Weight Function w(smokes)=1 w( Smokes )=1 w(friends )=1 w( Friends )=1 w(f)=exp(3.14) w( F)=1 FOL Sentence x,y, F(x,y) [ Smokes(x) Friends(x,y) Smokes(y) ] Compile? First-Order d-dnnf Circuit Domain Alice Bob Charlie Z = WFOMC = Evaluation in time polynomial in domain size Domain-lifted! [Vdb 11, 13]

59 Negation Normal Form [Darwiche 01]

60 Decomposable NNF Decomposable [Darwiche 01]

61 Deterministic Decomposable NNF Deterministic [Darwiche 01]

62 Deterministic Decomposable NNF Weighted Model Counting [Darwiche 01]

63 Deterministic Decomposable NNF Weighted Model Counting and much more! [Darwiche 01]

64 First-Order NNF [VdB 13]

65 First-Order Decomposability Decomposable [VdB 13]

66 First-Order Decomposability Decomposable [VdB 13]

67 First-Order Determinism Deterministic [VdB 13]

68 First-Order NNF = Query Plan [VdB 13]

69 Deterministic Decomposable FO NNF Weighted Model Counting [VdB 13]

70 Deterministic Decomposable FO NNF Weighted Model Counting Pr(belgian) x Pr(likes) + Pr( belgian) [VdB 13]

71 Deterministic Decomposable FO NNF Weighted Model Counting People Pr(belgian) x Pr(likes) + Pr( belgian) ( ) [VdB 13]

72 Symmetric WFOMC on FO NNF Complexity polynomial in domain size! Polynomial in NNF size for bounded depth. [VdB 13]

73 How to do first-order knowledge compilation?

74 Deterministic Decomposable FO NNF Δ = x,y People, (Smokes(x) Friends(x,y) Smokes(y)) [VdB 13]

75 Deterministic Decomposable FO NNF Δ = x,y People, (Smokes(x) Friends(x,y) Smokes(y)) [VdB 13]

76 Deterministic Decomposable FO NNF Δ = x,y People, (Smokes(x) Friends(x,y) Smokes(y)) [VdB 13]

77 Deterministic Decomposable FO NNF Δ = x,y People, (Smokes(x) Friends(x,y) Smokes(y)) [VdB 13]

78 Deterministic Decomposable FO NNF Δ = x,y People, (Smokes(x) Friends(x,y) Smokes(y)) Deterministic [VdB 13]

79 Deterministic Decomposable FO NNF Δ = x,y People, (Smokes(x) Friends(x,y) Smokes(y)) [VdB 13]

80 Compilation Rules Standard rules Shannon decomposition (DPLL) Detect decomposability Etc. FO Shannon decomposition: Δ [VdB 13]

81 Playing Cards Revisited... Let us automate this: Relational model p, c, Card(p,c) c, p, Card(p,c) p, c, c, Card(p,c) Card(p,c ) c = c Lifted probabilistic inference algorithm

82 Why not do propositional WMC? Reduce to propositional model counting: [VdB 15]

83 Why not do propositional WMC? Reduce to propositional model counting: Δ = Card(A,p 1 ) v v Card(2,p 1 ) Card(A,p 2 ) v v Card(2,p 2 ) Card(A,p 1 ) v v Card(A,p 52 ) Card(K,p 1 ) v v Card(K,p 52 ) Card(A,p 1 ) v Card(A,p 2 ) Card(A,p 1 ) v Card(A,p 3 ) [VdB 15]

84 Why not do propositional WMC? Reduce to propositional model counting: Δ = Card(A,p 1 ) v v Card(2,p 1 ) Card(A,p 2 ) v v Card(2,p 2 ) Card(A,p 1 ) v v Card(A,p 52 ) Card(K,p 1 ) v v Card(K,p 52 ) Card(A,p 1 ) v Card(A,p 2 ) Card(A,p 1 ) v Card(A,p 3 ) What will happen? [VdB 15]

85 3 Deck of Cards Graphically A 2 K [VdB 15]

86 3 Deck of Cards Graphically A 2 K Card(K,p 52 ) [VdB 15]

87 3 Deck of Cards Graphically A 2 K One model/perfect matching [VdB 15]

88 3 Deck of Cards Graphically A 2 K [VdB 15]

89 3 Deck of Cards Graphically A 2 K Card(K,p 52 ) [VdB 15]

90 3 Deck of Cards Graphically A 2 K Card(K,p 52 ) Model counting: How many perfect matchings? [VdB 15]

91 3 Deck of Cards Graphically A 2 K What if I set w(card(k,p 52 )) = 0? [VdB 15]

92 3 Deck of Cards Graphically A 2 K What if I set w(card(k,p 52 )) = 0? [VdB 15]

93 3 Deck of Cards Graphically A 2 K What if I set can set any asymmetric weight function? [VdB 15]

94 Observations Asymmetric weight function can remove edge Encode any bigraph Counting models = perfect matchings Problem is #P-complete! All non-lifted WMC solvers efficiently handle asymmetric weights No solver does cards problem efficiently! Later: Power of lifted vs. ground inference and complexities [VdB 15]

95 Playing Cards Revisited... Let us automate this: Relational model p, c, Card(p,c) c, p, Card(p,c) p, c, c, Card(p,c) Card(p,c ) c = c Lifted probabilistic inference algorithm

96 Playing Cards Revisited p, c, Card(p,c) c, p, Card(p,c) p, c, c, Card(p,c) Card(p,c ) c = c [VdB 15]

97 Playing Cards Revisited p, c, Card(p,c) c, p, Card(p,c) p, c, c, Card(p,c) Card(p,c ) c = c Skolemization [VdB 15]

98 Playing Cards Revisited p, c, Card(p,c) c, p, Card(p,c) p, c, c, Card(p,c) Card(p,c ) c = c Skolemization p, c, Card(p,c) S 1 (p) c, p, Card(p,c) S 2 (c) p, c, c, Card(p,c) Card(p,c ) c = c [VdB 15]

99 Playing Cards Revisited p, c, Card(p,c) c, p, Card(p,c) p, c, c, Card(p,c) Card(p,c ) c = c Skolemization p, c, Card(p,c) S 1 (p) c, p, Card(p,c) S 2 (c) p, c, c, Card(p,c) Card(p,c ) c = c w(s 1 ) = 1 and w( S 1 ) = -1 w(s 2 ) = 1 and w( S 2 ) = -1 [VdB 15]

100 Playing Cards Revisited p, c, Card(p,c) c, p, Card(p,c) p, c, c, Card(p,c) Card(p,c ) c = c Skolemization p, c, Card(p,c) S 1 (p) c, p, Card(p,c) S 2 (c) p, c, c, Card(p,c) Card(p,c ) c = c w(s 1 ) = 1 and w( S 1 ) = -1 w(s 2 ) = 1 and w( S 2 ) = -1 Atom counting [VdB 15]

101 Playing Cards Revisited p, c, Card(p,c) c, p, Card(p,c) p, c, c, Card(p,c) Card(p,c ) c = c Skolemization p, c, Card(p,c) S 1 (p) c, p, Card(p,c) S 2 (c) p, c, c, Card(p,c) Card(p,c ) c = c w(s 1 ) = 1 and w( S 1 ) = -1 w(s 2 ) = 1 and w( S 2 ) = -1 Atom counting p, c, c, Card(p,c) Card(p,c ) c = c [VdB 15]

102 Playing Cards Revisited p, c, Card(p,c) c, p, Card(p,c) p, c, c, Card(p,c) Card(p,c ) c = c Skolemization p, c, Card(p,c) S 1 (p) c, p, Card(p,c) S 2 (c) p, c, c, Card(p,c) Card(p,c ) c = c w(s 1 ) = 1 and w( S 1 ) = -1 w(s 2 ) = 1 and w( S 2 ) = -1 Atom counting p, c, c, Card(p,c) Card(p,c ) c = c -Rule [VdB 15]

103 Playing Cards Revisited p, c, Card(p,c) c, p, Card(p,c) p, c, c, Card(p,c) Card(p,c ) c = c Skolemization p, c, Card(p,c) S 1 (p) c, p, Card(p,c) S 2 (c) p, c, c, Card(p,c) Card(p,c ) c = c w(s 1 ) = 1 and w( S 1 ) = -1 w(s 2 ) = 1 and w( S 2 ) = -1 Atom counting p, c, c, Card(p,c) Card(p,c ) c = c -Rule c, c, Card(c) Card(c ) c = c [VdB 15]

104 Playing Cards Revisited p, c, Card(p,c) c, p, Card(p,c) p, c, c, Card(p,c) Card(p,c ) c = c Skolemization p, c, Card(p,c) S 1 (p) c, p, Card(p,c) S 2 (c) p, c, c, Card(p,c) Card(p,c ) c = c w(s 1 ) = 1 and w( S 1 ) = -1 w(s 2 ) = 1 and w( S 2 ) = -1 Atom counting p, c, c, Card(p,c) Card(p,c ) c = c -Rule c, c, Card(c) Card(c ) c = c [VdB 15]

105 Playing Cards Revisited... Let us automate this: Lifted probabilistic inference algorithm Computed in time polynomial in n [VdB 15]

106 Summary Lifted Inference By definition: PTIME data complexity Also: FO compilation = Query Plan However: only works for liftable queries Preprocessing based on logical rewriting The rules: Deceptively simple: the only surprising rules are I/E and atom counting Rules are captured by a query plan or first-order NNF circuit

First-Order Knowledge Compilation for Probabilistic Reasoning

First-Order Knowledge Compilation for Probabilistic Reasoning First-Order Knowledge Compilation for Probabilistic Reasoning Guy Van den Broeck based on joint work with Adnan Darwiche, Dan Suciu, and many others MOTIVATION 1 A Simple Reasoning Problem...? Probability

More information

On First-Order Knowledge Compilation. Guy Van den Broeck

On First-Order Knowledge Compilation. Guy Van den Broeck On First-Order Knowledge Compilation Guy Van den Broeck Beyond NP Workshop Feb 12, 2016 Overview 1. Why first-order model counting? 2. Why first-order model counters? 3. What first-order circuit languages?

More information

Lifted Probabilistic Inference in Relational Models

Lifted Probabilistic Inference in Relational Models Lifted Probabilistic Inference in Relational Models Guy Van den Broeck Dan Suciu KU Leuven U. of Washington Tutorial UAI 2014 1 About the Tutorial Slides available online. Bibliography is at the end. Your

More information

First-Order Probabilistic Reasoning: Successes and Challenges. Guy Van den Broeck

First-Order Probabilistic Reasoning: Successes and Challenges. Guy Van den Broeck First-Order Probabilistic Reasoning: Successes and Challenges Guy Van den Broeck IJCAI Early Career Spotlight Jul 14, 2016 Overview 1. Why first-order probabilistic models? 2. Why first-order probabilistic

More information

Open-World Probabilistic Databases. Guy Van den Broeck

Open-World Probabilistic Databases. Guy Van den Broeck Open-World Probabilistic Databases Guy Van den Broeck GCAI Oct 21, 2017 Overview 1. Why probabilistic databases? 2. How probabilistic query evaluation? 3. Why open world? 4. How open-world query evaluation?

More information

Brief Tutorial on Probabilistic Databases

Brief Tutorial on Probabilistic Databases Brief Tutorial on Probabilistic Databases Dan Suciu University of Washington Simons 206 About This Talk Probabilistic databases Tuple-independent Query evaluation Statistical relational models Representation,

More information

Lifted Probabilistic Inference in Relational Models. U. of Washington

Lifted Probabilistic Inference in Relational Models. U. of Washington Lifted Probabilistic Inference in Relational Models Guy Van den Broeck UCLA Dan Suciu U. of Washington IJCAI Tutorial July 10, 2016 About the Tutorial Slides available at http://web.cs.ucla.edu/~guyvdb/talks/ijcai16-tutorial/

More information

Outline. Part 1: Motivation Part 2: Probabilistic Databases Part 3: Weighted Model Counting Part 4: Lifted Inference for WFOMC

Outline. Part 1: Motivation Part 2: Probabilistic Databases Part 3: Weighted Model Counting Part 4: Lifted Inference for WFOMC Outline Part 1: Motivation Part 2: Probabilistic Databases Part 3: Weighted Model Counting Part 4: Lifted Inference for WFOMC Part 5: Completeness of Lifted Inference Part 6: Quer Compilation Part 7: Smmetric

More information

Outline. Part 1: Motivation Part 2: Probabilistic Databases Part 3: Weighted Model Counting Part 4: Lifted Inference for WFOMC

Outline. Part 1: Motivation Part 2: Probabilistic Databases Part 3: Weighted Model Counting Part 4: Lifted Inference for WFOMC Outline Part 1: Motivation Part 2: Probabilistic Databases Part 3: Weighted Model Counting Part 4: Lifted Inference for WFOMC Part 5: Completeness of Lifted Inference Part 6: Query Compilation Part 7:

More information

Probabilistic Theorem Proving. Vibhav Gogate (Joint work with Pedro Domingos)

Probabilistic Theorem Proving. Vibhav Gogate (Joint work with Pedro Domingos) Probabilistic Theorem Proving Vibhav Gogate (Joint work with Pedro Domingos) Progress to Date First-order variable elimination [Poole, 2003; de Salvo Braz, 2007; etc.] Lifted belief propagation [Singla

More information

Lifted Inference: Exact Search Based Algorithms

Lifted Inference: Exact Search Based Algorithms Lifted Inference: Exact Search Based Algorithms Vibhav Gogate The University of Texas at Dallas Overview Background and Notation Probabilistic Knowledge Bases Exact Inference in Propositional Models First-order

More information

A Relaxed Tseitin Transformation for Weighted Model Counting

A Relaxed Tseitin Transformation for Weighted Model Counting A Relaxed Tseitin Transformation for Weighted Model Counting Wannes Meert, Jonas Vlasselaer Computer Science Department KU Leuven, Belgium Guy Van den Broeck Computer Science Department University of California,

More information

StarAI Full, 6+1 pages Short, 2 page position paper or abstract

StarAI Full, 6+1 pages Short, 2 page position paper or abstract StarAI 2015 Fifth International Workshop on Statistical Relational AI At the 31st Conference on Uncertainty in Artificial Intelligence (UAI) (right after ICML) In Amsterdam, The Netherlands, on July 16.

More information

Inference and Learning in Probabilistic Logic Programs using Weighted Boolean Formulas

Inference and Learning in Probabilistic Logic Programs using Weighted Boolean Formulas Under consideration for publication in Theory and Practice of Logic Programming 1 Inference and Learning in Probabilistic Logic Programs using Weighted Boolean Formulas DAAN FIERENS, GUY VAN DEN BROECK,

More information

Inference in Probabilistic Logic Programs using Weighted CNF s

Inference in Probabilistic Logic Programs using Weighted CNF s Inference in Probabilistic Logic Programs using Weighted CNF s Daan Fierens, Guy Van den Broeck, Ingo Thon, Bernd Gutmann, Luc De Raedt Department of Computer Science Katholieke Universiteit Leuven Celestijnenlaan

More information

arxiv: v2 [cs.ai] 5 Mar 2014

arxiv: v2 [cs.ai] 5 Mar 2014 Skolemization for Weighted First-Order Model Counting Guy Van den Broeck Computer Science Department University of California, Los Angeles guyvdb@cs.ucla.edu Wannes Meert Computer Science Department KU

More information

Lifted Probabilistic Inference: A Guide for the Database Researcher

Lifted Probabilistic Inference: A Guide for the Database Researcher Lifted Probabilistic Inference: A Guide for the Database Researcher Eric Gribkoff University of Washington eagribko@cs.uw.edu Dan Suciu University of Washington suciu@cs.uw.edu Guy Van den Broeck UCLA,

More information

Towards Deterministic Decomposable Circuits for Safe Queries

Towards Deterministic Decomposable Circuits for Safe Queries Towards Deterministic Decomposable Circuits for Safe Queries Mikaël Monet 1 and Dan Olteanu 2 1 LTCI, Télécom ParisTech, Université Paris-Saclay, France, Inria Paris; Paris, France mikael.monet@telecom-paristech.fr

More information

Katholieke Universiteit Leuven Department of Computer Science

Katholieke Universiteit Leuven Department of Computer Science Inference in Probabilistic Logic Programs using Weighted CNF s Daan Fierens Guy Van den Broeck Ingo Thon Bernd Gutmann Luc De Raedt Report CW 607, June 2011 Katholieke Universiteit Leuven Department of

More information

KU Leuven Department of Computer Science

KU Leuven Department of Computer Science Implementation and Performance of Probabilistic Inference Pipelines: Data and Results Dimitar Shterionov Gerda Janssens Report CW 679, March 2015 KU Leuven Department of Computer Science Celestijnenlaan

More information

Relational Learning. CS 760: Machine Learning Spring 2018 Mark Craven and David Page.

Relational Learning. CS 760: Machine Learning Spring 2018 Mark Craven and David Page. Relational Learning CS 760: Machine Learning Spring 2018 Mark Craven and David Page www.biostat.wisc.edu/~craven/cs760 1 Goals for the lecture you should understand the following concepts relational learning

More information

Markov Networks. l Like Bayes Nets. l Graph model that describes joint probability distribution using tables (AKA potentials)

Markov Networks. l Like Bayes Nets. l Graph model that describes joint probability distribution using tables (AKA potentials) Markov Networks l Like Bayes Nets l Graph model that describes joint probability distribution using tables (AKA potentials) l Nodes are random variables l Labels are outcomes over the variables Markov

More information

Propositional Fragments for Knowledge Compilation and Quantified Boolean Formulae

Propositional Fragments for Knowledge Compilation and Quantified Boolean Formulae 1/15 Propositional Fragments for Knowledge Compilation and Quantified Boolean Formulae Sylvie Coste-Marquis Daniel Le Berre Florian Letombe Pierre Marquis CRIL, CNRS FRE 2499 Lens, Université d Artois,

More information

Probabilistic Relational Modeling Statistical Relational AI

Probabilistic Relational Modeling Statistical Relational AI Probabilistic Relational Modeling Statistical Relational AI Tutorial at KI-2018 Tanya Braun, Universität zu Lübeck Kristian Kersting, Technische Universität Darmstadt Ralf Möller, Universität zu Lübeck

More information

Markov Networks. l Like Bayes Nets. l Graphical model that describes joint probability distribution using tables (AKA potentials)

Markov Networks. l Like Bayes Nets. l Graphical model that describes joint probability distribution using tables (AKA potentials) Markov Networks l Like Bayes Nets l Graphical model that describes joint probability distribution using tables (AKA potentials) l Nodes are random variables l Labels are outcomes over the variables Markov

More information

New Rules for Domain Independent Lifted MAP Inference

New Rules for Domain Independent Lifted MAP Inference New Rules for Domain Independent Lifted MAP Inference Happy Mittal, Prasoon Goyal Dept. of Comp. Sci. & Engg. I.I.T. Delhi, Hauz Khas New Delhi, 116, India happy.mittal@cse.iitd.ac.in prasoongoyal13@gmail.com

More information

arxiv: v2 [cs.ai] 27 Jul 2017

arxiv: v2 [cs.ai] 27 Jul 2017 Domain Recursion for Lifted Inference with Existential Quantifiers Seyed Mehran Kazemi 1, Angelika Kimmig 2, Guy Van den Broeck 3 and David Poole 1 1 University of British Columbia, {smkazemi,poole}@cs.ubc.ca

More information

Lifted MAP Inference for Markov Logic

Lifted MAP Inference for Markov Logic Lifted MAP Inference for Markov Logic - Somdeb Sarkhel - Deepak Venugopal - Happy Mittal - Parag Singla - Vibhav Gogate Outline Introduction Lifted MAP Inference for Markov Logic Networks S. Sarkhel, D.

More information

SlimShot: Probabilistic Inference for Web-Scale Knowledge Bases

SlimShot: Probabilistic Inference for Web-Scale Knowledge Bases SlimShot: Probabilistic Inference for Web-Scale Knowledge Bases Eric Gribkoff University of Washington eagribko@cs.washington.edu Dan Suciu University of Washington suciu@cs.washington.edu ABSTRACT Increasingly

More information

A Dichotomy. in in Probabilistic Databases. Joint work with Robert Fink. for Non-Repeating Queries with Negation Queries with Negation

A Dichotomy. in in Probabilistic Databases. Joint work with Robert Fink. for Non-Repeating Queries with Negation Queries with Negation Dichotomy for Non-Repeating Queries with Negation Queries with Negation in in Probabilistic Databases Robert Dan Olteanu Fink and Dan Olteanu Joint work with Robert Fink Uncertainty in Computation Simons

More information

Section Summary. Section 1.5 9/9/2014

Section Summary. Section 1.5 9/9/2014 Section 1.5 Section Summary Nested Quantifiers Order of Quantifiers Translating from Nested Quantifiers into English Translating Mathematical Statements into Statements involving Nested Quantifiers Translated

More information

Symmetric Weighted First-Order Model Counting

Symmetric Weighted First-Order Model Counting Symmetric Weighted First-Order Model Counting ABSTRACT Paul Beame University of Washington beame@cs.washington.edu Eric Gribkoff University of Washington eagribko@cs.washington.edu The FO Model Counting

More information

Predicate Logic: Introduction and Translations

Predicate Logic: Introduction and Translations Predicate Logic: Introduction and Translations Alice Gao Lecture 10 Based on work by J. Buss, L. Kari, A. Lubiw, B. Bonakdarpour, D. Maftuleac, C. Roberts, R. Trefler, and P. Van Beek 1/29 Outline Predicate

More information

Compiling Knowledge into Decomposable Negation Normal Form

Compiling Knowledge into Decomposable Negation Normal Form Compiling Knowledge into Decomposable Negation Normal Form Adnan Darwiche Cognitive Systems Laboratory Department of Computer Science University of California Los Angeles, CA 90024 darwiche@cs. ucla. edu

More information

Fast Lifted MAP Inference via Partitioning

Fast Lifted MAP Inference via Partitioning Fast Lifted MAP Inference via Partitioning Somdeb Sarkhel The University of Texas at Dallas Parag Singla I.I.T. Delhi Abstract Vibhav Gogate The University of Texas at Dallas Recently, there has been growing

More information

Propositional Reasoning

Propositional Reasoning Propositional Reasoning CS 440 / ECE 448 Introduction to Artificial Intelligence Instructor: Eyal Amir Grad TAs: Wen Pu, Yonatan Bisk Undergrad TAs: Sam Johnson, Nikhil Johri Spring 2010 Intro to AI (CS

More information

! Predicates! Variables! Quantifiers. ! Universal Quantifier! Existential Quantifier. ! Negating Quantifiers. ! De Morgan s Laws for Quantifiers

! Predicates! Variables! Quantifiers. ! Universal Quantifier! Existential Quantifier. ! Negating Quantifiers. ! De Morgan s Laws for Quantifiers Sec$on Summary (K. Rosen notes for Ch. 1.4, 1.5 corrected and extended by A.Borgida)! Predicates! Variables! Quantifiers! Universal Quantifier! Existential Quantifier! Negating Quantifiers! De Morgan s

More information

173 Logic and Prolog 2013 With Answers and FFQ

173 Logic and Prolog 2013 With Answers and FFQ 173 Logic and Prolog 2013 With Answers and FFQ Please write your name on the bluebook. You may have two sides of handwritten notes. There are 75 points (minutes) total. Stay cool and please write neatly.

More information

LOGIC PROPOSITIONAL REASONING

LOGIC PROPOSITIONAL REASONING LOGIC PROPOSITIONAL REASONING WS 2017/2018 (342.208) Armin Biere Martina Seidl biere@jku.at martina.seidl@jku.at Institute for Formal Models and Verification Johannes Kepler Universität Linz Version 2018.1

More information

Encoding formulas with partially constrained weights in a possibilistic-like many-sorted propositional logic

Encoding formulas with partially constrained weights in a possibilistic-like many-sorted propositional logic Encoding formulas with partially constrained weights in a possibilistic-like many-sorted propositional logic Salem Benferhat CRIL-CNRS, Université d Artois rue Jean Souvraz 62307 Lens Cedex France benferhat@criluniv-artoisfr

More information

Logic. Propositional Logic: Syntax

Logic. Propositional Logic: Syntax Logic Propositional Logic: Syntax Logic is a tool for formalizing reasoning. There are lots of different logics: probabilistic logic: for reasoning about probability temporal logic: for reasoning about

More information

Knowledge Compilation of Logic Programs Using Approximation Fixpoint Theory

Knowledge Compilation of Logic Programs Using Approximation Fixpoint Theory Under consideration for publication in Theory and Practice of Logic Programming 1 Knowledge Compilation of Logic Programs Using Approximation Fixpoint Theory Bart Bogaerts and Guy Van den Broeck Department

More information

Schema Refinement: Other Dependencies and Higher Normal Forms

Schema Refinement: Other Dependencies and Higher Normal Forms Schema Refinement: Other Dependencies and Higher Normal Forms Spring 2018 School of Computer Science University of Waterloo Databases CS348 (University of Waterloo) Higher Normal Forms 1 / 14 Outline 1

More information

Combining logic with probability Motivation

Combining logic with probability Motivation Combining logic with probability Motivation is a powerful language to represent complex relational information Probability is the standard way to represent uncertainty in knowledge Combining the two would

More information

Do not start until you are given the green signal

Do not start until you are given the green signal SOLUTIONS CSE 311 Winter 2011: Midterm Exam (closed book, closed notes except for 1-page summary) Total: 100 points, 5 questions. Time: 50 minutes Instructions: 1. Write your name and student ID on the

More information

Computing Query Probability with Incidence Algebras Technical Report UW-CSE University of Washington

Computing Query Probability with Incidence Algebras Technical Report UW-CSE University of Washington Computing Query Probability with Incidence Algebras Technical Report UW-CSE-10-03-02 University of Washington Nilesh Dalvi, Karl Schnaitter and Dan Suciu Revised: August 24, 2010 Abstract We describe an

More information

Structure Learning with Hidden Data in Relational Domains

Structure Learning with Hidden Data in Relational Domains Structure Learning with Hidden Data in Relational Domains Tushar Khot, Sriraam Natarajan, Kristian Kersting +, Jude Shavlik University of Wisconsin-Madison, USA Wake Forest University School of Medicine,

More information

Packet #1: Logic & Proofs. Applied Discrete Mathematics

Packet #1: Logic & Proofs. Applied Discrete Mathematics Packet #1: Logic & Proofs Applied Discrete Mathematics Table of Contents Course Objectives Page 2 Propositional Calculus Information Pages 3-13 Course Objectives At the conclusion of this course, you should

More information

On the Compilation of Stratified Belief Bases under Linear and Possibilistic Logic Policies

On the Compilation of Stratified Belief Bases under Linear and Possibilistic Logic Policies Salem Benferhat CRIL-CNRS, Université d Artois rue Jean Souvraz 62307 Lens Cedex. France. benferhat@cril.univ-artois.fr On the Compilation of Stratified Belief Bases under Linear and Possibilistic Logic

More information

Query Evaluation on Probabilistic Databases. CSE 544: Wednesday, May 24, 2006

Query Evaluation on Probabilistic Databases. CSE 544: Wednesday, May 24, 2006 Query Evaluation on Probabilistic Databases CSE 544: Wednesday, May 24, 2006 Problem Setting Queries: Tables: Review A(x,y) :- Review(x,y), Movie(x,z), z > 1991 name rating p Movie Monkey Love good.5 title

More information

arxiv: v2 [cs.ai] 29 Jul 2014

arxiv: v2 [cs.ai] 29 Jul 2014 Understanding the Complexity of Lifted Inference and Asymmetric Weighted Model Counting arxiv:1405.3250v2 [cs.ai] 29 Jul 2014 Eric Gribkoff University of Washington eagribko@cs.uw.edu Abstract In this

More information

Thinking of Nested Quantification

Thinking of Nested Quantification Section 1.5 Section Summary Nested Quantifiers Order of Quantifiers Translating from Nested Quantifiers into English Translating Mathematical Statements into Statements involving Nested Quantifiers. Translating

More information

Logic: The Big Picture

Logic: The Big Picture Logic: The Big Picture A typical logic is described in terms of syntax: what are the legitimate formulas semantics: under what circumstances is a formula true proof theory/ axiomatization: rules for proving

More information

University of California, San Diego Department of Computer Science and Engineering CSE 20. Solutions to Midterm Exam Winter 2018

University of California, San Diego Department of Computer Science and Engineering CSE 20. Solutions to Midterm Exam Winter 2018 University of California, San Diego Department of Computer Science and Engineering CSE 20 Solutions to Midterm Exam Winter 2018 Problem 1 (30 points) a. The boolean function f(p, q) = (q p) ( q p) is specified

More information

Propositional Logic: Evaluating the Formulas

Propositional Logic: Evaluating the Formulas Institute for Formal Models and Verification Johannes Kepler University Linz VL Logik (LVA-Nr. 342208) Winter Semester 2015/2016 Propositional Logic: Evaluating the Formulas Version 2015.2 Armin Biere

More information

CS280, Spring 2004: Final

CS280, Spring 2004: Final CS280, Spring 2004: Final 1. [4 points] Which of the following relations on {0, 1, 2, 3} is an equivalence relation. (If it is, explain why. If it isn t, explain why not.) Just saying Yes or No with no

More information

Independencies. Undirected Graphical Models 2: Independencies. Independencies (Markov networks) Independencies (Bayesian Networks)

Independencies. Undirected Graphical Models 2: Independencies. Independencies (Markov networks) Independencies (Bayesian Networks) (Bayesian Networks) Undirected Graphical Models 2: Use d-separation to read off independencies in a Bayesian network Takes a bit of effort! 1 2 (Markov networks) Use separation to determine independencies

More information

Algebraic Model Counting

Algebraic Model Counting Algebraic Model Counting Angelika Kimmig a,, Guy Van den Broeck b, Luc De Raedt a a Department of Computer Science, KU Leuven, Celestijnenlaan 200a box 2402, 3001 Heverlee, Belgium. b UCLA Computer Science

More information

An Integer Polynomial Programming Based Framework for Lifted MAP Inference

An Integer Polynomial Programming Based Framework for Lifted MAP Inference An Integer Polynomial Programming Based Framework for Lifted MAP Inference Somdeb Sarkhel, Deepak Venugopal Computer Science Department The University of Texas at Dallas {sxs104721,dxv02}@utdallas.edu

More information

Propositional Logic: Syntax

Propositional Logic: Syntax Logic Logic is a tool for formalizing reasoning. There are lots of different logics: probabilistic logic: for reasoning about probability temporal logic: for reasoning about time (and programs) epistemic

More information

Examples: P: it is not the case that P. P Q: P or Q P Q: P implies Q (if P then Q) Typical formula:

Examples: P: it is not the case that P. P Q: P or Q P Q: P implies Q (if P then Q) Typical formula: Logic: The Big Picture Logic is a tool for formalizing reasoning. There are lots of different logics: probabilistic logic: for reasoning about probability temporal logic: for reasoning about time (and

More information

3.17 Semantic Tableaux for First-Order Logic

3.17 Semantic Tableaux for First-Order Logic 3.17 Semantic Tableaux for First-Order Logic There are two ways to extend the tableau calculus to quantified formulas: using ground instantiation using free variables Tableaux with Ground Instantiation

More information

Logic. Propositional Logic: Syntax. Wffs

Logic. Propositional Logic: Syntax. Wffs Logic Propositional Logic: Syntax Logic is a tool for formalizing reasoning. There are lots of different logics: probabilistic logic: for reasoning about probability temporal logic: for reasoning about

More information

CMPSCI 601: Tarski s Truth Definition Lecture 15. where

CMPSCI 601: Tarski s Truth Definition Lecture 15. where @ CMPSCI 601: Tarski s Truth Definition Lecture 15! "$#&%(') *+,-!".#/%0'!12 43 5 6 7 8:9 4; 9 9 < = 9 = or 5 6?>A@B!9 2 D for all C @B 9 CFE where ) CGE @B-HI LJKK MKK )HG if H ; C if H @ 1 > > > Fitch

More information

Learning Goals of CS245 Logic and Computation

Learning Goals of CS245 Logic and Computation Learning Goals of CS245 Logic and Computation Alice Gao April 27, 2018 Contents 1 Propositional Logic 2 2 Predicate Logic 4 3 Program Verification 6 4 Undecidability 7 1 1 Propositional Logic Introduction

More information

Model Counting of Query Expressions: Limitations of Propositional Methods

Model Counting of Query Expressions: Limitations of Propositional Methods Model Counting of Query Expressions: Limitations of Propositional Methods Paul Beame University of Washington beame@cs.washington.edu Jerry Li MIT jerryzli@csail.mit.edu Dan Suciu University of Washington

More information

Propositional Logic: Semantics

Propositional Logic: Semantics Propositional Logic: Semantics Alice Gao Lecture 4, September 19, 2017 Semantics 1/56 Announcements Semantics 2/56 The roadmap of propositional logic Semantics 3/56 FCC spectrum auction an application

More information

LING 501, Fall 2004: Quantification

LING 501, Fall 2004: Quantification LING 501, Fall 2004: Quantification The universal quantifier Ax is conjunctive and the existential quantifier Ex is disjunctive Suppose the domain of quantification (DQ) is {a, b}. Then: (1) Ax Px Pa &

More information

Lecture 5 : Proofs DRAFT

Lecture 5 : Proofs DRAFT CS/Math 240: Introduction to Discrete Mathematics 2/3/2011 Lecture 5 : Proofs Instructor: Dieter van Melkebeek Scribe: Dalibor Zelený DRAFT Up until now, we have been introducing mathematical notation

More information

Lifted First-Order Probabilistic Inference

Lifted First-Order Probabilistic Inference Lifted First-Order Probabilistic Inference Rodrigo de Salvo Braz SRI International based on work with Eyal Amir and Dan Roth Outline n Motivation Brief background Lifted Inference example n First-order

More information

Bound and Free Variables. Theorems and Proofs. More valid formulas involving quantifiers:

Bound and Free Variables. Theorems and Proofs. More valid formulas involving quantifiers: Bound and Free Variables More valid formulas involving quantifiers: xp(x) x P(x) Replacing P by P, we get: x P(x) x P(x) Therefore x P(x) xp(x) Similarly, we have xp(x) x P(x) x P(x) xp(x) i(i 2 > i) is

More information

OPPA European Social Fund Prague & EU: We invest in your future.

OPPA European Social Fund Prague & EU: We invest in your future. OPPA European Social Fund Prague & EU: We invest in your future. AE4M33RZN, Fuzzy logic: Introduction, Fuzzy operators Radomír Černoch 19/11/2012 Faculty of Electrical Engineering, CTU in Prague 1 / 44

More information

AE4M33RZN, Fuzzy logic: Introduction, Fuzzy operators

AE4M33RZN, Fuzzy logic: Introduction, Fuzzy operators AE4M33RZN, Fuzzy logic: Introduction, Fuzzy operators Radomír Černoch Faculty of Electrical Engineering, CTU in Prague 2/11/2015 Description logics A description logic is a decideable fragment of first

More information

Logic. Logic is a discipline that studies the principles and methods used in correct reasoning. It includes:

Logic. Logic is a discipline that studies the principles and methods used in correct reasoning. It includes: Logic Logic is a discipline that studies the principles and methods used in correct reasoning It includes: A formal language for expressing statements. An inference mechanism (a collection of rules) to

More information

On the Role of Canonicity in Bottom-up Knowledge Compilation

On the Role of Canonicity in Bottom-up Knowledge Compilation On the Role of Canonicity in Bottom-up Knowledge Compilation Guy Van den Broeck and Adnan Darwiche Computer Science Department University of California, Los Angeles {guyvdb,darwiche}@cs.ucla.edu Abstract

More information

26 : Spectral GMs. Lecturer: Eric P. Xing Scribes: Guillermo A Cidre, Abelino Jimenez G.

26 : Spectral GMs. Lecturer: Eric P. Xing Scribes: Guillermo A Cidre, Abelino Jimenez G. 10-708: Probabilistic Graphical Models, Spring 2015 26 : Spectral GMs Lecturer: Eric P. Xing Scribes: Guillermo A Cidre, Abelino Jimenez G. 1 Introduction A common task in machine learning is to work with

More information

Verification using Satisfiability Checking, Predicate Abstraction, and Craig Interpolation. Himanshu Jain THESIS ORAL TALK

Verification using Satisfiability Checking, Predicate Abstraction, and Craig Interpolation. Himanshu Jain THESIS ORAL TALK Verification using Satisfiability Checking, Predicate Abstraction, and Craig Interpolation Himanshu Jain THESIS ORAL TALK 1 Computer Systems are Pervasive Computer Systems = Software + Hardware Software/Hardware

More information

Theory of Computer Science

Theory of Computer Science Theory of Computer Science B4. Predicate Logic II Malte Helmert University of Basel March 9, 2016 Free and Bound Variables Free and Bound Variables: Motivation Question: Consider a signature with variable

More information

Solving PP PP -Complete Problems Using Knowledge Compilation

Solving PP PP -Complete Problems Using Knowledge Compilation Solving PP PP -Complete Problems Using Knowledge Compilation Umut Oztok and Arthur Choi and Adnan Darwiche Computer Science Department University of California, Los Angeles {umut,aychoi,darwiche}@cs.ucla.edu

More information

Deliberative Agents Knowledge Representation I. Deliberative Agents

Deliberative Agents Knowledge Representation I. Deliberative Agents Deliberative Agents Knowledge Representation I Vasant Honavar Bioinformatics and Computational Biology Program Center for Computational Intelligence, Learning, & Discovery honavar@cs.iastate.edu www.cs.iastate.edu/~honavar/

More information

Tractable Lineages on Treelike Instances: Limits and Extensions

Tractable Lineages on Treelike Instances: Limits and Extensions Tractable Lineages on Treelike Instances: Limits and Extensions Antoine Amarilli 1, Pierre Bourhis 2, Pierre enellart 1,3 June 29th, 2016 1 Télécom ParisTech 2 CNR CRItAL 3 National University of ingapore

More information

A Differential Approach to Inference in Bayesian Networks

A Differential Approach to Inference in Bayesian Networks A Differential Approach to Inference in Bayesian Networks Adnan Darwiche Computer Science Department University of California Los Angeles, Ca 90095 darwiche@cs.ucla.edu We present a new approach to inference

More information

Database Theory VU , SS Complexity of Query Evaluation. Reinhard Pichler

Database Theory VU , SS Complexity of Query Evaluation. Reinhard Pichler Database Theory Database Theory VU 181.140, SS 2018 5. Complexity of Query Evaluation Reinhard Pichler Institut für Informationssysteme Arbeitsbereich DBAI Technische Universität Wien 17 April, 2018 Pichler

More information

Class Assignment Strategies

Class Assignment Strategies Class Assignment Strategies ì Team- A'ack: Team a'ack ì Individualis2c: Search for possible ì Poli2cal: look at others and make decision based on who is winning, who is loosing, and conversa;on ì Emo2on

More information

Some Review Problems for Exam 1: Solutions

Some Review Problems for Exam 1: Solutions Math 3355 Fall 2018 Some Review Problems for Exam 1: Solutions Here is my quick review of proof techniques. I will focus exclusively on propositions of the form p q, or more properly, x P (x) Q(x) or x

More information

Symmetry Breaking for Relational Weighted Model Finding

Symmetry Breaking for Relational Weighted Model Finding Symmetry Breaking for Relational Weighted Model Finding Tim Kopp Parag Singla Henry Kautz WL4AI July 27, 2015 Outline Weighted logics. Symmetry-breaking in SAT. Symmetry-breaking for weighted logics. Weighted

More information

Scaling-up Importance Sampling for Markov Logic Networks

Scaling-up Importance Sampling for Markov Logic Networks Scaling-up Importance Sampling for Markov Logic Networks Deepak Venugopal Department of Computer Science University of Texas at Dallas dxv021000@utdallas.edu Vibhav Gogate Department of Computer Science

More information

How to Compute Probability in MLN: An Tutorial in Examples

How to Compute Probability in MLN: An Tutorial in Examples How to Compute Probability in MLN: An Tutorial in Examples Guangchun Cheng March 8, 2012 1 Preliminary This article illustrates how to compute the probability in Markov logic network (MLN) using examples.

More information

Motivation. CS389L: Automated Logical Reasoning. Lecture 10: Overview of First-Order Theories. Signature and Axioms of First-Order Theory

Motivation. CS389L: Automated Logical Reasoning. Lecture 10: Overview of First-Order Theories. Signature and Axioms of First-Order Theory Motivation CS389L: Automated Logical Reasoning Lecture 10: Overview of First-Order Theories Işıl Dillig Last few lectures: Full first-order logic In FOL, functions/predicates are uninterpreted (i.e., structure

More information

Computing Query Probability with Incidence Algebras

Computing Query Probability with Incidence Algebras Computing Query Probability with Incidence Algebras Nilesh Dalvi Yahoo Research Santa Clara, CA, USA ndalvi@yahoo-inc.com Karl Schnaitter UC Santa Cruz Santa Cruz, CA, USA karlsch@soe.ucsc.edu Dan Suciu

More information

(Yet another) decision procedure for Equality Logic

(Yet another) decision procedure for Equality Logic (Yet another) decision procedure for Equality Logic Ofer Strichman and Orly Meir echnion echnion 1 Equality Logic 0 0 0 1 0 1 E : (x 1 = x 2 Æ (x 2 x 3 Ç x 1 x 3 )) Domain: x 1,x 2,x 3 2 N he satisfiability

More information

Logic. Definition [1] A logic is a formal language that comes with rules for deducing the truth of one proposition from the truth of another.

Logic. Definition [1] A logic is a formal language that comes with rules for deducing the truth of one proposition from the truth of another. Math 0413 Appendix A.0 Logic Definition [1] A logic is a formal language that comes with rules for deducing the truth of one proposition from the truth of another. This type of logic is called propositional.

More information

Foundation of proofs. Jim Hefferon.

Foundation of proofs. Jim Hefferon. Foundation of proofs Jim Hefferon http://joshua.smcvt.edu/proofs The need to prove In Mathematics we prove things To a person with a mathematical turn of mind, the base angles of an isoceles triangle are

More information

Basing Decisions on Sentences in Decision Diagrams

Basing Decisions on Sentences in Decision Diagrams Proceedings of the Twenty-Sixth AAAI Conference on Artificial Intelligence Basing Decisions on Sentences in Decision Diagrams Yexiang Xue Department of Computer Science Cornell University yexiang@cs.cornell.edu

More information

Hashing-Based Approximate Probabilistic Inference in Hybrid Domains: An Abridged Report

Hashing-Based Approximate Probabilistic Inference in Hybrid Domains: An Abridged Report Hashing-Based Approximate Probabilistic Inference in Hybrid Domains: An Abridged Report Vaishak Belle KU Leuven vaishak@cs.kuleuven.be Guy Van den Broeck University of California, Los Angeles guyvdb@cs.ucla.edu

More information

Quantifiers. Leonardo de Moura Microsoft Research

Quantifiers. Leonardo de Moura Microsoft Research Quantifiers Leonardo de Moura Microsoft Research Satisfiability a > b + 2, a = 2c + 10, c + b 1000 SAT a = 0, b = 3, c = 5 Model 0 > 3 + 2, 0 = 2 5 + 10, 5 + ( 3) 1000 Quantifiers x y x > 0 f x, y = 0

More information

Definite Logic Programs

Definite Logic Programs Chapter 2 Definite Logic Programs 2.1 Definite Clauses The idea of logic programming is to use a computer for drawing conclusions from declarative descriptions. Such descriptions called logic programs

More information

Introduction to Isabelle/HOL

Introduction to Isabelle/HOL Introduction to Isabelle/HOL 1 Notes on Isabelle/HOL Notation In Isabelle/HOL: [ A 1 ;A 2 ; ;A n ]G can be read as if A 1 and A 2 and and A n then G 3 Note: -Px (P x) stands for P (x) (P(x)) -P(x, y) can

More information

A. Propositional Logic

A. Propositional Logic CmSc 175 Discrete Mathematics A. Propositional Logic 1. Statements (Propositions ): Statements are sentences that claim certain things. Can be either true or false, but not both. Propositional logic deals

More information

Modal Dependence Logic

Modal Dependence Logic Modal Dependence Logic Jouko Väänänen Institute for Logic, Language and Computation Universiteit van Amsterdam Plantage Muidergracht 24 1018 TV Amsterdam, The Netherlands J.A.Vaananen@uva.nl Abstract We

More information