Reasoning with Bayesian Networks

Size: px
Start display at page:

Download "Reasoning with Bayesian Networks"

Transcription

1 Reasoning with Lecture 1: Probability Calculus, NICTA and ANU Reasoning with

2 Overview of the Course Probability calculus, Bayesian networks Inference by variable elimination, factor elimination, conditioning Models for graph decomposition Most likely instantiations Complexity of probabilistic inference Compiling Bayesian networks Inference with local structure Applications Reasoning with

3 of Worlds Probability Calculus Worlds modeled with propositional variables A world is complete instantiation of variables of all worlds add up to 1 An event is a set of worlds Reasoning with

4 of Events Probability Calculus Pr(α) def = ω α Pr(ω) Pr(Earthquake) = Pr(ω 1 ) + Pr(ω 2 ) + Pr(ω 3 ) + Pr(ω 4 ) =.1 Reasoning with

5 of Events Pr(α) def = ω α Pr(ω) Pr(Earthquake) = Pr(ω 1 ) + Pr(ω 2 ) + Pr(ω 3 ) + Pr(ω 4 ) =.1 Pr( Burglary) =.8 Pr(Alarm) =.2442 Reasoning with

6 Propositional Sentences as Events Pr((Earthquake Burglary) Alarm) =? Reasoning with

7 Propositional Sentences as Events Pr((Earthquake Burglary) Alarm) =? Interpret world ω as model (satisfying assignment) for propositional sentence α Pr(α) def = ω =α Pr(ω) Reasoning with

8 Probability Calculus Suppose we re given evidence β that Alarm = true Need to update Pr(.) to Pr(. β) Reasoning with

9 Pr(β β) = 1, Pr( β β) = 0 Partition worlds into those = β and those = β Pr(ω β) = 0, for all ω = β Reasoning with

10 Pr(β β) = 1, Pr( β β) = 0 Partition worlds into those = β and those = β Pr(ω β) = 0, for all ω = β Relative beliefs stay put, for ω = β Reasoning with

11 Pr(β β) = 1, Pr( β β) = 0 Partition worlds into those = β and those = β Pr(ω β) = 0, for all ω = β Relative beliefs stay put, for ω = β ω =β Pr(ω) = Pr(β) Reasoning with

12 Probability Calculus Pr(β β) = 1, Pr( β β) = 0 Partition worlds into those = β and those = β Pr(ω β) = 0, for all ω = β Relative beliefs stay put, for ω = β ω =β Pr(ω) = Pr(β) ω =β Pr(ω β) = 1 Reasoning with

13 Probability Calculus Pr(β β) = 1, Pr( β β) = 0 Partition worlds into those = β and those = β Pr(ω β) = 0, for all ω = β Relative beliefs stay put, for ω = β ω =β Pr(ω) = Pr(β) ω =β Pr(ω β) = 1 Above imply we normalize by constant Pr(β) Reasoning with

14 Probability Calculus Pr(β β) = 1, Pr( β β) = 0 Partition worlds into those = β and those = β Pr(ω β) = 0, for all ω = β Relative beliefs stay put, for ω = β ω =β Pr(ω) = Pr(β) ω =β Pr(ω β) = 1 Above imply we normalize by constant Pr(β) Pr(ω β) def = { 0, if ω = β Pr(ω)/ Pr(β), if ω = β Reasoning with

15 Probability Calculus Reasoning with

16 Bayes Conditioning Probability Calculus Pr(α β) = ω =α Pr(ω β) = = ω =α, ω =β ω =α, ω =β = = ω =α β Pr(α β) Pr(β) Pr(ω β) + Pr(ω β) = ω =α, ω = β ω =α β Pr(ω)/ Pr(β) = 1 Pr(β) Pr(ω β) Pr(ω β) ω =α β Pr(ω) Reasoning with

17 Bayes Conditioning Probability Calculus Pr(Burglary) =.2 Reasoning with

18 Bayes Conditioning Probability Calculus Pr(Burglary) =.2 Pr(Burglary Alarm).741 Reasoning with

19 Bayes Conditioning Probability Calculus Pr(Burglary) =.2 Pr(Burglary Alarm).741 Pr(Burglary Alarm Earthquake).253 Reasoning with

20 Bayes Conditioning Probability Calculus Pr(Burglary) =.2 Pr(Burglary Alarm).741 Pr(Burglary Alarm Earthquake).253 Pr(Burglary Alarm Earthquake).957 Reasoning with

21 Probability Calculus Pr(Earthquake) =.1 Pr(Earthquake Burglary) =.1 Reasoning with

22 Probability Calculus Pr(Earthquake) =.1 Pr(Earthquake Burglary) =.1 Event α independent of event β if Pr(α β) = Pr(α) or Pr(β) = 0 Reasoning with

23 Probability Calculus Pr(Earthquake) =.1 Pr(Earthquake Burglary) =.1 Event α independent of event β if Pr(α β) = Pr(α) or Pr(β) = 0 Or equivalently Pr(α β) = Pr(α) Pr(β) Reasoning with

24 Probability Calculus Pr(Earthquake) =.1 Pr(Earthquake Burglary) =.1 Event α independent of event β if Or equivalently is symmetric Pr(α β) = Pr(α) or Pr(β) = 0 Pr(α β) = Pr(α) Pr(β) Reasoning with

25 Conditional is a dynamic notion Burglary independent of Earthquake, but not after accepting evidence Alarm Pr(Burglary Alarm).741 Pr(Burglary Alarm Earthquake).253 Reasoning with

26 Conditional is a dynamic notion Burglary independent of Earthquake, but not after accepting evidence Alarm Pr(Burglary Alarm).741 Pr(Burglary Alarm Earthquake).253 α independent of β given γ if Pr(α β γ) = Pr(α γ) Pr(β γ) or Pr(γ) = 0 Reasoning with

27 Variable Probability Calculus For disjoint sets of variables X, Y, and Z I Pr (X, Z, Y) : X independent of Y given Z in Pr Means x independent of y given z for all instantiations of x, y, z Reasoning with

28 : Chain Rule Pr(α 1 α 2... α n ) = Pr(α 1 α 2... α n ) Pr(α 2 α 3... α n )... Pr(α n ) Reasoning with

29 : Case Analysis Or equivalently Pr(α) = n Pr(α β i ) i=1 Pr(α) = n Pr(α β i ) Pr(β i ) i=1 Where events β i are mutually exclusive and collectively exhaustive Reasoning with

30 : Case Analysis Special case of n = 2 Pr(α) = Pr(α β) + Pr(α β) Or equivalently Pr(α) = Pr(α β) Pr(β) + Pr(α β) Pr( β) Reasoning with

31 : Bayes Rule Pr(α β) = Pr(β α) Pr(α) Pr(β) Reasoning with

32 : Bayes Rule Example: patient tested positive for disease D: patient has disease T : test comes out positive Would like to know Pr(D T ) Reasoning with

33 : Bayes Rule Example: patient tested positive for disease D: patient has disease T : test comes out positive Would like to know Pr(D T ) We know Pr(D) = 1/1000 Pr(T D) = 2/100 (false positive) Pr( T D) = 5/100 (false negative) Reasoning with

34 : Bayes Rule Example: patient tested positive for disease D: patient has disease T : test comes out positive Would like to know Pr(D T ) We know Pr(D) = 1/1000 Pr(T D) = 2/100 (false positive) Pr( T D) = 5/100 (false negative) Using Bayes rule: Pr(D T ) = Pr(T D) Pr(D) Pr(T ) Reasoning with

35 : Bayes Rule Computing Pr(T ): case analysis Pr(T ) = = Pr(T D) Pr(D) + Pr(T D) Pr( D) = Reasoning with

36 : Bayes Rule Computing Pr(T ): case analysis Pr(T ) = = Pr(T D) Pr(D) + Pr(T D) Pr( D) = Pr(D T ) = % Reasoning with

37 Probability Calculus So far we ve considered hard evidence, that some event has occurred Soft evidence: neighbor with hearing problem calls to tell us they heard alarm May not confirm Alarm, but can still increase our belief in Alarm Two main methods to specify strength of soft evidence Reasoning with

38 : All Things Considered After receiving my neighbor s call, my belief in Alarm is now.85 Soft evidence as constraint Pr (β) = q Not a statement about strength of evidence per se, but result of its integration with our initial beliefs Reasoning with

39 : All Things Considered After receiving my neighbor s call, my belief in Alarm is now.85 Soft evidence as constraint Pr (β) = q Not a statement about strength of evidence per se, but result of its integration with our initial beliefs Scale beliefs in worlds = β so they add up to q Scale beliefs in worlds = β so they add up to 1 q Reasoning with

40 : All Things Considered For worlds Pr (ω) def = { q Pr(β) Pr(ω), if ω = β Pr(ω), if ω = β 1 q Pr( β) Reasoning with

41 : All Things Considered For worlds Pr (ω) def = { q Pr(β) Pr(ω), if ω = β Pr(ω), if ω = β 1 q Pr( β) For events (Jeffery s rule) Pr (α) = q Pr(α β) + (1 q) Pr(α β) Reasoning with

42 : All Things Considered For worlds Pr (ω) def = { q Pr(β) Pr(ω), if ω = β Pr(ω), if ω = β 1 q Pr( β) For events (Jeffery s rule) Pr (α) = q Pr(α β) + (1 q) Pr(α β) Bayes conditioning is special case of q = 1 Reasoning with

43 : Nothing Else Considered Would like to declare strength of evidence independently of current beliefs Reasoning with

44 : Nothing Else Considered Would like to declare strength of evidence independently of current beliefs Need notion of odds of event β: O(β) def = Pr(β) Pr( β) Reasoning with

45 : Nothing Else Considered Would like to declare strength of evidence independently of current beliefs Need notion of odds of event β: O(β) def = Pr(β) Pr( β) Declare evidence β by Bayes factor k = O (β) O(β) Reasoning with

46 : Nothing Else Considered Would like to declare strength of evidence independently of current beliefs Need notion of odds of event β: O(β) def = Pr(β) Pr( β) Declare evidence β by Bayes factor k = O (β) O(β) My neighbor s call increases odds of Alarm by factor of 4 Reasoning with

47 : Nothing Else Considered Would like to declare strength of evidence independently of current beliefs Need notion of odds of event β: O(β) def = Pr(β) Pr( β) Declare evidence β by Bayes factor k = O (β) O(β) My neighbor s call increases odds of Alarm by factor of 4 Compute Pr (β) and fall back on Jeffrey s rule to update beliefs Reasoning with

48 Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation Probabilistic Reasoning Basic task: belief updating in face of hard and soft evidence Can be done in principle using definitions we ve seen Inefficient, requires joint probability tables (exponential) Also a modeling difficulty Specifying joint probabilities is tedious Beliefs (e.g., independencies) held by human experts not easily enforced Reasoning with

49 Capturing Graphically Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation Reasoning with

50 Capturing Graphically Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation Evidence R could increase Pr(A), Pr(C) Reasoning with

51 Capturing Graphically Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation Evidence R could increase Pr(A), Pr(C) But not if we know A Reasoning with

52 Capturing Graphically Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation Evidence R could increase Pr(A), Pr(C) But not if we know A C independent of R given A Reasoning with

53 Capturing Graphically Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation Parents(V): variables with edge to V Descendants(V): variables to which directed path from V Nondescendants(V): all except V, Parents(V), and Descendants(V) Reasoning with

54 Markovian Assumptions Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation Every variable is conditionally independent of its nondescendants given its parents I (V, Parents(V ), Nondescendants(V )) Reasoning with

55 Markovian Assumptions Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation Every variable is conditionally independent of its nondescendants given its parents I (V, Parents(V ), Nondescendants(V )) Given direct causes of variable, our beliefs in that variable are no longer influenced by any other variable except possibly by its effects Reasoning with

56 Capturing Graphically Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation I (C, A, {B, E, R}) I (R, E, {A, B, C}) I (A, {B, E}, R) I (B,, {E, R}) I (E,, B) Reasoning with

57 Conditional Probability Tables Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation DAG G together with Markov(G) declares a set of independencies, but does not define Pr Pr is uniquely defined if a conditional probability table (CPT) is provided for each variable X CPT: Pr(x u) for every value x of variable X and every instantiation u of parents U Reasoning with

58 Conditional Probability Tables Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation Pr(c a) Pr(r e) Pr(a b, e) Pr(e) Pr(b) Reasoning with

59 Conditional Probability Tables Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation Reasoning with

60 Conditional Probability Tables Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation Half of probabilities are redundant Only need 10 probabilities for whole DAG Reasoning with

61 Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation Bayesian Network A pair (G, Θ) where G is DAG over variables Z, called network structure Θ is set of CPTs, one for each variable in Z, called network parametrization Reasoning with

62 Bayesian Network Probability Calculus Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation Θ X U : CPT for X and parents U X U: network family θ x u = Pr(x u): network parameter x θ x u = 1 for every parent instantiation u Reasoning with

63 Bayesian Network Probability Calculus Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation Network instantiation: e.g., a, b, c, d, e θ x u z: instantiations xu & z are compatible θ a, θ b a, θ c a, θ d b,c, θ e c all z above Reasoning with

64 Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation Bayesian Network constraints imposed by network structure (DAG) and numeric constraints imposed by network parametrization (CPTs) are satisfied by unique Pr Pr(z) def = θ x u z θ x u (chain rule) Bayesian network is implicit representation of this probability distribution Reasoning with

65 Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation Chain Rule Example Pr(z) def = θ x u z θ x u (chain rule) Pr(a, b, c, d, e) = θ a θ b a θ c a θ d b,c θ e c = (.6)(.2)(.2)(.9)(.1) =.0216 Reasoning with

66 Compactness of Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation Let d be max variable domain size, k max number of parents Size of any CPT = O(d k+1 ) Total number of network parameters = O(n d k+1 ) for n-variable network Compact as long as k is relatively small, compared with joint probability table (up to d n entries) Reasoning with

67 Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation Properties of Probabilistic Distribution Pr represented by Bayesian network (G, Θ) is guaranteed to satisfy every independence assumption in Markov(G) It may satisfy more I Pr (X, Parents(X ), Nondescendants(X )) Derive independence by graphoid axioms Reasoning with

68 Graphoid Axioms: Symmetry Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation I Pr (X, Z, Y) I Pr (Y, Z, X) If learning y does not influence our belief in x, then learning x does not influence our belief in y either Reasoning with

69 Graphoid Axioms: Decomposition Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation I Pr (X, Z, Y W) I Pr (X, Z, Y) I Pr (X, Z, W) If learning yw does not influence our belief in x, then learning y alone, or w alone, does not influence our belief in x either If some information is irrelevant, then any part of it is also irrelevant Reverse does not hold in general: Two pieces of information may each be irrelevant on their own, yet their combination may be relevant Reasoning with

70 Graphoid Axioms: Decomposition Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation I Pr (X, Z, Y W) I Pr (X, Z, Y) I Pr (X, Z, W) In particular, can strengthen Markov(G) I Pr (X, Parents(X ), W) for W Nondescendants(X ) Useful for proving chain rule for Bayesian networks from chain rule of probability calculus Reasoning with

71 Graphoid Axioms: Weak Union Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation I Pr (X, Z, Y W) I Pr (X, Z Y, W) If information yw is not relevant to our belief in x, then partial information y will not make rest of information, w, relevant In particular, can strengthen Markov(G) I Pr (X, Parents(X ) W, Nondescendants(X )\W) for W Nondescendants(X ) Reasoning with

72 Graphoid Axioms: Contraction Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation I Pr (X, Z, Y) I Pr (X, Z Y, W) I Pr (X, Z, Y W) If, after learning irrelevant information y, information w is found to be irrelevant to our belief in x, then combined information yw must have been irrelevant to start with Reasoning with

73 Intersection Axiom Probability Calculus Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation I Pr (X, Z W, Y) I Pr (X, Z Y, W) I Pr (X, Z, Y W) for strictly positive distribution Pr If information w is irrelevant given y, and information y is irrelevant given w, then their combination is irrelevant to start with Not true in general, when there re zero probabilities (determinism) Reasoning with

74 Graphical Test of Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation Graphoid axioms produce independencies beyond Markov(G) Derivation of independencies is nontrivial; need efficient way Linear-time graphical test : d-separation Reasoning with

75 d-separation Probability Calculus Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation dsep G (X, Z, Y): every path between X & Y is blocked by Z dsep G (X, Z, Y) I Pr (X, Z, Y) for every Pr induced by G (soundness) Also enjoys weak completeness (discussed later) Reasoning with

76 d-separation: Paths and Valves Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation Reasoning with

77 d-separation: Paths and Valves Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation Reasoning with

78 d-separation: Path Blocking Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation Path is blocked if any valve is closed Sequential valve W closed iff W Z Divergent valve W closed iff W Z Convergent value W closed iff W Z and none of Descendants(W ) appears in Z Reasoning with

79 d-separation: Examples Probability Calculus Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation Reasoning with

80 d-separation: Examples Probability Calculus Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation Reasoning with

81 Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation d-separation: Complexity Need not enumerate paths between X & Y dsep G (X, Z, Y) iff X and Y are disconnected in new DAG G obtained by pruning DAG G as follows Delete all leaves W X Y Z Delete all edges outgoing from Z Linear in size of G Reasoning with

82 d-separation: Soundness Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation dsep G (X, Z, Y) I Pr (X, Z, Y) for Pr induced by G Prove by showing every independence claimed by d-separation can be derived from graphoid axioms Reasoning with

83 Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation d-separation: Completeness? dsep G (X, Z, Y) I Pr (X, Z, Y)? Does not hold Consider X Y Z X not d-separated from Z (given ) However, if θ y x = θ y x then X & Y are independent, so are X & Z Not surprising as d-separation is based on structure alone Reasoning with

84 d-separation: Weak Completeness Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation For every DAG G, parametrization Θ such that dsep G (X, Z, Y) I Pr (X, Z, Y) where Pr is induced by Bayesian network (G, Θ) Implies that one cannot improve on d-separation Reasoning with

85 Summary Probability Calculus Capturing Graphically Conditional Probability Tables Graphoid Axioms d-separation of worlds & events, Bayes conditioning, independence, chain rule, case analysis, Bayes rule, two methods for soft evidence Capturing independence graphically, Markovian assumptions, conditional probabilities tables, Bayesian networks, chain rule, graphoid axioms, d-separation Reasoning with

Logic and Bayesian Networks

Logic and Bayesian Networks Logic and Part 1: and Jinbo Huang Jinbo Huang and 1/ 31 What This Course Is About Probabilistic reasoning with Bayesian networks Reasoning by logical encoding and compilation Jinbo Huang and 2/ 31 Probabilities

More information

Probability Calculus. Chapter From Propositional to Graded Beliefs

Probability Calculus. Chapter From Propositional to Graded Beliefs Chapter 2 Probability Calculus Our purpose in this chapter is to introduce probability calculus and then show how it can be used to represent uncertain beliefs, and then change them in the face of new

More information

Bayesian Networks. Probability Theory and Probabilistic Reasoning. Emma Rollon and Javier Larrosa Q

Bayesian Networks. Probability Theory and Probabilistic Reasoning. Emma Rollon and Javier Larrosa Q Bayesian Networks Probability Theory and Probabilistic Reasoning Emma Rollon and Javier Larrosa Q1-2015-2016 Emma Rollon and Javier Larrosa Bayesian Networks Q1-2015-2016 1 / 26 Degrees of belief Given

More information

Refresher on Probability Theory

Refresher on Probability Theory Much of this material is adapted from Chapters 2 and 3 of Darwiche s book January 16, 2014 1 Preliminaries 2 Degrees of Belief 3 Independence 4 Other Important Properties 5 Wrap-up Primitives The following

More information

STATISTICAL METHODS IN AI/ML Vibhav Gogate The University of Texas at Dallas. Bayesian networks: Representation

STATISTICAL METHODS IN AI/ML Vibhav Gogate The University of Texas at Dallas. Bayesian networks: Representation STATISTICAL METHODS IN AI/ML Vibhav Gogate The University of Texas at Dallas Bayesian networks: Representation Motivation Explicit representation of the joint distribution is unmanageable Computationally:

More information

Probability Calculus. p.1

Probability Calculus. p.1 Probability Calculus p.1 Joint probability distribution A state of belief : assign a degree of belief or probability to each world 0 Pr ω 1 ω Ω sentence > event world > elementary event Pr ω ω α 1 Pr ω

More information

Preliminaries Bayesian Networks Graphoid Axioms d-separation Wrap-up. Bayesian Networks. Brandon Malone

Preliminaries Bayesian Networks Graphoid Axioms d-separation Wrap-up. Bayesian Networks. Brandon Malone Preliminaries Graphoid Axioms d-separation Wrap-up Much of this material is adapted from Chapter 4 of Darwiche s book January 23, 2014 Preliminaries Graphoid Axioms d-separation Wrap-up 1 Preliminaries

More information

Bayesian Networks. Semantics of Bayes Nets. Example (Binary valued Variables) CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-III

Bayesian Networks. Semantics of Bayes Nets. Example (Binary valued Variables) CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-III CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-III Bayesian Networks Announcements: Drop deadline is this Sunday Nov 5 th. All lecture notes needed for T3 posted (L13,,L17). T3 sample

More information

Probabilistic Reasoning. (Mostly using Bayesian Networks)

Probabilistic Reasoning. (Mostly using Bayesian Networks) Probabilistic Reasoning (Mostly using Bayesian Networks) Introduction: Why probabilistic reasoning? The world is not deterministic. (Usually because information is limited.) Ways of coping with uncertainty

More information

Bayesian Networks: Representation, Variable Elimination

Bayesian Networks: Representation, Variable Elimination Bayesian Networks: Representation, Variable Elimination CS 6375: Machine Learning Class Notes Instructor: Vibhav Gogate The University of Texas at Dallas We can view a Bayesian network as a compact representation

More information

Lecture 8. Probabilistic Reasoning CS 486/686 May 25, 2006

Lecture 8. Probabilistic Reasoning CS 486/686 May 25, 2006 Lecture 8 Probabilistic Reasoning CS 486/686 May 25, 2006 Outline Review probabilistic inference, independence and conditional independence Bayesian networks What are they What do they mean How do we create

More information

Independencies. Undirected Graphical Models 2: Independencies. Independencies (Markov networks) Independencies (Bayesian Networks)

Independencies. Undirected Graphical Models 2: Independencies. Independencies (Markov networks) Independencies (Bayesian Networks) (Bayesian Networks) Undirected Graphical Models 2: Use d-separation to read off independencies in a Bayesian network Takes a bit of effort! 1 2 (Markov networks) Use separation to determine independencies

More information

Uncertainty and Bayesian Networks

Uncertainty and Bayesian Networks Uncertainty and Bayesian Networks Tutorial 3 Tutorial 3 1 Outline Uncertainty Probability Syntax and Semantics for Uncertainty Inference Independence and Bayes Rule Syntax and Semantics for Bayesian Networks

More information

Quantifying uncertainty & Bayesian networks

Quantifying uncertainty & Bayesian networks Quantifying uncertainty & Bayesian networks CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2016 Soleymani Artificial Intelligence: A Modern Approach, 3 rd Edition,

More information

Quantifying Uncertainty & Probabilistic Reasoning. Abdulla AlKhenji Khaled AlEmadi Mohammed AlAnsari

Quantifying Uncertainty & Probabilistic Reasoning. Abdulla AlKhenji Khaled AlEmadi Mohammed AlAnsari Quantifying Uncertainty & Probabilistic Reasoning Abdulla AlKhenji Khaled AlEmadi Mohammed AlAnsari Outline Previous Implementations What is Uncertainty? Acting Under Uncertainty Rational Decisions Basic

More information

Motivation. Bayesian Networks in Epistemology and Philosophy of Science Lecture. Overview. Organizational Issues

Motivation. Bayesian Networks in Epistemology and Philosophy of Science Lecture. Overview. Organizational Issues Bayesian Networks in Epistemology and Philosophy of Science Lecture 1: Bayesian Networks Center for Logic and Philosophy of Science Tilburg University, The Netherlands Formal Epistemology Course Northern

More information

PROBABILISTIC REASONING SYSTEMS

PROBABILISTIC REASONING SYSTEMS PROBABILISTIC REASONING SYSTEMS In which we explain how to build reasoning systems that use network models to reason with uncertainty according to the laws of probability theory. Outline Knowledge in uncertain

More information

Axioms of Probability? Notation. Bayesian Networks. Bayesian Networks. Today we ll introduce Bayesian Networks.

Axioms of Probability? Notation. Bayesian Networks. Bayesian Networks. Today we ll introduce Bayesian Networks. Bayesian Networks Today we ll introduce Bayesian Networks. This material is covered in chapters 13 and 14. Chapter 13 gives basic background on probability and Chapter 14 talks about Bayesian Networks.

More information

Bayesian Networks. Vibhav Gogate The University of Texas at Dallas

Bayesian Networks. Vibhav Gogate The University of Texas at Dallas Bayesian Networks Vibhav Gogate The University of Texas at Dallas Intro to AI (CS 6364) Many slides over the course adapted from either Dan Klein, Luke Zettlemoyer, Stuart Russell or Andrew Moore 1 Outline

More information

Bayesian Networks. Axioms of Probability Theory. Conditional Probability. Inference by Enumeration. Inference by Enumeration CSE 473

Bayesian Networks. Axioms of Probability Theory. Conditional Probability. Inference by Enumeration. Inference by Enumeration CSE 473 ayesian Networks CSE 473 Last Time asic notions tomic events Probabilities Joint distribution Inference by enumeration Independence & conditional independence ayes rule ayesian networks Statistical learning

More information

Uncertainty. Introduction to Artificial Intelligence CS 151 Lecture 2 April 1, CS151, Spring 2004

Uncertainty. Introduction to Artificial Intelligence CS 151 Lecture 2 April 1, CS151, Spring 2004 Uncertainty Introduction to Artificial Intelligence CS 151 Lecture 2 April 1, 2004 Administration PA 1 will be handed out today. There will be a MATLAB tutorial tomorrow, Friday, April 2 in AP&M 4882 at

More information

Bayesian Networks. Vibhav Gogate The University of Texas at Dallas

Bayesian Networks. Vibhav Gogate The University of Texas at Dallas Bayesian Networks Vibhav Gogate The University of Texas at Dallas Intro to AI (CS 4365) Many slides over the course adapted from either Dan Klein, Luke Zettlemoyer, Stuart Russell or Andrew Moore 1 Outline

More information

1. what conditional independencies are implied by the graph. 2. whether these independecies correspond to the probability distribution

1. what conditional independencies are implied by the graph. 2. whether these independecies correspond to the probability distribution NETWORK ANALYSIS Lourens Waldorp PROBABILITY AND GRAPHS The objective is to obtain a correspondence between the intuitive pictures (graphs) of variables of interest and the probability distributions of

More information

Probabilistic Reasoning Systems

Probabilistic Reasoning Systems Probabilistic Reasoning Systems Dr. Richard J. Povinelli Copyright Richard J. Povinelli rev 1.0, 10/7/2001 Page 1 Objectives You should be able to apply belief networks to model a problem with uncertainty.

More information

Bayesian Networks. Material used

Bayesian Networks. Material used Bayesian Networks Material used Halpern: Reasoning about Uncertainty. Chapter 4 Stuart Russell and Peter Norvig: Artificial Intelligence: A Modern Approach 1 Random variables 2 Probabilistic independence

More information

Objectives. Probabilistic Reasoning Systems. Outline. Independence. Conditional independence. Conditional independence II.

Objectives. Probabilistic Reasoning Systems. Outline. Independence. Conditional independence. Conditional independence II. Copyright Richard J. Povinelli rev 1.0, 10/1//2001 Page 1 Probabilistic Reasoning Systems Dr. Richard J. Povinelli Objectives You should be able to apply belief networks to model a problem with uncertainty.

More information

Bayesian networks. Chapter 14, Sections 1 4

Bayesian networks. Chapter 14, Sections 1 4 Bayesian networks Chapter 14, Sections 1 4 Artificial Intelligence, spring 2013, Peter Ljunglöf; based on AIMA Slides c Stuart Russel and Peter Norvig, 2004 Chapter 14, Sections 1 4 1 Bayesian networks

More information

Lecture 10: Introduction to reasoning under uncertainty. Uncertainty

Lecture 10: Introduction to reasoning under uncertainty. Uncertainty Lecture 10: Introduction to reasoning under uncertainty Introduction to reasoning under uncertainty Review of probability Axioms and inference Conditional probability Probability distributions COMP-424,

More information

Introduction to Bayes Nets. CS 486/686: Introduction to Artificial Intelligence Fall 2013

Introduction to Bayes Nets. CS 486/686: Introduction to Artificial Intelligence Fall 2013 Introduction to Bayes Nets CS 486/686: Introduction to Artificial Intelligence Fall 2013 1 Introduction Review probabilistic inference, independence and conditional independence Bayesian Networks - - What

More information

CS 2750: Machine Learning. Bayesian Networks. Prof. Adriana Kovashka University of Pittsburgh March 14, 2016

CS 2750: Machine Learning. Bayesian Networks. Prof. Adriana Kovashka University of Pittsburgh March 14, 2016 CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016 Plan for today and next week Today and next time: Bayesian networks (Bishop Sec. 8.1) Conditional

More information

Bayesian networks. Chapter Chapter

Bayesian networks. Chapter Chapter Bayesian networks Chapter 14.1 3 Chapter 14.1 3 1 Outline Syntax Semantics Parameterized distributions Chapter 14.1 3 2 Bayesian networks A simple, graphical notation for conditional independence assertions

More information

Intelligent Systems: Reasoning and Recognition. Reasoning with Bayesian Networks

Intelligent Systems: Reasoning and Recognition. Reasoning with Bayesian Networks Intelligent Systems: Reasoning and Recognition James L. Crowley ENSIMAG 2 / MoSIG M1 Second Semester 2016/2017 Lesson 13 24 march 2017 Reasoning with Bayesian Networks Naïve Bayesian Systems...2 Example

More information

Bayesian belief networks

Bayesian belief networks CS 2001 Lecture 1 Bayesian belief networks Milos Hauskrecht milos@cs.pitt.edu 5329 Sennott Square 4-8845 Milos research interests Artificial Intelligence Planning, reasoning and optimization in the presence

More information

Bayesian Networks. Motivation

Bayesian Networks. Motivation Bayesian Networks Computer Sciences 760 Spring 2014 http://pages.cs.wisc.edu/~dpage/cs760/ Motivation Assume we have five Boolean variables,,,, The joint probability is,,,, How many state configurations

More information

Introduction to Probabilistic Reasoning. Image credit: NASA. Assignment

Introduction to Probabilistic Reasoning. Image credit: NASA. Assignment Introduction to Probabilistic Reasoning Brian C. Williams 16.410/16.413 November 17 th, 2010 11/17/10 copyright Brian Williams, 2005-10 1 Brian C. Williams, copyright 2000-09 Image credit: NASA. Assignment

More information

Events A and B are independent P(A) = P(A B) = P(A B) / P(B)

Events A and B are independent P(A) = P(A B) = P(A B) / P(B) Events A and B are independent A B U P(A) = P(A B) = P(A B) / P(B) 1 Alternative Characterization of Independence a) P(A B) = P(A) b) P(A B) = P(A) P(B) Recall P(A B) = P(A B) / P(B) (if P(B) 0) So P(A

More information

Inference in Bayesian Networks

Inference in Bayesian Networks Lecture 7 Inference in Bayesian Networks Marco Chiarandini Department of Mathematics & Computer Science University of Southern Denmark Slides by Stuart Russell and Peter Norvig Course Overview Introduction

More information

A Tutorial on Bayesian Belief Networks

A Tutorial on Bayesian Belief Networks A Tutorial on Bayesian Belief Networks Mark L Krieg Surveillance Systems Division Electronics and Surveillance Research Laboratory DSTO TN 0403 ABSTRACT This tutorial provides an overview of Bayesian belief

More information

Probabilistic Representation and Reasoning

Probabilistic Representation and Reasoning Probabilistic Representation and Reasoning Alessandro Panella Department of Computer Science University of Illinois at Chicago May 4, 2010 Alessandro Panella (CS Dept. - UIC) Probabilistic Representation

More information

1 : Introduction. 1 Course Overview. 2 Notation. 3 Representing Multivariate Distributions : Probabilistic Graphical Models , Spring 2014

1 : Introduction. 1 Course Overview. 2 Notation. 3 Representing Multivariate Distributions : Probabilistic Graphical Models , Spring 2014 10-708: Probabilistic Graphical Models 10-708, Spring 2014 1 : Introduction Lecturer: Eric P. Xing Scribes: Daniel Silva and Calvin McCarter 1 Course Overview In this lecture we introduce the concept of

More information

Directed Graphical Models

Directed Graphical Models CS 2750: Machine Learning Directed Graphical Models Prof. Adriana Kovashka University of Pittsburgh March 28, 2017 Graphical Models If no assumption of independence is made, must estimate an exponential

More information

Probabilistic representation and reasoning

Probabilistic representation and reasoning Probabilistic representation and reasoning Applied artificial intelligence (EDA132) Lecture 09 2017-02-15 Elin A. Topp Material based on course book, chapter 13, 14.1-3 1 Show time! Two boxes of chocolates,

More information

Bayesian Belief Network

Bayesian Belief Network Bayesian Belief Network a! b) = a) b) toothache, catch, cavity, Weather = cloudy) = = Weather = cloudy) toothache, catch, cavity) The decomposition of large probabilistic domains into weakly connected

More information

CS 188: Artificial Intelligence Spring Announcements

CS 188: Artificial Intelligence Spring Announcements CS 188: Artificial Intelligence Spring 2011 Lecture 16: Bayes Nets IV Inference 3/28/2011 Pieter Abbeel UC Berkeley Many slides over this course adapted from Dan Klein, Stuart Russell, Andrew Moore Announcements

More information

Bayes Networks. CS540 Bryan R Gibson University of Wisconsin-Madison. Slides adapted from those used by Prof. Jerry Zhu, CS540-1

Bayes Networks. CS540 Bryan R Gibson University of Wisconsin-Madison. Slides adapted from those used by Prof. Jerry Zhu, CS540-1 Bayes Networks CS540 Bryan R Gibson University of Wisconsin-Madison Slides adapted from those used by Prof. Jerry Zhu, CS540-1 1 / 59 Outline Joint Probability: great for inference, terrible to obtain

More information

Probabilistic Graphical Models. Rudolf Kruse, Alexander Dockhorn Bayesian Networks 153

Probabilistic Graphical Models. Rudolf Kruse, Alexander Dockhorn Bayesian Networks 153 Probabilistic Graphical Models Rudolf Kruse, Alexander Dockhorn Bayesian Networks 153 The Big Objective(s) In a wide variety of application fields two main problems need to be addressed over and over:

More information

Probability. CS 3793/5233 Artificial Intelligence Probability 1

Probability. CS 3793/5233 Artificial Intelligence Probability 1 CS 3793/5233 Artificial Intelligence 1 Motivation Motivation Random Variables Semantics Dice Example Joint Dist. Ex. Axioms Agents don t have complete knowledge about the world. Agents need to make decisions

More information

Fusion in simple models

Fusion in simple models Fusion in simple models Peter Antal antal@mit.bme.hu A.I. February 8, 2018 1 Basic concepts of probability theory Joint distribution Conditional probability Bayes rule Chain rule Marginalization General

More information

Our learning goals for Bayesian Nets

Our learning goals for Bayesian Nets Our learning goals for Bayesian Nets Probability background: probability spaces, conditional probability, Bayes Theorem, subspaces, random variables, joint distribution. The concept of conditional independence

More information

Uncertainty. Logic and Uncertainty. Russell & Norvig. Readings: Chapter 13. One problem with logical-agent approaches: C:145 Artificial

Uncertainty. Logic and Uncertainty. Russell & Norvig. Readings: Chapter 13. One problem with logical-agent approaches: C:145 Artificial C:145 Artificial Intelligence@ Uncertainty Readings: Chapter 13 Russell & Norvig. Artificial Intelligence p.1/43 Logic and Uncertainty One problem with logical-agent approaches: Agents almost never have

More information

Introduction to Artificial Intelligence. Unit # 11

Introduction to Artificial Intelligence. Unit # 11 Introduction to Artificial Intelligence Unit # 11 1 Course Outline Overview of Artificial Intelligence State Space Representation Search Techniques Machine Learning Logic Probabilistic Reasoning/Bayesian

More information

Directed and Undirected Graphical Models

Directed and Undirected Graphical Models Directed and Undirected Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Machine Learning: Neural Networks and Advanced Models (AA2) Last Lecture Refresher Lecture Plan Directed

More information

Learning Bayesian Networks (part 1) Goals for the lecture

Learning Bayesian Networks (part 1) Goals for the lecture Learning Bayesian Networks (part 1) Mark Craven and David Page Computer Scices 760 Spring 2018 www.biostat.wisc.edu/~craven/cs760/ Some ohe slides in these lectures have been adapted/borrowed from materials

More information

Y. Xiang, Inference with Uncertain Knowledge 1

Y. Xiang, Inference with Uncertain Knowledge 1 Inference with Uncertain Knowledge Objectives Why must agent use uncertain knowledge? Fundamentals of Bayesian probability Inference with full joint distributions Inference with Bayes rule Bayesian networks

More information

Probabilistic representation and reasoning

Probabilistic representation and reasoning Probabilistic representation and reasoning Applied artificial intelligence (EDAF70) Lecture 04 2019-02-01 Elin A. Topp Material based on course book, chapter 13, 14.1-3 1 Show time! Two boxes of chocolates,

More information

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016 Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2016 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several

More information

Causality in Econometrics (3)

Causality in Econometrics (3) Graphical Causal Models References Causality in Econometrics (3) Alessio Moneta Max Planck Institute of Economics Jena moneta@econ.mpg.de 26 April 2011 GSBC Lecture Friedrich-Schiller-Universität Jena

More information

COMP5211 Lecture Note on Reasoning under Uncertainty

COMP5211 Lecture Note on Reasoning under Uncertainty COMP5211 Lecture Note on Reasoning under Uncertainty Fangzhen Lin Department of Computer Science and Engineering Hong Kong University of Science and Technology Fangzhen Lin (HKUST) Uncertainty 1 / 33 Uncertainty

More information

Bayesian Networks and Decision Graphs

Bayesian Networks and Decision Graphs Bayesian Networks and Decision Graphs A 3-week course at Reykjavik University Finn V. Jensen & Uffe Kjærulff ({fvj,uk}@cs.aau.dk) Group of Machine Intelligence Department of Computer Science, Aalborg University

More information

Learning Bayesian Network Parameters under Equivalence Constraints

Learning Bayesian Network Parameters under Equivalence Constraints Learning Bayesian Network Parameters under Equivalence Constraints Tiansheng Yao 1,, Arthur Choi, Adnan Darwiche Computer Science Department University of California, Los Angeles Los Angeles, CA 90095

More information

Recall from last time: Conditional probabilities. Lecture 2: Belief (Bayesian) networks. Bayes ball. Example (continued) Example: Inference problem

Recall from last time: Conditional probabilities. Lecture 2: Belief (Bayesian) networks. Bayes ball. Example (continued) Example: Inference problem Recall from last time: Conditional probabilities Our probabilistic models will compute and manipulate conditional probabilities. Given two random variables X, Y, we denote by Lecture 2: Belief (Bayesian)

More information

Bayesian Network. Outline. Bayesian Network. Syntax Semantics Exact inference by enumeration Exact inference by variable elimination

Bayesian Network. Outline. Bayesian Network. Syntax Semantics Exact inference by enumeration Exact inference by variable elimination Outline Syntax Semantics Exact inference by enumeration Exact inference by variable elimination s A simple, graphical notation for conditional independence assertions and hence for compact specication

More information

STATISTICAL METHODS IN AI/ML Vibhav Gogate The University of Texas at Dallas. Propositional Logic and Probability Theory: Review

STATISTICAL METHODS IN AI/ML Vibhav Gogate The University of Texas at Dallas. Propositional Logic and Probability Theory: Review STATISTICAL METHODS IN AI/ML Vibhav Gogate The University of Texas at Dallas Propositional Logic and Probability Theory: Review Logic Logics are formal languages for representing information such that

More information

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 CS 70 Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 Introduction One of the key properties of coin flips is independence: if you flip a fair coin ten times and get ten

More information

CS839: Probabilistic Graphical Models. Lecture 2: Directed Graphical Models. Theo Rekatsinas

CS839: Probabilistic Graphical Models. Lecture 2: Directed Graphical Models. Theo Rekatsinas CS839: Probabilistic Graphical Models Lecture 2: Directed Graphical Models Theo Rekatsinas 1 Questions Questions? Waiting list Questions on other logistics 2 Section 1 1. Intro to Bayes Nets 3 Section

More information

Bayesian Networks (Part II)

Bayesian Networks (Part II) 10-601 Introduction to Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University Bayesian Networks (Part II) Graphical Model Readings: Murphy 10 10.2.1 Bishop 8.1,

More information

Lecture 8: December 17, 2003

Lecture 8: December 17, 2003 Computational Genomics Fall Semester, 2003 Lecture 8: December 17, 2003 Lecturer: Irit Gat-Viks Scribe: Tal Peled and David Burstein 8.1 Exploiting Independence Property 8.1.1 Introduction Our goal is

More information

Review: Bayesian learning and inference

Review: Bayesian learning and inference Review: Bayesian learning and inference Suppose the agent has to make decisions about the value of an unobserved query variable X based on the values of an observed evidence variable E Inference problem:

More information

Bayesian networks. Soleymani. CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2018

Bayesian networks. Soleymani. CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2018 Bayesian networks CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2018 Soleymani Slides have been adopted from Klein and Abdeel, CS188, UC Berkeley. Outline Probability

More information

Bayesian Networks Introduction to Machine Learning. Matt Gormley Lecture 24 April 9, 2018

Bayesian Networks Introduction to Machine Learning. Matt Gormley Lecture 24 April 9, 2018 10-601 Introduction to Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University Bayesian Networks Matt Gormley Lecture 24 April 9, 2018 1 Homework 7: HMMs Reminders

More information

Capturing Independence Graphically; Undirected Graphs

Capturing Independence Graphically; Undirected Graphs Capturing Independence Graphically; Undirected Graphs COMPSCI 276, Spring 2017 Set 2: Rina Dechter (Reading: Pearl chapters 3, Darwiche chapter 4) 1 Outline Graphical models: The constraint network, Probabilistic

More information

CS 188: Artificial Intelligence. Bayes Nets

CS 188: Artificial Intelligence. Bayes Nets CS 188: Artificial Intelligence Probabilistic Inference: Enumeration, Variable Elimination, Sampling Pieter Abbeel UC Berkeley Many slides over this course adapted from Dan Klein, Stuart Russell, Andrew

More information

This lecture. Reading. Conditional Independence Bayesian (Belief) Networks: Syntax and semantics. Chapter CS151, Spring 2004

This lecture. Reading. Conditional Independence Bayesian (Belief) Networks: Syntax and semantics. Chapter CS151, Spring 2004 This lecture Conditional Independence Bayesian (Belief) Networks: Syntax and semantics Reading Chapter 14.1-14.2 Propositions and Random Variables Letting A refer to a proposition which may either be true

More information

Bayesian Networks aka belief networks, probabilistic networks. Bayesian Networks aka belief networks, probabilistic networks. An Example Bayes Net

Bayesian Networks aka belief networks, probabilistic networks. Bayesian Networks aka belief networks, probabilistic networks. An Example Bayes Net Bayesian Networks aka belief networks, probabilistic networks A BN over variables {X 1, X 2,, X n } consists of: a DAG whose nodes are the variables a set of PTs (Pr(X i Parents(X i ) ) for each X i P(a)

More information

CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-II

CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-II CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-II 1 Bayes Rule Example Disease {malaria, cold, flu}; Symptom = fever Must compute Pr(D fever) to prescribe treatment Why not assess

More information

CSE 473: Artificial Intelligence Autumn 2011

CSE 473: Artificial Intelligence Autumn 2011 CSE 473: Artificial Intelligence Autumn 2011 Bayesian Networks Luke Zettlemoyer Many slides over the course adapted from either Dan Klein, Stuart Russell or Andrew Moore 1 Outline Probabilistic models

More information

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005 Instructor: Professor Jeff A. Bilmes Uncertainty & Bayesian Networks

More information

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ). CS 70 Discrete Mathematics for CS Spring 2006 Vazirani Lecture 8 Conditional Probability A pharmaceutical company is marketing a new test for a certain medical condition. According to clinical trials,

More information

CS 484 Data Mining. Classification 7. Some slides are from Professor Padhraic Smyth at UC Irvine

CS 484 Data Mining. Classification 7. Some slides are from Professor Padhraic Smyth at UC Irvine CS 484 Data Mining Classification 7 Some slides are from Professor Padhraic Smyth at UC Irvine Bayesian Belief networks Conditional independence assumption of Naïve Bayes classifier is too strong. Allows

More information

Bayesian Networks Representation

Bayesian Networks Representation Bayesian Networks Representation Machine Learning 10701/15781 Carlos Guestrin Carnegie Mellon University March 19 th, 2007 Handwriting recognition Character recognition, e.g., kernel SVMs a c z rr r r

More information

Bayesian belief networks. Inference.

Bayesian belief networks. Inference. Lecture 13 Bayesian belief networks. Inference. Milos Hauskrecht milos@cs.pitt.edu 5329 Sennott Square Midterm exam Monday, March 17, 2003 In class Closed book Material covered by Wednesday, March 12 Last

More information

Formalizing Probability. Choosing the Sample Space. Probability Measures

Formalizing Probability. Choosing the Sample Space. Probability Measures Formalizing Probability Choosing the Sample Space What do we assign probability to? Intuitively, we assign them to possible events (things that might happen, outcomes of an experiment) Formally, we take

More information

Probabilistic Graphical Networks: Definitions and Basic Results

Probabilistic Graphical Networks: Definitions and Basic Results This document gives a cursory overview of Probabilistic Graphical Networks. The material has been gleaned from different sources. I make no claim to original authorship of this material. Bayesian Graphical

More information

Uncertainty. 22c:145 Artificial Intelligence. Problem of Logic Agents. Foundations of Probability. Axioms of Probability

Uncertainty. 22c:145 Artificial Intelligence. Problem of Logic Agents. Foundations of Probability. Axioms of Probability Problem of Logic Agents 22c:145 Artificial Intelligence Uncertainty Reading: Ch 13. Russell & Norvig Logic-agents almost never have access to the whole truth about their environments. A rational agent

More information

Informatics 2D Reasoning and Agents Semester 2,

Informatics 2D Reasoning and Agents Semester 2, Informatics 2D Reasoning and Agents Semester 2, 2017 2018 Alex Lascarides alex@inf.ed.ac.uk Lecture 23 Probabilistic Reasoning with Bayesian Networks 15th March 2018 Informatics UoE Informatics 2D 1 Where

More information

Lecture 8: Bayesian Networks

Lecture 8: Bayesian Networks Lecture 8: Bayesian Networks Bayesian Networks Inference in Bayesian Networks COMP-652 and ECSE 608, Lecture 8 - January 31, 2017 1 Bayes nets P(E) E=1 E=0 0.005 0.995 E B P(B) B=1 B=0 0.01 0.99 E=0 E=1

More information

Uncertainty and Belief Networks. Introduction to Artificial Intelligence CS 151 Lecture 1 continued Ok, Lecture 2!

Uncertainty and Belief Networks. Introduction to Artificial Intelligence CS 151 Lecture 1 continued Ok, Lecture 2! Uncertainty and Belief Networks Introduction to Artificial Intelligence CS 151 Lecture 1 continued Ok, Lecture 2! This lecture Conditional Independence Bayesian (Belief) Networks: Syntax and semantics

More information

Introduction to Artificial Intelligence Belief networks

Introduction to Artificial Intelligence Belief networks Introduction to Artificial Intelligence Belief networks Chapter 15.1 2 Dieter Fox Based on AIMA Slides c S. Russell and P. Norvig, 1998 Chapter 15.1 2 0-0 Outline Bayesian networks: syntax and semantics

More information

Prof. Dr. Lars Schmidt-Thieme, L. B. Marinho, K. Buza Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany, Course

Prof. Dr. Lars Schmidt-Thieme, L. B. Marinho, K. Buza Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany, Course Course on Bayesian Networks, winter term 2007 0/31 Bayesian Networks Bayesian Networks I. Bayesian Networks / 1. Probabilistic Independence and Separation in Graphs Prof. Dr. Lars Schmidt-Thieme, L. B.

More information

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10 EECS 70 Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10 Introduction to Basic Discrete Probability In the last note we considered the probabilistic experiment where we flipped

More information

Markov properties for undirected graphs

Markov properties for undirected graphs Graphical Models, Lecture 2, Michaelmas Term 2011 October 12, 2011 Formal definition Fundamental properties Random variables X and Y are conditionally independent given the random variable Z if L(X Y,

More information

CS 188: Artificial Intelligence Fall 2008

CS 188: Artificial Intelligence Fall 2008 CS 188: Artificial Intelligence Fall 2008 Lecture 14: Bayes Nets 10/14/2008 Dan Klein UC Berkeley 1 1 Announcements Midterm 10/21! One page note sheet Review sessions Friday and Sunday (similar) OHs on

More information

CS Lecture 3. More Bayesian Networks

CS Lecture 3. More Bayesian Networks CS 6347 Lecture 3 More Bayesian Networks Recap Last time: Complexity challenges Representing distributions Computing probabilities/doing inference Introduction to Bayesian networks Today: D-separation,

More information

Announcements. CS 188: Artificial Intelligence Fall Causality? Example: Traffic. Topology Limits Distributions. Example: Reverse Traffic

Announcements. CS 188: Artificial Intelligence Fall Causality? Example: Traffic. Topology Limits Distributions. Example: Reverse Traffic CS 188: Artificial Intelligence Fall 2008 Lecture 16: Bayes Nets III 10/23/2008 Announcements Midterms graded, up on glookup, back Tuesday W4 also graded, back in sections / box Past homeworks in return

More information

Bayesian Approaches Data Mining Selected Technique

Bayesian Approaches Data Mining Selected Technique Bayesian Approaches Data Mining Selected Technique Henry Xiao xiao@cs.queensu.ca School of Computing Queen s University Henry Xiao CISC 873 Data Mining p. 1/17 Probabilistic Bases Review the fundamentals

More information

A Brief Introduction to Graphical Models. Presenter: Yijuan Lu November 12,2004

A Brief Introduction to Graphical Models. Presenter: Yijuan Lu November 12,2004 A Brief Introduction to Graphical Models Presenter: Yijuan Lu November 12,2004 References Introduction to Graphical Models, Kevin Murphy, Technical Report, May 2001 Learning in Graphical Models, Michael

More information

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018 Mathematical Foundations of Computer Science Lecture Outline October 18, 2018 The Total Probability Theorem. Consider events E and F. Consider a sample point ω E. Observe that ω belongs to either F or

More information

Modeling and reasoning with uncertainty

Modeling and reasoning with uncertainty CS 2710 Foundations of AI Lecture 18 Modeling and reasoning with uncertainty Milos Hauskrecht milos@cs.pitt.edu 5329 Sennott Square KB systems. Medical example. We want to build a KB system for the diagnosis

More information

Lecture 5: Bayesian Network

Lecture 5: Bayesian Network Lecture 5: Bayesian Network Topics of this lecture What is a Bayesian network? A simple example Formal definition of BN A slightly difficult example Learning of BN An example of learning Important topics

More information

Announcements. CS 188: Artificial Intelligence Spring Probability recap. Outline. Bayes Nets: Big Picture. Graphical Model Notation

Announcements. CS 188: Artificial Intelligence Spring Probability recap. Outline. Bayes Nets: Big Picture. Graphical Model Notation CS 188: Artificial Intelligence Spring 2010 Lecture 15: Bayes Nets II Independence 3/9/2010 Pieter Abbeel UC Berkeley Many slides over the course adapted from Dan Klein, Stuart Russell, Andrew Moore Current

More information