Artificial Intelligence Bayesian Networks
|
|
- Amberly Daniel
- 6 years ago
- Views:
Transcription
1 Artificial Intelligence Bayesian Networks Stephan Dreiseitl FH Hagenberg Software Engineering & Interactive Media Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43 Overview Representation of uncertain knowledge Constructing Bayesian networks Using Bayesian networks for inference Algorithmic aspects of inference Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43
2 A simple Bayesian network example Rain Worms Umbrellas P(Rain, Worms, Umbrellas) = P(Worms Rain)P(Umbrellas Rain)P(Rain) With conditional independence, need only right-hand side to represent joint distribution Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43 A simple Bayesian net example (cont.) Rain Worms Umbrellas Intuitively: graphical representation of influence Mathematically: graphical representation of conditional independence assertions Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43
3 A more complicated example Burglary Earthquake P(b) P(e) Alarm B E P(a B, E) T T 0.95 T F 0.94 F T 0.24 F F MaryCalls A P(m A) T 0.7 F 0.01 JohnCalls A P(j A) T 0.9 F 0.05 Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43 Definition of Bayesian networks A Bayesian network is a directed acyclic graph with random variables as nodes, links that specify directly influences relationships, probability distributions P(X i parents(x i )) for each node X i Graph structure asserts conditional independencies: P(MaryCalls JohnCalls, Alarm, Earthquake, Burglary) = P(MaryCalls Alarm) Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43
4 Bayesian networks as joint probabilities P(X 1,..., X n ) = = n P(X i X 1,..., X i 1 ) i=1 n P(X i parents(x i )) i=1 for parents(x i ) {X 1,..., X i 1 } Burglary example: P(b, e, a, m, j) = P(b)P( e)p(a b, e)p( m a)p(j a) = = Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43 Conditional independencies in networks Use graphical structure to visualize conditional dependencies and independencies Nodes are dependent if there is information flow between them (along one possible path) Nodes are independent if information flow is blocked (along all possible paths) Distinguish situations with and without evidence (instantiated variables) Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43
5 Conditional independencies in networks No evidence: Information flow along a path is blocked iff there is a head-to-head node (blocker) on path No blockers between A and B: A B A B Blocker C between A and B: A C B Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43 Conditional independencies in networks Evidence blocks information flow, except at blockers (or their descendents), where it opens information flow Information flow between A and B blocked by evidence: A B A B Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43
6 Conditional independencies in networks Information flow between A and B unblocked by evidence: A B A B Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43 Conditional independencies in networks A node is conditionally independent of its non-descendents, given its parents P 1 P 2 A X B C C D 1 2 P(X P 1, P 2, A, B, D) = P(X P 1, P 2 ) Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43
7 Cond. independencies in networks (cont.) A node is conditionally independent of all other nodes in the network, given its Markov blanket: its parents, children, and children s parents P 1 P 2 A X B C C D 1 2 P(X P 1, P 2, C 1, C 2, A, B, D) = P(X P 1, P 2, C 1, C 2, A, B) Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43 Noisy OR For Boolean node X with n Boolean parents, conditional probability table has 2 n entries Noisy OR assumption reduces this number to n: Assume each parent may be inhibited independently Flu Malaria Cold Fever Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43
8 Noisy OR (cont.) Need only specify first three entries of table: Flu Malaria Cold P( fever) T F F 0.2 F T F 0.1 F F T 0.6 F F F 1.0 F T T = 0.06 T F T = 0.12 T T F = 0.02 T T T = Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43 Building an example network When I go home at night, I want to know if my family is home before I try the doors (perhaps the most convenient door to enter is double locked when nobody is home). Now, often when my wife leaves the house she turns on an outdoor light. However, she sometimes turns on this light if she is expecting a guest. Also, we have a dog. When nobody is home, the dog is put in the back yard. The same is true if the dog has bowel trouble. Finally, if the dog is in the back yard, I will probably hear her barking, but sometimes I can be confused by other dogs barking. F. Jensen, An introduction to Bayesian networks, UCL Press, Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43
9 Building an example network (cont.) Relevant entities: Boolean random variables FamilyOut, LightsOn, HearDogBark Causal structure: FamilyOut has direct influence on both LightsOn and HearDogBark, so LightsOn and HearDogBark are conditionally independent given FamilyOut FamilyOut LightsOn HearDogBark Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43 Building an example network (cont.) Numbers in conditional probability table derived from previous experience, or subjective belief P(familyout) = 0.2 P(lightson familyout) = 0.99 P(lightson familyout) = 0.1 Run into problem with P(heardogbark familyout): dog may be out because of bowel problems, and barking may be other dogs Network structure needs to be updated to reflect this Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43
10 Building an example network (cont.) Introduce mediating variable DogOut to model uncertainty with bowel problems and hearing other dogs bark FamilyOut BowelProblems LightsOn DogOut HearDogBark Need: P(DogOut FamilyOut, BowelProblems) P(HearDogBark DogOut) Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43 Building an example network (cont.) Obtain the following additional probability tables: FamilyOut P(f) 0.2 BowelProblems P(b) 0.05 LightsOn DogOut F P(l F) T 0.99 F 0.1 F B P(d F, B) T T 0.99 T F 0.88 F T 0.96 F F 0.2 HearDogBark D P(h D) T 0.6 F 0.25 Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43
11 Inference in Bayesian networks Given events (instantiated variables) e and no information on hidden variables H, calculate distribution for query variable Q Algorithmically, calculate P(Q e) by marginalizing over H P(Q e) = αp(q, e) = α h P(Q, e, h) with h all possible value combinations of H Distinguish between causal, diagnostic, and intercausal reasoning Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43 Types of inference Causal reasoning: query variable is downstream of events P(heardogbark familyout) = 0.56 Diagnostic reasoning: query variable upstream of events P(familyout heardogbark) = Explaining away (intercausal reasoning): knowing effect and possible cause, reduce the probability of other possible causes P(familyout bowelproblems, heardogbark) = P(bowelproblems heardogbark) = P(bowelproblems familyout, heardogbark) = Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43
12 Algorithmic aspects of inference Calculating joint distribution computationally expensive Several alternatives for inference in Bayesian networks: Exact inference by enumeration by variable elimination Stochastic inference (Monte Carlo methods) by sampling from the joint distribution by rejection sampling by likelihood weighting by Markov chain Monte Carlo methods Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43 Inference by enumeration FamilyOut example (d {d, d}, b {b, b}) P(F l, h) = αp(f, l, h) = α P(F, l, h, d, b ) d b P(f l, h) = α d b P(f )P(b )P(l f )P(d f, b )P(h d ) = α P(f )P(l f ) d P(h d ) b P(b )P(d f, b ) = α ( ) = α Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43
13 Inference by enumeration (cont.) Similarily, P( f l, h) = α From P(f l, h) + P( f l, h) = 1, normalization yields P(F l, h) = α (0.111, ) = (0.806, 0.194) Burglary example: P(B j, m) = α P(B) e P(e ) a P(a B, e )P(j a )P(m a ) Last two factors P(j a)p(m a) do not depend on e, but have to be evaluated twice (for e and e) Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43 Variable elimination Eliminate repetitive calculations by summing inside out, storing intermediate results (cf. dynamic programming) Burglary example, different query: P(J b) = α P(b) e P(e ) a P(a b, e )P(J a ) m P(m a ) } {{ } = 1 Fact: Any variable that is not an ancestor of the query or evidence variables is irrelevant and can be dropped Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43
14 Sampling from the joint distribution Straightforward if there is no evidence in the network: Sample each variable in topological order For nodes without parents, sample from their distribution; for nodes with parents, sample from the conditional distribution With N S (x 1,..., x n ) being the number of times specific realizations (x 1,..., x n ) are generated in N sampling experiments, obtain lim N N S (x 1,..., x n ) N = P(x 1,..., x n ) Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43 Example: joint distribution sampling FamilyOut P(f) 0.2 BowelProblems P(b) 0.05 LightsOn DogOut F P(l F) T 0.99 F 0.1 F B P(d F, B) T T 0.99 T F 0.88 F T 0.96 F F 0.2 HearDogBark D P(h D) T 0.6 F 0.25 What is probability of family at home, dog has no bowel problems and isn t out, the light is off, and a dog s barking can be heard? Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43
15 Example: joint distribution sampling FamilyOut example: Generate samples from the network by first sampling from FamilyOut and BowelProblems variables, then sampling from all other variables in turn, given sampled parent values Obtain N S ( f, b, l, d, h) = Compare with P( f, b, l, d, h) = = Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43 Example: joint distribution sampling Advantage of sampling: easy to generate estimates for other probabilities Standard error of estimates drops as 1/ N, for N = this is N S ( d)/ = ( P( d) = ) N S (f, h) N S ( h) = ( P(f h) = ) Last example: Form of rejection sampling Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43
16 Rejection sampling in Bayesian networks Method to approximate conditional probabilities P(X e) of variables X, given evidence e: P(X e) N S(X, e) N S (e) Rejection sampling: Take only those samples that are consistent with the evidence into account Problem with rejection sampling: Number of samples consistent with evidence drops exponentially with number of evidence variables, therefore unusable for real-life networks Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43 Likelihood weighting Fix evidence and sample all other variables This overcomes rejection sampling shortcoming by only generating samples consistent with the evidence Problem: Consider situation with P(E = e X = x) = and P(X = x) = 0.9. Then, 90% of samples will have X = x (and fixed E = e), but this combination is very unlikely, since P(E = e X = x) = Solution: Weight each sample by product of conditional probabilities of evidence variables, given its parents Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43
17 Example: Likelihood weighting FamilyOut example: Calculate P(F l, d) Iterate the following: sample all non-evidence variables, given the evidence variables, obtaining, e.g., ( f, b, h) calculate weighting factor, e.g. P(l f ) P(d f, b) = = 0.02 Finally, sum and normalize weighting factors for samples (f, l, d) and ( f, l, d) Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43 Example: Likelihood weighting (cont). For N = , obtain N S ( f, l, d) = N S (f, l, d) = w( f,l,d) = w(f,l,d) = P(f l, d) /( ) = Correct: P(f l, d) = Disadvantage of likelihood weighting: With many evidence variables, most samples will have very small weights, and few samples with larger weights dominate Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43
18 Markov chains A sequence of discrete r.v. X 0, X 1,... is called a Markov chain with state space S iff P(X n = x n X 0 = x 0,..., X n 1 = x n 1 ) = P(X n = x n X n 1 = x n 1 ) for all x 0,..., x n S. Thus, X n is conditionally independent of all variables before it, given X n 1 Specify state transition matrix P with P ij = P(X n = x j X n 1 = x i ) Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43 Markov chains Monte Carlo methods Want to obtain samples from given distributions P d (X ) (hard to sample from with other methods) Idea: Construct a Markov chain that, for arbitrary initial state x 0, converges towards a stationary (equilibrium) distribution P d (X ) Then, successive realizations x n, x n+1,... are sampled according to P d (but are not independent!!) Often not clear when convergence of chain has taken place Therefore, discard initial portion of chain (burn-in phase) Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43
19 Markov chain example Let S = {1, 2} with state transition matrix P = ( 1 2 Simulate chain for 1000 steps, show N S (1)/N and N S (2)/N for N = 1,..., 1000 with starting state 1 (left) and 2 (right) ) Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43 MCMC for Bayesian networks Given evidence e and non-evidence variables X, use Markov chains to sample from the distribution P(X e) Obtain sequence of states x 0, x 1,..., discard initial portion After convergence, samples x k, x k+1,... have desired distribution P(X e) Many variants of Markov chain Monte Carlo algorithms Consider only Gibbs sampling Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43
20 Gibbs sampling Fix evidence variables to e, assign arbitrary values to non-evidence variables X Recall: Markov blanket of a variable is parents, children, and children s parents. Iterate the following: pick arbitrary variable X i from X sample from P(X i MarkovBlanket(X i )) new state = old state, with new value of X i Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43 Gibbs sampling (cont.) Calculating P(X i MarkovBlanket(X i )): P(x i MarkovBlanket(X i )) = α P(x i parents(x i )) Y i children(x i ) P(y i parents(y i )) With this, calculate P(x i MarkovBlanket(X i )) and P( x i MarkovBlanket(X i )), normalize to obtain P(X i MarkovBlanket(X i )) Sample from this for next value of X i, and thus next state of Markov chain Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43
21 Bayesian network MCMC example FamilyOut example: Calculate P(F l, d) Start with arbitrary non-evidence settings (f, b, h) Pick F, sample from P(F l, d, b), obtain f Pick B, sample from P(B f, d), obtain b Pick H, sample from P(H d), obtain h Iterate last three steps times, keep last states Obtain P(f l, d) (correct ) Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43 Comparison of inference algorithms Inference by enumeration computationally prohibitive Variable elimination removes all irrelevant variables Direct sampling from joint distribution: easy when no evidence present Use rejection sampling and likelihood weighting for more efficient calculations Markov chain Monte Carlo methods are most efficient for large networks by calculating new states based on old states, but lose independence of samples Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43
22 Summary Bayesian networks are graphical representations of causal influence among random variables Network structure graphically specifies conditional independence assumptions Need conditional distributions of nodes, given its parents Use noisy OR to reduce number of parameters in tables Reasoning types in Bayesian networks: causal, diagnostic, and explaining away There are exact and approximate inference algorithms Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence SS / 43
PROBABILISTIC REASONING SYSTEMS
PROBABILISTIC REASONING SYSTEMS In which we explain how to build reasoning systems that use network models to reason with uncertainty according to the laws of probability theory. Outline Knowledge in uncertain
More informationInformatics 2D Reasoning and Agents Semester 2,
Informatics 2D Reasoning and Agents Semester 2, 2017 2018 Alex Lascarides alex@inf.ed.ac.uk Lecture 23 Probabilistic Reasoning with Bayesian Networks 15th March 2018 Informatics UoE Informatics 2D 1 Where
More informationBayesian Network. Outline. Bayesian Network. Syntax Semantics Exact inference by enumeration Exact inference by variable elimination
Outline Syntax Semantics Exact inference by enumeration Exact inference by variable elimination s A simple, graphical notation for conditional independence assertions and hence for compact specication
More informationBayesian Networks BY: MOHAMAD ALSABBAGH
Bayesian Networks BY: MOHAMAD ALSABBAGH Outlines Introduction Bayes Rule Bayesian Networks (BN) Representation Size of a Bayesian Network Inference via BN BN Learning Dynamic BN Introduction Conditional
More informationBayesian networks. Chapter 14, Sections 1 4
Bayesian networks Chapter 14, Sections 1 4 Artificial Intelligence, spring 2013, Peter Ljunglöf; based on AIMA Slides c Stuart Russel and Peter Norvig, 2004 Chapter 14, Sections 1 4 1 Bayesian networks
More informationSchool of EECS Washington State University. Artificial Intelligence
School of EECS Washington State University Artificial Intelligence 1 } Full joint probability distribution Can answer any query But typically too large } Conditional independence Can reduce the number
More information14 PROBABILISTIC REASONING
228 14 PROBABILISTIC REASONING A Bayesian network is a directed graph in which each node is annotated with quantitative probability information 1. A set of random variables makes up the nodes of the network.
More informationBayesian Networks. Motivation
Bayesian Networks Computer Sciences 760 Spring 2014 http://pages.cs.wisc.edu/~dpage/cs760/ Motivation Assume we have five Boolean variables,,,, The joint probability is,,,, How many state configurations
More informationProbabilistic Reasoning Systems
Probabilistic Reasoning Systems Dr. Richard J. Povinelli Copyright Richard J. Povinelli rev 1.0, 10/7/2001 Page 1 Objectives You should be able to apply belief networks to model a problem with uncertainty.
More informationProbabilistic Reasoning. (Mostly using Bayesian Networks)
Probabilistic Reasoning (Mostly using Bayesian Networks) Introduction: Why probabilistic reasoning? The world is not deterministic. (Usually because information is limited.) Ways of coping with uncertainty
More informationIntroduction to Artificial Intelligence. Unit # 11
Introduction to Artificial Intelligence Unit # 11 1 Course Outline Overview of Artificial Intelligence State Space Representation Search Techniques Machine Learning Logic Probabilistic Reasoning/Bayesian
More informationBayesian networks. Chapter Chapter
Bayesian networks Chapter 14.1 3 Chapter 14.1 3 1 Outline Syntax Semantics Parameterized distributions Chapter 14.1 3 2 Bayesian networks A simple, graphical notation for conditional independence assertions
More informationQuantifying uncertainty & Bayesian networks
Quantifying uncertainty & Bayesian networks CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2016 Soleymani Artificial Intelligence: A Modern Approach, 3 rd Edition,
More informationObjectives. Probabilistic Reasoning Systems. Outline. Independence. Conditional independence. Conditional independence II.
Copyright Richard J. Povinelli rev 1.0, 10/1//2001 Page 1 Probabilistic Reasoning Systems Dr. Richard J. Povinelli Objectives You should be able to apply belief networks to model a problem with uncertainty.
More informationBayes Networks. CS540 Bryan R Gibson University of Wisconsin-Madison. Slides adapted from those used by Prof. Jerry Zhu, CS540-1
Bayes Networks CS540 Bryan R Gibson University of Wisconsin-Madison Slides adapted from those used by Prof. Jerry Zhu, CS540-1 1 / 59 Outline Joint Probability: great for inference, terrible to obtain
More informationArtificial Intelligence
ICS461 Fall 2010 Nancy E. Reed nreed@hawaii.edu 1 Lecture #14B Outline Inference in Bayesian Networks Exact inference by enumeration Exact inference by variable elimination Approximate inference by stochastic
More informationDirected Graphical Models
CS 2750: Machine Learning Directed Graphical Models Prof. Adriana Kovashka University of Pittsburgh March 28, 2017 Graphical Models If no assumption of independence is made, must estimate an exponential
More informationBayesian Networks. Semantics of Bayes Nets. Example (Binary valued Variables) CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-III
CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-III Bayesian Networks Announcements: Drop deadline is this Sunday Nov 5 th. All lecture notes needed for T3 posted (L13,,L17). T3 sample
More informationGraphical Models and Kernel Methods
Graphical Models and Kernel Methods Jerry Zhu Department of Computer Sciences University of Wisconsin Madison, USA MLSS June 17, 2014 1 / 123 Outline Graphical Models Probabilistic Inference Directed vs.
More informationLecture 8: Bayesian Networks
Lecture 8: Bayesian Networks Bayesian Networks Inference in Bayesian Networks COMP-652 and ECSE 608, Lecture 8 - January 31, 2017 1 Bayes nets P(E) E=1 E=0 0.005 0.995 E B P(B) B=1 B=0 0.01 0.99 E=0 E=1
More informationCS 343: Artificial Intelligence
CS 343: Artificial Intelligence Bayes Nets: Sampling Prof. Scott Niekum The University of Texas at Austin [These slides based on those of Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley.
More informationGraphical Models - Part I
Graphical Models - Part I Oliver Schulte - CMPT 726 Bishop PRML Ch. 8, some slides from Russell and Norvig AIMA2e Outline Probabilistic Models Bayesian Networks Markov Random Fields Inference Outline Probabilistic
More informationOutline. CSE 573: Artificial Intelligence Autumn Bayes Nets: Big Picture. Bayes Net Semantics. Hidden Markov Models. Example Bayes Net: Car
CSE 573: Artificial Intelligence Autumn 2012 Bayesian Networks Dan Weld Many slides adapted from Dan Klein, Stuart Russell, Andrew Moore & Luke Zettlemoyer Outline Probabilistic models (and inference)
More informationlast two digits of your SID
Announcements Midterm: Wednesday 7pm-9pm See midterm prep page (posted on Piazza, inst.eecs page) Four rooms; your room determined by last two digits of your SID: 00-32: Dwinelle 155 33-45: Genetics and
More informationIntroduction to Artificial Intelligence Belief networks
Introduction to Artificial Intelligence Belief networks Chapter 15.1 2 Dieter Fox Based on AIMA Slides c S. Russell and P. Norvig, 1998 Chapter 15.1 2 0-0 Outline Bayesian networks: syntax and semantics
More informationAxioms of Probability? Notation. Bayesian Networks. Bayesian Networks. Today we ll introduce Bayesian Networks.
Bayesian Networks Today we ll introduce Bayesian Networks. This material is covered in chapters 13 and 14. Chapter 13 gives basic background on probability and Chapter 14 talks about Bayesian Networks.
More informationCS 188: Artificial Intelligence. Bayes Nets
CS 188: Artificial Intelligence Probabilistic Inference: Enumeration, Variable Elimination, Sampling Pieter Abbeel UC Berkeley Many slides over this course adapted from Dan Klein, Stuart Russell, Andrew
More informationCS 2750: Machine Learning. Bayesian Networks. Prof. Adriana Kovashka University of Pittsburgh March 14, 2016
CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016 Plan for today and next week Today and next time: Bayesian networks (Bishop Sec. 8.1) Conditional
More informationLearning Bayesian Networks (part 1) Goals for the lecture
Learning Bayesian Networks (part 1) Mark Craven and David Page Computer Scices 760 Spring 2018 www.biostat.wisc.edu/~craven/cs760/ Some ohe slides in these lectures have been adapted/borrowed from materials
More informationBayes Networks 6.872/HST.950
Bayes Networks 6.872/HST.950 What Probabilistic Models Should We Use? Full joint distribution Completely expressive Hugely data-hungry Exponential computational complexity Naive Bayes (full conditional
More informationEE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS
EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005 Instructor: Professor Jeff A. Bilmes Uncertainty & Bayesian Networks
More informationAnnouncements. CS 188: Artificial Intelligence Fall Causality? Example: Traffic. Topology Limits Distributions. Example: Reverse Traffic
CS 188: Artificial Intelligence Fall 2008 Lecture 16: Bayes Nets III 10/23/2008 Announcements Midterms graded, up on glookup, back Tuesday W4 also graded, back in sections / box Past homeworks in return
More informationProbabilistic Graphical Models and Bayesian Networks. Artificial Intelligence Bert Huang Virginia Tech
Probabilistic Graphical Models and Bayesian Networks Artificial Intelligence Bert Huang Virginia Tech Concept Map for Segment Probabilistic Graphical Models Probabilistic Time Series Models Particle Filters
More informationCS 380: ARTIFICIAL INTELLIGENCE UNCERTAINTY. Santiago Ontañón
CS 380: ARTIFICIAL INTELLIGENCE UNCERTAINTY Santiago Ontañón so367@drexel.edu Summary Probability is a rigorous formalism for uncertain knowledge Joint probability distribution specifies probability of
More informationImplementing Machine Reasoning using Bayesian Network in Big Data Analytics
Implementing Machine Reasoning using Bayesian Network in Big Data Analytics Steve Cheng, Ph.D. Guest Speaker for EECS 6893 Big Data Analytics Columbia University October 26, 2017 Outline Introduction Probability
More informationCOMP5211 Lecture Note on Reasoning under Uncertainty
COMP5211 Lecture Note on Reasoning under Uncertainty Fangzhen Lin Department of Computer Science and Engineering Hong Kong University of Science and Technology Fangzhen Lin (HKUST) Uncertainty 1 / 33 Uncertainty
More informationMachine Learning for Data Science (CS4786) Lecture 24
Machine Learning for Data Science (CS4786) Lecture 24 Graphical Models: Approximate Inference Course Webpage : http://www.cs.cornell.edu/courses/cs4786/2016sp/ BELIEF PROPAGATION OR MESSAGE PASSING Each
More informationBayesian Networks. Philipp Koehn. 6 April 2017
Bayesian Networks Philipp Koehn 6 April 2017 Outline 1 Bayesian Networks Parameterized distributions Exact inference Approximate inference 2 bayesian networks Bayesian Networks 3 A simple, graphical notation
More informationReview: Bayesian learning and inference
Review: Bayesian learning and inference Suppose the agent has to make decisions about the value of an unobserved query variable X based on the values of an observed evidence variable E Inference problem:
More informationAnnouncements. Inference. Mid-term. Inference by Enumeration. Reminder: Alarm Network. Introduction to Artificial Intelligence. V22.
Introduction to Artificial Intelligence V22.0472-001 Fall 2009 Lecture 15: Bayes Nets 3 Midterms graded Assignment 2 graded Announcements Rob Fergus Dept of Computer Science, Courant Institute, NYU Slides
More informationUncertainty and Bayesian Networks
Uncertainty and Bayesian Networks Tutorial 3 Tutorial 3 1 Outline Uncertainty Probability Syntax and Semantics for Uncertainty Inference Independence and Bayes Rule Syntax and Semantics for Bayesian Networks
More informationRecall from last time: Conditional probabilities. Lecture 2: Belief (Bayesian) networks. Bayes ball. Example (continued) Example: Inference problem
Recall from last time: Conditional probabilities Our probabilistic models will compute and manipulate conditional probabilities. Given two random variables X, Y, we denote by Lecture 2: Belief (Bayesian)
More informationProbability. CS 3793/5233 Artificial Intelligence Probability 1
CS 3793/5233 Artificial Intelligence 1 Motivation Motivation Random Variables Semantics Dice Example Joint Dist. Ex. Axioms Agents don t have complete knowledge about the world. Agents need to make decisions
More informationBayesian Networks. Philipp Koehn. 29 October 2015
Bayesian Networks Philipp Koehn 29 October 2015 Outline 1 Bayesian Networks Parameterized distributions Exact inference Approximate inference 2 bayesian networks Bayesian Networks 3 A simple, graphical
More informationBayes Nets III: Inference
1 Hal Daumé III (me@hal3.name) Bayes Nets III: Inference Hal Daumé III Computer Science University of Maryland me@hal3.name CS 421: Introduction to Artificial Intelligence 10 Apr 2012 Many slides courtesy
More informationInference in Bayesian Networks
Lecture 7 Inference in Bayesian Networks Marco Chiarandini Department of Mathematics & Computer Science University of Southern Denmark Slides by Stuart Russell and Peter Norvig Course Overview Introduction
More informationInference in Bayesian networks
Inference in Bayesian networks AIMA2e hapter 14.4 5 1 Outline Exact inference by enumeration Exact inference by variable elimination Approximate inference by stochastic simulation Approximate inference
More informationStochastic inference in Bayesian networks, Markov chain Monte Carlo methods
Stochastic inference in Bayesian networks, Markov chain Monte Carlo methods AI: Stochastic inference in BNs AI: Stochastic inference in BNs 1 Outline ypes of inference in (causal) BNs Hardness of exact
More informationCOS402- Artificial Intelligence Fall Lecture 10: Bayesian Networks & Exact Inference
COS402- Artificial Intelligence Fall 2015 Lecture 10: Bayesian Networks & Exact Inference Outline Logical inference and probabilistic inference Independence and conditional independence Bayes Nets Semantics
More informationArtificial Intelligence Markov Chains
Artificial Intelligence Markov Chains Stephan Dreiseitl FH Hagenberg Software Engineering & Interactive Media Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 12: Markov Chains Artificial Intelligence SS2010
More informationBayesian networks. Soleymani. CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2018
Bayesian networks CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2018 Soleymani Slides have been adopted from Klein and Abdeel, CS188, UC Berkeley. Outline Probability
More informationProbabilistic Reasoning. Kee-Eung Kim KAIST Computer Science
Probabilistic Reasoning Kee-Eung Kim KAIST Computer Science Outline #1 Acting under uncertainty Probabilities Inference with Probabilities Independence and Bayes Rule Bayesian networks Inference in Bayesian
More informationBayesian Networks. Machine Learning, Fall Slides based on material from the Russell and Norvig AI Book, Ch. 14
Bayesian Networks Machine Learning, Fall 2010 Slides based on material from the Russell and Norvig AI Book, Ch. 14 1 Administrativia Bayesian networks The inference problem: given a BN, how to make predictions
More informationBayesian belief networks. Inference.
Lecture 13 Bayesian belief networks. Inference. Milos Hauskrecht milos@cs.pitt.edu 5329 Sennott Square Midterm exam Monday, March 17, 2003 In class Closed book Material covered by Wednesday, March 12 Last
More informationLecture 10: Bayesian Networks and Inference
Lecture 10: Bayesian Networks and Inference CS 580 (001) - Spring 2016 Amarda Shehu Department of Computer Science George Mason University, Fairfax, VA, USA Apr 13-20, 2016 Amarda Shehu (580) 1 1 Outline
More informationIntroduction to Bayes Nets. CS 486/686: Introduction to Artificial Intelligence Fall 2013
Introduction to Bayes Nets CS 486/686: Introduction to Artificial Intelligence Fall 2013 1 Introduction Review probabilistic inference, independence and conditional independence Bayesian Networks - - What
More informationCS 484 Data Mining. Classification 7. Some slides are from Professor Padhraic Smyth at UC Irvine
CS 484 Data Mining Classification 7 Some slides are from Professor Padhraic Smyth at UC Irvine Bayesian Belief networks Conditional independence assumption of Naïve Bayes classifier is too strong. Allows
More informationCS Lecture 3. More Bayesian Networks
CS 6347 Lecture 3 More Bayesian Networks Recap Last time: Complexity challenges Representing distributions Computing probabilities/doing inference Introduction to Bayesian networks Today: D-separation,
More informationIntelligent Systems: Reasoning and Recognition. Reasoning with Bayesian Networks
Intelligent Systems: Reasoning and Recognition James L. Crowley ENSIMAG 2 / MoSIG M1 Second Semester 2016/2017 Lesson 13 24 march 2017 Reasoning with Bayesian Networks Naïve Bayesian Systems...2 Example
More informationUncertainty. Logic and Uncertainty. Russell & Norvig. Readings: Chapter 13. One problem with logical-agent approaches: C:145 Artificial
C:145 Artificial Intelligence@ Uncertainty Readings: Chapter 13 Russell & Norvig. Artificial Intelligence p.1/43 Logic and Uncertainty One problem with logical-agent approaches: Agents almost never have
More informationCS 188: Artificial Intelligence Spring Announcements
CS 188: Artificial Intelligence Spring 2011 Lecture 14: Bayes Nets II Independence 3/9/2011 Pieter Abbeel UC Berkeley Many slides over the course adapted from Dan Klein, Stuart Russell, Andrew Moore Announcements
More informationUncertainty. Introduction to Artificial Intelligence CS 151 Lecture 2 April 1, CS151, Spring 2004
Uncertainty Introduction to Artificial Intelligence CS 151 Lecture 2 April 1, 2004 Administration PA 1 will be handed out today. There will be a MATLAB tutorial tomorrow, Friday, April 2 in AP&M 4882 at
More informationArtificial Intelligence Bayes Nets: Independence
Artificial Intelligence Bayes Nets: Independence Instructors: David Suter and Qince Li Course Delivered @ Harbin Institute of Technology [Many slides adapted from those created by Dan Klein and Pieter
More informationInformatics 2D Reasoning and Agents Semester 2,
Informatics 2D Reasoning and Agents Semester 2, 2018 2019 Alex Lascarides alex@inf.ed.ac.uk Lecture 25 Approximate Inference in Bayesian Networks 19th March 2019 Informatics UoE Informatics 2D 1 Where
More informationCS Belief networks. Chapter
CS 580 1 Belief networks Chapter 15.1 2 CS 580 2 Outline Conditional independence Bayesian networks: syntax and semantics Exact inference Approximate inference CS 580 3 Independence Two random variables
More informationProbabilistic representation and reasoning
Probabilistic representation and reasoning Applied artificial intelligence (EDAF70) Lecture 04 2019-02-01 Elin A. Topp Material based on course book, chapter 13, 14.1-3 1 Show time! Two boxes of chocolates,
More informationCSEP 573: Artificial Intelligence
CSEP 573: Artificial Intelligence Bayesian Networks: Inference Ali Farhadi Many slides over the course adapted from either Luke Zettlemoyer, Pieter Abbeel, Dan Klein, Stuart Russell or Andrew Moore 1 Outline
More informationIntroduction to Artificial Intelligence (AI)
Introduction to Artificial Intelligence (AI) Computer Science cpsc502, Lecture 9 Oct, 11, 2011 Slide credit Approx. Inference : S. Thrun, P, Norvig, D. Klein CPSC 502, Lecture 9 Slide 1 Today Oct 11 Bayesian
More informationY. Xiang, Inference with Uncertain Knowledge 1
Inference with Uncertain Knowledge Objectives Why must agent use uncertain knowledge? Fundamentals of Bayesian probability Inference with full joint distributions Inference with Bayes rule Bayesian networks
More informationCS 188: Artificial Intelligence Fall 2009
CS 188: Artificial Intelligence Fall 2009 Lecture 14: Bayes Nets 10/13/2009 Dan Klein UC Berkeley Announcements Assignments P3 due yesterday W2 due Thursday W1 returned in front (after lecture) Midterm
More informationUncertainty and Belief Networks. Introduction to Artificial Intelligence CS 151 Lecture 1 continued Ok, Lecture 2!
Uncertainty and Belief Networks Introduction to Artificial Intelligence CS 151 Lecture 1 continued Ok, Lecture 2! This lecture Conditional Independence Bayesian (Belief) Networks: Syntax and semantics
More informationCOMP9414: Artificial Intelligence Reasoning Under Uncertainty
COMP9414, Monday 16 April, 2012 Reasoning Under Uncertainty 2 COMP9414: Artificial Intelligence Reasoning Under Uncertainty Overview Problems with Logical Approach What Do the Numbers Mean? Wayne Wobcke
More informationqfundamental Assumption qbeyond the Bayes net Chain rule conditional independence assumptions,
Bayes Nets: Conditional Independence qfundamental Assumption qbeyond the Bayes net Chain rule conditional independence assumptions, oadditional conditional independencies, that can be read off the graph.
More informationBayesian networks. Chapter Chapter Outline. Syntax Semantics Parameterized distributions. Chapter
Bayesian networks Chapter 14.1 3 Chapter 14.1 3 1 Outline Syntax Semantics Parameterized distributions Chapter 14.1 3 2 Bayesian networks A simple, graphical notation for conditional independence assertions
More informationBayesian belief networks
CS 2001 Lecture 1 Bayesian belief networks Milos Hauskrecht milos@cs.pitt.edu 5329 Sennott Square 4-8845 Milos research interests Artificial Intelligence Planning, reasoning and optimization in the presence
More informationDefining Things in Terms of Joint Probability Distribution. Today s Lecture. Lecture 17: Uncertainty 2. Victor R. Lesser
Lecture 17: Uncertainty 2 Victor R. Lesser CMPSCI 683 Fall 2010 Today s Lecture How belief networks can be a Knowledge Base for probabilistic knowledge. How to construct a belief network. How to answer
More informationRecall from last time. Lecture 3: Conditional independence and graph structure. Example: A Bayesian (belief) network.
ecall from last time Lecture 3: onditional independence and graph structure onditional independencies implied by a belief network Independence maps (I-maps) Factorization theorem The Bayes ball algorithm
More informationAnother look at Bayesian. inference
Another look at Bayesian A general scenario: - Query variables: X inference - Evidence (observed) variables and their values: E = e - Unobserved variables: Y Inference problem: answer questions about the
More informationIntro to AI: Lecture 8. Volker Sorge. Introduction. A Bayesian Network. Inference in. Bayesian Networks. Bayesian Networks.
Specifying Probability Distributions Specifying a probability for every atomic event is impractical We have already seen it can be easier to specify probability distributions by using (conditional) independence
More informationUncertainty. 22c:145 Artificial Intelligence. Problem of Logic Agents. Foundations of Probability. Axioms of Probability
Problem of Logic Agents 22c:145 Artificial Intelligence Uncertainty Reading: Ch 13. Russell & Norvig Logic-agents almost never have access to the whole truth about their environments. A rational agent
More informationCS 188: Artificial Intelligence Spring Announcements
CS 188: Artificial Intelligence Spring 2011 Lecture 16: Bayes Nets IV Inference 3/28/2011 Pieter Abbeel UC Berkeley Many slides over this course adapted from Dan Klein, Stuart Russell, Andrew Moore Announcements
More informationAn Introduction to Bayesian Networks: Representation and Approximate Inference
An Introduction to Bayesian Networks: Representation and Approximate Inference Marek Grześ Department of Computer Science University of York Graphical Models Reading Group May 7, 2009 Data and Probabilities
More informationBayesian networks. Chapter AIMA2e Slides, Stuart Russell and Peter Norvig, Completed by Kazim Fouladi, Fall 2008 Chapter 14.
Bayesian networks Chapter 14.1 3 AIMA2e Slides, Stuart Russell and Peter Norvig, Completed by Kazim Fouladi, Fall 2008 Chapter 14.1 3 1 Outline Syntax Semantics Parameterized distributions AIMA2e Slides,
More informationIntroduction to Bayesian Learning
Course Information Introduction Introduction to Bayesian Learning Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Apprendimento Automatico: Fondamenti - A.A. 2016/2017 Outline
More informationProbabilistic representation and reasoning
Probabilistic representation and reasoning Applied artificial intelligence (EDA132) Lecture 09 2017-02-15 Elin A. Topp Material based on course book, chapter 13, 14.1-3 1 Show time! Two boxes of chocolates,
More informationThis lecture. Reading. Conditional Independence Bayesian (Belief) Networks: Syntax and semantics. Chapter CS151, Spring 2004
This lecture Conditional Independence Bayesian (Belief) Networks: Syntax and semantics Reading Chapter 14.1-14.2 Propositions and Random Variables Letting A refer to a proposition which may either be true
More informationCS 5522: Artificial Intelligence II
CS 5522: Artificial Intelligence II Bayes Nets: Independence Instructor: Alan Ritter Ohio State University [These slides were adapted from CS188 Intro to AI at UC Berkeley. All materials available at http://ai.berkeley.edu.]
More informationGraphical Models - Part II
Graphical Models - Part II Bishop PRML Ch. 8 Alireza Ghane Outline Probabilistic Models Bayesian Networks Markov Random Fields Inference Graphical Models Alireza Ghane / Greg Mori 1 Outline Probabilistic
More informationArtificial Intelligence Methods. Inference in Bayesian networks
Artificial Intelligence Methods Inference in Bayesian networks In which we explain how to build network models to reason under uncertainty according to the laws of probability theory. Dr. Igor rajkovski
More informationBayes Nets: Independence
Bayes Nets: Independence [These slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188 materials are available at http://ai.berkeley.edu.] Bayes Nets A Bayes
More informationIntroduction to Bayesian Networks
Introduction to Bayesian Networks Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208 http://www.eecs.northwestern.edu/~yingwu 1/23 Outline Basic Concepts Bayesian
More informationLecture 8. Probabilistic Reasoning CS 486/686 May 25, 2006
Lecture 8 Probabilistic Reasoning CS 486/686 May 25, 2006 Outline Review probabilistic inference, independence and conditional independence Bayesian networks What are they What do they mean How do we create
More informationProbabilistic Models
Bayes Nets 1 Probabilistic Models Models describe how (a portion of) the world works Models are always simplifications May not account for every variable May not account for all interactions between variables
More informationBayesian Networks Introduction to Machine Learning. Matt Gormley Lecture 24 April 9, 2018
10-601 Introduction to Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University Bayesian Networks Matt Gormley Lecture 24 April 9, 2018 1 Homework 7: HMMs Reminders
More informationReasoning Under Uncertainty
Reasoning Under Uncertainty Introduction Representing uncertain knowledge: logic and probability (a reminder!) Probabilistic inference using the joint probability distribution Bayesian networks The Importance
More informationAnnouncements. CS 188: Artificial Intelligence Spring Probability recap. Outline. Bayes Nets: Big Picture. Graphical Model Notation
CS 188: Artificial Intelligence Spring 2010 Lecture 15: Bayes Nets II Independence 3/9/2010 Pieter Abbeel UC Berkeley Many slides over the course adapted from Dan Klein, Stuart Russell, Andrew Moore Current
More informationProbabilistic Graphical Networks: Definitions and Basic Results
This document gives a cursory overview of Probabilistic Graphical Networks. The material has been gleaned from different sources. I make no claim to original authorship of this material. Bayesian Graphical
More informationBayes Nets: Sampling
Bayes Nets: Sampling [These slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188 materials are available at http://ai.berkeley.edu.] Approximate Inference:
More informationCS 5522: Artificial Intelligence II
CS 5522: Artificial Intelligence II Bayes Nets Instructor: Alan Ritter Ohio State University [These slides were adapted from CS188 Intro to AI at UC Berkeley. All materials available at http://ai.berkeley.edu.]
More informationBayes Net Representation. CS 188: Artificial Intelligence. Approximate Inference: Sampling. Variable Elimination. Sampling.
188: Artificial Intelligence Bayes Nets: ampling Bayes Net epresentation A directed, acyclic graph, one node per random variable A conditional probability table (PT) for each node A collection of distributions
More information