Directed Graphical Models or Bayesian Networks
|
|
- Rosa Garrett
- 5 years ago
- Views:
Transcription
1 Directed Graphical Models or Bayesian Networks Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012
2 Bayesian Networks One of the most exciting recent advancements in statistical AI Compact representation for exponentially-large probability distributions Fast marginalization algorithm Exploit conditional independencies Allergy Flu Difference from undirected graphical models! Sinus Nose Headache 2
3 Handwriting recognition 3
4 Handwriting recognition X 1 X 2 X 3 X 4 X 5 4
5 Summary: basic concepts for R.V. Outcome: assign x 1,, x n to X 1, X n Conditional probability: P(X, Y) = P(X) P(Y X) Bayes rule: P(Y X) = P X Y P(Y) P(X) Chain rule: P X 1,, X n = P X 1 P X 2 X 1 P X n X 1,, X n 1 5
6 Summary: conditional independence X is independent of Y given Z if P(X = x Y = y, Z = z) = P(X = x Z = z) x Val X, y Val Y, z Val Z Shorthand: (X Y Z) For (X Y ), write X Y Proposition: (X Y Z) if and only if P(X, Y Z) = P(X Z)P(Y Z) 6
7 Representation of Bayesian Networks Consider P X i Assign probability to each x i Val X i Assume Val X i k 1 = k, how many independent parameters? Consider P X 1,, X n How many independent parameters if Val X i k n 1 = k? Bayesian Networks can represent the same joint probability with fewer parameters 7
8 Simple Bayesian Networks
9 What if variables are independent? What if all variables are independent? Is it enough to have X i X j Not enough!!!, i, j Must assume that X Y, X, Y subsets of X 1,, X n X 1, X 3 X 2, X 4, X 1 X 7, X 8, X 9, Can write as P X 1,, X n = P X i i=1 n Bayesian networks with no edges X 1 X 2 X 3 X n How many independent parameters now? n k 1 9
10 Conditional parameterization two nodes Grade (G) is determined by intelligence (I) P(I) = VH H I P G I = G I VH H A B G P I = VH, G = B = P I = VH P G = B I = VH = =
11 Conditional parameterization three nodes Grade and SAT score are determined by intelligence P(I) I (G S I) P(S G, I = P(S I) P(G I) P(S I) G S P(I, G, S) = P(I) P(G I) P(S I), why? Chain rule P(I, G, S) = P(I) P(G I) P(S G, I) Use conditional independence, we get P(I) P(G I) P(S I) 11
12 The naïve Bayes model Class variable: C Evidence variables: X 1, X n Assume that (X Y C), X, Y subsets of X 1,, X n C P C, X 1,, X n = P C P X i C i, why? Chain rule P C, X 1,, X n = P C P X 1 C P X 2 C, X 1 P X n C, X 1,, X n 1 X 1 X 2 X n P X 2 C, X 1 = P X 2 C X_1 X_2 C P X n C, X 1,, X n 1 = P X n C X n X 1,, X n 1 C 12
13 More Complicated Bayesian Networks and an Example
14 Causal Structure We will learn the semantics of Bayesian Networks (BNs), relate them to independence assumptions encoded by the graph Suppose we know the following: The flu (F) causes sinus inflammation (S) Allergies (A) cause sinus inflammation Sinus inflammation causes a runny nose (N) Sinus inflammation causes headaches (H) How are these variables connected? Allergy Sinus Flu Nose Headache 14
15 Possible queries Probabilistic Inference Eg., P(A = t H = t, N = f) Most probable explanation Allergy Flu max F,A,S P(F, A, S H = t, N = t) Sinus Active data collection What s the next best test variable to observed Nose Headache 15
16 Car starts BN 18 binary variables P A, F, L,, Spark Inference P(BatteryAge Starts=f) = Alt FanBelt P A, F, L, Sum over 2 16 terms, why BN so fast? Use the sparse graph structure For the HailFinder BN in JavaBayes More than 3 54 = terms 16
17 Factored joint distribution--preview P(A) Allergy Flu P(F) P(S F, A) Sinus P(N S) Nose Headache P(H S) P F, A, S, H, N = P F P A P S F, A P H S P N S 17
18 Number of parameters 1 P(A) Allergy Flu P(F) 1 S FA tt tf ft ff t f P(S F, A) Sinus P(N S) 2 Nose Headache Bayesian networks (BN) has 10 parameters P(H S) Full probability table P(F, A, S, H, N) explicitly has = 31 parameters 2 18
19 Key: Independence assumptions H N H N S Allergy Flu A N A N S Sinus Knowing sinus separates the symptom variables from each other Nose Headache 19
20 Marginal independence Flu and Allergy are (marginally) independent F A P F, A = P F P A Allergy Flu More generally: subsets of {X 1,, X n }, X Y, X X 1,, X n, Y X 1,, X n Nose Sinus Headache P X 1,, X n = P X i i Flu = t Flu = f Flu = t 0.1 Flu = f 0.9 Al = t 0.3 Al = f 0.7 Al = t = Al = f
21 Conditional independence Flu and headache are not (marginally) independent F H, P F H P F, P F, H P F p H Allergy Flu Flu and headache are independent given Sinus infection F H S, P F, H S P F H, S = P F S = P F S P H S Nose Sinus Headache More generally: X 1, X 2,, X n are independent of each other given C P X 1, X 2,, X n C = P X 1 C P X 2,, X n C P X 1, X 2,, X n C = P X i C i 21
22 The conditional independence assumption Local Markov Assumption: a variable X is independence of its non-descendants given its parents and only its parents X NonDescendants X Pa X Flu: Pa Flu =, NonDescendants Flu = {A} F A Allergy Flu Nose: Pa nose = S, NonDescendants nose = N {F, A, H} S F, A, H Sinus Sinus: Pa sinus = F, A, NonDescendants sinus = No assumption about S??? F, A Nose Headache 22
23 Explaining away Local Markov Assumption: a variable X is independence of its non-descendants given its parents and only its parents X NonDescendants X Pa X local Markov assumption not imply F A S Allergy Flu XOR: P A = t = 0.5, P F = t = 0.5, S = F XOR A S = t, A = t F = f, ie., P F = f S = t, A = t =0 Sinus P F = t = 0.2, P F = t S = t = 0.5, P F = t S = t, A = t = 0.3 P F = t P F = t S = t, A = t P F = t S = t Knowing A = t lowers the probability of F = t (A=t explains away F=t) Depends on P(S F,A), it could be P F = t S = t, A = t P F = t S = t Nose Headache 23
24 Why can we decompose joint distribution? Chain rule and Local Markov Assumption Pick a topological ordering of nodes Interpret a BN using particular chain rule order Use the conditional independence assumption 2 Allergy Sinus Flu Nose Headache P N, H, S, A, F = P F P A F P S F, A P H S, F, A P N S, F, A, H P(A) P(S F, A) P(H S) P(N S) A F H FA S H FAH S P(N, H, S, A, F) = P(F) P(A) P(S FA) P(H S) P(N S) 24
25 General Bayesian Networks
26 A general Bayes net Set of random variables X 1,, X n Directed acyclic graph (DAG) Loops are ok But no directed cycle Local Markov Assumptions A variable X is independent of its non-descendants given its parents and only its parents (X NonDescendants X Pa X ) Conditional probability tables (CPTs), P(X i Pa Xi ), for each X i Joint distribution P X 1,, X n = P X i Pa Xi i 26
27 Question? What distributions can be represented by a Bayesian Networks? What Bayesian Networks can represent a distribution? What are the independence assumptions encoded by a BN? In addition to the local Markov assumption A B C D Local Markov A C B D AB C Derived independence A D B 27
28 Conditional Independence in Problem World, Data, Reality: Bayesian Networks: True distribution P Contains conditional independence assertions I(P) Graph G encodes local independence assumptions I l G Key representational assumption: I l G I P 28
29 The representation theorem True conditional independence => BN factorization BN encodes local conditional independence assumptions I l G If local conditional independence in BN are subset of conditional independence in P I P I l G obtain Then the joint probability P can be written as P(X 1,, X n ) = P(X i Pa Xi ) i 29
30 The representation theorem BN factorization => True conditional independence BN encodes local conditional independence assumptions I l G If the joint probability P can be written as P(X 1,, X n ) = P(X i Pa Xi ) i obtain Then local conditional independence in BN are subset of conditional independence in P I P I l G 30
31 Example: naïve Bayes True conditional independence => BN Factorization Independence assumptions: X i are independent of each other given C C That is X Y, X, Y subsets of X 1,, X n X 1, X 3 X 2, X 4, X 1 X 7, X 8, X 9, Same as local conditional independence X 1 X 2 X n prove P C, X 1,, X n = P(C) P X i C i Use chain rule, and local Markov property P C, X 1,, X n = P C P X 1 C P X 2 C, X 1 P X n C, X 1,, X n 1 P X 2 C, X 1 = P X 2 C X_1 X_2 C P X n C, X 1,, X n 1 = P X n C X n X 1,, X n 1 C 31
32 Example: naïve Bayes BN Factorization => True conditional independence Assume P C, X 1,, X n = P(C) P X i C i C Prove Independence assumptions: X i are independent of each other given C X 1 X 2 X n That is X Y, X, Y subsets of X 1,, X n X 1, X 3 X 2, X 4, X 1 X 7, X 8, X 9, Eg. n = 4, P(X1, X_2 C) = P(X 1,X 2,C) P(C) = 1 P C x 3, x 4 P X 1 C P X 2 C P X 3 C = x3,x4 P(X 1,X 2,X 3,X 4,C) P(C) P X 4 C P C = P X 1 C P X 2 C x 3 P X 3 C x 4 P(X 4 C) = P X 1 C P X 2 C
33 How about the general case? BN encodes local conditional independence assumptions I l G If local conditional independence in BN are subset of conditional independence in P I P I l G Every P has least one BN structure G If the joint probability P can be written as P(X 1,, X n ) = P(X i Pa Xi ) i This BN is an I-map of P Read independence of P from BN structure G obtain obtain P factorizes according to BN Then the joint probability P can be written as P(X 1,, X n ) = P(X i Pa Xi ) Then local conditional independence in BN are subset of conditional independence in P I P I l G i 33
34 Proof from I-map to factorization Topological ordering of X 1, X 2, X n : Number variables such that Parent has lower number than child ie., X i X j i < j Variable has lower number than all its descendants 2 Allergy Flu 1 Sinus Nose Headach Use chain rule P X 1,, X n = P X 1 P X 2 X 1 P X n X 1, X n 1 P X i X 1,, X i 1 : Pa Xi {X 1,, X i 1 }, And there is no descendant of X i in {X 1,, X i 1 } P X i X 1,, X i 1 = P(X i Pa Xi ) Local Markov Assumption: X NonDescendants X Pa X 34
35 Adding edges doesn t hurt Let G be an I-map for P, any DAG G that includes the same directed edges as G is also an I-map of P G is strictly more expressive than G If G is an I-map for P, then adding edges still results in an I-map P(N S,A) P(N S, A = t) = P(N S, A = f) P(N S) = P(N S, A) N A S Allergy Sinus Flu Nose Headache 35
36 Minimal I-maps G is a minimal I-map for P if deleting any edges for G makes it no longer an I-map Eg. A B, A B C, minimal I-map A C B, if remove an edge then no longer I-map Obtain a minimal I-map given a set of variabls and conditional independence assertions Choose an ordering on variables X 1,, X n For i = 1 to n Add X i to network Define parents of X i, Pa Xi, in graphs as the minimal subset of nodes such that local Markov assumptions hold Define/learn CPT P X i Pa Xi ) 36
37 Read conditional independence from BN structure G
38 Conditional Independence encoded in BN Local Markov assumption X NonDescendants X Pa X There are other conditional independence Eg., explaining away, derived Allergy Flu A B C D Sinus A F A F S Local Markov A C B D AB C Derived independence A D B What other derived conditional independence there? 38
39 3-node cases Causal effect: Allergy Sinus Headache A H A H S Common cause Nose Sinus Headache N H N H S Common effect (V-structure) Allergy Sinus Flu A F A F S 39
40 How about other derived relations?? F? {B, E, G, J} F {B, E, G, J} A B? A, C, F, I? B, E, G, J A, C, F, I {B, E, G, J} C D E? B? J E B J E? E? F K E F K F H G? E? F {K, I} E F {K, I}? F? G D F G D I J? F? G H F G H? F? G {H, K} K F G {H, K}? F? G {H, A} F G {H, A} 40
41 Active trails A trail X 1 X 2 X k is an active trail when variables O X 1,, X n are observed if for each consecutive triplet in the trail: X i 1 X i X i+1 and X i is not observed X i O X i 1 X i X i+1 and X i is not observed X i O X i 1 X i X i+1 and X i is not observed X i O X i 1 X i X i+1 and X i is observed X i O or one of its descendants (V-structure) 41
42 Active trails and conditional independence Variables X i and X j are independent given Z X 1,, X n, if there is no active trail between X i and X j when variables Z are observed (X i X j Z) C A D B E We say that X i and X j are d- separated given Z (dependency separation) F I H J G Eg., F G K 42
43 Soundness of d-separation Given BN with structure G Set of conditional independence assertions obtained by d- separation I G = X Y Z: d sep G X; Y Z Soundness of d-separation If P factorizes over G then I G I P (not only I l G I P ) Interpretation: d-separation only captures true conditional independencies For most P s that factorize over G, I G = I P (P-map) 43
44 Bayesian Networks are not enough
45 Inexistence of P-maps, example 1 XOR: A = B XOR C A B, A B C B C, A C B C A, B C A A C B Not P-map Can not read B C A C Minimal I-map A B 45
46 Inexistence of P-maps, example 2 Swinging couples of variables X 1, Y 1 and X 2, Y 2 X 1 Y 1 X 2 Y 2 X 1 X 2 Y 1 Y 2 Y 1 Y 2 X 1 X 2 No Bayesian Network P-map Y 2 X 1 X 2 Need undirected graphical models! Y 1 46
Bayesian Networks Representation
Bayesian Networks Representation Machine Learning 10701/15781 Carlos Guestrin Carnegie Mellon University March 19 th, 2007 Handwriting recognition Character recognition, e.g., kernel SVMs a c z rr r r
More informationBN Semantics 3 Now it s personal!
Readings: K&F: 3.3, 3.4 BN Semantics 3 Now it s personal! Graphical Models 10708 Carlos Guestrin Carnegie Mellon University September 22 nd, 2008 10-708 Carlos Guestrin 2006-2008 1 Independencies encoded
More informationBN Semantics 3 Now it s personal! Parameter Learning 1
Readings: K&F: 3.4, 14.1, 14.2 BN Semantics 3 Now it s personal! Parameter Learning 1 Graphical Models 10708 Carlos Guestrin Carnegie Mellon University September 22 nd, 2006 1 Building BNs from independence
More informationLearning P-maps Param. Learning
Readings: K&F: 3.3, 3.4, 16.1, 16.2, 16.3, 16.4 Learning P-maps Param. Learning Graphical Models 10708 Carlos Guestrin Carnegie Mellon University September 24 th, 2008 10-708 Carlos Guestrin 2006-2008
More information2 : Directed GMs: Bayesian Networks
10-708: Probabilistic Graphical Models 10-708, Spring 2017 2 : Directed GMs: Bayesian Networks Lecturer: Eric P. Xing Scribes: Jayanth Koushik, Hiroaki Hayashi, Christian Perez Topic: Directed GMs 1 Types
More information1 : Introduction. 1 Course Overview. 2 Notation. 3 Representing Multivariate Distributions : Probabilistic Graphical Models , Spring 2014
10-708: Probabilistic Graphical Models 10-708, Spring 2014 1 : Introduction Lecturer: Eric P. Xing Scribes: Daniel Silva and Calvin McCarter 1 Course Overview In this lecture we introduce the concept of
More informationInference in Graphical Models Variable Elimination and Message Passing Algorithm
Inference in Graphical Models Variable Elimination and Message Passing lgorithm Le Song Machine Learning II: dvanced Topics SE 8803ML, Spring 2012 onditional Independence ssumptions Local Markov ssumption
More informationStructure Learning: the good, the bad, the ugly
Readings: K&F: 15.1, 15.2, 15.3, 15.4, 15.5 Structure Learning: the good, the bad, the ugly Graphical Models 10708 Carlos Guestrin Carnegie Mellon University September 29 th, 2006 1 Understanding the uniform
More informationDirected and Undirected Graphical Models
Directed and Undirected Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Machine Learning: Neural Networks and Advanced Models (AA2) Last Lecture Refresher Lecture Plan Directed
More informationDirected Graphical Models
CS 2750: Machine Learning Directed Graphical Models Prof. Adriana Kovashka University of Pittsburgh March 28, 2017 Graphical Models If no assumption of independence is made, must estimate an exponential
More informationDirected Graphical Models
Directed Graphical Models Instructor: Alan Ritter Many Slides from Tom Mitchell Graphical Models Key Idea: Conditional independence assumptions useful but Naïve Bayes is extreme! Graphical models express
More informationCS839: Probabilistic Graphical Models. Lecture 2: Directed Graphical Models. Theo Rekatsinas
CS839: Probabilistic Graphical Models Lecture 2: Directed Graphical Models Theo Rekatsinas 1 Questions Questions? Waiting list Questions on other logistics 2 Section 1 1. Intro to Bayes Nets 3 Section
More informationPROBABILISTIC REASONING SYSTEMS
PROBABILISTIC REASONING SYSTEMS In which we explain how to build reasoning systems that use network models to reason with uncertainty according to the laws of probability theory. Outline Knowledge in uncertain
More informationIntroduction to Bayes Nets. CS 486/686: Introduction to Artificial Intelligence Fall 2013
Introduction to Bayes Nets CS 486/686: Introduction to Artificial Intelligence Fall 2013 1 Introduction Review probabilistic inference, independence and conditional independence Bayesian Networks - - What
More informationSTAT 598L Probabilistic Graphical Models. Instructor: Sergey Kirshner. Bayesian Networks
STAT 598L Probabilistic Graphical Models Instructor: Sergey Kirshner Bayesian Networks Representing Joint Probability Distributions 2 n -1 free parameters Reducing Number of Parameters: Conditional Independence
More informationCS Lecture 3. More Bayesian Networks
CS 6347 Lecture 3 More Bayesian Networks Recap Last time: Complexity challenges Representing distributions Computing probabilities/doing inference Introduction to Bayesian networks Today: D-separation,
More information2 : Directed GMs: Bayesian Networks
10-708: Probabilistic Graphical Models, Spring 2015 2 : Directed GMs: Bayesian Networks Lecturer: Eric P. Xing Scribes: Yi Cheng, Cong Lu 1 Notation Here the notations used in this course are defined:
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Lecture 4 Learning Bayesian Networks CS/CNS/EE 155 Andreas Krause Announcements Another TA: Hongchao Zhou Please fill out the questionnaire about recitations Homework 1 out.
More informationCOS402- Artificial Intelligence Fall Lecture 10: Bayesian Networks & Exact Inference
COS402- Artificial Intelligence Fall 2015 Lecture 10: Bayesian Networks & Exact Inference Outline Logical inference and probabilistic inference Independence and conditional independence Bayes Nets Semantics
More informationRapid Introduction to Machine Learning/ Deep Learning
Rapid Introduction to Machine Learning/ Deep Learning Hyeong In Choi Seoul National University 1/32 Lecture 5a Bayesian network April 14, 2016 2/32 Table of contents 1 1. Objectives of Lecture 5a 2 2.Bayesian
More information1. what conditional independencies are implied by the graph. 2. whether these independecies correspond to the probability distribution
NETWORK ANALYSIS Lourens Waldorp PROBABILITY AND GRAPHS The objective is to obtain a correspondence between the intuitive pictures (graphs) of variables of interest and the probability distributions of
More informationTópicos Especiais em Modelagem e Análise - Aprendizado por Máquina CPS863
Tópicos Especiais em Modelagem e Análise - Aprendizado por Máquina CPS863 Daniel, Edmundo, Rosa Terceiro trimestre de 2012 UFRJ - COPPE Programa de Engenharia de Sistemas e Computação Bayesian Networks
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Carlos Carvalho, Mladen Kolar and Robert McCulloch 11/19/2015 Classification revisited The goal of classification is to learn a mapping from features to the target class.
More informationLearning Bayes Net Structures
Learning Bayes Net Structures KF, Chapter 15 15.5 (RN, Chapter 20) Some material taken from C Guesterin (CMU), K Murphy (UBC) 1 2 Learning Bayes Nets Known Structure Unknown Data Complete Missing Easy
More informationBayesian Networks BY: MOHAMAD ALSABBAGH
Bayesian Networks BY: MOHAMAD ALSABBAGH Outlines Introduction Bayes Rule Bayesian Networks (BN) Representation Size of a Bayesian Network Inference via BN BN Learning Dynamic BN Introduction Conditional
More informationBayes Nets: Independence
Bayes Nets: Independence [These slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188 materials are available at http://ai.berkeley.edu.] Bayes Nets A Bayes
More informationCS 5522: Artificial Intelligence II
CS 5522: Artificial Intelligence II Bayes Nets: Independence Instructor: Alan Ritter Ohio State University [These slides were adapted from CS188 Intro to AI at UC Berkeley. All materials available at http://ai.berkeley.edu.]
More informationRecall from last time. Lecture 3: Conditional independence and graph structure. Example: A Bayesian (belief) network.
ecall from last time Lecture 3: onditional independence and graph structure onditional independencies implied by a belief network Independence maps (I-maps) Factorization theorem The Bayes ball algorithm
More informationCS 2750: Machine Learning. Bayesian Networks. Prof. Adriana Kovashka University of Pittsburgh March 14, 2016
CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016 Plan for today and next week Today and next time: Bayesian networks (Bishop Sec. 8.1) Conditional
More informationEE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS
EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005 Instructor: Professor Jeff A. Bilmes Uncertainty & Bayesian Networks
More informationArtificial Intelligence Bayes Nets: Independence
Artificial Intelligence Bayes Nets: Independence Instructors: David Suter and Qince Li Course Delivered @ Harbin Institute of Technology [Many slides adapted from those created by Dan Klein and Pieter
More informationIntelligent Systems: Reasoning and Recognition. Reasoning with Bayesian Networks
Intelligent Systems: Reasoning and Recognition James L. Crowley ENSIMAG 2 / MoSIG M1 Second Semester 2016/2017 Lesson 13 24 march 2017 Reasoning with Bayesian Networks Naïve Bayesian Systems...2 Example
More informationUncertainty and Bayesian Networks
Uncertainty and Bayesian Networks Tutorial 3 Tutorial 3 1 Outline Uncertainty Probability Syntax and Semantics for Uncertainty Inference Independence and Bayes Rule Syntax and Semantics for Bayesian Networks
More informationGraphical models and causality: Directed acyclic graphs (DAGs) and conditional (in)dependence
Graphical models and causality: Directed acyclic graphs (DAGs) and conditional (in)dependence General overview Introduction Directed acyclic graphs (DAGs) and conditional independence DAGs and causal effects
More informationUndirected Graphical Models: Markov Random Fields
Undirected Graphical Models: Markov Random Fields 40-956 Advanced Topics in AI: Probabilistic Graphical Models Sharif University of Technology Soleymani Spring 2015 Markov Random Field Structure: undirected
More informationPart I. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS
Part I C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Probabilistic Graphical Models Graphical representation of a probabilistic model Each variable corresponds to a
More informationReadings: K&F: 16.3, 16.4, Graphical Models Carlos Guestrin Carnegie Mellon University October 6 th, 2008
Readings: K&F: 16.3, 16.4, 17.3 Bayesian Param. Learning Bayesian Structure Learning Graphical Models 10708 Carlos Guestrin Carnegie Mellon University October 6 th, 2008 10-708 Carlos Guestrin 2006-2008
More informationBayesian Network Representation
Bayesian Network Representation Sargur Srihari srihari@cedar.buffalo.edu 1 Topics Joint and Conditional Distributions I-Maps I-Map to Factorization Factorization to I-Map Perfect Map Knowledge Engineering
More informationTDT70: Uncertainty in Artificial Intelligence. Chapter 1 and 2
TDT70: Uncertainty in Artificial Intelligence Chapter 1 and 2 Fundamentals of probability theory The sample space is the set of possible outcomes of an experiment. A subset of a sample space is called
More informationProbabilistic Reasoning. (Mostly using Bayesian Networks)
Probabilistic Reasoning (Mostly using Bayesian Networks) Introduction: Why probabilistic reasoning? The world is not deterministic. (Usually because information is limited.) Ways of coping with uncertainty
More informationCS 188: Artificial Intelligence. Bayes Nets
CS 188: Artificial Intelligence Probabilistic Inference: Enumeration, Variable Elimination, Sampling Pieter Abbeel UC Berkeley Many slides over this course adapted from Dan Klein, Stuart Russell, Andrew
More informationRepresentation. Stefano Ermon, Aditya Grover. Stanford University. Lecture 2
Representation Stefano Ermon, Aditya Grover Stanford University Lecture 2 Stefano Ermon, Aditya Grover (AI Lab) Deep Generative Models Lecture 2 1 / 32 Learning a generative model We are given a training
More informationProbabilistic Graphical Models and Bayesian Networks. Artificial Intelligence Bert Huang Virginia Tech
Probabilistic Graphical Models and Bayesian Networks Artificial Intelligence Bert Huang Virginia Tech Concept Map for Segment Probabilistic Graphical Models Probabilistic Time Series Models Particle Filters
More informationLecture 8. Probabilistic Reasoning CS 486/686 May 25, 2006
Lecture 8 Probabilistic Reasoning CS 486/686 May 25, 2006 Outline Review probabilistic inference, independence and conditional independence Bayesian networks What are they What do they mean How do we create
More informationIntroduction to Bayesian Networks. Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 1
Introduction to Bayesian Networks Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 1 On learning and inference Assume n binary random variables X1,...,X n A joint probability distribution
More informationIntroduction to Bayesian Networks
Introduction to Bayesian Networks The two-variable case Assume two binary (Bernoulli distributed) variables A and B Two examples of the joint distribution P(A,B): B=1 B=0 P(A) A=1 0.08 0.02 0.10 A=0 0.72
More informationProbabilistic Graphical Networks: Definitions and Basic Results
This document gives a cursory overview of Probabilistic Graphical Networks. The material has been gleaned from different sources. I make no claim to original authorship of this material. Bayesian Graphical
More informationCausal Inference & Reasoning with Causal Bayesian Networks
Causal Inference & Reasoning with Causal Bayesian Networks Neyman-Rubin Framework Potential Outcome Framework: for each unit k and each treatment i, there is a potential outcome on an attribute U, U ik,
More informationAn Introduction to Bayesian Machine Learning
1 An Introduction to Bayesian Machine Learning José Miguel Hernández-Lobato Department of Engineering, Cambridge University April 8, 2013 2 What is Machine Learning? The design of computational systems
More informationCS 484 Data Mining. Classification 7. Some slides are from Professor Padhraic Smyth at UC Irvine
CS 484 Data Mining Classification 7 Some slides are from Professor Padhraic Smyth at UC Irvine Bayesian Belief networks Conditional independence assumption of Naïve Bayes classifier is too strong. Allows
More informationBayesian networks. Chapter Chapter
Bayesian networks Chapter 14.1 3 Chapter 14.1 3 1 Outline Syntax Semantics Parameterized distributions Chapter 14.1 3 2 Bayesian networks A simple, graphical notation for conditional independence assertions
More informationGraphical Models - Part I
Graphical Models - Part I Oliver Schulte - CMPT 726 Bishop PRML Ch. 8, some slides from Russell and Norvig AIMA2e Outline Probabilistic Models Bayesian Networks Markov Random Fields Inference Outline Probabilistic
More informationMachine Learning Lecture 14
Many slides adapted from B. Schiele, S. Roth, Z. Gharahmani Machine Learning Lecture 14 Undirected Graphical Models & Inference 23.06.2015 Bastian Leibe RWTH Aachen http://www.vision.rwth-aachen.de/ leibe@vision.rwth-aachen.de
More informationCS 343: Artificial Intelligence
CS 343: Artificial Intelligence Bayes Nets: Sampling Prof. Scott Niekum The University of Texas at Austin [These slides based on those of Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley.
More informationLecture 4 October 18th
Directed and undirected graphical models Fall 2017 Lecture 4 October 18th Lecturer: Guillaume Obozinski Scribe: In this lecture, we will assume that all random variables are discrete, to keep notations
More informationBayesian Networks. Semantics of Bayes Nets. Example (Binary valued Variables) CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-III
CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-III Bayesian Networks Announcements: Drop deadline is this Sunday Nov 5 th. All lecture notes needed for T3 posted (L13,,L17). T3 sample
More informationDirected and Undirected Graphical Models
Directed and Undirected Graphical Models Adrian Weller MLSALT4 Lecture Feb 26, 2016 With thanks to David Sontag (NYU) and Tony Jebara (Columbia) for use of many slides and illustrations For more information,
More informationOutline. CSE 573: Artificial Intelligence Autumn Bayes Nets: Big Picture. Bayes Net Semantics. Hidden Markov Models. Example Bayes Net: Car
CSE 573: Artificial Intelligence Autumn 2012 Bayesian Networks Dan Weld Many slides adapted from Dan Klein, Stuart Russell, Andrew Moore & Luke Zettlemoyer Outline Probabilistic models (and inference)
More information3 : Representation of Undirected GM
10-708: Probabilistic Graphical Models 10-708, Spring 2016 3 : Representation of Undirected GM Lecturer: Eric P. Xing Scribes: Longqi Cai, Man-Chia Chang 1 MRF vs BN There are two types of graphical models:
More informationLearning in Bayesian Networks
Learning in Bayesian Networks Florian Markowetz Max-Planck-Institute for Molecular Genetics Computational Molecular Biology Berlin Berlin: 20.06.2002 1 Overview 1. Bayesian Networks Stochastic Networks
More informationBayesian Networks. CompSci 270 Duke University Ron Parr. Why Joint Distributions are Important
Bayesian Networks CompSci 270 Duke University Ron Parr Why Joint Distributions are Important Joint distributions gives P(X 1 X n ) Classification/Diagnosis Suppose X1=disease X2 Xn = symptoms Co-occurrence
More informationQuantifying uncertainty & Bayesian networks
Quantifying uncertainty & Bayesian networks CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2016 Soleymani Artificial Intelligence: A Modern Approach, 3 rd Edition,
More informationBayesian Networks Inference with Probabilistic Graphical Models
4190.408 2016-Spring Bayesian Networks Inference with Probabilistic Graphical Models Byoung-Tak Zhang intelligence Lab Seoul National University 4190.408 Artificial (2016-Spring) 1 Machine Learning? Learning
More informationIntroduction to Artificial Intelligence. Unit # 11
Introduction to Artificial Intelligence Unit # 11 1 Course Outline Overview of Artificial Intelligence State Space Representation Search Techniques Machine Learning Logic Probabilistic Reasoning/Bayesian
More informationIntelligent Systems (AI-2)
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 11 Oct, 3, 2016 CPSC 422, Lecture 11 Slide 1 422 big picture: Where are we? Query Planning Deterministic Logics First Order Logics Ontologies
More informationProbabilistic representation and reasoning
Probabilistic representation and reasoning Applied artificial intelligence (EDA132) Lecture 09 2017-02-15 Elin A. Topp Material based on course book, chapter 13, 14.1-3 1 Show time! Two boxes of chocolates,
More informationBayesian Network. Outline. Bayesian Network. Syntax Semantics Exact inference by enumeration Exact inference by variable elimination
Outline Syntax Semantics Exact inference by enumeration Exact inference by variable elimination s A simple, graphical notation for conditional independence assertions and hence for compact specication
More informationProbabilistic Graphical Models Redes Bayesianas: Definição e Propriedades Básicas
Probabilistic Graphical Models Redes Bayesianas: Definição e Propriedades Básicas Renato Martins Assunção DCC, UFMG - 2015 Renato Assunção, DCC, UFMG PGM 1 / 29 What s the use of? = BN Y = (Y 1, Y 2,...,
More informationBayesian Networks Representation
3/8/2017 ayesian Networks Representation Emily ox University of Washington March 6, 2017 Learning from structured data 2 1 TrueSkill: ayesian Skill Rating System erbrich et al., 2007 Skill Player performance
More informationChris Bishop s PRML Ch. 8: Graphical Models
Chris Bishop s PRML Ch. 8: Graphical Models January 24, 2008 Introduction Visualize the structure of a probabilistic model Design and motivate new models Insights into the model s properties, in particular
More informationReview: Bayesian learning and inference
Review: Bayesian learning and inference Suppose the agent has to make decisions about the value of an unobserved query variable X based on the values of an observed evidence variable E Inference problem:
More informationIntroduction to Artificial Intelligence Belief networks
Introduction to Artificial Intelligence Belief networks Chapter 15.1 2 Dieter Fox Based on AIMA Slides c S. Russell and P. Norvig, 1998 Chapter 15.1 2 0-0 Outline Bayesian networks: syntax and semantics
More informationPreliminaries Bayesian Networks Graphoid Axioms d-separation Wrap-up. Bayesian Networks. Brandon Malone
Preliminaries Graphoid Axioms d-separation Wrap-up Much of this material is adapted from Chapter 4 of Darwiche s book January 23, 2014 Preliminaries Graphoid Axioms d-separation Wrap-up 1 Preliminaries
More informationProbabilistic representation and reasoning
Probabilistic representation and reasoning Applied artificial intelligence (EDAF70) Lecture 04 2019-02-01 Elin A. Topp Material based on course book, chapter 13, 14.1-3 1 Show time! Two boxes of chocolates,
More informationProbabilistic Graphical Models (I)
Probabilistic Graphical Models (I) Hongxin Zhang zhx@cad.zju.edu.cn State Key Lab of CAD&CG, ZJU 2015-03-31 Probabilistic Graphical Models Modeling many real-world problems => a large number of random
More informationOutline. CSE 573: Artificial Intelligence Autumn Agent. Partial Observability. Markov Decision Process (MDP) 10/31/2012
CSE 573: Artificial Intelligence Autumn 2012 Reasoning about Uncertainty & Hidden Markov Models Daniel Weld Many slides adapted from Dan Klein, Stuart Russell, Andrew Moore & Luke Zettlemoyer 1 Outline
More informationCSE 473: Artificial Intelligence Autumn 2011
CSE 473: Artificial Intelligence Autumn 2011 Bayesian Networks Luke Zettlemoyer Many slides over the course adapted from either Dan Klein, Stuart Russell or Andrew Moore 1 Outline Probabilistic models
More informationFrom Bayesian Networks to Markov Networks. Sargur Srihari
From Bayesian Networks to Markov Networks Sargur srihari@cedar.buffalo.edu 1 Topics Bayesian Networks and Markov Networks From BN to MN: Moralized graphs From MN to BN: Chordal graphs 2 Bayesian Networks
More informationCheng Soon Ong & Christian Walder. Canberra February June 2018
Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 Outlines Overview Introduction Linear Algebra Probability Linear Regression
More informationCS 380: ARTIFICIAL INTELLIGENCE UNCERTAINTY. Santiago Ontañón
CS 380: ARTIFICIAL INTELLIGENCE UNCERTAINTY Santiago Ontañón so367@drexel.edu Summary Probability is a rigorous formalism for uncertain knowledge Joint probability distribution specifies probability of
More informationMachine Learning 4771
Machine Learning 4771 Instructor: Tony Jebara Topic 16 Undirected Graphs Undirected Separation Inferring Marginals & Conditionals Moralization Junction Trees Triangulation Undirected Graphs Separation
More informationSTATISTICAL METHODS IN AI/ML Vibhav Gogate The University of Texas at Dallas. Bayesian networks: Representation
STATISTICAL METHODS IN AI/ML Vibhav Gogate The University of Texas at Dallas Bayesian networks: Representation Motivation Explicit representation of the joint distribution is unmanageable Computationally:
More informationConditional Independence and Factorization
Conditional Independence and Factorization Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr
More informationBayesian Networks. Vibhav Gogate The University of Texas at Dallas
Bayesian Networks Vibhav Gogate The University of Texas at Dallas Intro to AI (CS 6364) Many slides over the course adapted from either Dan Klein, Luke Zettlemoyer, Stuart Russell or Andrew Moore 1 Outline
More informationRecall from last time: Conditional probabilities. Lecture 2: Belief (Bayesian) networks. Bayes ball. Example (continued) Example: Inference problem
Recall from last time: Conditional probabilities Our probabilistic models will compute and manipulate conditional probabilities. Given two random variables X, Y, we denote by Lecture 2: Belief (Bayesian)
More informationECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning
ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning Topics Summary of Class Advanced Topics Dhruv Batra Virginia Tech HW1 Grades Mean: 28.5/38 ~= 74.9%
More information4.1 Notation and probability review
Directed and undirected graphical models Fall 2015 Lecture 4 October 21st Lecturer: Simon Lacoste-Julien Scribe: Jaime Roquero, JieYing Wu 4.1 Notation and probability review 4.1.1 Notations Let us recall
More informationProbabilistic Models
Bayes Nets 1 Probabilistic Models Models describe how (a portion of) the world works Models are always simplifications May not account for every variable May not account for all interactions between variables
More informationIntroduction to Probabilistic Graphical Models
Introduction to Probabilistic Graphical Models Franz Pernkopf, Robert Peharz, Sebastian Tschiatschek Graz University of Technology, Laboratory of Signal Processing and Speech Communication Inffeldgasse
More informationCPSC 540: Machine Learning
CPSC 540: Machine Learning Undirected Graphical Models Mark Schmidt University of British Columbia Winter 2016 Admin Assignment 3: 2 late days to hand it in today, Thursday is final day. Assignment 4:
More informationBayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016
Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2016 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several
More information4 : Exact Inference: Variable Elimination
10-708: Probabilistic Graphical Models 10-708, Spring 2014 4 : Exact Inference: Variable Elimination Lecturer: Eric P. ing Scribes: Soumya Batra, Pradeep Dasigi, Manzil Zaheer 1 Probabilistic Inference
More informationBayesian Networks. Motivation
Bayesian Networks Computer Sciences 760 Spring 2014 http://pages.cs.wisc.edu/~dpage/cs760/ Motivation Assume we have five Boolean variables,,,, The joint probability is,,,, How many state configurations
More informationCS Lecture 4. Markov Random Fields
CS 6347 Lecture 4 Markov Random Fields Recap Announcements First homework is available on elearning Reminder: Office hours Tuesday from 10am-11am Last Time Bayesian networks Today Markov random fields
More informationAnnouncements. CS 188: Artificial Intelligence Fall Causality? Example: Traffic. Topology Limits Distributions. Example: Reverse Traffic
CS 188: Artificial Intelligence Fall 2008 Lecture 16: Bayes Nets III 10/23/2008 Announcements Midterms graded, up on glookup, back Tuesday W4 also graded, back in sections / box Past homeworks in return
More information6.867 Machine learning, lecture 23 (Jaakkola)
Lecture topics: Markov Random Fields Probabilistic inference Markov Random Fields We will briefly go over undirected graphical models or Markov Random Fields (MRFs) as they will be needed in the context
More informationEvents A and B are independent P(A) = P(A B) = P(A B) / P(B)
Events A and B are independent A B U P(A) = P(A B) = P(A B) / P(B) 1 Alternative Characterization of Independence a) P(A B) = P(A) b) P(A B) = P(A) P(B) Recall P(A B) = P(A B) / P(B) (if P(B) 0) So P(A
More informationBayesian networks: Modeling
Bayesian networks: Modeling CS194-10 Fall 2011 Lecture 21 CS194-10 Fall 2011 Lecture 21 1 Outline Overview of Bayes nets Syntax and semantics Examples Compact conditional distributions CS194-10 Fall 2011
More informationBayesian Networks. Vibhav Gogate The University of Texas at Dallas
Bayesian Networks Vibhav Gogate The University of Texas at Dallas Intro to AI (CS 4365) Many slides over the course adapted from either Dan Klein, Luke Zettlemoyer, Stuart Russell or Andrew Moore 1 Outline
More informationBayesian networks. Soleymani. CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2018
Bayesian networks CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2018 Soleymani Slides have been adopted from Klein and Abdeel, CS188, UC Berkeley. Outline Probability
More information