Intelligent Systems:

Size: px
Start display at page:

Download "Intelligent Systems:"

Transcription

1 Intelligent Systems: Probabilistic Inference in Undirected and Directed Graphical models Carsten Rother 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM

2 Reminder: Random Forests 1) Get d-dimensional feature, depending on some parameters θ: 2-dimensional: We want to classify the white pixel Feature: color (e.g. red channel) of the green pixel and color (e.g. red channel) of the red pixel Parameters: 4D (2 offset vectors) We will visualize in this way: 04/02/2014 2

3 Reminder: Random Forests body joint hypotheses input depth image body parts Bodypart Labelling Clustering Label each pixel front view side view top view 04/02/2014 3

4 Reminder: Decision Tree Train Time Input: all training points Input data in feature space each point has a class label Split the training set at each node The set of all labelled (training data) points, here 35 red and 23 blue. (remember, the feature space is also optimized with θ) Measure p(c) at each leave, it could be 3 red an 1 blue, i.e. p(red) = 0.75; p(blue) = /02/2014 4

5 Random Forests Training of features (illustration) Image Labeling (2 classes, red and blue) What does it mean to optimize over θ pos + (θ 1, θ 2 ) pos + (θ 3, θ 4 ) For each pixel the same feature test (at one split node) will be done. One has to define what happens with feature tests that reaches outside the image Goal during Training: spate red pixel (class 1) from blue pixels (class 2) Feature: Value x 1 : what is the value of green color channel (could also be red or blue) if you look: θ 1 pixel right and θ 2 pixels up Value x 2 : what is the value of green color channel (could also be red or blue) if you look: θ 3 pixel right and θ 4 pixels down Goal: find a such a θ that it is best to separate the data One choice of θ another choice of θ 04/02/2014 5

6 Roadmap for this lecture Directed Graphical Models: Definition and Example Probabilistic Inference Undirected Graphical Models: Probabilistic inference: message passing Probabilistic inference: sampling Probabilistic programming languages Wrap-Up 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 6

7 Slide credits J. Fürnkranz, TU Darmstadt Dimitri Schlesinger 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 7

8 Reminder: Graphical models to capture structured problems Write probability distribution as a Graphical model: Directed graphical model (also called Bayesian Networks) Undirected graphical model (also called Markov Random Field) Factor graphs (which we will use predominately) Basic idea: A visualization to represent a family of distributions Key concept is conditional independency You can also convert between distributions References: - Pattern Recognition and Machine Learning [Bishop 08, chapter 8] - several lectures at the Machine Learning Summer School 2009 (see video lectures) 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 8

9 When to use what model? Undirected graphical model (also called Markov Random Field) The individual unknown variables have all the same meaning Factor graphs (which we will use predominately) Same as undirected graphical model but represents the distribution more detailed Directed graphical model (also called Bayesian Networks) The unknown variables have different meanings d i Label only left image x 1 x 2 z 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 9

10 Reminder: What to infer? MAP inference (Maximum a posterior state): x = argmax x P x = argmin x E x Probabilistic Inference, so-called marginal: P x i = k = P(x 1, x i = k,, x n ) x x i =k This can be used to make a maximum marginal decision: x i = argmax xi P x i 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 10

11 Reminder: MAP versus Marginal - visually Input Image Ground Truth Labeling MAP solution x (each pixel has 0,1 label) Marginals P x i (each pixel has a probability between 0 and 1) 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 11

12 Reminder: MAP versus Marginal Making Decisions P(x z) Input image z Space of all solutions x (sorted by pixel difference) Which solution x would you choose? 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 12

13 Reminder: How to make a decision Assume model P x z is known Question: What solution x should we give out? Answer: Choose x which minimizes the Bayesian risk: x = argmin x x P x z C(x, x ) C x 1, x 2 is called the loss function (or cost function) of comparing to results x 1, x 2 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 13

14 Reminder: MAP versus Marginals Making Decisions guessed max marginal MAP solution P(x z) MAP global 0-1 loss: C x, x = 0 if x = x, otherwise 1 Max-marginal pixel-wise 0-1 loss: C x, x Space of all solutions x (sorted by pixel difference) = i x i x i (hamming loss) 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 14

15 Reminder: What to infer? MAP inference (Maximum a posterior state): x = argmax x P x = argmin x E x Probabilistic Inference, so-called marginals: P x i = k = P(x 1, x i = k,, x n ) x x i =k This can be used to make a maximum marginal decision: x i = argmax xi P x i So far we did only MAP estimation in factor graphs / undirected graphical models Today we do probabilistic inference in directed and factor graphs / undirected graphical models Comment: MAP inference in directed graphical models can be done, but rarely done 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 15

16 Directed Graphical Model: Intro All variables that are linked by an arrow have a causal, directed relationship Since variables have a meaning it makes sense to look at conditional dependency (in Factor graphs, undirected models this does not give any insights) Eat sweets Hole in tooth Car is broken Toothache Comment: the nodes do not have to be discrete variables, they can also be continuous, as long as distribution is well defined 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 16

17 Directed Graphical Models: Intro Rewrite a distribution using product rule: p x 1, x 2, x 3 = p x 1, x 2 x 3 p x 3 = p x 1 x 2, x 3 p x 2 x 3 p(x 3 ) Any joint distribution can be written as product of conditionals: n p x 1,, x n = p(x i x i+1, x n ) i=1 Visualize the conditional p(x i x i+1, x n ) as follows: x i+1 x n x i 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 17

18 Directed Graphical Models: Examples x 3 x 2 x 3 x 2 x 3 x 2 x 1 p x 1, x 2, x 3 = p x 1 x 2, x 3 p x 2 x 3 p(x 3 ) x 1 p x 1, x 2, x 3 = p x 1 x 2 p x 2 x 3 p(x 3 ) x 1 p x 1, x 2, x 3 = p x 1 x 2, x 3 p(x 2 ) p(x 3 ) As with undirected graphical models the absence of links is the interesting aspect! 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 18

19 Definition: Directed Graphical Model Given an directed Graph G = (V, E), where V is the set of nodes and E the set of directed Edges A directed Graphical Model defines the family of distributions: p x 1,, x n = n i=1 p(x i parents(x i )) The set parents x i is a subset of varibales in {x i+1, x n } and defined via the graph, p(x i x i+1, x n ) is visualized as: Comment: compared to factor graphs / undirected graphical models there is no partition function f: P x = 1 f F ॲ ψ F (x N F ) where f = x F ॲ ψ F(x N F ) 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 19

20 Running Example Situation I am at work John calls to say that the alarm in my house went off Mary, who is my neighbor, did not call me The alarm is usually set off by burglars, but sometimes also by a minor earthquake Can we constructed the directed graphical model? Step 1: Identify variables that can have different state : JohnCalls(J), MaryCalls(M), AlarmOn(A), burglarinhouse(b), earthquakehappening(e) 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 20

21 Step 2: Method to construct Direct Graphical Model 1) Choose a sorting of the variables: x n, x 1 2) For i = n 1 A. Add node x i to network B. Select those parents such that: p x i x i+1,, x n = p x i parents x i 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 21

22 Example Let us select the ordering: MaryCalls(M), JohnCalls(J), AlarmOn(A), burglarinhouse(b), earthquakehappening(e) Build the model step by step: p(m) MaryCalls(M) Joint (so far): P M, J, A, B, E = p(m) 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 22

23 Example Let us select the ordering: MaryCalls(M), JohnCalls(J), AlarmOn(A), burglarinhouse(b), earthquakehappening(e) Build the model step by step: p(m) p J M = p J? No. If Mary calls than John is also likely to call MaryCalls(M) JohnCalls(J) Joint (so far): P M, J, A, B, E = p(m) 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 23

24 Example Let us select the ordering: MaryCalls(M), JohnCalls(J), AlarmOn(A), burglarinhouse(b), earthquakehappening(e) Build the model step by step: p J M = p J? No. If Mary calls than John is also likely to call p(m) MaryCalls(M) p J M JohnCalls(J) Joint (so far): P M, J, A, B, E = p(j M)p(M) 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 24

25 Example Let us select the ordering: MaryCalls(M), JohnCalls(J), AlarmOn(A), burglarinhouse(b), earthquakehappening(e) Build the model step by step: p A J, M) = p A? p A J, M) = p A J? p A J, M) = p A M? No. If Mary calls or John calls it is a higher probability that the alarm is on Joint (so far): P M, J, A, B, E = p(j M)p(M) p(m) MaryCalls(M) AlarmOn(A) p J M JohnCalls(J) 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 25

26 Example Let us select the ordering: MaryCalls(M), JohnCalls(J), AlarmOn(A), burglarinhouse(b), earthquakehappening(e) Build the model step by step: p A J, M) = p A? p A J, M) = p A J? p A J, M) = p A M? No. If Mary calls or John calls it is a higher probability that the alarm is on Joint (so far): P M, J, A, B, E = p A J, M p(j M)p(M) p(m) MaryCalls(M) p A J, M) AlarmOn(A) p J M JohnCalls(J) 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 26

27 Example Let us select the ordering: MaryCalls(M), JohnCalls(J), AlarmOn(A), burglarinhouse(b), earthquakehappening(e) Build the model step by step: p B A, J, M) = p B? p(m) MaryCalls(M) p A J, M) p J M JohnCalls(J) No. The chance that there is a burglar in the house depends on the AlaramOn, Johns calls or Marry calls. AlarmOn(A) BurglarInHouse(B) Joint (so far): P M, J, A, B, E = p A J, M p(j M)p(M) 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 27

28 Example Let us select the ordering: MaryCalls(M), JohnCalls(J), AlarmOn(A), burglarinhouse(b), earthquakehappening(e) Build the model step by step: p B A, J, M) = p B A? p(m) MaryCalls(M) p A J, M) p J M JohnCalls(J) Yes. If we know that alarm has gone off then the information that Marry or John called is not relevant! p(b A) AlarmOn(A) BurglarInHouse(B) Joint (so far): P M, J, A, B, E = p(b A)p A J, M p(j M)p(M) 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 28

29 Example Let us select the ordering: MaryCalls(M), JohnCalls(J), AlarmOn(A), burglarinhouse(b), earthquakehappening(e) Build the model step by step: p E A, J, M, B) = p E? p(m) MaryCalls(M) p A J, M) p J M JohnCalls(J) No. If Mary Calls or John Calls or Alarm is on then the chances of an earthquake are higher p(b A) AlarmOn(A) Joint (so far): P M, J, A, B, E = p(b A)p A J, M p(j M)p(M) BurglarInHouse(B) earthquakehappening(e) 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 29

30 Example Let us select the ordering: MaryCalls(M), JohnCalls(J), AlarmOn(A), burglarinhouse(b), earthquakehappening(e) Build the model step by step: p E A, J, M, B) = p E A, B? p(m) MaryCalls(M) p A J, M) p J M JohnCalls(J) Yes. If we know that alarm has gone off then the information that Marry or John called is not relevant. Joint (final): p(b A) BurglarInHouse(B) P M, J, A, B, E = p(e A, B)p(B A)p A J, M p(j M)p(M) AlarmOn(A) p(e A, B) earthquakehappening(e) 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 30

31 Example: Optimal Order Was that the best order to get a directed Graphical Model with as few links as possible? The optimal order is: BurglarInHouse(B) earthquakehappening(e) AlarmOn(A) Temporal order how things happened! MaryCalls(M) JohnCalls(J) Joint: How to find that order? P M, J, A, B, E = p(j A)p(M A)p A B, E p(e)p(b) 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 31

32 Example: Optimal Order Associate probabilities: P(B = 1) P(B = 0) P(E = 1) P(E = 0) BurglarInHouse(B) earthquakehappening(e) AlarmOn(A) B E P(A = 1 B, E) P(A = 0 B, E) True True True False False True False False MaryCalls(M) A P(M = 1 A) P(M = 0 A) True False JohnCalls(J) A P(J = 1 A) P(J = 0 A) True False /02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 32

33 Worst case scenario 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 33

34 Running Example What is the probability that John calls and Mary calls and alarm on, but there is no burglar and no earthquake p J = 1, M = 1, A = 1, B = 0, E = 0 = p B = 0 p E = 0 p A = 1 B = 0, E = 0 p J = 1 A = 1 p M = 1 A = 1 = = What is the probability that John calls and Mary calls and alarm on p J = 1, M = 1, A = 1 = B,E p J = 1, M = 1, A = 1, B, E = B,E p B p E p A = 1 B, E p J = 1 A = 1 p M = 1 A = 1 = 0.7*0.9*( ) = /02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 34

35 Fixing Marginal Smoker(s) P(s = 1) P(s = 0) s P(c = 1 s) P(c = 0 s) Cancer(c) True False Joint: p c, s = p c s p(s) Marginal: p s = 0 = 0.5 ; p s = 1 = 0.5 p c = 0 = p c = 0 s = 0 p s = 0 + p c = 0 s = 1 p s = 1 = 0.85 p c = 1 = /02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 35

36 Fixing Marginal Smoker(s) P(s = 1) P(s = 0) Fix marginal Cancer(c) s P(c = 1 s) P(c = 0 s) True False We know that the person has cancer, what is the probability that he smokes: Wrong solution: just update the table for p(c s) but leave p(s) unchanged. We want to change p(c) and not p(c s)! Correct solution: recomputed all probabilities under the condition c = 1 p s = 0 c = 1 = p s=0,c=1 p c=1 = p c=1 s=0 p(s=0) p c=1 s=0 p(s=0)+p(c=1 s=1)p(s=1) = 0.05/0.15 = 1/3 p s = 1 c = 1 = p s=1,c=1 p c=1 = p c=1 s=1 p(s=1) p c=1 s=0 p(s=0)+p(c=1 s=1)p(s=1) = 0.1/0.15 = 2/3 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 36

37 Fixing Marginal We know that the person has cancer, what is the probability that he smokes: Smoker(s) P(s = 1) 2/3 1/3 P(s = 0) s P(c = 1 s) P(c = 0 s) Cancer(c) True 1 0 False /02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 37

38 Fixing Marginal Fix marginal Smoker(s) P(s = 1) P(s = 0) s P(c = 1 s) P(c = 0 s) Cancer(c) True False We know that the person smokes, what is the probability that he has cancer: Two solutions (same outcome) 1) Just update p(s) to p(s=0) = 0, p(s=1) = 1, and then you get a new p s, c = p c s p s. Now the marginal: p(c=0) = p(c=0,s=0)+p(c=0,s=1) = p(c=0 s=0)p(s=0) + p(c=0 s=1)p(s=1) = 0.8 p(c=1) = p(c=1,s=0)+p(c=1,s=1) = p(c=1 s=0)p(s=0) + p(c=1 s=1)p(s=1) = 0.2 2) Update p(c) under the condition that s=1: p (c=0) = p(c=0,s=1)/p(s=1) = p(c=0 s=1)p(s=1) / p(s=1) = p(c=0 s=1) = 0.8 p (c=1) = p(c=1,s=1)/p(s=1) = p(c=1 s=1)p(s=1) / p(s=1) = p(c=1 s=1) = /02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 38

39 Fixing Marginal We know that the person smokes, what is the probability that he has cancer: Smoker(s) P(s = 1) 1 0 P(s = 0) s P(c = 1 s) P(c = 0 s) Cancer(c) True False irrelevant irrelevant 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 39

40 Fixing Marginal another example Smoker(s) Healthy (h) environment Cancer(c) Joint: p c, s, h = p c s, h p s p(h) 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 40

41 Fixing Marginal another example Smoker(s) Healthy (h) environment p h does not change Cancer(c) p c changes Joint: p c, s, h = p c s, h p s p(h) p c, h s = p c, s, h p(s) = p c s, h p s p h p(s) = p c s, h p(h) 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 41

42 Examples: Medical Diagnosis 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 42

43 Examples: Medical Diagnosis 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 43

44 Examples: Medical Diagnosis Local Marginal for each node (sometimes also called belief) 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 44

45 Examples: Medical Diagnosis 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 45

46 Examples: Medical Diagnosis 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 46

47 Examples: Medical Diagnosis 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 47

48 Examples: Medical Diagnosis 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 48

49 Car Diagnosis 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 49

50 Examples: Car Insurance 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 50

51 Example: Pigs Network ( Stammbaum ) 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 51

52 Reminder: Image segmentation Joint Probablity P z, x = P z x P x Samples: True image: Labeling x x z Most likely: Image z x z 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 52

53 Reminder: lecture on Computer Vision Scene type Scene geometry Object classes Object position Object orientation Object shape Depth/occlusions Object appearance Illumination Shadows Motion blur Camera effects 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 53

54 Generative model for images 3D Objects Material Light Background Light coming into camera Image 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 54

55 Hybrid networks Directed graphical (as well as undirected ones) can be expressed with discrete and continuous variables (as long as they define correct distributions) 3D Objects (continuous) Material (discrete) Light (continuous) Background (discrete) Light coming into camera (continuous) Image (discrete) 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 55

56 Real World Applications of Bayesian Networks 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 56

57 Real World Applications of Bayesian Networks 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 57

58 What questions can we ask? Marginal of a single variable p(x i ) we call probabilistic inference, most common query Marginal of two variables: p(x i, x j ) Conditional queries: p x i = 0, x j = 1, x k = 0, p(x i, x j x k = 0) Optimal decisions: Add utility function to the network and the ask question about outcome Value of information: Which node to fix to get all marginal as unique as possible Sensitive Analyses: How does network behave if I change one marginal 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 58

59 Roadmap for this lecture Directed Graphical Models: Definition and Example Probabilistic Inference Undirected Graphical Models: Probabilistic inference: message passing Probabilistic inference: sampling Probabilistic programming languages Wrap-Up 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 59

60 Probabilistic Inference in DGM We consider two possible principles to compute marginal (probabilistic inference): Inference by enumeration Inference by sampling 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 60

61 Probabilistic inference by enumeration x 3 x 2 x 1 p x 1, x 2, x 3 = p x 1 x 2, x 3 p x 2 x 3 p(x 3 ) Brute force - just add up all: p x 2 = x 1,x 3 p x 1, x 2, x 3 = x 1,x 3 p x 1 x 2, x 3 p x 2 x 3 p(x 3 ) 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 61

62 Probabilistic inference by enumeration x 3 x 2 x 1 p x 1, x 2, x 3 = p x 1 x 2 p x 2 x 3 p(x 3 ) In some cases you can reduce the cost of computations: p x 1 = x 2,x 3 p x 1, x 2, x 3 = x 2,x 3 p x 1 x 2 p x 2 x 3 p(x 3 ) = x 2 p x 1 x 2 x 3 p x 2 x 3 p(x 3 ) That is a function (message M(x 2 )) that depends on x 2 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 62

63 Probabilistic inference by sampling This procedure is called ancestor sampling or prior sampling: Goal: create one valid sample from the distribution p x 1,, x n = n i=1 p(x i parents(x i )) where parents x i is a subset of varibales in {x i+1, x n } Example: p x 1, x 2, x 3 = p x 1 x 2, x 3 p x 2 x 3 p(x 3 ) Procedure: For i = n 1 Sample x i = p x i parents x i Give out (x 1,, x n ) x 3 x 2 x 1 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 63

64 Probabilistic inference by sampling - Example Example: p x 1, x 2, x 3 = p x 1 x 2, x 3 p x 2 x 3 p(x 3 ) Procedure: For i = n 1 Sample x i = p x i parents x i Give out (x 1,, x n ) x 3 x 2 x 1 x 3 x 2 x 3 x 2 x 3 x 2 x 1 x 1 x 1 Step 1 Step 2 Step 3 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 64

65 How to sample a single variable How to sample from a general discrete probability distribution of one variable p(x), x {0,1,, n}? 1. Define intervals whose lengths are proportional to p(x) 2. Concatenate these intervals 3. Sample into the composed interval uniformly 4. Check, in which interval the sampled value falls in. Below is an example for p x {1,2,3} (three values) /02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 65

66 Probabilistic inference by sampling By running the ancestor sampling often enough we get (in the limit) the true joint distribution out ( Gesetz der grossen Zahlen ) Procedure: 1) Let N x 1,, x n denote the number of times we have seen a certain sample x 1,, x n 2) P x 1,, x n = N x 1,,x n N It is: P x 1,, x n = P x 1,, x n when N The joint distribution can now be used to compute local marginal: P x i = k = x:x i =k P(x 1, x i = k,, x n ) where N is the total number of samples 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 66

67 Example 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 67

68 Example 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 68

69 Example 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 69

70 Example 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 70

71 Example 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 71

72 Example 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 72

73 Roadmap for this lecture Directed Graphical Models: Definition and Example Probabilistic Inference Undirected Graphical Models: Probabilistic inference: message passing Probabilistic inference: sampling Probabilistic programming languages Wrap-Up 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 73

74 Reminder: Definition: Factor Graph models Given an undirected Graph G = (V, F, E), where V, F are the set of nodes and E the set of Edges A Factor Graph defines a family of distributions: P x = 1 f F ॲ ψ F (x N F ) where f = x F ॲ ψ F(x N F ) f: partition function F: Factor ॲ: Set of all factors N(F): Neighbourhood of a factor ψ F : function (not distribution) depending on x N(F) (ψ C : K C R where x i K) Note the definition of factor is not linked to a property of a Graph (as with cliques) 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 74

75 Reminder: Dynamic Programming on chains E x = θ i x i + θ ij x i, x j i Unary terms i,j N pairwise terms in a row p q r M o p (x p ) M p q (x q ) M q r (x r ) M r s (x s ) Pass messages from left to right Message is a vector with K entries (K is the amount of labels) Read out the solution from final message and final unary term globally exact solution Other name: min-sum algorithm Comment: Dmitri Schlesinger called the messages Bellman functions. 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 75

76 Reminder: Dynamic Programming on chains Define the message: p q r M q r M q r x r = min x q {M p q x q + θ q x q + θ q,r (x q, x r )} Information from previous nodes Local information Connection to next node Message stores the energy up to this point for x r = k: M q r x r = k = min E x 1,, x q, x r = k x 1 q,x r =k 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 76

77 Reminder: Dynamic Programming on chains - example 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 77

78 Sum-Prod Algorithm for probabilistic Inference Let us consider factor graphs: P x = 1 f F ॲ ψ F (x N F ) where f = x F ॲ ψ F(x N F ) ψ q x q ψq,r xq, xr p q r M q r (x r ) M q r x r = x q {M p q x q ψ q x q ψ q,r (x q, x r )} Information from previous nodes Local information Connection to next node The Method is called Sum-Prod Algorithm since a message is computed by product and sum operations. 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 78

79 Sum-Prod Algorithm for probabilistic Inference 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 79

80 Extensions The algorithm above can easy be extended to chains with arbitrary length The algorithm above can also be extended to compute marginal for arbitrary nodes (here q) p q r s Message: M p q (x q ) Message: M r q (x q ) 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 80

81 Extensions To compute f and all marginal we have to compute all messages that go into each node This is done by two passes over all nodes: from left to right and from right to left. (this is referred to as the full Sum-Prod Algorithm) p q r s Compute all messages that go this way Compute all messages that go this way 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 81

82 Extensions Both, MAP estimation and probabilistic Inference, can easily be extended to tree structures: Both, MAP estimation and probabilistic Inference, can be extend to arbitrary factor graphs (with loops and higher order potentials) Loop: x 2, x 3, x 4 Higher (3) order factor: x 2, x 1, x 4 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 82

83 Roadmap for this lecture Directed Graphical Models: Definition and Example Probabilistic Inference Undirected Graphical Models: Probabilistic inference: message passing Probabilistic inference: sampling Probabilistic programming languages Wrap-Up 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 83

84 Reminder: ICM - Iterated conditional mode x 2 Gibbs Energy: E x = θ 12 x 1, x 2 + θ 13 x 1, x 3 + θ 14 x 1, x 4 + θ 15 x 1, x 5 + x 3 x 1 x 4 x 5 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 84

85 Reimnder: ICM - Iterated conditional mode x 2 Gibbs Energy: E x = θ 12 x 1, x 2 + θ 13 x 1, x 3 + θ 14 x 1, x 4 + θ 15 x 1, x 5 + Idea: fix all variable but one and optimize for this one x 3 x 1 x 4 Selected x 1 and optimize: E x = θ 12 x 1, x 2 + θ 13 x 1, x 3 + θ 14 x 1, x 4 + θ 15 x 1, x 5 x 5 ICM Global min Can get stuck in local minima Depends on initialization 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 85

86 Reminder: ICM - parallelization Normal procedure: Parallel procedure: Step 1 Step 2 Step 3 Step 4 Step 1 Step 2 Step 3 Step 4 The schedule is a more complex task in graphs which are not 4-connected 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 86

87 Gibbs Sampling Task: draw a sample from a multivariate probability distribution p x = p x 1, x 2,, x n. This is not trivial, if the probability distribution is complex, e.g. an MRF (remember: it is usually given up to an unknown normalizing constant). The following procedure is used: 1. Start from an arbitrary assignment x 0 = (x 1 0 x n 0 ). 2. Repeat: 1) Pick (randomly) a variable x i 2) Sample a new value for x t+1 i according to the conditional probability distribution p x i x t t t 1,, x i 1, x i+1,, x t n ) i.e. under the condition all other variables are fixed 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 87

88 Gibbs Sampling After each elementary sampling step 2) a new assignment x t+1 is obtained, that differs from the previous one only by the value for x i. After many (infinite number) iterations of 1)-2) the assignment x t follows the initial probability distribution p x under some (mild) assumption: The probability to pick each particular variable x i in 1) is nonzero, i.e. for an infinite number of generation steps each variable is visited infinitely many times. (In particular, in Computer vision applications (each variable often corresponds to a pixel) scan-line order is widely used.) For each pair of assignments x 1, x 2 the probability to arrive at x 2 starting from x 1 is non-zero, i.e. independently on the starting point there is a chance to sample each elementary event of the considering probability distribution. 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 88

89 Gibbs Sampling for MRFs For MRFs the elementary sampling step 2) is feasible because p x i x 1,, x i 1, x i+1,, x n = p(x i x N(i) ) where N(i) is the neighborhood of i (the Markovian property). In particular, for Gibbs distributions of second order p x i = k x N i exp[θ i k + i,j N 4 θ ij (k, x j )] It reminds on the Iterated Conditional Mode, but now we do not decide for the best label, we sample one instead. 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 89

90 Reminder: How to sample a single variable How to sample from a general discrete probability distribution of one variable p(x), x {0,1,, n}? 1. Define intervals whose lengths are proportional to p(x) 2. Concatenate these intervals 3. Sample into the composed interval uniformly 4. Check, in which interval the sampled value falls in. Below is an example for p x {1,2,3} (three values) /02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 90

91 Other Samplers Rejection sampling Metropolis-Hasting Sampling Comment: Gibbs Sampler and Metropolis-Hasting Sampling belong to the class of MCMC samplers 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 91

92 Roadmap for this lecture Directed Graphical Models: Definition and Example Probabilistic Inference Undirected Graphical Models: Probabilistic inference: message passing Probabilistic inference: sampling Probabilistic programming languages Wrap-Up 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 92

93 Probabilistic programming Languages A programming language to easily program: Inference in factor graphs, and other machine learning tasks Basic Idea: associate to variables a distribution: Normally C++: Bool coin = 0; Probabilistic Program: Bool coin = Bernoulli(0.5); % Bernoulli is a distribution with 2 states See 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 93

94 Another Example: Two coin example Draw coin1 (x 1 ) and coin2 (x 2 ) independently. Each coin has equal probability to be head (1) or tail (0) New random variable z which is True if and only if both coins are head: z = x 1 & x 2 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 94

95 Write this as a directed graphical model x 1 x 2 x i 0,1 p x i = 1 = p x i = 0 = 0.5 z P(z = 1 x 1, x 2 ) P(z = 0 x 1, x 2 ) Value x 1 Value x Compute Marginal: p z = x 1,x 2 p z, x 1, x 2 = x 1,x 2 p z x 1, x 2 p x 1 p(x 2 ) p z = 1 = = 0.25 p z = 0 = = /02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 95

96 Infer.net example: 2 coins Program: Run it: Add to the program: 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 96

97 Conversion DGM, UGM, Factor graphs In Infer.Net all models are converted to factor graphs and then inference is done in factor graphs In practice you have one concrete distribution (not family) Let us first convert Direct Graphical model to undirected and then from undirected to factor graph 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 97

98 Directed Graphical model to undirected one Take the joint distribution and then replace all conditional symbols with commas and then normalize to get correct distribution Directed Graphical model: p x 1,, x n = i=1 n p(x i parents(x i )) Gives UGM: p x 1,, x n = 1/f i=1 n p(x i, parents(x i )) where f = x i=1 n p(x i, parents(x i )) Note, the function p x i, parents x i is no longer a probability, so it should rather be called: ψ(x i, parents(x i )) = p(x i, parents(x i )) Comment: to go from undirected graphical model to directed one is difficult since we don t have the individual distributions, just a big product of factors 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 98

99 From Directed Graphical model to undirected x 4 x 4 x 3 x 2 x 3 x 2 x 1 p x 1, x 2, x 3, x 4 = p x 1 x 2, x 3 p x 2 p x 3 x 4 p(x 4 ) x 1 p x 1, x 2, x 3, x 4 = 1 f (p x 1, x 2, x 3 p x 2 p x 3, x 4 p x 4 ) This step is called moralization. 1) All the children to a certain node are connected up 2) All directed links are converted to undirected one 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 99

100 Undirected Graphical model to factor graph The next step in Infer.Net: x 4 x 4 x 3 x 2 x 3 x 2 x 1 p x 1, x 2, x 3, x 4 = 1 f (p x 1, x 2, x 3 p x 2 p x 3, x 4 p x 4 ) x 1 p x 1, x 2, x 3, x 4 = 1 f (p x 1, x 2, x 3 p x 2 p x 3, x 4 p x 4 ) This is the exact same distribution just visualized differently 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 100

101 Roadmap for this lecture Directed Graphical Models: Definition and Example Probabilistic Inference Undirected Graphical Models: Probabilistic inference: message passing Probabilistic inference: sampling Probabilistic programming languages Wrap-Up 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 101

102 100+ Things we have learned Introduction/Agents: basic definitions, rational agents, performance measures, PEAS, environment/agent types Search: single-state problems, state space, tree search, uninformed algorithms (BFS, DFS, DLS, IDS), heuristics, informed algorithms (Greedy, A*), local search (hill climbing, beam search, simulated annealing, genetic algorithms) Logic: syntax, semantics, propositional logic, proofs in propositional logic, truth tables, resolution, knowledge representation using propositional logic Decision Trees: entropy, information gain, attribute selection, ID3, overfitting Hidden Markov Models: entity recognition (ER), ER techniques, Hidden Markov Models, Viterbi algorithm, quality measures for ER, cross-fold validation 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 102

103 100+ Things we have learned Probability Theory: probability spaces and algebras, events, random variables, distributions and densities, independence Bayesian Decision Theory: Bayesian Risk Minimization, Maximum A- posteriori Decision as a special case Non-Bayesian formulations: Neyman-Pearson task Statistical Learning: Maximum Likelihood Principle, unsupervised learning Expectation-Maximization Algorithm Discriminative Learning: hierarchy of abstractions generative models discriminative models discriminant functions Discriminative Models: Neurons and the Perceptron Algorithm, linear classifiers, Feed-Forward neural networks and Error Back Propagation, Support Vector Machines and Kernels 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 103

104 100+ Things we have learned Computer Vision is hard: Lighthill Report, physics-based vision, semanticbased vision, scene parsing, semantic segmentation, visual illusions, features Undirected Graphical models: Conversion, Factor Graphs, Hammersley- Clifford Theorem, Gibbs Distribution, ICM, Dynamic Programming, Graph Cut, Move-Making Methods, Applications, Markov Random Field, Image Retargeting, Denoising, Gaussian Mixture Model, Learning structured models Recognition in the Wild: Object Class/Instance recognition, Object Detection, Decision Forests, Information Gain, Over-fitting, Generalization, real-time People Pose detection Directed Graphical Models: MAP versus Marginal, Probabilistic Inference, conditional distribution, fixing marginal, medical diagnosis, inference by enumeration/sampling, Sum-Prod Algorithm, Gibbs Sampling, Probabilistic Programming Languages 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 104

105 The remaining slides are not relevant for the exam If you are interested in Computer Vision and/or Machine Learning: Computer Vision 1: Algorithms and Applications (2+2) WS 14/15 Machine Learning: Basics (2+2) WS 14/15 Computer Vision 2: Inference, Models, and Learning, SS 14 Other lectures in Logic, etc. 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 105

106 Computer Vision 1 VL1 (21.10): Introduction Ex1: Intro to OpenCV (H.Heidrich) VL2 (28.10): Basics of Digital Image Processing Ex2: Fast Methods for Filtering; explain home work VL3 (4.11): Image Formation Models Ex3: homework VL4 (11.11): Single View Geometry and Camera Calibration Ex4: homework VL5 (18.11): Feature Extraction and Matching Ex5: homework VL6 (25.11): Two View Geometry Ex6: Robust Panoramic Stitching; explain home work VL7 (2.12): 3D reconstruction from multiple views Ex7: homework 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 106

107 Computer Vision 1 VL8 (9.12): Dense Motion Estimation Ex1: homework VL9 (16.12): Image Segmentation Ex2: homework VL10 (6.1): Object Recognition (in progress) Ex3: Object Recognition; explain homework VL11 (13.1): Part-based recognition (in progress) Ex4: homework VL12 (20.1): Model-based Vision (in progress) Ex5: homework VL13 (27.1): Having Fun with Images (in progress) Ex6: homework VL14(3.2): Wrap-Up: 100 things we have learned (in progress) Ex5: homework 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 107

108 Computer Vision 2 - Content Introduction structured models (summary of CV 1) Discrete models Inference (state-of-the art) Pairwise models combinatorial optimization, message passing, etc. Concept of re-parametrization, tree-reweighted message passing Higher-order models dual decomposition, P n -Potts, etc Continuous-label models Gaussian Random Fields, IRLS, PMBP, etc. probabilistic inference variational methods, sampling OpenGM and other state-of-the art libraries Discrete models - Learning (state-of-the art) Probabilistic learning Field of Experts Loss-based Learning - Regression/Decision Tree Fields, struct-svm Applications: object recognition and detection (part based, deformable shape models, etc) 3D scene understanding Intrinsic Image decomposition Image partitioning, segmentation and matting Continuous Domain models 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 108

109 Please ask me if you want to work on computer vision We have openings for Master, Bachelor thesis, Seminar, Project work, and SHKs Lots of academic collaborations in Europe: Oxford, UCL, Heidelberg, Darmstadt, ect. Good contacts to companies: Adobe, Microsoft, etc. 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 109

110 Activities in the CVLD project 1 The Two-Frame Inverse Rendering Project: Input: Two large-displacement images Output: depth, motion, material, light, objects, segmentation, etc. Application: image editing, image enhancement, image search, augmented reality, Challenge: build a joint statistical model: 1) How to do efficient inference, such as approximating rendering equation? 2) What aspects should be learned and what derived from physics? 3) Do you get better results when deriving multiple outputs? 4) What are good scene priors? Can we use synthetic models from graphics? 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 110

111 Activities in the CVLD project 1 2 RGBD Input 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 111

112 Activities in the CVLD project 1 Object 1; Material: wood Inoput Two Images Object 2 Material: stone Object 3 Material: glass Scene in Blender 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 112

113 Activities in the CVLD project 2 Object Instance Recognition Scan-in 3D objects (using KinectFusion [Izadi et al. UIST 2011]) Fast detection and 6D object pose estimation of multiple objects from an RGBD stream: 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 113

114 Activities in the CVLD project 3 Improved Regression tree fields Gibbs distribution with energy: Factor graph - compact: E y, x, w = E F y F, x, w F F Factors graph: y i 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 114

115 Activities in the CVLD project 3 State of the art for Image de-convolution: Input x = K*y Output y 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 115

116 Roadmap for this lecture Directed Graphical Models: Definition and Example Probabilistic Inference Undirected Graphical Models: Probabilistic inference: message passing Probabilistic inference: sampling Probabilistic programming languages Wrap-Up 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 116

117 Feedback Please give me feedback 04/02/2014 Intelligent Systems: Probabilistic Inference in DGM and UGM 117

Intelligent Systems:

Intelligent Systems: Intelligent Systems: Undirected Graphical models (Factor Graphs) (2 lectures) Carsten Rother 15/01/2015 Intelligent Systems: Probabilistic Inference in DGM and UGM Roadmap for next two lectures Definition

More information

Graphical Models and Kernel Methods

Graphical Models and Kernel Methods Graphical Models and Kernel Methods Jerry Zhu Department of Computer Sciences University of Wisconsin Madison, USA MLSS June 17, 2014 1 / 123 Outline Graphical Models Probabilistic Inference Directed vs.

More information

Review: Bayesian learning and inference

Review: Bayesian learning and inference Review: Bayesian learning and inference Suppose the agent has to make decisions about the value of an unobserved query variable X based on the values of an observed evidence variable E Inference problem:

More information

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016 Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2016 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several

More information

Introduction to Graphical Models. Srikumar Ramalingam School of Computing University of Utah

Introduction to Graphical Models. Srikumar Ramalingam School of Computing University of Utah Introduction to Graphical Models Srikumar Ramalingam School of Computing University of Utah Reference Christopher M. Bishop, Pattern Recognition and Machine Learning, Jonathan S. Yedidia, William T. Freeman,

More information

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014 Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2014 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several

More information

Bayesian Networks Introduction to Machine Learning. Matt Gormley Lecture 24 April 9, 2018

Bayesian Networks Introduction to Machine Learning. Matt Gormley Lecture 24 April 9, 2018 10-601 Introduction to Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University Bayesian Networks Matt Gormley Lecture 24 April 9, 2018 1 Homework 7: HMMs Reminders

More information

CS 2750: Machine Learning. Bayesian Networks. Prof. Adriana Kovashka University of Pittsburgh March 14, 2016

CS 2750: Machine Learning. Bayesian Networks. Prof. Adriana Kovashka University of Pittsburgh March 14, 2016 CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016 Plan for today and next week Today and next time: Bayesian networks (Bishop Sec. 8.1) Conditional

More information

Objectives. Probabilistic Reasoning Systems. Outline. Independence. Conditional independence. Conditional independence II.

Objectives. Probabilistic Reasoning Systems. Outline. Independence. Conditional independence. Conditional independence II. Copyright Richard J. Povinelli rev 1.0, 10/1//2001 Page 1 Probabilistic Reasoning Systems Dr. Richard J. Povinelli Objectives You should be able to apply belief networks to model a problem with uncertainty.

More information

Introduction to Graphical Models. Srikumar Ramalingam School of Computing University of Utah

Introduction to Graphical Models. Srikumar Ramalingam School of Computing University of Utah Introduction to Graphical Models Srikumar Ramalingam School of Computing University of Utah Reference Christopher M. Bishop, Pattern Recognition and Machine Learning, Jonathan S. Yedidia, William T. Freeman,

More information

Probabilistic Reasoning Systems

Probabilistic Reasoning Systems Probabilistic Reasoning Systems Dr. Richard J. Povinelli Copyright Richard J. Povinelli rev 1.0, 10/7/2001 Page 1 Objectives You should be able to apply belief networks to model a problem with uncertainty.

More information

Cheng Soon Ong & Christian Walder. Canberra February June 2018

Cheng Soon Ong & Christian Walder. Canberra February June 2018 Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 Outlines Overview Introduction Linear Algebra Probability Linear Regression

More information

CS 484 Data Mining. Classification 7. Some slides are from Professor Padhraic Smyth at UC Irvine

CS 484 Data Mining. Classification 7. Some slides are from Professor Padhraic Smyth at UC Irvine CS 484 Data Mining Classification 7 Some slides are from Professor Padhraic Smyth at UC Irvine Bayesian Belief networks Conditional independence assumption of Naïve Bayes classifier is too strong. Allows

More information

Chris Bishop s PRML Ch. 8: Graphical Models

Chris Bishop s PRML Ch. 8: Graphical Models Chris Bishop s PRML Ch. 8: Graphical Models January 24, 2008 Introduction Visualize the structure of a probabilistic model Design and motivate new models Insights into the model s properties, in particular

More information

Quantifying uncertainty & Bayesian networks

Quantifying uncertainty & Bayesian networks Quantifying uncertainty & Bayesian networks CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2016 Soleymani Artificial Intelligence: A Modern Approach, 3 rd Edition,

More information

Part I. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS

Part I. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Part I C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Probabilistic Graphical Models Graphical representation of a probabilistic model Each variable corresponds to a

More information

Undirected Graphical Models

Undirected Graphical Models Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Introduction 2 Properties Properties 3 Generative vs. Conditional

More information

Another look at Bayesian. inference

Another look at Bayesian. inference Another look at Bayesian A general scenario: - Query variables: X inference - Evidence (observed) variables and their values: E = e - Unobserved variables: Y Inference problem: answer questions about the

More information

Bayesian Networks. Motivation

Bayesian Networks. Motivation Bayesian Networks Computer Sciences 760 Spring 2014 http://pages.cs.wisc.edu/~dpage/cs760/ Motivation Assume we have five Boolean variables,,,, The joint probability is,,,, How many state configurations

More information

Probabilistic Reasoning. (Mostly using Bayesian Networks)

Probabilistic Reasoning. (Mostly using Bayesian Networks) Probabilistic Reasoning (Mostly using Bayesian Networks) Introduction: Why probabilistic reasoning? The world is not deterministic. (Usually because information is limited.) Ways of coping with uncertainty

More information

Bayesian Networks BY: MOHAMAD ALSABBAGH

Bayesian Networks BY: MOHAMAD ALSABBAGH Bayesian Networks BY: MOHAMAD ALSABBAGH Outlines Introduction Bayes Rule Bayesian Networks (BN) Representation Size of a Bayesian Network Inference via BN BN Learning Dynamic BN Introduction Conditional

More information

Bayesian networks. Chapter 14, Sections 1 4

Bayesian networks. Chapter 14, Sections 1 4 Bayesian networks Chapter 14, Sections 1 4 Artificial Intelligence, spring 2013, Peter Ljunglöf; based on AIMA Slides c Stuart Russel and Peter Norvig, 2004 Chapter 14, Sections 1 4 1 Bayesian networks

More information

Pattern Recognition and Machine Learning

Pattern Recognition and Machine Learning Christopher M. Bishop Pattern Recognition and Machine Learning ÖSpri inger Contents Preface Mathematical notation Contents vii xi xiii 1 Introduction 1 1.1 Example: Polynomial Curve Fitting 4 1.2 Probability

More information

A graph contains a set of nodes (vertices) connected by links (edges or arcs)

A graph contains a set of nodes (vertices) connected by links (edges or arcs) BOLTZMANN MACHINES Generative Models Graphical Models A graph contains a set of nodes (vertices) connected by links (edges or arcs) In a probabilistic graphical model, each node represents a random variable,

More information

Brief Introduction of Machine Learning Techniques for Content Analysis

Brief Introduction of Machine Learning Techniques for Content Analysis 1 Brief Introduction of Machine Learning Techniques for Content Analysis Wei-Ta Chu 2008/11/20 Outline 2 Overview Gaussian Mixture Model (GMM) Hidden Markov Model (HMM) Support Vector Machine (SVM) Overview

More information

Lecture 8: Bayesian Networks

Lecture 8: Bayesian Networks Lecture 8: Bayesian Networks Bayesian Networks Inference in Bayesian Networks COMP-652 and ECSE 608, Lecture 8 - January 31, 2017 1 Bayes nets P(E) E=1 E=0 0.005 0.995 E B P(B) B=1 B=0 0.01 0.99 E=0 E=1

More information

6.867 Machine learning, lecture 23 (Jaakkola)

6.867 Machine learning, lecture 23 (Jaakkola) Lecture topics: Markov Random Fields Probabilistic inference Markov Random Fields We will briefly go over undirected graphical models or Markov Random Fields (MRFs) as they will be needed in the context

More information

PROBABILISTIC REASONING SYSTEMS

PROBABILISTIC REASONING SYSTEMS PROBABILISTIC REASONING SYSTEMS In which we explain how to build reasoning systems that use network models to reason with uncertainty according to the laws of probability theory. Outline Knowledge in uncertain

More information

Implementing Machine Reasoning using Bayesian Network in Big Data Analytics

Implementing Machine Reasoning using Bayesian Network in Big Data Analytics Implementing Machine Reasoning using Bayesian Network in Big Data Analytics Steve Cheng, Ph.D. Guest Speaker for EECS 6893 Big Data Analytics Columbia University October 26, 2017 Outline Introduction Probability

More information

Lecture 6: Graphical Models

Lecture 6: Graphical Models Lecture 6: Graphical Models Kai-Wei Chang CS @ Uniersity of Virginia kw@kwchang.net Some slides are adapted from Viek Skirmar s course on Structured Prediction 1 So far We discussed sequence labeling tasks:

More information

CSC 412 (Lecture 4): Undirected Graphical Models

CSC 412 (Lecture 4): Undirected Graphical Models CSC 412 (Lecture 4): Undirected Graphical Models Raquel Urtasun University of Toronto Feb 2, 2016 R Urtasun (UofT) CSC 412 Feb 2, 2016 1 / 37 Today Undirected Graphical Models: Semantics of the graph:

More information

Bayesian Network. Outline. Bayesian Network. Syntax Semantics Exact inference by enumeration Exact inference by variable elimination

Bayesian Network. Outline. Bayesian Network. Syntax Semantics Exact inference by enumeration Exact inference by variable elimination Outline Syntax Semantics Exact inference by enumeration Exact inference by variable elimination s A simple, graphical notation for conditional independence assertions and hence for compact specication

More information

Final Examination CS 540-2: Introduction to Artificial Intelligence

Final Examination CS 540-2: Introduction to Artificial Intelligence Final Examination CS 540-2: Introduction to Artificial Intelligence May 7, 2017 LAST NAME: SOLUTIONS FIRST NAME: Problem Score Max Score 1 14 2 10 3 6 4 10 5 11 6 9 7 8 9 10 8 12 12 8 Total 100 1 of 11

More information

Probabilistic Graphical Models. Guest Lecture by Narges Razavian Machine Learning Class April

Probabilistic Graphical Models. Guest Lecture by Narges Razavian Machine Learning Class April Probabilistic Graphical Models Guest Lecture by Narges Razavian Machine Learning Class April 14 2017 Today What is probabilistic graphical model and why it is useful? Bayesian Networks Basic Inference

More information

The Origin of Deep Learning. Lili Mou Jan, 2015

The Origin of Deep Learning. Lili Mou Jan, 2015 The Origin of Deep Learning Lili Mou Jan, 2015 Acknowledgment Most of the materials come from G. E. Hinton s online course. Outline Introduction Preliminary Boltzmann Machines and RBMs Deep Belief Nets

More information

Intelligent Systems Statistical Machine Learning

Intelligent Systems Statistical Machine Learning Intelligent Systems Statistical Machine Learning Carsten Rother, Dmitrij Schlesinger WS2014/2015, Our tasks (recap) The model: two variables are usually present: - the first one is typically discrete k

More information

CS6220: DATA MINING TECHNIQUES

CS6220: DATA MINING TECHNIQUES CS6220: DATA MINING TECHNIQUES Chapter 8&9: Classification: Part 3 Instructor: Yizhou Sun yzsun@ccs.neu.edu March 12, 2013 Midterm Report Grade Distribution 90-100 10 80-89 16 70-79 8 60-69 4

More information

University of Washington Department of Electrical Engineering EE512 Spring, 2006 Graphical Models

University of Washington Department of Electrical Engineering EE512 Spring, 2006 Graphical Models University of Washington Department of Electrical Engineering EE512 Spring, 2006 Graphical Models Jeff A. Bilmes Lecture 1 Slides March 28 th, 2006 Lec 1: March 28th, 2006 EE512

More information

Intelligent Systems (AI-2)

Intelligent Systems (AI-2) Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 11 Oct, 3, 2016 CPSC 422, Lecture 11 Slide 1 422 big picture: Where are we? Query Planning Deterministic Logics First Order Logics Ontologies

More information

Chapter 16. Structured Probabilistic Models for Deep Learning

Chapter 16. Structured Probabilistic Models for Deep Learning Peng et al.: Deep Learning and Practice 1 Chapter 16 Structured Probabilistic Models for Deep Learning Peng et al.: Deep Learning and Practice 2 Structured Probabilistic Models way of using graphs to describe

More information

Probabilistic Graphical Models (I)

Probabilistic Graphical Models (I) Probabilistic Graphical Models (I) Hongxin Zhang zhx@cad.zju.edu.cn State Key Lab of CAD&CG, ZJU 2015-03-31 Probabilistic Graphical Models Modeling many real-world problems => a large number of random

More information

Computer Vision Group Prof. Daniel Cremers. 10a. Markov Chain Monte Carlo

Computer Vision Group Prof. Daniel Cremers. 10a. Markov Chain Monte Carlo Group Prof. Daniel Cremers 10a. Markov Chain Monte Carlo Markov Chain Monte Carlo In high-dimensional spaces, rejection sampling and importance sampling are very inefficient An alternative is Markov Chain

More information

The Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision

The Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision The Particle Filter Non-parametric implementation of Bayes filter Represents the belief (posterior) random state samples. by a set of This representation is approximate. Can represent distributions that

More information

CPSC 540: Machine Learning

CPSC 540: Machine Learning CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is

More information

Graphical Models - Part I

Graphical Models - Part I Graphical Models - Part I Oliver Schulte - CMPT 726 Bishop PRML Ch. 8, some slides from Russell and Norvig AIMA2e Outline Probabilistic Models Bayesian Networks Markov Random Fields Inference Outline Probabilistic

More information

Markov Networks. l Like Bayes Nets. l Graph model that describes joint probability distribution using tables (AKA potentials)

Markov Networks. l Like Bayes Nets. l Graph model that describes joint probability distribution using tables (AKA potentials) Markov Networks l Like Bayes Nets l Graph model that describes joint probability distribution using tables (AKA potentials) l Nodes are random variables l Labels are outcomes over the variables Markov

More information

Inference in Graphical Models Variable Elimination and Message Passing Algorithm

Inference in Graphical Models Variable Elimination and Message Passing Algorithm Inference in Graphical Models Variable Elimination and Message Passing lgorithm Le Song Machine Learning II: dvanced Topics SE 8803ML, Spring 2012 onditional Independence ssumptions Local Markov ssumption

More information

Lecture 10: Introduction to reasoning under uncertainty. Uncertainty

Lecture 10: Introduction to reasoning under uncertainty. Uncertainty Lecture 10: Introduction to reasoning under uncertainty Introduction to reasoning under uncertainty Review of probability Axioms and inference Conditional probability Probability distributions COMP-424,

More information

Machine Learning

Machine Learning Machine Learning 10-601 Tom M. Mitchell Machine Learning Department Carnegie Mellon University August 30, 2017 Today: Decision trees Overfitting The Big Picture Coming soon Probabilistic learning MLE,

More information

Bayesian networks. Chapter Chapter

Bayesian networks. Chapter Chapter Bayesian networks Chapter 14.1 3 Chapter 14.1 3 1 Outline Syntax Semantics Parameterized distributions Chapter 14.1 3 2 Bayesian networks A simple, graphical notation for conditional independence assertions

More information

ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning

ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning Topics Summary of Class Advanced Topics Dhruv Batra Virginia Tech HW1 Grades Mean: 28.5/38 ~= 74.9%

More information

Machine Learning Lecture 14

Machine Learning Lecture 14 Many slides adapted from B. Schiele, S. Roth, Z. Gharahmani Machine Learning Lecture 14 Undirected Graphical Models & Inference 23.06.2015 Bastian Leibe RWTH Aachen http://www.vision.rwth-aachen.de/ leibe@vision.rwth-aachen.de

More information

Markov Networks.

Markov Networks. Markov Networks www.biostat.wisc.edu/~dpage/cs760/ Goals for the lecture you should understand the following concepts Markov network syntax Markov network semantics Potential functions Partition function

More information

Probabilistic Graphical Models Homework 2: Due February 24, 2014 at 4 pm

Probabilistic Graphical Models Homework 2: Due February 24, 2014 at 4 pm Probabilistic Graphical Models 10-708 Homework 2: Due February 24, 2014 at 4 pm Directions. This homework assignment covers the material presented in Lectures 4-8. You must complete all four problems to

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 3 Linear

More information

CS6220: DATA MINING TECHNIQUES

CS6220: DATA MINING TECHNIQUES CS6220: DATA MINING TECHNIQUES Matrix Data: Classification: Part 2 Instructor: Yizhou Sun yzsun@ccs.neu.edu September 21, 2014 Methods to Learn Matrix Data Set Data Sequence Data Time Series Graph & Network

More information

Learning Bayesian Networks (part 1) Goals for the lecture

Learning Bayesian Networks (part 1) Goals for the lecture Learning Bayesian Networks (part 1) Mark Craven and David Page Computer Scices 760 Spring 2018 www.biostat.wisc.edu/~craven/cs760/ Some ohe slides in these lectures have been adapted/borrowed from materials

More information

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005 Instructor: Professor Jeff A. Bilmes Uncertainty & Bayesian Networks

More information

Directed and Undirected Graphical Models

Directed and Undirected Graphical Models Directed and Undirected Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Machine Learning: Neural Networks and Advanced Models (AA2) Last Lecture Refresher Lecture Plan Directed

More information

Machine Learning 4771

Machine Learning 4771 Machine Learning 4771 Instructor: Tony Jebara Topic 16 Undirected Graphs Undirected Separation Inferring Marginals & Conditionals Moralization Junction Trees Triangulation Undirected Graphs Separation

More information

Bayesian Machine Learning

Bayesian Machine Learning Bayesian Machine Learning Andrew Gordon Wilson ORIE 6741 Lecture 4 Occam s Razor, Model Construction, and Directed Graphical Models https://people.orie.cornell.edu/andrew/orie6741 Cornell University September

More information

Markov Networks. l Like Bayes Nets. l Graphical model that describes joint probability distribution using tables (AKA potentials)

Markov Networks. l Like Bayes Nets. l Graphical model that describes joint probability distribution using tables (AKA potentials) Markov Networks l Like Bayes Nets l Graphical model that describes joint probability distribution using tables (AKA potentials) l Nodes are random variables l Labels are outcomes over the variables Markov

More information

Undirected Graphical Models: Markov Random Fields

Undirected Graphical Models: Markov Random Fields Undirected Graphical Models: Markov Random Fields 40-956 Advanced Topics in AI: Probabilistic Graphical Models Sharif University of Technology Soleymani Spring 2015 Markov Random Field Structure: undirected

More information

Machine Learning for Data Science (CS4786) Lecture 24

Machine Learning for Data Science (CS4786) Lecture 24 Machine Learning for Data Science (CS4786) Lecture 24 Graphical Models: Approximate Inference Course Webpage : http://www.cs.cornell.edu/courses/cs4786/2016sp/ BELIEF PROPAGATION OR MESSAGE PASSING Each

More information

Bayesian Networks (Part II)

Bayesian Networks (Part II) 10-601 Introduction to Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University Bayesian Networks (Part II) Graphical Model Readings: Murphy 10 10.2.1 Bishop 8.1,

More information

Introduction to Artificial Intelligence. Unit # 11

Introduction to Artificial Intelligence. Unit # 11 Introduction to Artificial Intelligence Unit # 11 1 Course Outline Overview of Artificial Intelligence State Space Representation Search Techniques Machine Learning Logic Probabilistic Reasoning/Bayesian

More information

Introduction to Graphical Models

Introduction to Graphical Models Introduction to Graphical Models The 15 th Winter School of Statistical Physics POSCO International Center & POSTECH, Pohang 2018. 1. 9 (Tue.) Yung-Kyun Noh GENERALIZATION FOR PREDICTION 2 Probabilistic

More information

Lecture 15. Probabilistic Models on Graph

Lecture 15. Probabilistic Models on Graph Lecture 15. Probabilistic Models on Graph Prof. Alan Yuille Spring 2014 1 Introduction We discuss how to define probabilistic models that use richly structured probability distributions and describe how

More information

Be able to define the following terms and answer basic questions about them:

Be able to define the following terms and answer basic questions about them: CS440/ECE448 Section Q Fall 2017 Final Review Be able to define the following terms and answer basic questions about them: Probability o Random variables, axioms of probability o Joint, marginal, conditional

More information

CS 343: Artificial Intelligence

CS 343: Artificial Intelligence CS 343: Artificial Intelligence Bayes Nets: Sampling Prof. Scott Niekum The University of Texas at Austin [These slides based on those of Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley.

More information

Using first-order logic, formalize the following knowledge:

Using first-order logic, formalize the following knowledge: Probabilistic Artificial Intelligence Final Exam Feb 2, 2016 Time limit: 120 minutes Number of pages: 19 Total points: 100 You can use the back of the pages if you run out of space. Collaboration on the

More information

Undirected graphical models

Undirected graphical models Undirected graphical models Semantics of probabilistic models over undirected graphs Parameters of undirected models Example applications COMP-652 and ECSE-608, February 16, 2017 1 Undirected graphical

More information

ECE521 Tutorial 11. Topic Review. ECE521 Winter Credits to Alireza Makhzani, Alex Schwing, Rich Zemel and TAs for slides. ECE521 Tutorial 11 / 4

ECE521 Tutorial 11. Topic Review. ECE521 Winter Credits to Alireza Makhzani, Alex Schwing, Rich Zemel and TAs for slides. ECE521 Tutorial 11 / 4 ECE52 Tutorial Topic Review ECE52 Winter 206 Credits to Alireza Makhzani, Alex Schwing, Rich Zemel and TAs for slides ECE52 Tutorial ECE52 Winter 206 Credits to Alireza / 4 Outline K-means, PCA 2 Bayesian

More information

Intelligent Systems (AI-2)

Intelligent Systems (AI-2) Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 19 Oct, 23, 2015 Slide Sources Raymond J. Mooney University of Texas at Austin D. Koller, Stanford CS - Probabilistic Graphical Models D. Page,

More information

Final Exam, Machine Learning, Spring 2009

Final Exam, Machine Learning, Spring 2009 Name: Andrew ID: Final Exam, 10701 Machine Learning, Spring 2009 - The exam is open-book, open-notes, no electronics other than calculators. - The maximum possible score on this exam is 100. You have 3

More information

CS Lecture 4. Markov Random Fields

CS Lecture 4. Markov Random Fields CS 6347 Lecture 4 Markov Random Fields Recap Announcements First homework is available on elearning Reminder: Office hours Tuesday from 10am-11am Last Time Bayesian networks Today Markov random fields

More information

Bayesian Methods for Machine Learning

Bayesian Methods for Machine Learning Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),

More information

Bayesian Learning in Undirected Graphical Models

Bayesian Learning in Undirected Graphical Models Bayesian Learning in Undirected Graphical Models Zoubin Ghahramani Gatsby Computational Neuroscience Unit University College London, UK http://www.gatsby.ucl.ac.uk/ Work with: Iain Murray and Hyun-Chul

More information

Variational Inference (11/04/13)

Variational Inference (11/04/13) STA561: Probabilistic machine learning Variational Inference (11/04/13) Lecturer: Barbara Engelhardt Scribes: Matt Dickenson, Alireza Samany, Tracy Schifeling 1 Introduction In this lecture we will further

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 11 Project

More information

Probabilistic Models

Probabilistic Models Bayes Nets 1 Probabilistic Models Models describe how (a portion of) the world works Models are always simplifications May not account for every variable May not account for all interactions between variables

More information

Bayesian Networks Inference with Probabilistic Graphical Models

Bayesian Networks Inference with Probabilistic Graphical Models 4190.408 2016-Spring Bayesian Networks Inference with Probabilistic Graphical Models Byoung-Tak Zhang intelligence Lab Seoul National University 4190.408 Artificial (2016-Spring) 1 Machine Learning? Learning

More information

p L yi z n m x N n xi

p L yi z n m x N n xi y i z n x n N x i Overview Directed and undirected graphs Conditional independence Exact inference Latent variables and EM Variational inference Books statistical perspective Graphical Models, S. Lauritzen

More information

Probabilistic Graphical Models

Probabilistic Graphical Models 2016 Robert Nowak Probabilistic Graphical Models 1 Introduction We have focused mainly on linear models for signals, in particular the subspace model x = Uθ, where U is a n k matrix and θ R k is a vector

More information

Directed Graphical Models

Directed Graphical Models CS 2750: Machine Learning Directed Graphical Models Prof. Adriana Kovashka University of Pittsburgh March 28, 2017 Graphical Models If no assumption of independence is made, must estimate an exponential

More information

Intelligent Systems: Reasoning and Recognition. Reasoning with Bayesian Networks

Intelligent Systems: Reasoning and Recognition. Reasoning with Bayesian Networks Intelligent Systems: Reasoning and Recognition James L. Crowley ENSIMAG 2 / MoSIG M1 Second Semester 2016/2017 Lesson 13 24 march 2017 Reasoning with Bayesian Networks Naïve Bayesian Systems...2 Example

More information

Results: MCMC Dancers, q=10, n=500

Results: MCMC Dancers, q=10, n=500 Motivation Sampling Methods for Bayesian Inference How to track many INTERACTING targets? A Tutorial Frank Dellaert Results: MCMC Dancers, q=10, n=500 1 Probabilistic Topological Maps Results Real-Time

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models David Sontag New York University Lecture 4, February 16, 2012 David Sontag (NYU) Graphical Models Lecture 4, February 16, 2012 1 / 27 Undirected graphical models Reminder

More information

CS 188: Artificial Intelligence. Bayes Nets

CS 188: Artificial Intelligence. Bayes Nets CS 188: Artificial Intelligence Probabilistic Inference: Enumeration, Variable Elimination, Sampling Pieter Abbeel UC Berkeley Many slides over this course adapted from Dan Klein, Stuart Russell, Andrew

More information

9/12/17. Types of learning. Modeling data. Supervised learning: Classification. Supervised learning: Regression. Unsupervised learning: Clustering

9/12/17. Types of learning. Modeling data. Supervised learning: Classification. Supervised learning: Regression. Unsupervised learning: Clustering Types of learning Modeling data Supervised: we know input and targets Goal is to learn a model that, given input data, accurately predicts target data Unsupervised: we know the input only and want to make

More information

Notes on Markov Networks

Notes on Markov Networks Notes on Markov Networks Lili Mou moull12@sei.pku.edu.cn December, 2014 This note covers basic topics in Markov networks. We mainly talk about the formal definition, Gibbs sampling for inference, and maximum

More information

Probabilistic Graphical Models for Image Analysis - Lecture 1

Probabilistic Graphical Models for Image Analysis - Lecture 1 Probabilistic Graphical Models for Image Analysis - Lecture 1 Alexey Gronskiy, Stefan Bauer 21 September 2018 Max Planck ETH Center for Learning Systems Overview 1. Motivation - Why Graphical Models 2.

More information

Probabilistic Reasoning. Kee-Eung Kim KAIST Computer Science

Probabilistic Reasoning. Kee-Eung Kim KAIST Computer Science Probabilistic Reasoning Kee-Eung Kim KAIST Computer Science Outline #1 Acting under uncertainty Probabilities Inference with Probabilities Independence and Bayes Rule Bayesian networks Inference in Bayesian

More information

COMP5211 Lecture Note on Reasoning under Uncertainty

COMP5211 Lecture Note on Reasoning under Uncertainty COMP5211 Lecture Note on Reasoning under Uncertainty Fangzhen Lin Department of Computer Science and Engineering Hong Kong University of Science and Technology Fangzhen Lin (HKUST) Uncertainty 1 / 33 Uncertainty

More information

CS 188: Artificial Intelligence Fall 2009

CS 188: Artificial Intelligence Fall 2009 CS 188: Artificial Intelligence Fall 2009 Lecture 14: Bayes Nets 10/13/2009 Dan Klein UC Berkeley Announcements Assignments P3 due yesterday W2 due Thursday W1 returned in front (after lecture) Midterm

More information

Graphical Models - Part II

Graphical Models - Part II Graphical Models - Part II Bishop PRML Ch. 8 Alireza Ghane Outline Probabilistic Models Bayesian Networks Markov Random Fields Inference Graphical Models Alireza Ghane / Greg Mori 1 Outline Probabilistic

More information

Informatics 2D Reasoning and Agents Semester 2,

Informatics 2D Reasoning and Agents Semester 2, Informatics 2D Reasoning and Agents Semester 2, 2017 2018 Alex Lascarides alex@inf.ed.ac.uk Lecture 23 Probabilistic Reasoning with Bayesian Networks 15th March 2018 Informatics UoE Informatics 2D 1 Where

More information

Bayesian networks. Soleymani. CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2018

Bayesian networks. Soleymani. CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2018 Bayesian networks CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2018 Soleymani Slides have been adopted from Klein and Abdeel, CS188, UC Berkeley. Outline Probability

More information

Intelligent Systems (AI-2)

Intelligent Systems (AI-2) Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 19 Oct, 24, 2016 Slide Sources Raymond J. Mooney University of Texas at Austin D. Koller, Stanford CS - Probabilistic Graphical Models D. Page,

More information

Statistical Learning

Statistical Learning Statistical Learning Lecture 5: Bayesian Networks and Graphical Models Mário A. T. Figueiredo Instituto Superior Técnico & Instituto de Telecomunicações University of Lisbon, Portugal May 2018 Mário A.

More information