This lecture. Reading. Conditional Independence Bayesian (Belief) Networks: Syntax and semantics. Chapter CS151, Spring 2004

Similar documents
Uncertainty and Belief Networks. Introduction to Artificial Intelligence CS 151 Lecture 1 continued Ok, Lecture 2!

Uncertainty. Introduction to Artificial Intelligence CS 151 Lecture 2 April 1, CS151, Spring 2004

Uncertainty (Chapter 13, Russell & Norvig) Introduction to Artificial Intelligence CS 150 Lecture 14

Quantifying uncertainty & Bayesian networks

Probabilistic Reasoning Systems

Objectives. Probabilistic Reasoning Systems. Outline. Independence. Conditional independence. Conditional independence II.

Bayesian networks. Chapter 14, Sections 1 4

Bayesian Networks. Material used

CS 380: ARTIFICIAL INTELLIGENCE UNCERTAINTY. Santiago Ontañón

Probabilistic representation and reasoning

Bayesian Belief Network

Uncertainty (Chapter 13, Russell & Norvig)

Bayesian networks. Chapter Chapter

Probabilistic Reasoning. (Mostly using Bayesian Networks)

Uncertainty. 22c:145 Artificial Intelligence. Problem of Logic Agents. Foundations of Probability. Axioms of Probability

Bayesian networks. Soleymani. CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2018

Probabilistic representation and reasoning

CS Belief networks. Chapter

Bayesian Network. Outline. Bayesian Network. Syntax Semantics Exact inference by enumeration Exact inference by variable elimination

CS 484 Data Mining. Classification 7. Some slides are from Professor Padhraic Smyth at UC Irvine

PROBABILISTIC REASONING SYSTEMS

Graphical Models - Part I

CS 188: Artificial Intelligence Fall 2008

Bayesian Networks. Semantics of Bayes Nets. Example (Binary valued Variables) CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-III

Artificial Intelligence Uncertainty

Uncertainty. Variables. assigns to each sentence numerical degree of belief between 0 and 1. uncertainty

Review: Bayesian learning and inference

Artificial Intelligence

Bayesian networks. Chapter Chapter

Probabilistic Reasoning. Kee-Eung Kim KAIST Computer Science

14 PROBABILISTIC REASONING

Bayesian networks. Chapter Chapter Outline. Syntax Semantics Parameterized distributions. Chapter

Uncertainty and Bayesian Networks

Bayesian networks: Modeling

Bayesian networks. Chapter AIMA2e Slides, Stuart Russell and Peter Norvig, Completed by Kazim Fouladi, Fall 2008 Chapter 14.

Uncertainty. Outline

CS 188: Artificial Intelligence Fall 2009

Uncertainty. Chapter 13

Uncertainty. Chapter 13

CS 188: Artificial Intelligence Spring Announcements

Probabilistic Models

Introduction to Artificial Intelligence Belief networks

CS 5522: Artificial Intelligence II

Brief Intro. to Bayesian Networks. Extracted and modified from four lectures in Intro to AI, Spring 2008 Paul S. Rosenbloom

Quantifying Uncertainty & Probabilistic Reasoning. Abdulla AlKhenji Khaled AlEmadi Mohammed AlAnsari

Artificial Intelligence CS 6364

CS 343: Artificial Intelligence

Outline } Conditional independence } Bayesian networks: syntax and semantics } Exact inference } Approximate inference AIMA Slides cstuart Russell and

COMP9414/ 9814/ 3411: Artificial Intelligence. 14. Uncertainty. Russell & Norvig, Chapter 13. UNSW c AIMA, 2004, Alan Blair, 2012

Y. Xiang, Inference with Uncertain Knowledge 1

Uncertainty. Logic and Uncertainty. Russell & Norvig. Readings: Chapter 13. One problem with logical-agent approaches: C:145 Artificial

PROBABILISTIC REASONING Outline

Axioms of Probability? Notation. Bayesian Networks. Bayesian Networks. Today we ll introduce Bayesian Networks.

Introduction to Artificial Intelligence. Unit # 11

Pengju XJTU 2016

Informatics 2D Reasoning and Agents Semester 2,

CS 561: Artificial Intelligence

Basic Probability. Robert Platt Northeastern University. Some images and slides are used from: 1. AIMA 2. Chris Amato

Reasoning Under Uncertainty

CSE 473: Artificial Intelligence Autumn 2011

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS

CS 188: Artificial Intelligence Fall 2009

Bayesian Networks. Vibhav Gogate The University of Texas at Dallas

Directed Graphical Models

COMP9414: Artificial Intelligence Reasoning Under Uncertainty

Reasoning Under Uncertainty

Events A and B are independent P(A) = P(A B) = P(A B) / P(B)

Bayesian Networks. Vibhav Gogate The University of Texas at Dallas

Probabilistic Reasoning

Uncertainty. Chapter 13, Sections 1 6

COMP5211 Lecture Note on Reasoning under Uncertainty

Probability Hal Daumé III. Computer Science University of Maryland CS 421: Introduction to Artificial Intelligence 27 Mar 2012

Web-Mining Agents Data Mining

Pengju

Probabilistic Robotics. Slides from Autonomous Robots (Siegwart and Nourbaksh), Chapter 5 Probabilistic Robotics (S. Thurn et al.

CS 188: Artificial Intelligence. Our Status in CS188

Bayesian Networks. Philipp Koehn. 6 April 2017

Foundations of Artificial Intelligence

CS Lecture 3. More Bayesian Networks

Bayesian Networks. Motivation

Uncertainty. Chapter 13. Chapter 13 1

Probabilistic Models. Models describe how (a portion of) the world works

FIRST ORDER LOGIC AND PROBABILISTIC INFERENCING. ECE457 Applied Artificial Intelligence Page 1

Bayesian Networks. Philipp Koehn. 29 October 2015

Lecture 10: Introduction to reasoning under uncertainty. Uncertainty

Announcements. CS 188: Artificial Intelligence Spring Probability recap. Outline. Bayes Nets: Big Picture. Graphical Model Notation

CS 2750: Machine Learning. Bayesian Networks. Prof. Adriana Kovashka University of Pittsburgh March 14, 2016

Another look at Bayesian. inference

Outline. Uncertainty. Methods for handling uncertainty. Uncertainty. Making decisions under uncertainty. Probability. Uncertainty

ARTIFICIAL INTELLIGENCE. Uncertainty: probabilistic reasoning

Uncertain Knowledge and Reasoning

p(x) p(x Z) = y p(y X, Z) = αp(x Y, Z)p(Y Z)

Probability. CS 3793/5233 Artificial Intelligence Probability 1

Bayesian networks (1) Lirong Xia

Implementing Machine Reasoning using Bayesian Network in Big Data Analytics

Chapter 13 Quantifying Uncertainty

Probabilistic Models Bayesian Networks Markov Random Fields Inference. Graphical Models. Foundations of Data Analysis

Basic Probability and Decisions

Introduction to Probabilistic Reasoning. Image credit: NASA. Assignment

Course Overview. Summary. Outline

Transcription:

This lecture Conditional Independence Bayesian (Belief) Networks: Syntax and semantics Reading Chapter 14.1-14.2

Propositions and Random Variables Letting A refer to a proposition which may either be true or false, A) denotes the probability that A is true also denoted A=TRUE). A is a called a binary or propositional random variable. We also have multi-valued random variables e.g., Weather is one of <sunny,rain,cloudy,snow>

Priors, Distribution, Joint Prior or unconditional probabilities of propositions e.g., Cavity) =Cavity=TRUE)= 0.1 Weather=sunny) = 0.72 correspond to belief prior to arrival of any (new) evidence Probability distribution gives probabilities of all possible values of the random variable. Weather is one of <sunny,rain,cloudy,snow> Weather) = <0.72, 0.1, 0.08, 0.1> (normalized, i.e., sums to 1)

Joint probability distribution Joint probability distribution for a set of variables gives values for each possible assignment to all the variables Toothache, Cavity) is a 2 by 2 matrix. Cavity=true Cavity=false Toothache=true 0.04 0.01 Toothache = false 0.06 0.89 NOTE: Elements in table sum to 1 3 independent numbers. Weather,Cavity) is a 4 by 2 matrix of values: Weather= sunny rain cloudy snow Cavity=True Cavity=False

Conditional Probabilities Conditional or posterior probabilities e.g., Cavity Toothache) = 0.8 What is the probability of having a cavity given that the patient has a toothache? Definition of conditional probability: A B) = A, B) / B) if B) 0 Product rule gives an alternative formulation: A, B) = A B)B) = B A)A)

Bayes Rule From product rule A, B) = A B)B) = B A)A), we can obtain Bayes' rule P ( A B) = B A) A) B)

Bayes Rule: Example P ( Cause Effect) = Effect Cause) Cause) Effect) Let M be meningitis, S be stiff neck M)=0.0001 S)= 0.1 S M) = 0.8 S M ) M ) 0.8 0.0001 M S) = = = S) 0.1 0.0008 Note: posterior probability of meningitis still very small!

A generalized Bayes Rule More general version conditionalized on some background evidence E P ( A B, E) = B A, E) A B E) E)

Using the full joint distribution Cavity=true Cavity=false Toothache=true 0.04 0.01 Toothache = false 0.06 0.89 What is the unconditional probability of having a Cavity? Cavity) = Cavity ^ Toothache) + Cavity^ ~Toothache) = 0.04 + 0.06 = 0.1 What is the probability of having either a cavity or a Toothache? Cavity Toothache) = Cavity,Toothache) + Cavity, ~Toothache) + ~Cavity,Toothache) = 0.04 + 0.06 + 0.01 = 0.11

Using the full joint distribution Cavity=true Cavity=false Toothache=true 0.04 0.01 Toothache = false 0.06 0.89 What is the probability of having a cavity given that you already have a toothache? Cavity Toothache) 0.04 Cavity Toothache) = = = Toothache) 0.04 + 0.01 0.8

Normalization Suppose we wish to compute a posterior distribution over random variable A given B=b, and suppose A has possible values a 1... a m We can apply Bayes' rule for each value of A: A=a 1 B=b) = B=b A=a 1 )A=a 1 )/B=b)... A=a m B=b) = B=b A=a m )A=a m )/B=b) Adding these up, and noting that Σ i A=a i B=b) = 1: B=b) = Σ i B=b A=a i )A=a i ) This is the normalization factor denoted α = 1/B=b): A B=b) = α B=b A)A) Typically compute an unnormalized distribution, normalize at end e.g., suppose B=b A)A) = <0.4,0.2,0.2> then A B=b) = α <0.4,0.2,0.2> = <0.4,0.2,0.2> / (0.4+0.2+0.2) = <0.5,0.25,0.25>

Marginalization Given a condition distribution X Z), we can create the unconditional distribution X) by marginalization: X) = Σ z X Z=z) Z=z) = Σ z X, Z=z) In general, given a joint distribution over a set of variables, the distribution over any subset (called a marginal distribution for historical reasons) can be calculated by summing out the other variables.

Conditioning Introducing a variable as an extra condition: X Y) = Σ z X Y, Z=z) Z=z Y) Intuition: often easier to assess each specific circumstance, e.g., RunOver Cross) = RunOver Cross, Light=green)Light=green Cross) + RunOver Cross, Light=yellow)Light=yellow Cross) + RunOver Cross, Light=red)Light=red Cross)

Absolute Independence Two random variables A and B are (absolutely) independent iff A, B) = A)B) Using product rule for A & B independent, we can show: A, B) = A B)B) = A)B) Therefore A B) = A) If n Boolean variables are independent, the full joint is: X 1,, X n ) = Π i X i ) Full joint is generally specified by 2 n - 1 numbers, but when independent only n numbers are needed. Absolute independence is a very strong requirement, seldom met!!

Conditional Independence Some evidence may be irrelevant, allowing simplification, e.g., Cavity Toothache, CubsWin) = Cavity Toothache) = 0.8 This property is known as Conditional Independence and can be expressed as: X Y, Z) = X Z) which says that X and Y independent given Z. If I have a cavity, the probability that the probe catches in it doesn't depend on whether I have a toothache: 1. Catch Toothache, Cavity) = Catch Cavity) i.e., Catch is conditionally independent of Toothache given Cavity The same independence holds if I haven't got a cavity: 2. Catch Toothache, ~Cavity) = Catch ~Cavity)

Conditional independence contd. Equivalent statements to Catch Toothache, Cavity) = Catch Cavity) ( * ) 1.a Toothache Catch, Cavity) = Toothache Cavity) Toothache Catch, Cavity) = Catch Toothache, Cavity) Toothache Cavity) / Catch Cavity) = Catch Cavity) Toothache Cavity) / Catch Cavity) (from *) = Toothache Cavity) 1.b Toothache,Catch Cavity) = Toothache Cavity)Catch Cavity) Toothache,Catch Cavity) = Toothache Catch,Cavity) Catch Cavity) (product rule) = Toothache Cavity) Catch Cavity) (from 1a)

Using Conditional Independence Full joint distribution can now be written as Toothache,Catch,Cavity) = (Toothache,Catch Cavity) Cavity) = Toothache Cavity)Catch Cavity)Cavity) (from 1.b) Specified by: 2 + 2 + 1 = 5 independent numbers Compared to 7 for general joint or 3 for unconditionally indpendent.

Belief (Bayes( Bayes) ) networks A simple, graphical notation for conditional independence assertions and hence for compact specification of full joint distributions. Syntax: 1. a set of nodes, one per random variable 2. links mean parent directly influences child 3. a directed, acyclic graph 4. a conditional distribution (a table) for each node given its parents: X i Parents(X i )) In the simplest case, conditional distribution represented as a conditional probability table (CPT)

A two node net & Conditional probability table A Node A is independent of Node B, so it is described by an unconditional probability A) A) is given by 1 - A) B Node B is conditionally dependent on A. It is described by four numbers, B A), B A), B A) and B A). This can be expressed as 2 by 2 conditional probability table (CPT). But B A) = 1 B A) and B A) = 1 - B A). Therefore, only two independent numbers in CPT

Example I'm at work, neighbor John calls to say my alarm is ringing, but neighbor Mary doesn't call. Sometimes it's set off by minor earthquakes. Is there a burglar? Variables: Burglar, Earthquake, Alarm, JohnCalls, MaryCalls Network topology reflects ``causal'' knowledge:

Uncertainty and Belief Networks Introduction to Artificial Intelligence CS 151 Lecture 2 continued April 2, 2004