Another look at Bayesian. inference

Similar documents
Review: Bayesian learning and inference

CS 380: ARTIFICIAL INTELLIGENCE UNCERTAINTY. Santiago Ontañón

Bayesian networks. Chapter Chapter

Bayesian networks. Chapter Chapter Outline. Syntax Semantics Parameterized distributions. Chapter

Bayesian networks. Chapter AIMA2e Slides, Stuart Russell and Peter Norvig, Completed by Kazim Fouladi, Fall 2008 Chapter 14.

Probabilistic Reasoning. (Mostly using Bayesian Networks)

Directed Graphical Models

Bayesian networks. Chapter 14, Sections 1 4

Introduction to Artificial Intelligence Belief networks

Probabilistic Reasoning Systems

Objectives. Probabilistic Reasoning Systems. Outline. Independence. Conditional independence. Conditional independence II.

Bayesian Network. Outline. Bayesian Network. Syntax Semantics Exact inference by enumeration Exact inference by variable elimination

Bayesian networks: Modeling

PROBABILISTIC REASONING SYSTEMS

CS 2750: Machine Learning. Bayesian Networks. Prof. Adriana Kovashka University of Pittsburgh March 14, 2016

Bayesian networks (1) Lirong Xia

Quantifying uncertainty & Bayesian networks

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS

Informatics 2D Reasoning and Agents Semester 2,

Bayesian networks. Soleymani. CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2018

Bayesian networks. Chapter Chapter

CS Belief networks. Chapter

Graphical Models - Part I

Bayesian Networks. Semantics of Bayes Nets. Example (Binary valued Variables) CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-III

Bayesian Networks. Motivation

Probabilistic Reasoning. Kee-Eung Kim KAIST Computer Science

CS 484 Data Mining. Classification 7. Some slides are from Professor Padhraic Smyth at UC Irvine

Axioms of Probability? Notation. Bayesian Networks. Bayesian Networks. Today we ll introduce Bayesian Networks.

Probabilistic representation and reasoning

14 PROBABILISTIC REASONING

Example. Bayesian networks. Outline. Example. Bayesian networks. Example contd. Topology of network encodes conditional independence assertions:

Outline. Bayesian networks. Example. Bayesian networks. Example contd. Example. Syntax. Semantics Parameterized distributions. Chapter 14.

COMP5211 Lecture Note on Reasoning under Uncertainty

Probabilistic Models. Models describe how (a portion of) the world works

Introduction to Artificial Intelligence. Unit # 11

Bayesian Belief Network

CS 188: Artificial Intelligence Fall 2009

Announcements. CS 188: Artificial Intelligence Spring Probability recap. Outline. Bayes Nets: Big Picture. Graphical Model Notation

Bayesian Networks. Machine Learning, Fall Slides based on material from the Russell and Norvig AI Book, Ch. 14

Probabilistic Models

Learning Bayesian Networks (part 1) Goals for the lecture

CSE 473: Artificial Intelligence Autumn 2011

Bayesian Networks. Material used

CS 5522: Artificial Intelligence II

Outline } Conditional independence } Bayesian networks: syntax and semantics } Exact inference } Approximate inference AIMA Slides cstuart Russell and

Bayesian Networks. Vibhav Gogate The University of Texas at Dallas

CS 343: Artificial Intelligence

COMP9414: Artificial Intelligence Reasoning Under Uncertainty

Bayesian Networks. Philipp Koehn. 6 April 2017

CS 5522: Artificial Intelligence II

Outline. CSE 573: Artificial Intelligence Autumn Bayes Nets: Big Picture. Bayes Net Semantics. Hidden Markov Models. Example Bayes Net: Car

Bayesian Networks. Vibhav Gogate The University of Texas at Dallas

This lecture. Reading. Conditional Independence Bayesian (Belief) Networks: Syntax and semantics. Chapter CS151, Spring 2004

Recall from last time. Lecture 3: Conditional independence and graph structure. Example: A Bayesian (belief) network.

Implementing Machine Reasoning using Bayesian Network in Big Data Analytics

Probabilistic Models Bayesian Networks Markov Random Fields Inference. Graphical Models. Foundations of Data Analysis

p(x) p(x Z) = y p(y X, Z) = αp(x Y, Z)p(Y Z)

Bayesian Networks. Philipp Koehn. 29 October 2015

Probabilistic representation and reasoning

Uncertainty. 22c:145 Artificial Intelligence. Problem of Logic Agents. Foundations of Probability. Axioms of Probability

Graphical Models - Part II

Bayesian Networks (Part II)

Bayes Networks. CS540 Bryan R Gibson University of Wisconsin-Madison. Slides adapted from those used by Prof. Jerry Zhu, CS540-1

Bayes Nets: Independence

CS 188: Artificial Intelligence Spring Announcements

Artificial Intelligence Bayesian Networks

Defining Things in Terms of Joint Probability Distribution. Today s Lecture. Lecture 17: Uncertainty 2. Victor R. Lesser

Probability. CS 3793/5233 Artificial Intelligence Probability 1

Uncertainty. Logic and Uncertainty. Russell & Norvig. Readings: Chapter 13. One problem with logical-agent approaches: C:145 Artificial

Bayesian Networks Representation

Uncertainty and Belief Networks. Introduction to Artificial Intelligence CS 151 Lecture 1 continued Ok, Lecture 2!

Bayesian Networks Introduction to Machine Learning. Matt Gormley Lecture 24 April 9, 2018

Bayesian Networks BY: MOHAMAD ALSABBAGH

Bayesian belief networks

Uncertainty. Introduction to Artificial Intelligence CS 151 Lecture 2 April 1, CS151, Spring 2004

Bayesian Reasoning. Adapted from slides by Tim Finin and Marie desjardins.

CS 188: Artificial Intelligence Fall 2008

Brief Intro. to Bayesian Networks. Extracted and modified from four lectures in Intro to AI, Spring 2008 Paul S. Rosenbloom

Bayesian belief networks. Inference.

Recall from last time: Conditional probabilities. Lecture 2: Belief (Bayesian) networks. Bayes ball. Example (continued) Example: Inference problem

School of EECS Washington State University. Artificial Intelligence

Reasoning Under Uncertainty: Belief Network Inference

Lecture 8: Bayesian Networks

Introduction to Probabilistic Reasoning. Image credit: NASA. Assignment

Lecture 10: Introduction to reasoning under uncertainty. Uncertainty

Probabilistic Graphical Networks: Definitions and Basic Results

Intelligent Systems: Reasoning and Recognition. Reasoning with Bayesian Networks

Graphical models and causality: Directed acyclic graphs (DAGs) and conditional (in)dependence

Foundations of Artificial Intelligence

Artificial Intelligence Bayes Nets: Independence

COS402- Artificial Intelligence Fall Lecture 10: Bayesian Networks & Exact Inference

Bayes Nets. CS 188: Artificial Intelligence Fall Example: Alarm Network. Bayes Net Semantics. Building the (Entire) Joint. Size of a Bayes Net

Course Overview. Summary. Outline

Our learning goals for Bayesian Nets

A Brief Introduction to Graphical Models. Presenter: Yijuan Lu November 12,2004

Bayes Networks 6.872/HST.950

Quantifying Uncertainty & Probabilistic Reasoning. Abdulla AlKhenji Khaled AlEmadi Mohammed AlAnsari

Outline. CSE 473: Artificial Intelligence Spring Bayes Nets: Big Picture. Bayes Net Semantics. Hidden Markov Models. Example Bayes Net: Car

CSE 473: Ar+ficial Intelligence. Hidden Markov Models. Bayes Nets. Two random variable at each +me step Hidden state, X i Observa+on, E i

Today. Why do we need Bayesian networks (Bayes nets)? Factoring the joint probability. Conditional independence

Transcription:

Another look at Bayesian A general scenario: - Query variables: X inference - Evidence (observed) variables and their values: E = e - Unobserved variables: Y Inference problem: answer questions about the query variables given the evidence variables - This can be done using the posterior distribution P(X E = e) - In turn, the posterior needs to be derived from the full joint P(X, E, Y) P( X, e) P( X E = e) = P( X, e, y) y P( e) Bayesian networks are a tool for representing joint probability distributions efficiently

Bayesian networks More commonly called graphical models A way to depict conditional independence relationships between random variables A compact specification of full joint distributions

Bayesian networks: Structure Nodes: random variables Arcs: interactions An arrow from one variable to another indicates direct influence Must form a directed, acyclic graph

Example: N independent coin flips Complete independence: no interactions X 1 X 2 X n

Example: Naïve Bayes document model Random variables: X: document class W 1,, W n : words in the document X W 1 W 2 W n

Example: Burglar Alarm I have a burglar alarm that is sometimes set off by minor earthquakes. My two neighbors, John and Mary, promised to call me at work if they hear the alarm Example inference tasks Suppose Mary calls and John doesn t call. What is the probability of a burglary? Suppose there is a burglary and no earthquake. What is the probability of John calling? Suppose the alarm went off. What is the probability of burglary?

Example: Burglar Alarm I have a burglar alarm that is sometimes set off by minor earthquakes. My two neighbors, John and Mary, promised to call me at work if they hear the alarm What are the random variables? Burglary, Earthquake, Alarm, John, Mary What are the direct influence relationships? A burglar can set the alarm off An earthquake can set the alarm off The alarm can cause Mary to call The alarm can cause John to call

Example: Burglar Alarm

Conditional independence relationships Suppose the alarm went off. Does knowing whether there was a burglary change the probability of John calling? P(John Alarm, Burglary) = P(John Alarm) Suppose the alarm went off. Does knowing whether John called change the probability of Mary calling? P(Mary Alarm, JohnCalls) = P(Mary Alarm) Suppose the alarm went off. Does knowing whether there was an earthquake change the probability of burglary? P(Burglary Alarm, Earthquake)!= P(Burglary Alarm) Suppose there was a burglary. Does knowing whether John called change the probability that the alarm went off? P(Alarm Burglary, John)!= P(Alarm Burglary)

Conditional independence relationships John and Mary are conditionally independent of Burglary and Earthquake given Alarm Children are conditionally independent of ancestors given parents John and Mary are conditionally independent of each other given Alarm Siblings are conditionally independent of each other given parents Burglary and Earthquake are not conditionally independent of each other given Alarm Parents are not conditionally independent given children Alarm is not conditionally independent of John and Mary given Burglary and Earthquake Nodes are not conditionally independent of children given parents General rule: each node is conditionally independent of its non-descendants given its parents

Conditional independence and the joint distribution General rule: each node is conditionally independent of its non-descendants given its parents Suppose the nodes X 1,, X n are sorted in topological order (parents before children) To get the joint distribution P(X 1,, X n ), use chain rule: n 1,, X n) = P i 1, i= 1 ( X X, X ) P( X = n i= 1 P i 1 ( X Parents( )) i X i

Conditional probability distributions To specify the full joint distribution, we need to specify a conditional distribution for each node given its parents: P (X Parents(X)) Z 1 Z 2 Z n X P (X Z 1,, Z n )

Example: Burglar Alarm The conditional probability tables are the model parameters

The joint probability distribution n = 1,, X n) P i i ) i= 1 P( X ( X Parents( X ) For example, P(j, m, a, b, e) = P( b) P( e) P(a b, e) P(j a) P(m a)

Conditional independence General rule: X is conditionally independent of every non-descendant node given its parents Example: causal chain Are X and Z independent? Is Z independent of X given Y?

Conditional independence Common cause Common effect Are X and Z independent? No Are they conditionally independent given Y? Yes Are X and Z independent? Yes Are they conditionally independent given Y? No

Compactness Suppose we have a Boolean variable X i with k Boolean parents. How many rows does its conditional probability table have? 2 k rows for all the combinations of parent values Each row requires one number for P(X i = true parent values) If each variable has no more than k parents, how many numbers does the complete network require? O(n 2 k ) numbers vs. O(2 n ) for the full joint distribution How many nodes for the burglary network? 1 + 1 + 4 + 2 + 2 = 10 numbers (vs. 2 5-1 = 31)

Constructing Bayesian networks 1. Choose an ordering of variables X 1,, X n 2. For i = 1 to n add X i to the network select parents from X 1,,X i-1 such that P(X i Parents(X i )) = P(X i X 1,... X i-1 )

Example Suppose we choose the ordering M, J, A, B, E

Example Suppose we choose the ordering M, J, A, B, E

Example Suppose we choose the ordering M, J, A, B, E

Example Suppose we choose the ordering M, J, A, B, E

Example Suppose we choose the ordering M, J, A, B, E

Example Suppose we choose the ordering M, J, A, B, E

Example Suppose we choose the ordering M, J, A, B, E

Example Suppose we choose the ordering M, J, A, B, E

Example contd. Deciding conditional independence is hard in noncausal directions The causal direction seems much more natural Network is less compact: 1 + 2 + 4 + 2 + 4 = 13 numbers needed (vs. 10 for the causal ordering)

A more realistic Bayes Network: Initial observation: car won t start Orange: broken, so fix it nodes Green: testable evidence Car diagnosis Gray: hidden variables to ensure sparse structure, reduce parameteres

Car insurance

In research literature Causal Protein-Signaling Networks Derived from Multiparameter Single-Cell Data Karen Sachs, Omar Perez, Dana Pe'er, Douglas A. Lauffenburger, and Garry P. Nolan (22 April 2005) Science 308 (5721), 523.

In research literature Describing Visual Scenes Using Transformed Objects and Parts E. Sudderth, A. Torralba, W. T. Freeman, and A. Willsky. International Journal of Computer Vision, No. 1-3, May 2008, pp. 291-330.

In research literature audio video Audiovisual Speech Recognition with Articulator Positions as Hidden Variables Mark Hasegawa-Johnson, Karen Livescu, Partha Lal and Kate Saenko International Congress on Phonetic Sciences 1719:299-302, 2007

In research literature Detecting interaction links in a collaborating group using manually annotated data S. Mathur, M.S. Poole, F. Pena-Mora, M. Hasegawa-Johnson, N. Contractor Social Networks 10.1016/j.socnet.2012.04.002

In research literature Speaking: S i =1 if #i is speaking. Link: L ij =1 if #i is listening to #j. Neighborhood: N ij =1 if they are near one another. Gaze: G ij =1 if #i is looking at #j. Indirect: I ij =1 if #i and #j are both listening to the same person. Detecting interaction links in a collaborating group using manually annotated data S. Mathur, M.S. Poole, F. Pena-Mora, M. Hasegawa-Johnson, N. Contractor Social Networks 10.1016/j.socnet.2012.04.002

Summary Bayesian networks provide a natural representation for (causally induced) conditional independence Topology + conditional probability tables Generally easy for domain experts to construct