CS 730/730W/830: Intro AI

Similar documents
Bayesian networks. Chapter 14, Sections 1 4

CS 730/730W/830: Intro AI

Bayesian Networks. Motivation

last two digits of your SID

School of EECS Washington State University. Artificial Intelligence

Bayes Nets: Independence

Inference in Bayesian networks

CS 5522: Artificial Intelligence II

Inference in Bayesian networks

Bayesian Network. Outline. Bayesian Network. Syntax Semantics Exact inference by enumeration Exact inference by variable elimination

Announcements. CS 188: Artificial Intelligence Fall Causality? Example: Traffic. Topology Limits Distributions. Example: Reverse Traffic

Artificial Intelligence Methods. Inference in Bayesian networks

CS 343: Artificial Intelligence

CS 730/830: Intro AI. 1 handout: slides. Wheeler Ruml (UNH) Lecture 17, CS / 16. Regression POP

CS 730/730W/830: Intro AI

Artificial Intelligence Bayes Nets: Independence

Announcements. Inference. Mid-term. Inference by Enumeration. Reminder: Alarm Network. Introduction to Artificial Intelligence. V22.

Bayes Nets III: Inference

CS 730/830: Intro AI. 1 handout: slides. Wheeler Ruml (UNH) Lecture 11, CS / 15. Propositional Logic. First-Order Logic

CS 188: Artificial Intelligence Spring Announcements

CS 188: Artificial Intelligence. Bayes Nets

CSEP 573: Artificial Intelligence

COS402- Artificial Intelligence Fall Lecture 10: Bayesian Networks & Exact Inference

Informatics 2D Reasoning and Agents Semester 2,

Learning Bayesian Networks (part 1) Goals for the lecture

Bayes Nets: Sampling

PROBABILISTIC REASONING SYSTEMS

CS 730/830: Intro AI. 3 handouts: slides, asst 6, asst 7. Wheeler Ruml (UNH) Lecture 12, CS / 16. Reasoning.

Bayesian Networks BY: MOHAMAD ALSABBAGH

Artificial Intelligence Bayesian Networks

Product rule. Chain rule

Bayesian networks. Chapter Chapter

Brief Intro. to Bayesian Networks. Extracted and modified from four lectures in Intro to AI, Spring 2008 Paul S. Rosenbloom

Announcements. CS 188: Artificial Intelligence Spring Probability recap. Outline. Bayes Nets: Big Picture. Graphical Model Notation

Outline. CSE 573: Artificial Intelligence Autumn Bayes Nets: Big Picture. Bayes Net Semantics. Hidden Markov Models. Example Bayes Net: Car

Artificial Intelligence

The 30 Years War. Wheeler Ruml (UNH) Lecture 11, CS / 24. Leibniz. Logic in Practice. Satisfiability ILP

Quantifying uncertainty & Bayesian networks

Probabilistic Graphical Models and Bayesian Networks. Artificial Intelligence Bert Huang Virginia Tech

Bayesian Approaches Data Mining Selected Technique

Bayes Net Representation. CS 188: Artificial Intelligence. Approximate Inference: Sampling. Variable Elimination. Sampling.

Quantifying Uncertainty

CS 730/830: Intro AI. 2 handouts: slides, asst 5. Wheeler Ruml (UNH) Lecture 10, CS / 19. What is KR? Prop. Logic. Reasoning

CS 5522: Artificial Intelligence II

CS 188: Artificial Intelligence Fall 2009

CS 343: Artificial Intelligence

CS 484 Data Mining. Classification 7. Some slides are from Professor Padhraic Smyth at UC Irvine

Review: Bayesian learning and inference

Probabilistic Models

CS 188: Artificial Intelligence Spring Announcements

Bayes Networks 6.872/HST.950

Objectives. Probabilistic Reasoning Systems. Outline. Independence. Conditional independence. Conditional independence II.

Bayesian Networks. Vibhav Gogate The University of Texas at Dallas

Probabilistic Reasoning Systems

Inference in Bayesian Networks

Bayesian Networks. Vibhav Gogate The University of Texas at Dallas

Machine Learning for Data Science (CS4786) Lecture 24

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS

Announcements. CS 188: Artificial Intelligence Spring Bayes Net Semantics. Probabilities in BNs. All Conditional Independences

Intelligent Systems (AI-2)

CS 343: Artificial Intelligence

Bayes Networks. CS540 Bryan R Gibson University of Wisconsin-Madison. Slides adapted from those used by Prof. Jerry Zhu, CS540-1

Bayesian Networks. Machine Learning, Fall Slides based on material from the Russell and Norvig AI Book, Ch. 14

Defining Things in Terms of Joint Probability Distribution. Today s Lecture. Lecture 17: Uncertainty 2. Victor R. Lesser

Probabilistic Reasoning. Kee-Eung Kim KAIST Computer Science

Bayesian Networks. Semantics of Bayes Nets. Example (Binary valued Variables) CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-III

Chapter 15. Probability Rules! Copyright 2012, 2008, 2005 Pearson Education, Inc.

Our Status in CSE 5522

Probability. CS 3793/5233 Artificial Intelligence Probability 1

Probabilistic Reasoning. (Mostly using Bayesian Networks)

Introduction to Probabilistic Reasoning. Image credit: NASA. Assignment

COMP5211 Lecture Note on Reasoning under Uncertainty

Machine Learning for Data Science (CS4786) Lecture 24

CMPSCI 240: Reasoning about Uncertainty

CS188 Outline. We re done with Part I: Search and Planning! Part II: Probabilistic Reasoning. Part III: Machine Learning

Directed Graphical Models

CSE 473: Artificial Intelligence

Lecture 10: Bayesian Networks and Inference

CS188 Fall 2018 Section 7: Bayes Nets and Decision Nets

Our Status. We re done with Part I Search and Planning!

Bayesian Networks. Philipp Koehn. 6 April 2017

14 PROBABILISTIC REASONING

Hidden Markov Models. Vibhav Gogate The University of Texas at Dallas

CSE 473: Ar+ficial Intelligence. Hidden Markov Models. Bayes Nets. Two random variable at each +me step Hidden state, X i Observa+on, E i

Probabilistic Models. Models describe how (a portion of) the world works

CSE 473: Artificial Intelligence Autumn 2011

Introduction to Artificial Intelligence (AI)

Stochastic Simulation

qfundamental Assumption qbeyond the Bayes net Chain rule conditional independence assumptions,

Bayesian Networks aka belief networks, probabilistic networks. Bayesian Networks aka belief networks, probabilistic networks. An Example Bayes Net

Bayesian Networks. Philipp Koehn. 29 October 2015

Implementing Machine Reasoning using Bayesian Network in Big Data Analytics

Uncertainty and knowledge. Uncertainty and knowledge. Reasoning with uncertainty. Notes

Axioms of Probability? Notation. Bayesian Networks. Bayesian Networks. Today we ll introduce Bayesian Networks.

Approximate Inference

Introduction to Bayes Nets. CS 486/686: Introduction to Artificial Intelligence Fall 2013

Y. Xiang, Inference with Uncertain Knowledge 1

Markov Networks.

Course Overview. Summary. Outline

Bayesian networks. Soleymani. CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2018

Transcription:

CS 730/730W/830: Intro AI 1 handout: slides final blog entries were due Wheeler Ruml (UNH) Lecture 27, CS 730 1 / 15

Reminder Wheeler Ruml (UNH) Lecture 27, CS 730 2 / 15

The Alarm Domain Reminder Wheeler Ruml (UNH) Lecture 27, CS 730 3 / 15

Bayes Nets Reminder Reminder In general: P(x 1,...,x n ) = P(x n x n 1,...,x 1 )P(x n 1,...,x 1 ) Wheeler Ruml (UNH) Lecture 27, CS 730 4 / 15

Bayes Nets Reminder Reminder In general: P(x 1,...,x n ) = P(x n x n 1,...,x 1 )P(x n 1,...,x 1 ) n = P(x i x i 1,...,x 1 ) i=1 Bayes Net specifies independence: joint distribution: P(X i X i 1,...,X 1 ) = P(X i parents(x i )) P(x 1,...,x n ) = n i=1 P(x i parents(x i )) What is distribution of X given evidence e and unobserved Y? Wheeler Ruml (UNH) Lecture 27, CS 730 4 / 15

Basic Sampling Rej. Sampling Likelihood Wting Break Approximate Inference Wheeler Ruml (UNH) Lecture 27, CS 730 5 / 15

Sampling According to the Joint Distribution Basic Sampling Rej. Sampling Likelihood Wting Break sample values for variables, working top down directly implements the semantics of the network generative model each sample is linear time Wheeler Ruml (UNH) Lecture 27, CS 730 6 / 15

Rejection Sampling Basic Sampling Rej. Sampling Likelihood Wting Break What is distribution of X given evidence e and unobserved Y? Draw worlds from the joint, rejecting those that do not match e. Look at distribution of X. each sample is linear time, but overall slow if e is unlikely Wheeler Ruml (UNH) Lecture 27, CS 730 7 / 15

Likelihood Weighting Basic Sampling Rej. Sampling Likelihood Wting Break What is distribution of X given evidence e and unobserved Y? ChooseSample (e) w 1 for each variable V i in topological order: if (V i = v i ) e then w w P(v i parents(v i )) else v i sample from P(V i parents(v i )) (afterwards, normalize samples so all w s sum to 1) uses all samples, but needs lots of samples if e are late in ordering Wheeler Ruml (UNH) Lecture 27, CS 730 8 / 15

Break Basic Sampling Rej. Sampling Likelihood Wting Break exam 3: calculator, review session May 4 projects Wheeler Ruml (UNH) Lecture 27, CS 730 9 / 15

Enumeration Var. Elim. 1 Var. Elim. 2 EOLQs in Wheeler Ruml (UNH) Lecture 27, CS 730 10 / 15

Enumeration Over the Joint Distribution What is distribution of X given evidence e and unobserved Y? Enumeration Var. Elim. 1 Var. Elim. 2 EOLQs P(X e) = P(e X)P(X) P(e) = αp(x,e) = α y P(X,e,y) = α y n P(V i parents(v i )) i=1 Wheeler Ruml (UNH) Lecture 27, CS 730 11 / 15

Example Enumeration Var. Elim. 1 Var. Elim. 2 EOLQs P(B j,m) = P(j,m B)P(B) P(j,m) = αp(b,j,m) = α P(B,e,a,j,m) e a = α n P(V i parents(v i )) e a i=1 P(b j,m) = α P(b)P(e)P(a b,e)p(j a)p(m a) e a = αp(b) e P(e) a P(a b,e)p(j a)p(m a) [draw tree] Wheeler Ruml (UNH) Lecture 27, CS 730 12 / 15

Variable Elimination Enumeration Var. Elim. 1 Var. Elim. 2 EOLQs P(B j,m) = αp(b) e P(e) a factors = tables = f varsused (dimensions). eg: f A (A,B,E), f M (A) multiplying factors: table with union of variables summing reduces table P(a B,e)P(j a)p(m a) Wheeler Ruml (UNH) Lecture 27, CS 730 13 / 15

Variable Elimination Enumeration Var. Elim. 1 Var. Elim. 2 EOLQs eliminating variables: eg P(J b) P(J b) = αp(b) P(e) P(a b,e)p(j a) P(m a) e a m all vars not ancestor of query or evidence are irrelevant! Wheeler Ruml (UNH) Lecture 27, CS 730 14 / 15

EOLQs Enumeration Var. Elim. 1 Var. Elim. 2 EOLQs What question didn t you get to ask today? What s still confusing? What would you like to hear more about? Please write down your most pressing question about AI and put it in the box on your way out. Thanks! Wheeler Ruml (UNH) Lecture 27, CS 730 15 / 15