TDT70: Uncertainty in Artificial Intelligence. Chapter 1 and 2

Similar documents
CS Lecture 3. More Bayesian Networks

Introduction to Bayesian Networks

Introduction to Probabilistic Graphical Models

Directed Graphical Models or Bayesian Networks

Directed Graphical Models

Introduction to Artificial Intelligence. Unit # 11

Rapid Introduction to Machine Learning/ Deep Learning

Intelligent Systems: Reasoning and Recognition. Reasoning with Bayesian Networks

Graphical models and causality: Directed acyclic graphs (DAGs) and conditional (in)dependence

Outline. CSE 573: Artificial Intelligence Autumn Bayes Nets: Big Picture. Bayes Net Semantics. Hidden Markov Models. Example Bayes Net: Car

CS 2750: Machine Learning. Bayesian Networks. Prof. Adriana Kovashka University of Pittsburgh March 14, 2016

Introduction to Bayes Nets. CS 486/686: Introduction to Artificial Intelligence Fall 2013

Artificial Intelligence Bayes Nets: Independence

Part I. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS

1. what conditional independencies are implied by the graph. 2. whether these independecies correspond to the probability distribution

COS402- Artificial Intelligence Fall Lecture 10: Bayesian Networks & Exact Inference

Bayesian Networks BY: MOHAMAD ALSABBAGH

Markov Independence (Continued)

Uncertainty and Bayesian Networks

Probabilistic Graphical Models and Bayesian Networks. Artificial Intelligence Bert Huang Virginia Tech

CS 380: ARTIFICIAL INTELLIGENCE UNCERTAINTY. Santiago Ontañón

Introduction to Artificial Intelligence Belief networks

Y. Xiang, Inference with Uncertain Knowledge 1

Algorithmisches Lernen/Machine Learning

Bayesian Networks and Decision Graphs

CS839: Probabilistic Graphical Models. Lecture 2: Directed Graphical Models. Theo Rekatsinas

COMP5211 Lecture Note on Reasoning under Uncertainty

Informatics 2D Reasoning and Agents Semester 2,

Axioms of Probability? Notation. Bayesian Networks. Bayesian Networks. Today we ll introduce Bayesian Networks.

Uncertainty. Introduction to Artificial Intelligence CS 151 Lecture 2 April 1, CS151, Spring 2004

Preliminaries Bayesian Networks Graphoid Axioms d-separation Wrap-up. Bayesian Networks. Brandon Malone

Chris Bishop s PRML Ch. 8: Graphical Models

2 : Directed GMs: Bayesian Networks

A Tutorial on Bayesian Belief Networks

Probabilistic Reasoning. (Mostly using Bayesian Networks)

PROBABILISTIC REASONING SYSTEMS

Bayesian Networks. Semantics of Bayes Nets. Example (Binary valued Variables) CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-III

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS

Bayesian networks. Chapter 14, Sections 1 4

Statistical Approaches to Learning and Discovery

Probabilistic Graphical Networks: Definitions and Basic Results

Introduction to Probabilistic Graphical Models

Quantifying uncertainty & Bayesian networks

Bayesian Networks Representation

Probabilistic representation and reasoning

1 : Introduction. 1 Course Overview. 2 Notation. 3 Representing Multivariate Distributions : Probabilistic Graphical Models , Spring 2014

Probabilistic representation and reasoning

Outline. CSE 573: Artificial Intelligence Autumn Agent. Partial Observability. Markov Decision Process (MDP) 10/31/2012

Recall from last time. Lecture 3: Conditional independence and graph structure. Example: A Bayesian (belief) network.

Introduction to Bayesian Networks

CS 343: Artificial Intelligence

Icy Roads. Bayesian Networks. Icy Roads. Icy Roads. Icy Roads. Holmes and Watson in LA

Quantifying Uncertainty & Probabilistic Reasoning. Abdulla AlKhenji Khaled AlEmadi Mohammed AlAnsari

Probability. CS 3793/5233 Artificial Intelligence Probability 1

Published in: Tenth Tbilisi Symposium on Language, Logic and Computation: Gudauri, Georgia, September 2013

Artificial Intelligence Bayesian Networks

Conditional Independence

Cognitive Systems 300: Probability and Causality (cont.)

Module 10. Reasoning with Uncertainty - Probabilistic reasoning. Version 2 CSE IIT,Kharagpur

Bayesian networks. Chapter Chapter

Machine Learning Lecture 14

CSE 473: Artificial Intelligence Autumn 2011

Probabilistic Models. Models describe how (a portion of) the world works

Bayesian Networks. instructor: Matteo Pozzi. x 1. x 2. x 3 x 4. x 5. x 6. x 7. x 8. x 9. Lec : Urban Systems Modeling

Part I Qualitative Probabilistic Networks

Directed and Undirected Graphical Models

Bayesian Networks. Machine Learning, Fall Slides based on material from the Russell and Norvig AI Book, Ch. 14

Introduction to Bayesian Networks. Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 1

Bayesian networks. Independence. Bayesian networks. Markov conditions Inference. by enumeration rejection sampling Gibbs sampler

Objectives. Probabilistic Reasoning Systems. Outline. Independence. Conditional independence. Conditional independence II.

This lecture. Reading. Conditional Independence Bayesian (Belief) Networks: Syntax and semantics. Chapter CS151, Spring 2004

Machine learning: lecture 20. Tommi S. Jaakkola MIT CSAIL

CAUSALITY CORRECTIONS IMPLEMENTED IN 2nd PRINTING

Basic Probabilistic Reasoning SEG

Probabilistic Graphical Models. Rudolf Kruse, Alexander Dockhorn Bayesian Networks 153

Probabilistic Reasoning Systems

Probabilistic Graphical Models (I)

Probabilistic Representation and Reasoning

Bayesian networks. Soleymani. CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2018

Tópicos Especiais em Modelagem e Análise - Aprendizado por Máquina CPS863

Introduction to Mobile Robotics Probabilistic Robotics

Motivation. Bayesian Networks in Epistemology and Philosophy of Science Lecture. Overview. Organizational Issues

CS 5522: Artificial Intelligence II

A Tutorial on Learning with Bayesian Networks

Uncertainty and Belief Networks. Introduction to Artificial Intelligence CS 151 Lecture 1 continued Ok, Lecture 2!

Ch.6 Uncertain Knowledge. Logic and Uncertainty. Representation. One problem with logical approaches: Department of Computer Science

Causal Inference & Reasoning with Causal Bayesian Networks

Graphical Models - Part I

Bayesian networks for multilevel system reliability

Lecture 4 October 18th

A Brief Introduction to Graphical Models. Presenter: Yijuan Lu November 12,2004

Bayesian Networks to design optimal experiments. Davide De March

On the errors introduced by the naive Bayes independence assumption

02 Background Minimum background on probability. Random process

Bayesian Network. Outline. Bayesian Network. Syntax Semantics Exact inference by enumeration Exact inference by variable elimination

Directed Graphical Models

Bayesian networks. Chapter Chapter

{ p if x = 1 1 p if x = 0

STAT 598L Probabilistic Graphical Models. Instructor: Sergey Kirshner. Bayesian Networks

Implementing Machine Reasoning using Bayesian Network in Big Data Analytics

Transcription:

TDT70: Uncertainty in Artificial Intelligence Chapter 1 and 2

Fundamentals of probability theory The sample space is the set of possible outcomes of an experiment. A subset of a sample space is called an event. In general, we say that an event A is true for an experiment if the outcome of the experiment is an element of A. To measure our degree of uncertainty about an experiment we assign a probability P(A) to each event A in the sample space.

Conditional probabilities Conditional probability: The fundamental rule: Bayes rule: The events A and B are independent if:

Probability calculus for variables A variable is defined as a collection of sample spaces. A variable can be considered an experiment, and for each outcome of the experiment the variable has a corresponding state. For example, if D is a variable representing the outcome of rolling a die, then its state space would be sp(d) = (1,2,3,4,5,6) For a variable A with states a 1,...a n, we express our uncertainty about its state through a probability distribution P(A) over these states: P(A) = (x 1,...,x n ); where x i is the probability of A being in a state a i.

Joint probability tables and marginalization From a joint probability table P(A,B), the probability distribution P(A) can be calculated by considering the outcomes of B that can occur together with each state a i of A. Joint probability table example: b1 b2 b3 a1 0.16 0.12 0.12 a2 0.24 0.28 0.08

Causal Networks A causal network is a directed graph, where the variables represent propositions A variable represents a set of possible states of affairs. A variable is in exactly one of its states; which one may be unknown

Causal networks, example Fuel? Clean Spark Plugs Fuel Meter Standing Start? A way of structuring a situation for reasoning under uncertainty is to construct a graph representing causal relations between events.

D-separation Definition: Two distinct variables A and B in a causal network are d-separated if for all paths between A and B, there is an intermediate variable V (distinct from A and B) such that either the connection is serial or diverging and V is instantiated the connection is converging, and neither V nor any of V s descendants have received evidence. If A and B are not d-separated, we call them d-connected

D-seperation, cont. A B C Evidence may be transmitted through a serial connection unless the state of the variable in the connection is known. When the state of a variable is known, we say that the variable is instantiated. Rainfall Water level Flooding

D-seperation, cont. A B C... E Evidence may be transmitted through a diverging connection unless it is instantiated. B,C,...,E are d-separated given A Sex Hair length Stature

D-seperation, cont. B C... E A Evidence may be transmitted through a converging connection only if either the variable in the connection or one of its descendants has received evidence. Not enough fuel Dirty spark plugs Car won t start

D-separation (again) Definition: Two distinct variables A and B in a causal network are d-separated if for all paths between A and B, there is an intermediate variable V (distinct from A and B) such that either the connection is serial or diverging and V is instantiated the connection is converging, and neither V nor any of V s descendants have received evidence. If A and B are not d-separated, we call them d-connected

D-separation, cont. The Markov blanket of a variable A is the set consisting of the parents of A, the children of A, and the variables sharing a child with A. Has the property that when instantiated, A is d-separated from the rest of the network.

Bayesian Networks A Bayesian network consists of the following: A set of variables and a set of directed edges between variables. Each variable has a finite set of mutually exclusive states. The variables together with the directed edges form an acyclic directed graph. To each variable A with parents B 1,...,B n, a conditional probability table P(A B 1,...,B n ) is attached.

Bayesian Networks, cont. The model s d-separation properties should correspond to our perception of the world s conditional independence properties. If A and B are d-separated given evidence e, then the probability calculus used for Bayesian networks must yield P(A e) = P(A B,e)

The general chain rule Let U {A 1,...A n } be a set of variables. Then for any probability distribution P(U) we have P(U) = P(A n A 1,...A n-1 ) P(A n-1 A 1,...A n-2 )... P(A 2 A 1 )P(A 1 )

The chain rule for Bayesian networks Let BN be a Bayesian network over U = {A 1,...,A n }. Then BN specifies a unique joint probability distribution P(U) given by the product of all conditional probability tables specified in BN: where pa(a i ) are the parents of A i in BN, and P(U) reflects the properties of BN.

Inserting evidence Let A be a variable with n states. A finding on A is an n- dimensional table of zeros and ones. E.g. (0,0,0,1,0,0,1,0) Semantically, a finding is a statement that certain states of A are impossible. Let BN be a Bayesian network over the universe U, and let e 1,...,e m be findings. Then And for A U we have

Questions?