CMPSCI 240: Reasoning Under Uncertainty

Similar documents
CMPSCI 240: Reasoning Under Uncertainty

CMPSCI 240: Reasoning Under Uncertainty

CMPSCI 240: Reasoning Under Uncertainty

CMPSCI 240: Reasoning Under Uncertainty

CS145: Probability & Computing Lecture 18: Discrete Markov Chains, Equilibrium Distributions

CS Lecture 3. More Bayesian Networks

VCE Mathematical Methods (CAS)

Outlines. Discrete Time Markov Chain (DTMC) Continuous Time Markov Chain (CTMC)

Math 3B: Lecture 11. Noah White. October 25, 2017

Quantitative Evaluation of Emedded Systems Solution 2: Discrete Time Markov Chains (DTMC)

Data Mining and Matrices

= P{X 0. = i} (1) If the MC has stationary transition probabilities then, = i} = P{X n+1

Introduction to Stochastic Processes

Unit 4 Day 8 Symmetry & Compositions

Lecture Notes: Markov chains Tuesday, September 16 Dannie Durand

Week 12: Optimisation and Course Review.

Markov Chains, Random Walks on Graphs, and the Laplacian

MATH 450: Mathematical statistics

Exercise Sketch these lines and find their intersection.

Some Definition and Example of Markov Chain

Modern Algebra 2: Midterm 2

MARKOV PROCESSES. Valerio Di Valerio

Markov Chains and MCMC

Untitled September 04, 2007

Pre-Calculus Midterm Practice Test (Units 1 through 3)

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Friday, Feb 8, 2013

HW 2 Solutions. The variance of the random walk is explosive (lim n Var (X n ) = ).

Lecture 3- Vectors Chapter 3

Announcements. CS 188: Artificial Intelligence Fall Causality? Example: Traffic. Topology Limits Distributions. Example: Reverse Traffic

Markov Chains (Part 3)

Lecture 3- Vectors Chapter 3

Lecture Notes: Markov chains

1 Random Walks and Electrical Networks

Stochastic Processes

Calculus 140, section 4.7 Concavity and Inflection Points notes by Tim Pilachowski

Stat-491-Fall2014-Assignment-III

Markov Chains Handout for Stat 110

No class on Thursday, October 1. No office hours on Tuesday, September 29 and Thursday, October 1.

Interlude: Practice Final

Slide 7.1. Theme 7. Correlation

Understanding MCMC. Marcel Lüthi, University of Basel. Slides based on presentation by Sandro Schönborn

Precalculus Lesson 4.1 Polynomial Functions and Models Mrs. Snow, Instructor

Section 2.5 Formulas and Additional Applications from Geometry Section 2.6 Solving Linear Inequalities Section 7.

Lecture 3: Markov chains.

Logistics. All the course-related information regarding

MATH 118 FINAL EXAM STUDY GUIDE

MCMC: Markov Chain Monte Carlo

Introduction to Machine Learning CMU-10701

Instantaneous Rate of Change

Lecture 2 : CS6205 Advanced Modeling and Simulation

Math 16A Discussion Worksheet January 23, 2018

Math Linear Algebra Spring Term 2014 Course Description

Homework 6. Wife Husband XY Sum Mean SS

The Theory behind PageRank

Rapid Introduction to Machine Learning/ Deep Learning

Winter 2019 Math 106 Topics in Applied Mathematics. Lecture 9: Markov Chain Monte Carlo

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Lecture Notes: Solving Linear Systems with Gauss Elimination

PRACTICE PROBLEMS FOR MIDTERM I

Distributive property and its connection to areas

Course Staff. Textbook

0.1 Naive formulation of PageRank

Section Notes 9. Midterm 2 Review. Applied Math / Engineering Sciences 121. Week of December 3, 2018

Algebra 2 C Midterm Exam Review Topics Exam Date: A2: Wednesday, January 21 st A4: Friday, January 23 rd

Convex Optimization CMU-10725

Math 3C Lecture 20. John Douglas Moore

Announcements August 31

INTRODUCTION TO MARKOV CHAINS AND MARKOV CHAIN MIXING

TMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013

Assignment 3 with Reference Solutions

Fast Convolution; Strassen s Method

Unit 2: Logic and Reasoning. start of unit

Take-Home Exam 1: pick up on Thursday, June 8, return Monday,

CS 340: Discrete Structures for Engineers

MATH 310, REVIEW SHEET 2

ISM206 Lecture, May 12, 2005 Markov Chain

Introduction to Computational Biology Lecture # 14: MCMC - Markov Chain Monte Carlo

Math 166: Topics in Contemporary Mathematics II

Physics 1140 Fall 2013 Introduction to Experimental Physics

MAT 1302B Mathematical Methods II

LECTURE #6 BIRTH-DEATH PROCESS

CMPSCI 240: Reasoning about Uncertainty

Introduction to Restricted Boltzmann Machines

Announcements Wednesday, August 30

Fundamental Theorem of Algebra (NEW): A polynomial function of degree n > 0 has n complex zeros. Some of these zeros may be repeated.

JEFFERSON COLLEGE COURSE SYLLABUS MTH 141 PRECALCULUS. 5 Credit Hours. Prepared by John M Johny August 2012

CMPSCI 240: Reasoning about Uncertainty

Convergence Rate of Markov Chains

Announcements. Inference. Mid-term. Inference by Enumeration. Reminder: Alarm Network. Introduction to Artificial Intelligence. V22.

CSE 241 Digital Systems Spring 2013

Welcome back to Physics 211

Math 240: Syllabus and Vector Functions

Announcements Wednesday, August 30

LAGUARDIA COMMUNITY COLLEGE CITY UNIVERSITY OF NEW YORK DEPARTMENT OF MATHEMATICS, ENGINEERING AND COMPUTER SCIENCE

Chapter 3: Inequalities, Lines and Circles, Introduction to Functions

(x 1 +x 2 )(x 1 x 2 )+(x 2 +x 3 )(x 2 x 3 )+(x 3 +x 1 )(x 3 x 1 ).

Finding local extrema and intervals of increase/decrease

Markov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued

Homework 2. Solutions T =

Transcription:

CMPSCI 240: Reasoning Under Uncertainty Lecture 24 Prof. Hanna Wallach wallach@cs.umass.edu April 24, 2012

Reminders Check the course website: http://www.cs.umass.edu/ ~wallach/courses/s12/cmpsci240/ Eighth homework is due on Friday

Grades 1. Add discussion section scores, divide by 14, multiply by 10 2. Add homework scores, divide by 250, multiply by 30 3. Divide midterm score by 100, multiply by 30 4. Add 1 3 to obtain your score (max. possible is 70) 5. If your score is 40 or less, you re in danger of getting a D

Recap

Last Time: Matrices and Vectors for Markov Chains Transition probability matrix: A = p 11 p 12 p 13... p 1S p 21 p 21 p 22... p 2S............... p S1 p S2 p S3... p SS

Last Time: Matrices and Vectors for Markov Chains Transition probability matrix: A = p 11 p 12 p 13... p 1S p 21 p 21 p 22... p 2S............... p S1 p S2 p S3... p SS State probability vector: v (t) with v (t) i = P(X t =i)

Last Time: Matrices and Vectors for Markov Chains Transition probability matrix: A = p 11 p 12 p 13... p 1S p 21 p 21 p 22... p 2S............... p S1 p S2 p S3... p SS State probability vector: v (t) with v (t) i = P(X t =i) n-step transition probabilities: v (n) = v (0) A n

Last Time: Steady State If va = v then v is a steady state distribution

Last Time: Steady State If va = v then v is a steady state distribution Most Markov chains have a unique steady state distribution that is approached by successive time steps (applications of transition matrix A) from any starting distribution

Last Time: Steady State If va = v then v is a steady state distribution Most Markov chains have a unique steady state distribution that is approached by successive time steps (applications of transition matrix A) from any starting distribution va = v defines a set of n + 1 simultaneous equations

Examples of Steady State e.g., draw the transition probability graph and find the steady state distribution for a Markov chain with 4 states and 0 0.9 0.1 0 A = 0.2 0.8 0 0 0 0 0 1 0 0 0.6 0.4

Irreducible Markov Chains The steady state distribution is v = 0, 0, 0.375, 0.625

Irreducible Markov Chains The steady state distribution is v = 0, 0, 0.375, 0.625 There is no path from states 3 or 4 to states 1 or 2

Irreducible Markov Chains The steady state distribution is v = 0, 0, 0.375, 0.625 There is no path from states 3 or 4 to states 1 or 2 A Markov chain is irreducible if there exists a path in the transition graph from every state to every other state

Irreducible Markov Chains The steady state distribution is v = 0, 0, 0.375, 0.625 There is no path from states 3 or 4 to states 1 or 2 A Markov chain is irreducible if there exists a path in the transition graph from every state to every other state If a Markov chain is not irreducible, then it is reducible

Communicating Classes State i communicates with state j, i.e., i j, if there is some way of reaching state j from state i and vice versa

Communicating Classes State i communicates with state j, i.e., i j, if there is some way of reaching state j from state i and vice versa A set of states C is a communicating class if i j for every i, j C and no i C communicates with some j / C

Communicating Classes State i communicates with state j, i.e., i j, if there is some way of reaching state j from state i and vice versa A set of states C is a communicating class if i j for every i, j C and no i C communicates with some j / C C is closed if the probability of leaving it is zero

Communicating Classes State i communicates with state j, i.e., i j, if there is some way of reaching state j from state i and vice versa A set of states C is a communicating class if i j for every i, j C and no i C communicates with some j / C C is closed if the probability of leaving it is zero A Markov chain is irreducible if its state space forms a single communicating class, i.e., i, j C for all i, j S

Examples of Communicating Classes e.g., how many communicating classes are there if 0 0.9 0.1 0 A = 0.2 0.8 0 0 0 0 0 1 0 0 0.6 0.4

Examples of Communicating Classes e.g., how many communicating classes are there if 0 0.9 0.1 0 A = 0.2 0.8 0 0 0 0 0 1 0 0 0.6 0.4 e.g., what does this say about the reducibility of the chain?

Examples of Steady State e.g., draw the transition probability graph and find the steady state distribution for a Markov chain with 4 states and 0 0.5 0 0.5 A = 0.75 0 0.25 0 0 0.75 0 0.25 0.75 0 0.25 0

Examples of Steady State e.g., draw the transition probability graph and find the steady state distribution for a Markov chain with 4 states and 0 0.5 0 0.5 A = 0.75 0 0.25 0 0 0.75 0 0.25 0.75 0 0.25 0 e.g., is this Markov chain irreducible?

Periodic Markov Chains v (0) = 0.250, 0.250, 0.250, 0.250

Periodic Markov Chains v (0) = 0.250, 0.250, 0.250, 0.250 v (1) = 0.375, 0.312, 0.125, 0.188

Periodic Markov Chains v (0) = 0.250, 0.250, 0.250, 0.250 v (1) = 0.375, 0.312, 0.125, 0.188 v (2) = 0.375, 0.281, 0.125, 0.219

Periodic Markov Chains v (0) = 0.250, 0.250, 0.250, 0.250 v (1) = 0.375, 0.312, 0.125, 0.188 v (2) = 0.375, 0.281, 0.125, 0.219 v (3) = 0.375, 0.281, 0.125, 0.219

Periodic Markov Chains v (0) = 0.250, 0.250, 0.250, 0.250 v (1) = 0.375, 0.312, 0.125, 0.188 v (2) = 0.375, 0.281, 0.125, 0.219 v (3) = 0.375, 0.281, 0.125, 0.219 v (4) = 0.375, 0.281, 0.125, 0.219

Periodic Markov Chains v (0) = 0.250, 0.250, 0.250, 0.250 v (1) = 0.375, 0.312, 0.125, 0.188 v (2) = 0.375, 0.281, 0.125, 0.219 v (3) = 0.375, 0.281, 0.125, 0.219 v (4) = 0.375, 0.281, 0.125, 0.219...

Periodic Markov Chains v (0) = 1.000, 0.000, 0.000, 0.000

Periodic Markov Chains v (0) = 1.000, 0.000, 0.000, 0.000 v (1) = 0.000, 0.500, 0.000, 0.500

Periodic Markov Chains v (0) = 1.000, 0.000, 0.000, 0.000 v (1) = 0.000, 0.500, 0.000, 0.500 v (2) = 0.750, 0.000, 0.250, 0.000

Periodic Markov Chains v (0) = 1.000, 0.000, 0.000, 0.000 v (1) = 0.000, 0.500, 0.000, 0.500 v (2) = 0.750, 0.000, 0.250, 0.000 v (3) = 0.000, 0.562, 0.000, 0.438

Periodic Markov Chains v (0) = 1.000, 0.000, 0.000, 0.000 v (1) = 0.000, 0.500, 0.000, 0.500 v (2) = 0.750, 0.000, 0.250, 0.000 v (3) = 0.000, 0.562, 0.000, 0.438 v (4) = 0.750, 0.000, 0.250, 0.000

Periodic Markov Chains v (0) = 1.000, 0.000, 0.000, 0.000 v (1) = 0.000, 0.500, 0.000, 0.500 v (2) = 0.750, 0.000, 0.250, 0.000 v (3) = 0.000, 0.562, 0.000, 0.438 v (4) = 0.750, 0.000, 0.250, 0.000 v (5) = 0.000, 0.562, 0.000, 0.438

Periodic Markov Chains v (0) = 1.000, 0.000, 0.000, 0.000 v (1) = 0.000, 0.500, 0.000, 0.500 v (2) = 0.750, 0.000, 0.250, 0.000 v (3) = 0.000, 0.562, 0.000, 0.438 v (4) = 0.750, 0.000, 0.250, 0.000 v (5) = 0.000, 0.562, 0.000, 0.438 v (6) = 0.750, 0.000, 0.250, 0.000

Periodic Markov Chains v (0) = 1.000, 0.000, 0.000, 0.000 v (1) = 0.000, 0.500, 0.000, 0.500 v (2) = 0.750, 0.000, 0.250, 0.000 v (3) = 0.000, 0.562, 0.000, 0.438 v (4) = 0.750, 0.000, 0.250, 0.000 v (5) = 0.000, 0.562, 0.000, 0.438 v (6) = 0.750, 0.000, 0.250, 0.000 v (7) = 0.000, 0.562, 0.000, 0.438

Periodic Markov Chains v (0) = 1.000, 0.000, 0.000, 0.000 v (1) = 0.000, 0.500, 0.000, 0.500 v (2) = 0.750, 0.000, 0.250, 0.000 v (3) = 0.000, 0.562, 0.000, 0.438 v (4) = 0.750, 0.000, 0.250, 0.000 v (5) = 0.000, 0.562, 0.000, 0.438 v (6) = 0.750, 0.000, 0.250, 0.000 v (7) = 0.000, 0.562, 0.000, 0.438 v (8) = 0.750, 0.000, 0.250, 0.000...

Periodic Markov Chains Chain is always in 1 or 3 on even t and 2 or 4 on odd t

Periodic Markov Chains Chain is always in 1 or 3 on even t and 2 or 4 on odd t X = {1, 3} Y = {2, 4} and vice versa

Periodic Markov Chains Chain is always in 1 or 3 on even t and 2 or 4 on odd t X = {1, 3} Y = {2, 4} and vice versa If the chain is in a state in X at time t, then at time t + 2 it must return to a state in X ; the same is true for Y

Periodic Markov Chains A state i is periodic if the probability of returning to i is zero except at regular intervals (e.g., every 2 time steps)

Periodic Markov Chains A state i is periodic if the probability of returning to i is zero except at regular intervals (e.g., every 2 time steps) If all states are periodic, then the chain is periodic

Periodic Markov Chains A state i is periodic if the probability of returning to i is zero except at regular intervals (e.g., every 2 time steps) If all states are periodic, then the chain is periodic An irreducible Markov chain is periodic if there is some k > 1 such that A k is the transition matrix of a reducible chain

Steady State Periodic: chain moves from group to group in a periodic way

Steady State Periodic: chain moves from group to group in a periodic way Reducible: some states can t be reached from others

Steady State Periodic: chain moves from group to group in a periodic way Reducible: some states can t be reached from others Steady state: if a Markov chain is aperiodic and irreducible then there must be some v such that for some t P(X t =i) v i ɛ for any starting distribution and any positive, real ɛ

For Next Time Check the course website: http://www.cs.umass.edu/ ~wallach/courses/s12/cmpsci240/ Eighth homework is due on Friday