Markov Chains and Related Matters

Similar documents
Getting Started with Communications Engineering. Rows first, columns second. Remember that. R then C. 1

MA 1128: Lecture 08 03/02/2018. Linear Equations from Graphs And Linear Inequalities

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

Section 1.2. Row Reduction and Echelon Forms

Basic Linear Algebra in MATLAB

Vectors Picturing vectors Properties of vectors Linear combination Span. Vectors. Math 4A Scharlemann. 9 January /31

A Series Transformations

Methods for Solving Linear Systems Part 2

Properties of Arithmetic

An Intuitive Introduction to Motivic Homotopy Theory Vladimir Voevodsky

M. Matrices and Linear Algebra

ELE2120 Digital Circuits and Systems. Tutorial Note 10

MATH3200, Lecture 31: Applications of Eigenvectors. Markov Chains and Chemical Reaction Systems

Chapter 14. From Randomness to Probability. Copyright 2012, 2008, 2005 Pearson Education, Inc.

Solutions to Selected Questions from Denis Sevee s Vector Geometry. (Updated )

Chapter 2. Ma 322 Fall Ma 322. Sept 23-27

Announcements Wednesday, October 25

THE NULLSPACE OF A: SOLVING AX = 0 3.2

1 Last time: determinants

Bayesian Networks 3 D-separation. D-separation

CS 188: Artificial Intelligence Fall 2009

P (A) = P (B) = P (C) = P (D) =

Dot Products, Transposes, and Orthogonal Projections

Exam 2 Solutions. (a) Is W closed under addition? Why or why not? W is not closed under addition. For example,

Chapter 4. Solving Systems of Equations. Chapter 4

The Matrix Vector Product and the Matrix Product

PageRank: The Math-y Version (Or, What To Do When You Can t Tear Up Little Pieces of Paper)

Class President: A Network Approach to Popularity. Due July 18, 2014

Sequential Circuit Analysis

Markov Chains. CS70 Summer Lecture 6B. David Dinh 26 July UC Berkeley

Notes on Mathematics Groups

IEOR 6711: Professor Whitt. Introduction to Markov Chains

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Algebra 8.6 Simple Equations

Lecture 10: Powers of Matrices, Difference Equations

STA Module 4 Probability Concepts. Rev.F08 1

Synchronous Sequential Circuit Design

Section 29: What s an Inverse?

Answers in blue. If you have questions or spot an error, let me know. 1. Find all matrices that commute with A =. 4 3

RECURSION EQUATION FOR

Algebra Year 10. Language

Markov Chains and Pandemics

LS.1 Review of Linear Algebra

Systems of equation and matrices

MAT Mathematics in Today's World

Section 6.3. Matrices and Systems of Equations

The matrix will only be consistent if the last entry of row three is 0, meaning 2b 3 + b 2 b 1 = 0.

Math 54 HW 4 solutions

Getting Started with Communications Engineering

Math Lecture 26 : The Properties of Determinants

Markov Chains Handout for Stat 110

MITOCW ocw f99-lec05_300k

MATH Mathematics for Agriculture II

Solution Set 3, Fall '12

Sections 6.1 and 6.2: Systems of Linear Equations

Probabilistic Models

To factor an expression means to write it as a product of factors instead of a sum of terms. The expression 3x

Lecture 2. When we studied dimensional analysis in the last lecture, I defined speed. The average speed for a traveling object is quite simply

Announcements Wednesday, August 30

Markov chains and the number of occurrences of a word in a sequence ( , 11.1,2,4,6)

Determinants and Scalar Multiplication

Roberto s Notes on Linear Algebra Chapter 4: Matrix Algebra Section 4. Matrix products

12. Perturbed Matrices

CS 188: Artificial Intelligence Fall 2008

Finite Mathematics : A Business Approach

Algebra Year 9. Language

SECTION 2.3: LONG AND SYNTHETIC POLYNOMIAL DIVISION

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Exercise Set Suppose that A, B, C, D, and E are matrices with the following sizes: A B C D E

Announcements Wednesday, August 30

Confidence Intervals. - simply, an interval for which we have a certain confidence.

Roberto s Notes on Linear Algebra Chapter 9: Orthogonality Section 2. Orthogonal matrices

MATH 118 FINAL EXAM STUDY GUIDE

Chapter 1 Review of Equations and Inequalities

Chapter 1: Systems of linear equations and matrices. Section 1.1: Introduction to systems of linear equations

Latches. October 13, 2003 Latches 1

Week-5. Sequential Circuit Design. Acknowledgement: Most of the following slides are adapted from Prof. Kale's slides at UIUC, USA.

Steve Smith Tuition: Maths Notes

base 2 4 The EXPONENT tells you how many times to write the base as a factor. Evaluate the following expressions in standard notation.

P (E) = P (A 1 )P (A 2 )... P (A n ).

Stochastic Processes

MAC Module 2 Systems of Linear Equations and Matrices II. Learning Objectives. Upon completing this module, you should be able to :

I&C 6N. Computational Linear Algebra

Solutions to Exam I MATH 304, section 6

Matrices, Row Reduction of Matrices

A primer on matrices

Sometimes the domains X and Z will be the same, so this might be written:

18.06 Problem Set 3 Due Wednesday, 27 February 2008 at 4 pm in

Eigenvalues and Eigenvectors

Proofs, Strings, and Finite Automata. CS154 Chris Pollett Feb 5, 2007.

MAT 1302B Mathematical Methods II

Eigenvalues and eigenvectors

Chapter 29 out of 37 from Discrete Mathematics for Neophytes: Number Theory, Probability, Algorithms, and Other Stuff by J. M.

Matrices and Vectors

Probabilistic Models. Models describe how (a portion of) the world works

Sec 2.1 The Real Number Line. Opposites: Two numbers that are the same distance from the origin (zero), but on opposite sides of the origin.

Determinants and Scalar Multiplication

CS 5522: Artificial Intelligence II

Lecture 8: A Crash Course in Linear Algebra

Row Reduction

Transcription:

Markov Chains and Related Matters 2 :9 3 4 : The four nodes are called states. The numbers on the arrows are called transition probabilities. For example if we are in state, there is a probability of going to state 3 and a probability of going to state 2. There would be no probability of staying in state or going to state 4. (These are allowed in Markov chains, but my example just doesn t have them.) The sum of the outgoing numbers from any state sum to. Q520 Markov Chains

These Get Applied in Many Settings What do IU undergrads do between 6PM and midnight? 2 :9 3 4 : = study 2 = eat 3 = make phone call 4 =watchtv Q520 Markov Chains 2

Related Model: Automata a b 2 b a a a 3 Let s declare to be the start state. a b 4 We have indicated 4 as the accepting state. Then we read in words, following any paths of arrows we like, starting in and hoping to get 4. Some words accepted: aa, aaa, aabababababababa. Some words not accepted: bb, a, any word not ending in a. Q520 Markov Chains 3

Markov Chains and Related Matters 2 :9 3 4 : What we have here are probabilities instead of atomic letters like a, b, c. We also don t have starting or accepting state. However, it is possible and useful to have these extra features of top of Markov chains, as we ll see. Q520 Markov Chains 4

State Vectors Suppose we have a large number of IU students, and we measure their state by a random variable X. Let s say that the value space V (X) is f 2 3 4g, with = study 2 = eat 3 = make phone call 4 =watchtv We have probabilities like Pr (X = ), Pr (X = 2), etc. These P will be numbers between 0 and, and Pr (X = i) =. i Definition A state vector for X is a vector x = (x ::: x n ) of numbers between 0 and which sum to. Q520 Markov Chains 5

The Evolution of State Vectors Suppose our original state vector x is (:2 :3 : )..2.3 :9..4 : A good way to think about a Markov chain is that we have a large number of individuals making choices with no memory, and independent of each other. We want to go from x to x. How should we get X? For, we would have ()(:3) + ()(:) = :5 + :04 = :9. Q520 Markov Chains 6

The Evolution of State Vectors We call our original state vector x 0.Sox 0 is (:2 :3 : ). Then we do the calculations:.9.44 :9.6.4 : We get x = (:9 4 :3 :06). Continuing, we want to get x 2, x 3, :::, x 3225, :::, x n, :::. Q520 Markov Chains 7

State Vectors and Markov Chains We ll discuss formulas soon, but first let s remember that state vectors come from random variables. This means that in addition to the state variables x 0 x ::: x n :::we also have random variables X 0 X ::: X n :::. These have an important independence property: Pr (X n jx ::: X n ) = Pr (X n jx n ): () In formally, what happens at time n only depends on time n and not on anything further back. This Markov property in () formalizes the idea that we are modeling a system with no memory. In sophisticated books, a Markov chain is defined to be the sequence of random variables, and then the picture is a specification of it. Q520 Markov Chains 8

Markov Chains and Related Matters There are two ways to turn a Markov Chain into a matrix: 2 :9 3 4 : 0 0 0 0 0 0 0 or 0 0 0 0 0 :9 0 : 0 :9 : 0 0 0 0 Q520 Markov Chains 9

Matrices, continued 0 0 0 0 0 0 0 or 0 0 0 0 0 :9 0 : 0 :9 : 0 0 0 0 The first way, the rows add to. The second way, the columns do. We always write a i j for the entries in a matrix A, withtherow index first. On the left p i j is the probability of going from state i to state j. On the right p i j is the prob of going from state j to state i. The two matrices are transposes of each other. One has to be careful in reading books and papers, because both kinds of matrices are used. Q520 Markov Chains 0

Markov Chains and Related Matters Even more, one has to be careful in writing! Cf. Textbook, Chapter 0: both conventions seem to be in use! To make life easier for us, we ll call P the one on the left, and P t the one on the right. So the rows of P add to. Remember that we started with a state vector x 0 = (:2 :3 : ). To get x, we multiply matrices: x = x 0 P In more detail, x = (:2 :3 : ) 0 0 0 0 0 0 0 0 :9 : 0 Q520 Markov Chains

Markov Chains, continued The general formula is x n+ = x n P, where the operation here is matrix multiplication. We could also work with P t, and here what we would do is to multiply on the right by the transpose of x n : x t n+ = P t x t n : For example, we would get x t by multiplying P t x t 0 0 0 : 0 0 :2 0 0 :9 :3 0 :9 = 4 0 : 0 0 0 : :3 :06 Q520 Markov Chains 2

Regular Markov Chains Here is one of the main facts about Markov Chains. We say that a Markov chain is regular if for each pair of states i and j, it is possible to go from i to j following non-zero arrows. Our example chain is regular, but we ll soon see one which is not. Then, there is a unique long-term state vector x with the following properties:. x P = x :sox is stable. 2. In terms of the transpose, P t (x ) t = (x ) t. 3. No matter what the starting x 0 is, the sequence x 0 x ::: x n ::: will converge to x in an appropriate sense. 4. We can find x by solving a set of linear equations. Q520 Markov Chains 3

How to find x In our example, let s write x as (w x y z). It will be easier to work with the transpose P t and to solve We want to solve 0 0 0 0 0 :9 0 w x = 0 w x 0 : y y 0 0 0 z z When we multiply out the left side, we get 0 0w + x + y + z w + 0x + 0y + :9z w + x + 0y + :z 0w + 0x + y + 0z 0 w = x y z Q520 Markov Chains 4

Solving it, continued So we would have to solve 0w + x + y + z = w w + 0x + 0y + :9z = x w + x + 0y + :z = y 0w + 0x + y + 0z = z This is four linear equations in four unknowns, and it has a unique solution by our theorem. The easiest way to get the solution is to use a program like Matlab. Q520 Markov Chains 5

A Non-Regular Chain Flip a fain coin forever, stopping when you get a Heads. What s the probability that the first H is on an even numbered flip? 2 3 4 = an even number of T s and no H. 2 = an odd number of T s and no H. 3 = H after an an even number of all T flips 4 = H after an an odd number of all T flips We start with x 0 = ( 0 0 0), and we want to know the long term distribution. Q520 Markov Chains 6