MATH3200, Lecture 31: Applications of Eigenvectors. Markov Chains and Chemical Reaction Systems

Size: px
Start display at page:

Download "MATH3200, Lecture 31: Applications of Eigenvectors. Markov Chains and Chemical Reaction Systems"

Transcription

1 Lecture 31: Some Applications of Eigenvectors: Markov Chains and Chemical Reaction Systems Winfried Just Department of Mathematics, Ohio University April 9 11, 2018

2 Review: Eigenvectors and left eigenvectors A nonzero column vector x is an eigenvector aka right eigenvector of a square matrix A with eigenvalue λ if A x = λ x. A nonzero row vector y is a left eigenvector of a square matrix A with eigenvalue λ if ya = λ y. Note that y left-multiplies A here. By Homework 91, x is a (right) eigenvector of A with eigenvalue λ if, and only if, y = x T is a left eigenvector of A T with the same eigenvalue λ.

3 Review: Markov chains A Markov chain is a stochastic process. Time proceeds in discrete steps t = 0, 1, 2,... At each time t the process can only be in one of several states that are numbered 1,..., n. The probability of being in a given state at time t + 1 depends only on the state at time t. The matrix P = [p ij ] n n gives the transition probabilities p ij from state i at time t to state j at time t + 1. When x(t) = [x 1 (t),..., x n (t)] is the probability distribution for the states at time t, then the probability distribution x(t + 1) at time t + 1 is given by x(t + 1) = x(t)p = [x 1 (t),..., x n (t)]p.

4 Review: Markov chains for weather.com light Time proceeds in steps of days. State 1: sunny day, State 2: rainy day. Each day is somehow unambiguously classified in this way. The meaning of the transition probabilities: p 11 is the probability that a sunny day is followed by another sunny day. p 12 is the probability that a sunny day is followed by a rainy day. p 21 is the probability that a rainy day is followed by a sunny day. p 22 is the probability that a rainy day is followed by another rainy day. [ ] p11 p P = 12 p 21 p 22 x(t) = [x 1 (t), x 2 (t)], where x 1 (t) is the probability that day t will be a sunny day. x 2 (t) is the probability that day t will be a rainy day.

5 An example of P for weather.com light [ ] p11 p Let P = 12 = p 21 p 22 [ 0.6 ] A sunny day is followed by another sunny day with probability 0.6. A sunny day is followed by a rainy day with probability 0.4. A rainy day is followed by a sunny day with probability 0.3. A rainy day is followed by another rainy day with probability 0.7. P is a stochastic matrix, which means that each row adds up to 1. This will be true for every transition probability matrix of a Markov chain, as each state i must be followed by some state in the next time step.

6 One-state transitions for our example of P [ ] p11 p Let P = 12 = p 21 p 22 [ 0.6 ] Consider the following probability distributions for day t: x(t) = [1, 0] means that day t is sunny for sure. y(t) = [0.5, 0.5] means equal likelihood of a sunny or a rainy day. Note that the probabilities of all states always add up to 1. The corresponding probabilities for the next day are: x(t + 1) = [1, 0] P = [1, 0] y(t + 1) = [0.5, 0.5] P = [0.5, 0.5] [ ] = [0.6, 0.4] [ ] = [0.45, 0.55]

7 The eigenvalues and an eigenvector for P of our example Let P = [ 0.6 ] P λi = [ 0.6 λ 0.4 ] λ det(p λi) = λ 2 1.3λ = (1 λ)(0.3 λ). The eigenvalues are λ 1 = 1 and λ 2 = 0.3. Let s find an eigenvector with eigenvalue 1: Form P 1I = [ ] = [ ] [ ] x1 Solve = x 2 [ ] 0 0 [ 0.4 ] x x 2 = 0 0.3x 1 0.3x 2 = 0 By setting x 1 = 1, we see that x = [1, 1] T is an eigenvector with eigenvalue 1 of P.

8 The meaning of the eigenvector [1, 1] T Eigenvectors with eigenvalues λ 1 are less important for transition matrices of Markov chains, so we will skip finding a eigenvector with eigenvalue λ 2 = 0.3 in our example. But we will take a closer look at the eigenvector [1, 1] T with eigenvalue λ 1 = 1. Let A = [a ij ] n n be any square matrix. Then [1, 1,..., 1] T is an eigenvector of A with eigenvalue λ if, and only if, a 11 a a 1n 1 a 11 + a a 1n λ a 21 a a 2n = a 21 + a a 2n. = λ. a n1 a n2... a nn 1 a n1 + a n2 + + a nn λ Thus [1, 1,..., 1] T is an eigenvector of A with eigenvalue λ if, and only if, each row of A adds up to λ. In particular, [1, 1,..., 1] T is an eigenvector of A with eigenvalue 1 if, and only if, A is a stochastic matrix.

9 We have proved a theorem... The observation on the previous slide proves parts (a) and (b) of the following result: Theorem Let P = [p ij ] n n be the matrix of transition probabilities for a Markov chain. Then (a) λ = 1 is an eigenvalue of P. (b) [1, 1,..., 1] T is an eigenvector of P with eigenvalue 1. (c) Every eigenvalue λ of P satisfies λ λ = 1. Part (c) is a consequence of a more general theorem called the Perron-Frobenius Theorem that goes beyond the scope of this course. This part says that λ = 1 is a so-called leading eigenvalue of P.

10 How about the eigenvectors of P T? Since every square matrix has the same eigenvalues as its transpose, λ = 1 must also be an eigenvalue of P T. Let s find a corresponding eigenvector for our example of P: Form P T 1I = [ [ ] [ ] x1 Solve = x 2 ] [ ] 0 0 [ ] = [ x x 2 = 0 0.4x 1 0.3x 2 = 0 We find that every vector of the form x = [ x 1, 4 3 x 1] T is an eigenvector with eigenvalue 1 of P T. These are the only eigenvectors with eigenvalue 1 of P T. Here it will be useful to find the eigenvector x = [x 1, x 2 ] T with x 1 + x 2 = 1 = x x 1 = 7 3 x 1. It is x = [ 3 7, 4 7] T [0.4286, ] T. ]

11 The meaning of the eigenvector [0.4286, ] T of P T By the result of Homework 91, the vector ([0.4286, ] T ) T = [0.4286, ] is a left eigenvector of P. Moreover, since the coordinates add up to 1 and are nonnegative, [0.4286, ] is a probability distribution. It follows that if the probability distribution of the weather in our example on day t is x(t) = [0.4286, ], then the probability distribution of the weather on day t + 1 is x(t + 1) = [0.4286, ]P = [0.4286, ]. x = [0.4286, ] is a stationary (probability) distribution, which means that it remains the same on the next and all future days. In fact, x = [0.4286, ] is the only stationary distribution in this example.

12 These observations generalize Theorem Let P be the transition probability matrix of a Markov chain with n states and let x = [x 1, x 2,... x n] be a probability distribution. (a) x is a stationary distribution for this Markov chain if, and only if, x is a left eigenvector with eigenvalue 1 of P. (b) There exists at least one stationary distribution x of the Markov chain. (c) If x is the only stationary distribution of the Markov chain, then for any given initial distribution x(0), the distributions x(t) always approach x as t. Point (b) follows from point (a) and the previous theorem. Note also that in point (c) it is necessary that x is unique, because when we start in one stationary distribution y then we cannot approach another stationary distribution x.

13 Some alternative versions of weather.com light Let us consider some other transition probability matrices P for weather.com light Markov chains. Homework 93: (a) Let P 1 = I 2 (the weather always stays the same). Show that in this case every probability distribution x = [x 1, x 2 ] is a stationary distribution. [ ] 0 1 (b) Let P 2 = 1 0 Show that in this case x = [0.5, 0.5] is the unique stationary distribution. (c) Find a third transition probability matrix P 3 with stationary distribution x = [0.5, 0.5]. (d) Formulate a condition on P that appears to guarantee that x = [0.5, 0.5] is a stationary distribution and prove that it does.

14 Remember Waldo? Waldo is a highly gregarious and motivated and spends all of his evenings working with six students on his MATH 3200 homework. At 7p.m. he visits a randomly chosen student i among those six, and then operates as follows: He starts working with i. After 10 minutes, he flips a fair coin. If the coin comes up heads, he continues working with i for another 10 minutes before flipping the coin again. If the coin comes up tails, he moves to the room of a randomly chosen friend of i and repeats the procedure. He never tires of these efforts until 1a.m. Where should we go looking for Waldo at midnight?

15 The stationary distribution for Waldo Waldo s itinerary can be modeled as a Markov chain with states i = 1, 2,..., 6, where one time step lasts 10 minutes. State i simply means that Waldo is in i s room. The transition probability matrix for this Markov chain is 1/ /4 0 1/4 0 1/2 0 1/4 0 1/4 P = 0 0 1/2 1/4 1/4 0 1/8 1/8 1/8 1/2 0 1/8 = [p ij ] /2 0 1/2 0 1/6 1/6 0 1/6 0 1/2 Homework 94: Show that this Markov chain has a unique stationary probability distribution and find it.

16 Eigenvectors with eigenvalue 0 and the nullspace of A Let A be a square matrix. Let N(A) denote the set of all eigenvectors of A with eigenvalue 0 together with the zero vector 0. It is the nullspace of A. (A nullspace can be defined for any matrix A, but only for square matrices in terms of eigenvectors.) Proposition (a) N(A) has a nonzero element x 0 if, and only if, A is singular. (b) N(A) is the set of all nonzero solutions x of the homogeneous system A x = 0. (c) If a 1, a 2,..., a n denote the column vectors of A, then N(A) is the set of all vectors [x 1, x 2,..., x n ] T of coefficients such that x 1 a 1 + x 2 a x n a n = 0. (d) N(A) is the set of all vectors x such that T A ( x) = 0. N(A) is also called the kernel of T A.

17 Review: Chemical reaction networks; net change of concentrations Consider a chemical reaction network like: A + 2B 2C A + B D A + 2C 2D B + D 2C If initial concentrations are denoted by [A] 0, [B] 0, [C] 0, [D] 0 and concentrations are measured again after some time and denoted by [A] 1, [B] 1, [C] 1, [D] 1, then the vector w = [[A] 1 [A] 0, [B] 1 [B] 0, [C] 1 [C] 0, [D] 1 [D] 0 ] represents the net change in concentrations. If some coordinate [X ] 1 [X ] 0 is positive, then a net production of compound X was observed, if some coordinate [X ] 1 [X ] 0 is negative, then a net consumption of compound X was observed.

18 Review: Chemical reaction networks; reaction vectors and stoichiometric matrix The reaction vectors of the chemical reaction network 1 A + 2B 2C 2 A + 2C 2D 3 A + B D 4 B + D 2C v 1 = 2 2 v 2 = = 1 0 v 4 = represent the net changes in concentrations if only one reaction occurs and consumes one mole of its first reactant. They can be written as the columns of the stoichiometric matrix S.

19 Review: The linear transformation T S If we let k = [k 1, k 2, k 3, k 4 ] T be the column vector of average net rates at which the reactions occur over a given time interval, then the matrix product S k = w gives us the net change in concentrations. Positive values k i > 0 signify that the forward reaction dominates; negative values k i < 0 signify that the backward reaction dominates. When k = 0 over arbitrarily short time intervals, then each reaction is at equilibrium. When S k = 0 over arbitrarily short time intervals, then no observable change occurs and the system is at equilibrium. The nullspace N(S) is the set of all rate vectors k where the system is at equilibrium.

20 The rank of the stoichiometric matrix S Recall the result of Group Work 6 : Proposition Suppose S represents a stoichiometric matrix of order m n for n reactions between m chemical species in a closed reaction system (without net inflow, net outflow, or contributions from or to other reactions). Then r(s) < m. It follows that if m = n, then S is singular, so that it has at least one eigenvector with eigenvalue 0. Each such eigenvector represents a vector of reaction rates where the system is at equilibrium, but at least one reaction is not at equilibrium.

21 Homework problems Homework 95: Let S be a stoichiometric matrix of order n n, and let k be an eigenvector with eigenvalue 0 for S. Show that k must have at least 2 nonzero coordinates. Homework 96: Let S be a stoichiometric matrix for the chemical reaction network A + 2B 2C A + B D A + 2C 2D B + D 2C Find the set of all eigenvectors of S with eigenvalue 0.

Lecture 5: Introduction to Markov Chains

Lecture 5: Introduction to Markov Chains Lecture 5: Introduction to Markov Chains Winfried Just Department of Mathematics, Ohio University January 24 26, 2018 weather.com light The weather is a stochastic process. For now we can assume that this

More information

Lecture 18: The Rank of a Matrix and Consistency of Linear Systems

Lecture 18: The Rank of a Matrix and Consistency of Linear Systems Lecture 18: The Rank of a Matrix and Consistency of Linear Systems Winfried Just Department of Mathematics, Ohio University February 28, 218 Review: The linear span Definition Let { v 1, v 2,..., v n }

More information

No class on Thursday, October 1. No office hours on Tuesday, September 29 and Thursday, October 1.

No class on Thursday, October 1. No office hours on Tuesday, September 29 and Thursday, October 1. Stationary Distributions Monday, September 28, 2015 2:02 PM No class on Thursday, October 1. No office hours on Tuesday, September 29 and Thursday, October 1. Homework 1 due Friday, October 2 at 5 PM strongly

More information

18.440: Lecture 33 Markov Chains

18.440: Lecture 33 Markov Chains 18.440: Lecture 33 Markov Chains Scott Sheffield MIT 1 Outline Markov chains Examples Ergodicity and stationarity 2 Outline Markov chains Examples Ergodicity and stationarity 3 Markov chains Consider a

More information

Math Camp Notes: Linear Algebra II

Math Camp Notes: Linear Algebra II Math Camp Notes: Linear Algebra II Eigenvalues Let A be a square matrix. An eigenvalue is a number λ which when subtracted from the diagonal elements of the matrix A creates a singular matrix. In other

More information

Markov Chains. As part of Interdisciplinary Mathematical Modeling, By Warren Weckesser Copyright c 2006.

Markov Chains. As part of Interdisciplinary Mathematical Modeling, By Warren Weckesser Copyright c 2006. Markov Chains As part of Interdisciplinary Mathematical Modeling, By Warren Weckesser Copyright c 2006 1 Introduction A (finite) Markov chain is a process with a finite number of states (or outcomes, or

More information

Lecture 19: Introduction to Linear Transformations

Lecture 19: Introduction to Linear Transformations Lecture 19: Introduction to Linear Transformations Winfried Just, Ohio University October 11, 217 Scope of this lecture Linear transformations are important and useful: A lot of applications of linear

More information

18.600: Lecture 32 Markov Chains

18.600: Lecture 32 Markov Chains 18.600: Lecture 32 Markov Chains Scott Sheffield MIT Outline Markov chains Examples Ergodicity and stationarity Outline Markov chains Examples Ergodicity and stationarity Markov chains Consider a sequence

More information

Today. Next lecture. (Ch 14) Markov chains and hidden Markov models

Today. Next lecture. (Ch 14) Markov chains and hidden Markov models Today (Ch 14) Markov chains and hidden Markov models Graphical representation Transition probability matrix Propagating state distributions The stationary distribution Next lecture (Ch 14) Markov chains

More information

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th. Elementary Linear Algebra Review for Exam Exam is Monday, November 6th. The exam will cover sections:.4,..4, 5. 5., 7., the class notes on Markov Models. You must be able to do each of the following. Section.4

More information

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Prof. Tesler Math 283 Fall 2016 Also see the separate version of this with Matlab and R commands. Prof. Tesler Diagonalizing

More information

MATH 56A: STOCHASTIC PROCESSES CHAPTER 1

MATH 56A: STOCHASTIC PROCESSES CHAPTER 1 MATH 56A: STOCHASTIC PROCESSES CHAPTER. Finite Markov chains For the sake of completeness of these notes I decided to write a summary of the basic concepts of finite Markov chains. The topics in this chapter

More information

Matrices 2. Slide for MA1203 Business Mathematics II Week 4

Matrices 2. Slide for MA1203 Business Mathematics II Week 4 Matrices 2 Slide for MA1203 Business Mathematics II Week 4 2.7 Leontief Input Output Model Input Output Analysis One important applications of matrix theory to the field of economics is the study of the

More information

Finite-Horizon Statistics for Markov chains

Finite-Horizon Statistics for Markov chains Analyzing FSDT Markov chains Friday, September 30, 2011 2:03 PM Simulating FSDT Markov chains, as we have said is very straightforward, either by using probability transition matrix or stochastic update

More information

Math 304 Handout: Linear algebra, graphs, and networks.

Math 304 Handout: Linear algebra, graphs, and networks. Math 30 Handout: Linear algebra, graphs, and networks. December, 006. GRAPHS AND ADJACENCY MATRICES. Definition. A graph is a collection of vertices connected by edges. A directed graph is a graph all

More information

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Prof. Tesler Math 283 Fall 2018 Also see the separate version of this with Matlab and R commands. Prof. Tesler Diagonalizing

More information

Markov Chains Handout for Stat 110

Markov Chains Handout for Stat 110 Markov Chains Handout for Stat 0 Prof. Joe Blitzstein (Harvard Statistics Department) Introduction Markov chains were first introduced in 906 by Andrey Markov, with the goal of showing that the Law of

More information

TMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013

TMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013 TMA4115 - Calculus 3 Lecture 21, April 3 Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013 www.ntnu.no TMA4115 - Calculus 3, Lecture 21 Review of last week s lecture Last week

More information

Linear Algebra Practice Problems

Linear Algebra Practice Problems Math 7, Professor Ramras Linear Algebra Practice Problems () Consider the following system of linear equations in the variables x, y, and z, in which the constants a and b are real numbers. x y + z = a

More information

Markov Chains and Related Matters

Markov Chains and Related Matters Markov Chains and Related Matters 2 :9 3 4 : The four nodes are called states. The numbers on the arrows are called transition probabilities. For example if we are in state, there is a probability of going

More information

CHAPTER 6. Markov Chains

CHAPTER 6. Markov Chains CHAPTER 6 Markov Chains 6.1. Introduction A(finite)Markovchainisaprocess withafinitenumberofstates (or outcomes, or events) in which the probability of being in a particular state at step n+1depends only

More information

Linear Algebra Solutions 1

Linear Algebra Solutions 1 Math Camp 1 Do the following: Linear Algebra Solutions 1 1. Let A = and B = 3 8 5 A B = 3 5 9 A + B = 9 11 14 4 AB = 69 3 16 BA = 1 4 ( 1 3. Let v = and u = 5 uv = 13 u v = 13 v u = 13 Math Camp 1 ( 7

More information

This operation is - associative A + (B + C) = (A + B) + C; - commutative A + B = B + A; - has a neutral element O + A = A, here O is the null matrix

This operation is - associative A + (B + C) = (A + B) + C; - commutative A + B = B + A; - has a neutral element O + A = A, here O is the null matrix 1 Matrix Algebra Reading [SB] 81-85, pp 153-180 11 Matrix Operations 1 Addition a 11 a 12 a 1n a 21 a 22 a 2n a m1 a m2 a mn + b 11 b 12 b 1n b 21 b 22 b 2n b m1 b m2 b mn a 11 + b 11 a 12 + b 12 a 1n

More information

Definition A finite Markov chain is a memoryless homogeneous discrete stochastic process with a finite number of states.

Definition A finite Markov chain is a memoryless homogeneous discrete stochastic process with a finite number of states. Chapter 8 Finite Markov Chains A discrete system is characterized by a set V of states and transitions between the states. V is referred to as the state space. We think of the transitions as occurring

More information

STA 294: Stochastic Processes & Bayesian Nonparametrics

STA 294: Stochastic Processes & Bayesian Nonparametrics MARKOV CHAINS AND CONVERGENCE CONCEPTS Markov chains are among the simplest stochastic processes, just one step beyond iid sequences of random variables. Traditionally they ve been used in modelling a

More information

1.3 Convergence of Regular Markov Chains

1.3 Convergence of Regular Markov Chains Markov Chains and Random Walks on Graphs 3 Applying the same argument to A T, which has the same λ 0 as A, yields the row sum bounds Corollary 0 Let P 0 be the transition matrix of a regular Markov chain

More information

Lecture 15 Perron-Frobenius Theory

Lecture 15 Perron-Frobenius Theory EE363 Winter 2005-06 Lecture 15 Perron-Frobenius Theory Positive and nonnegative matrices and vectors Perron-Frobenius theorems Markov chains Economic growth Population dynamics Max-min and min-max characterization

More information

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018 Homework #1 Assigned: August 20, 2018 Review the following subjects involving systems of equations and matrices from Calculus II. Linear systems of equations Converting systems to matrix form Pivot entry

More information

Eigenvalues in Applications

Eigenvalues in Applications Eigenvalues in Applications Abstract We look at the role of eigenvalues and eigenvectors in various applications. Specifically, we consider differential equations, Markov chains, population growth, and

More information

In a forward reaction, the reactants collide to produce products and it goes from left to

In a forward reaction, the reactants collide to produce products and it goes from left to Worksheet #1 Approaching Equilibrium Read unit II your textbook. Answer all of the questions. Do not start the questions until you have completed the reading. Be prepared to discuss your answers next period.

More information

Section 1.7: Properties of the Leslie Matrix

Section 1.7: Properties of the Leslie Matrix Section 1.7: Properties of the Leslie Matrix Definition: A matrix A whose entries are nonnegative (positive) is called a nonnegative (positive) matrix, denoted as A 0 (A > 0). Definition: A square m m

More information

ECEN 689 Special Topics in Data Science for Communications Networks

ECEN 689 Special Topics in Data Science for Communications Networks ECEN 689 Special Topics in Data Science for Communications Networks Nick Duffield Department of Electrical & Computer Engineering Texas A&M University Lecture 8 Random Walks, Matrices and PageRank Graphs

More information

MAA704, Perron-Frobenius theory and Markov chains.

MAA704, Perron-Frobenius theory and Markov chains. November 19, 2013 Lecture overview Today we will look at: Permutation and graphs. Perron frobenius for non-negative. Stochastic, and their relation to theory. Hitting and hitting probabilities of chain.

More information

LTCC. Exercises. (1) Two possible weather conditions on any day: {rainy, sunny} (2) Tomorrow s weather depends only on today s weather

LTCC. Exercises. (1) Two possible weather conditions on any day: {rainy, sunny} (2) Tomorrow s weather depends only on today s weather 1. Markov chain LTCC. Exercises Let X 0, X 1, X 2,... be a Markov chain with state space {1, 2, 3, 4} and transition matrix 1/2 1/2 0 0 P = 0 1/2 1/3 1/6. 0 0 0 1 (a) What happens if the chain starts in

More information

18.175: Lecture 30 Markov chains

18.175: Lecture 30 Markov chains 18.175: Lecture 30 Markov chains Scott Sheffield MIT Outline Review what you know about finite state Markov chains Finite state ergodicity and stationarity More general setup Outline Review what you know

More information

spring, math 204 (mitchell) list of theorems 1 Linear Systems Linear Transformations Matrix Algebra

spring, math 204 (mitchell) list of theorems 1 Linear Systems Linear Transformations Matrix Algebra spring, 2016. math 204 (mitchell) list of theorems 1 Linear Systems THEOREM 1.0.1 (Theorem 1.1). Uniqueness of Reduced Row-Echelon Form THEOREM 1.0.2 (Theorem 1.2). Existence and Uniqueness Theorem THEOREM

More information

Homework set 2 - Solutions

Homework set 2 - Solutions Homework set 2 - Solutions Math 495 Renato Feres Simulating a Markov chain in R Generating sample sequences of a finite state Markov chain. The following is a simple program for generating sample sequences

More information

Lecture 9: Submatrices and Some Special Matrices

Lecture 9: Submatrices and Some Special Matrices Lecture 9: Submatrices and Some Special Matrices Winfried Just Department of Mathematics, Ohio University February 2, 2018 Submatrices A submatrix B of a given matrix A is any matrix that can be obtained

More information

Daily Update. Math 290: Elementary Linear Algebra Fall 2018

Daily Update. Math 290: Elementary Linear Algebra Fall 2018 Daily Update Math 90: Elementary Linear Algebra Fall 08 Lecture 7: Tuesday, December 4 After reviewing the definitions of a linear transformation, and the kernel and range of a linear transformation, we

More information

LECTURE NOTES: Discrete time Markov Chains (2/24/03; SG)

LECTURE NOTES: Discrete time Markov Chains (2/24/03; SG) LECTURE NOTES: Discrete time Markov Chains (2/24/03; SG) A discrete time Markov Chains is a stochastic dynamical system in which the probability of arriving in a particular state at a particular time moment

More information

MATH36001 Perron Frobenius Theory 2015

MATH36001 Perron Frobenius Theory 2015 MATH361 Perron Frobenius Theory 215 In addition to saying something useful, the Perron Frobenius theory is elegant. It is a testament to the fact that beautiful mathematics eventually tends to be useful,

More information

Markov Processes Hamid R. Rabiee

Markov Processes Hamid R. Rabiee Markov Processes Hamid R. Rabiee Overview Markov Property Markov Chains Definition Stationary Property Paths in Markov Chains Classification of States Steady States in MCs. 2 Markov Property A discrete

More information

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T. Notes on singular value decomposition for Math 54 Recall that if A is a symmetric n n matrix, then A has real eigenvalues λ 1,, λ n (possibly repeated), and R n has an orthonormal basis v 1,, v n, where

More information

Chapter 10 Markov Chains and Transition Matrices

Chapter 10 Markov Chains and Transition Matrices Finite Mathematics (Mat 119) Lecture week 3 Dr. Firozzaman Department of Mathematics and Statistics Arizona State University Chapter 10 Markov Chains and Transition Matrices A Markov Chain is a sequence

More information

Chapter 1. Vectors, Matrices, and Linear Spaces

Chapter 1. Vectors, Matrices, and Linear Spaces 1.7 Applications to Population Distributions 1 Chapter 1. Vectors, Matrices, and Linear Spaces 1.7. Applications to Population Distributions Note. In this section we break a population into states and

More information

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION REVIEW FOR EXAM III The exam covers sections 4.4, the portions of 4. on systems of differential equations and on Markov chains, and..4. SIMILARITY AND DIAGONALIZATION. Two matrices A and B are similar

More information

MATH 310, REVIEW SHEET 2

MATH 310, REVIEW SHEET 2 MATH 310, REVIEW SHEET 2 These notes are a very short summary of the key topics in the book (and follow the book pretty closely). You should be familiar with everything on here, but it s not comprehensive,

More information

Lecture #5. Dependencies along the genome

Lecture #5. Dependencies along the genome Markov Chains Lecture #5 Background Readings: Durbin et. al. Section 3., Polanski&Kimmel Section 2.8. Prepared by Shlomo Moran, based on Danny Geiger s and Nir Friedman s. Dependencies along the genome

More information

CAAM 335 Matrix Analysis

CAAM 335 Matrix Analysis CAAM 335 Matrix Analysis Solutions to Homework 8 Problem (5+5+5=5 points The partial fraction expansion of the resolvent for the matrix B = is given by (si B = s } {{ } =P + s + } {{ } =P + (s (5 points

More information

MIT Final Exam Solutions, Spring 2017

MIT Final Exam Solutions, Spring 2017 MIT 8.6 Final Exam Solutions, Spring 7 Problem : For some real matrix A, the following vectors form a basis for its column space and null space: C(A) = span,, N(A) = span,,. (a) What is the size m n of

More information

2 Discrete-Time Markov Chains

2 Discrete-Time Markov Chains 2 Discrete-Time Markov Chains Angela Peace Biomathematics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

Lecture 6: Entropy Rate

Lecture 6: Entropy Rate Lecture 6: Entropy Rate Entropy rate H(X) Random walk on graph Dr. Yao Xie, ECE587, Information Theory, Duke University Coin tossing versus poker Toss a fair coin and see and sequence Head, Tail, Tail,

More information

Math 1553, Introduction to Linear Algebra

Math 1553, Introduction to Linear Algebra Learning goals articulate what students are expected to be able to do in a course that can be measured. This course has course-level learning goals that pertain to the entire course, and section-level

More information

Lecture 20 : Markov Chains

Lecture 20 : Markov Chains CSCI 3560 Probability and Computing Instructor: Bogdan Chlebus Lecture 0 : Markov Chains We consider stochastic processes. A process represents a system that evolves through incremental changes called

More information

Lecture 12: Solving Systems of Linear Equations by Gaussian Elimination

Lecture 12: Solving Systems of Linear Equations by Gaussian Elimination Lecture 12: Solving Systems of Linear Equations by Gaussian Elimination Winfried Just, Ohio University September 22, 2017 Review: The coefficient matrix Consider a system of m linear equations in n variables.

More information

Conceptual Questions for Review

Conceptual Questions for Review Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.

More information

Linear Algebra V = T = ( 4 3 ).

Linear Algebra V = T = ( 4 3 ). Linear Algebra Vectors A column vector is a list of numbers stored vertically The dimension of a column vector is the number of values in the vector W is a -dimensional column vector and V is a 5-dimensional

More information

Be sure this exam has 9 pages including the cover. The University of British Columbia

Be sure this exam has 9 pages including the cover. The University of British Columbia Be sure this exam has 9 pages including the cover The University of British Columbia Sessional Exams 2011 Term 2 Mathematics 303 Introduction to Stochastic Processes Dr. D. Brydges Last Name: First Name:

More information

Additional Homework Problems

Additional Homework Problems Math 216 2016-2017 Fall Additional Homework Problems 1 In parts (a) and (b) assume that the given system is consistent For each system determine all possibilities for the numbers r and n r where r is the

More information

The Stoichiometry of Reactions Introduction Copyright c 2018 by Nob Hill Publishing, LLC

The Stoichiometry of Reactions Introduction Copyright c 2018 by Nob Hill Publishing, LLC 1 / 70 The Stoichiometry of Reactions Introduction Copyright c 2018 by Nob Hill Publishing, LLC Stoichiometry: the determination of the proportions in which chemical elements combine or are produced and

More information

Lecture 6. Eigen-analysis

Lecture 6. Eigen-analysis Lecture 6 Eigen-analysis University of British Columbia, Vancouver Yue-Xian Li March 7 6 Definition of eigenvectors and eigenvalues Def: Any n n matrix A defines a LT, A : R n R n A vector v and a scalar

More information

The Stoichiometry of Reactions Introduction

The Stoichiometry of Reactions Introduction The Stoichiometry of Reactions Introduction Copyright c 2015 by Nob ill Publishing, LLC Stoichiometry: the determination of the proportions in which chemical elements combine or are produced and the weight

More information

MAT 1302B Mathematical Methods II

MAT 1302B Mathematical Methods II MAT 1302B Mathematical Methods II Alistair Savage Mathematics and Statistics University of Ottawa Winter 2015 Lecture 19 Alistair Savage (uottawa) MAT 1302B Mathematical Methods II Winter 2015 Lecture

More information

Lecture: Local Spectral Methods (1 of 4)

Lecture: Local Spectral Methods (1 of 4) Stat260/CS294: Spectral Graph Methods Lecture 18-03/31/2015 Lecture: Local Spectral Methods (1 of 4) Lecturer: Michael Mahoney Scribe: Michael Mahoney Warning: these notes are still very rough. They provide

More information

Summary: A Random Walks View of Spectral Segmentation, by Marina Meila (University of Washington) and Jianbo Shi (Carnegie Mellon University)

Summary: A Random Walks View of Spectral Segmentation, by Marina Meila (University of Washington) and Jianbo Shi (Carnegie Mellon University) Summary: A Random Walks View of Spectral Segmentation, by Marina Meila (University of Washington) and Jianbo Shi (Carnegie Mellon University) The authors explain how the NCut algorithm for graph bisection

More information

Markov Chains and Stochastic Sampling

Markov Chains and Stochastic Sampling Part I Markov Chains and Stochastic Sampling 1 Markov Chains and Random Walks on Graphs 1.1 Structure of Finite Markov Chains We shall only consider Markov chains with a finite, but usually very large,

More information

Graph Models The PageRank Algorithm

Graph Models The PageRank Algorithm Graph Models The PageRank Algorithm Anna-Karin Tornberg Mathematical Models, Analysis and Simulation Fall semester, 2013 The PageRank Algorithm I Invented by Larry Page and Sergey Brin around 1998 and

More information

Note that in the example in Lecture 1, the state Home is recurrent (and even absorbing), but all other states are transient. f ii (n) f ii = n=1 < +

Note that in the example in Lecture 1, the state Home is recurrent (and even absorbing), but all other states are transient. f ii (n) f ii = n=1 < + Random Walks: WEEK 2 Recurrence and transience Consider the event {X n = i for some n > 0} by which we mean {X = i}or{x 2 = i,x i}or{x 3 = i,x 2 i,x i},. Definition.. A state i S is recurrent if P(X n

More information

Application. Stochastic Matrices and PageRank

Application. Stochastic Matrices and PageRank Application Stochastic Matrices and PageRank Stochastic Matrices Definition A square matrix A is stochastic if all of its entries are nonnegative, and the sum of the entries of each column is. We say A

More information

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.] Math 43 Review Notes [Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty Dot Product If v (v, v, v 3 and w (w, w, w 3, then the

More information

A Note on Google s PageRank

A Note on Google s PageRank A Note on Google s PageRank According to Google, google-search on a given topic results in a listing of most relevant web pages related to the topic. Google ranks the importance of webpages according to

More information

18.06SC Final Exam Solutions

18.06SC Final Exam Solutions 18.06SC Final Exam Solutions 1 (4+7=11 pts.) Suppose A is 3 by 4, and Ax = 0 has exactly 2 special solutions: 1 2 x 1 = 1 and x 2 = 1 1 0 0 1 (a) Remembering that A is 3 by 4, find its row reduced echelon

More information

STOCHASTIC PROCESSES Basic notions

STOCHASTIC PROCESSES Basic notions J. Virtamo 38.3143 Queueing Theory / Stochastic processes 1 STOCHASTIC PROCESSES Basic notions Often the systems we consider evolve in time and we are interested in their dynamic behaviour, usually involving

More information

eigenvalues, markov matrices, and the power method

eigenvalues, markov matrices, and the power method eigenvalues, markov matrices, and the power method Slides by Olson. Some taken loosely from Jeff Jauregui, Some from Semeraro L. Olson Department of Computer Science University of Illinois at Urbana-Champaign

More information

Understanding MCMC. Marcel Lüthi, University of Basel. Slides based on presentation by Sandro Schönborn

Understanding MCMC. Marcel Lüthi, University of Basel. Slides based on presentation by Sandro Schönborn Understanding MCMC Marcel Lüthi, University of Basel Slides based on presentation by Sandro Schönborn 1 The big picture which satisfies detailed balance condition for p(x) an aperiodic and irreducable

More information

6 EIGENVALUES AND EIGENVECTORS

6 EIGENVALUES AND EIGENVECTORS 6 EIGENVALUES AND EIGENVECTORS INTRODUCTION TO EIGENVALUES 61 Linear equations Ax = b come from steady state problems Eigenvalues have their greatest importance in dynamic problems The solution of du/dt

More information

Math 314H Solutions to Homework # 3

Math 314H Solutions to Homework # 3 Math 34H Solutions to Homework # 3 Complete the exercises from the second maple assignment which can be downloaded from my linear algebra course web page Attach printouts of your work on this problem to

More information

8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains

8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains 8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains 8.1 Review 8.2 Statistical Equilibrium 8.3 Two-State Markov Chain 8.4 Existence of P ( ) 8.5 Classification of States

More information

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Exam 2 will be held on Tuesday, April 8, 7-8pm in 117 MacMillan What will be covered The exam will cover material from the lectures

More information

Question Points Score Total: 70

Question Points Score Total: 70 The University of British Columbia Final Examination - April 204 Mathematics 303 Dr. D. Brydges Time: 2.5 hours Last Name First Signature Student Number Special Instructions: Closed book exam, no calculators.

More information

Recitation 9: Probability Matrices and Real Symmetric Matrices. 3 Probability Matrices: Definitions and Examples

Recitation 9: Probability Matrices and Real Symmetric Matrices. 3 Probability Matrices: Definitions and Examples Math b TA: Padraic Bartlett Recitation 9: Probability Matrices and Real Symmetric Matrices Week 9 Caltech 20 Random Question Show that + + + + +... = ϕ, the golden ratio, which is = + 5. 2 2 Homework comments

More information

Lab 8: Measuring Graph Centrality - PageRank. Monday, November 5 CompSci 531, Fall 2018

Lab 8: Measuring Graph Centrality - PageRank. Monday, November 5 CompSci 531, Fall 2018 Lab 8: Measuring Graph Centrality - PageRank Monday, November 5 CompSci 531, Fall 2018 Outline Measuring Graph Centrality: Motivation Random Walks, Markov Chains, and Stationarity Distributions Google

More information

18.06 Problem Set 8 Solution Due Wednesday, 22 April 2009 at 4 pm in Total: 160 points.

18.06 Problem Set 8 Solution Due Wednesday, 22 April 2009 at 4 pm in Total: 160 points. 86 Problem Set 8 Solution Due Wednesday, April 9 at 4 pm in -6 Total: 6 points Problem : If A is real-symmetric, it has real eigenvalues What can you say about the eigenvalues if A is real and anti-symmetric

More information

Linear Algebra: Linear Systems and Matrices - Quadratic Forms and Deniteness - Eigenvalues and Markov Chains

Linear Algebra: Linear Systems and Matrices - Quadratic Forms and Deniteness - Eigenvalues and Markov Chains Linear Algebra: Linear Systems and Matrices - Quadratic Forms and Deniteness - Eigenvalues and Markov Chains Joshua Wilde, revised by Isabel Tecu, Takeshi Suzuki and María José Boccardi August 3, 3 Systems

More information

Chapter 3 Transformations

Chapter 3 Transformations Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases

More information

Discrete Markov Chain. Theory and use

Discrete Markov Chain. Theory and use Discrete Markov Chain. Theory and use Andres Vallone PhD Student andres.vallone@predoc.uam.es 2016 Contents 1 Introduction 2 Concept and definition Examples Transitions Matrix Chains Classification 3 Empirical

More information

P (E) = P (A 1 )P (A 2 )... P (A n ).

P (E) = P (A 1 )P (A 2 )... P (A n ). Lecture 9: Conditional probability II: breaking complex events into smaller events, methods to solve probability problems, Bayes rule, law of total probability, Bayes theorem Discrete Structures II (Summer

More information

Unit 16: Hidden Markov Models

Unit 16: Hidden Markov Models Computational Statistics with Application to Bioinformatics Prof. William H. Press Spring Term, 2008 The University of Texas at Austin Unit 16: Hidden Markov Models The University of Texas at Austin, CS

More information

Markov Processes. Stochastic process. Markov process

Markov Processes. Stochastic process. Markov process Markov Processes Stochastic process movement through a series of well-defined states in a way that involves some element of randomness for our purposes, states are microstates in the governing ensemble

More information

Properties of Linear Transformations from R n to R m

Properties of Linear Transformations from R n to R m Properties of Linear Transformations from R n to R m MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 2015 Topic Overview Relationship between the properties of a matrix transformation

More information

HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION)

HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION) HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION) PROFESSOR STEVEN MILLER: BROWN UNIVERSITY: SPRING 2007 1. CHAPTER 1: MATRICES AND GAUSSIAN ELIMINATION Page 9, # 3: Describe

More information

18.06 Problem Set 7 Solution Due Wednesday, 15 April 2009 at 4 pm in Total: 150 points.

18.06 Problem Set 7 Solution Due Wednesday, 15 April 2009 at 4 pm in Total: 150 points. 8.06 Problem Set 7 Solution Due Wednesday, 5 April 2009 at 4 pm in 2-06. Total: 50 points. ( ) 2 Problem : Diagonalize A = and compute SΛ 2 k S to prove this formula for A k : A k = ( ) 3 k + 3 k 2 3 k

More information

MA 527 first midterm review problems Hopefully final version as of October 2nd

MA 527 first midterm review problems Hopefully final version as of October 2nd MA 57 first midterm review problems Hopefully final version as of October nd The first midterm will be on Wednesday, October 4th, from 8 to 9 pm, in MTHW 0. It will cover all the material from the classes

More information

Differential equations

Differential equations Differential equations Math 7 Spring Practice problems for April Exam Problem Use the method of elimination to find the x-component of the general solution of x y = 6x 9x + y = x 6y 9y Soln: The system

More information

THE EIGENVALUE PROBLEM

THE EIGENVALUE PROBLEM THE EIGENVALUE PROBLEM Let A be an n n square matrix. If there is a number λ and a column vector v 0 for which Av = λv then we say λ is an eigenvalue of A and v is an associated eigenvector. Note that

More information

On the mathematical background of Google PageRank algorithm

On the mathematical background of Google PageRank algorithm Working Paper Series Department of Economics University of Verona On the mathematical background of Google PageRank algorithm Alberto Peretti, Alberto Roveda WP Number: 25 December 2014 ISSN: 2036-2919

More information

Bindel, Fall 2011 Intro to Scientific Computing (CS 3220) Week 3: Wednesday, Jan 9

Bindel, Fall 2011 Intro to Scientific Computing (CS 3220) Week 3: Wednesday, Jan 9 Problem du jour Week 3: Wednesday, Jan 9 1. As a function of matrix dimension, what is the asymptotic complexity of computing a determinant using the Laplace expansion (cofactor expansion) that you probably

More information

Markov Chains (Part 4)

Markov Chains (Part 4) Markov Chains (Part 4) Steady State Probabilities and First Passage Times Markov Chains - 1 Steady-State Probabilities Remember, for the inventory example we had (8) P &.286 =.286.286 %.286 For an irreducible

More information

Calculating determinants for larger matrices

Calculating determinants for larger matrices Day 26 Calculating determinants for larger matrices We now proceed to define det A for n n matrices A As before, we are looking for a function of A that satisfies the product formula det(ab) = det A det

More information

Lecture 5: Random Walks and Markov Chain

Lecture 5: Random Walks and Markov Chain Spectral Graph Theory and Applications WS 20/202 Lecture 5: Random Walks and Markov Chain Lecturer: Thomas Sauerwald & He Sun Introduction to Markov Chains Definition 5.. A sequence of random variables

More information