Special Mathematics. Tutorial 13. Markov chains

Similar documents
Math 166: Topics in Contemporary Mathematics II

Discrete Markov Chain. Theory and use

APPM 4/5560 Markov Processes Fall 2019, Some Review Problems for Exam One

Math Camp Notes: Linear Algebra II

T HINK ABOUT IT MARKOV CHAINS. Transition Matrix In sociology, it is convenient to classify people by income

ISE/OR 760 Applied Stochastic Modeling

2 DISCRETE-TIME MARKOV CHAINS

Markov Chains. As part of Interdisciplinary Mathematical Modeling, By Warren Weckesser Copyright c 2006.

Markov Chains. Chapter 16. Markov Chains - 1

MATH3200, Lecture 31: Applications of Eigenvectors. Markov Chains and Chemical Reaction Systems

TMA 4265 Stochastic Processes Semester project, fall 2014 Student number and

Thank you for choosing AIMS!

Lecture 5: Introduction to Markov Chains

Doing Physics with Random Numbers

Chapter 10. Finite-State Markov Chains. Introductory Example: Googling Markov Chains

Announcements Monday, September 18

18.440: Lecture 33 Markov Chains

Weather Observation Journal

Weather An Introduction to Weather

The probability of going from one state to another state on the next trial depends only on the present experiment and not on past history.

3. If a forecast is too high when compared to an actual outcome, will that forecast error be positive or negative?

STOCHASTIC MODELS LECTURE 1 MARKOV CHAINS. Nan Chen MSc Program in Financial Engineering The Chinese University of Hong Kong (ShenZhen) Sept.

18.600: Lecture 32 Markov Chains

ASTR 101L: Motion of the Sun Take Home Lab

Chapter 16 focused on decision making in the face of uncertainty about one future

Today. Next lecture. (Ch 14) Markov chains and hidden Markov models

1 Normal Distribution.

Lesson Plan. AM 121: Introduction to Optimization Models and Methods. Lecture 17: Markov Chains. Yiling Chen SEAS. Stochastic process Markov Chains

Weather. science centers. created by: The Curriculum Corner.

MATH 56A: STOCHASTIC PROCESSES CHAPTER 1

IE 336 Seat # Name. Closed book. One page of hand-written notes, front and back. No calculator. 60 minutes.

Definition and Examples of DTMCs

each nonabsorbing state to each absorbing state.

Combinations. April 12, 2006

σ. We further know that if the sample is from a normal distribution then the sampling STAT 2507 Assignment # 3 (Chapters 7 & 8)

CHAPTER 6. Markov Chains

Definition. A matrix is a rectangular array of numbers enclosed by brackets (plural: matrices).

No class on Thursday, October 1. No office hours on Tuesday, September 29 and Thursday, October 1.

Discrete Time Markov Chain (DTMC)

IE 336 Seat # Name (clearly) Closed book. One page of hand-written notes, front and back. No calculator. 60 minutes.

General Info. Grading

Watching the Weather

Matrix Multiplication

My Calendar Notebook

Linear Algebra: Linear Systems and Matrices - Quadratic Forms and Deniteness - Eigenvalues and Markov Chains

Calculus and linear algebra for biomedical engineering Week 3: Matrices, linear systems of equations, and the Gauss algorithm

Weather Observation Journal

Probability, Random Processes and Inference

Discrete time Markov chains. Discrete Time Markov Chains, Definition and classification. Probability axioms and first results

CHAPTER 15 PROBABILITY Introduction

Markov Chains Introduction

Instructions Answers Calculators must not

LTCC. Exercises. (1) Two possible weather conditions on any day: {rainy, sunny} (2) Tomorrow s weather depends only on today s weather

Where are we in CS 440?

Chapter 10 Markov Chains and Transition Matrices

Physics 121, April 29, The Second Law of Thermodynamics.

Markov Chains and Transition Probabilities

Unit Title: Weather, Climate, Climate Change Lesson Title: Will It Rain Tomorrow?

Extending Learning Beyond the Classroom

Chapter 1 0+7= 1+6= 2+5= 3+4= 4+3= 5+2= 6+1= 7+0= How would you write five plus two equals seven?

ECE 6960: Adv. Random Processes & Applications Lecture Notes, Fall 2010

18.175: Lecture 30 Markov chains

Unit 2 Maths Methods (CAS) Exam

Markov Chains. Chapter Introduction. 1. w(1) = (.5,.25,.25) w(2) = (.4375,.1875,.375) w(3) = (.40625, , ) 1 0.

Outlines. Discrete Time Markov Chain (DTMC) Continuous Time Markov Chain (CTMC)

Discrete Mathematics & Mathematical Reasoning Induction

Grades 7 & 8, Math Circles 24/25/26 October, Probability

COVENANT UNIVERSITY NIGERIA TUTORIAL KIT OMEGA SEMESTER PROGRAMME: ECONOMICS

ECE 541 Project Report: Modeling the Game of RISK Using Markov Chains

K-ESS2-1. By: Natalie Rapson and Cassidy Smith

Law of large numbers for Markov chains

Tropical Update. 5 PM EDT Sunday, October 7, 2018 Tropical Storm Michael, Tropical Storm Leslie, & Invest 92L (30%)

Chemical Reactions Investigation Two Data Record

Uncertainty Runs Rampant in the Universe C. Ebeling circa Markov Chains. A Stochastic Process. Into each life a little uncertainty must fall.

HW1 Solutions. October 5, (20 pts.) Random variables, sample space and events Consider the random experiment of ipping a coin 4 times.

JANUARY MONDAY TUESDAY WEDNESDAY THURSDAY FRIDAY SATURDAY SUNDAY

Please simplify your answers to the extent reasonable without a calculator. Show your work. Explain your answers, concisely.

Friday, November 2, 2018 GLE/Standard: The fact that atoms are conserved, together

Worksheet 2 Problems

Propositional Calculus. Problems. Propositional Calculus 3&4. 1&2 Propositional Calculus. Johnson will leave the cabinet, and we ll lose the election.

Data, Statistics, and Probability Practice Questions

Reminders : Sign up to make up Labs (Tues/Thurs from 3 4 pm)

Unit 1: Weather. Real-World Math 2

Lectures on Probability and Statistical Models

Announcements Wednesday, October 25

ON A CONJECTURE OF WILLIAM HERSCHEL

Discrete Mathematics & Mathematical Reasoning Induction

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2 a: Conditional Probability and Bayes Rule

n α 1 α 2... α m 1 α m σ , A =

Announcements Wednesday, September 20

Consider an experiment that may have different outcomes. We are interested to know what is the probability of a particular set of outcomes.

Stochastic modelling of epidemic spread

MATH 118 FINAL EXAM STUDY GUIDE

8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains

Bishop Kelley High School Summer Math Program Course: Algebra II B

Operating Instructions 5 Day Weather Station with Color Screen Model: DG-TH8805 INDOOR UNIT

Probability- describes the pattern of chance outcomes

Introduction to Artificial Intelligence. Prof. Inkyu Moon Dept. of Robotics Engineering, DGIST

Announcements Monday, September 17

Markov Chains (Part 3)

Transcription:

Tutorial 13 Markov chains "The future starts today, not tomorrow." Pope John Paul II A sequence of trials of an experiment is a nite Markov chain if: the outcome of each experiment is one of a nite set of states Ω = {i 1, i 2,... i n }. the outcome of an experiment depends only on the present state, and not on any past states: P (X k+1 = j k+1 X k = j k, X k 1 = j k 1,... X 0 = j 0 ) = P (X k+1 = j k+1 X k = j k ) for some states j 0, j 1,... j k+1 from Ω. We will work with time-homogeneous Markov chains, i.e. the probability transition matrix P does not depend on k, and then: P 11 P 12... P 1n...... P n1 P n2... P nn where P (X k+1 = j X k = i) = P ij. The probability that the system is in state i after k steps is denoted by: p i (k) = P (X k = i) which means the following distribution for the random variables X k : X k : One has the property: i1 i 2... i n p i1 (k) p i2 (k)... p in (k) p i (k) = or written in a vectorial way: n p j (k 1) P ji j=1 p(k) = p(k 1) P Suppose a Markov chain has initial probability vector: p(0) = (p i1 (0), p i2 (0),..., p in (0))

and transition matrix P, then the probability vector after n repetitions (steps) of the experiment is: The following identity holds: p(n) = p(0) P n P (X 0 = j 0, X 1 = j 1, X 2 = j 2,..., X k = j k ) = p j0 (0)P j0 j 1 P j1 j 2... P jk 1 j k for arbitrary states j 0,... j k. Absorbing Markov chains absorbing Markov states: a state i is absorbing if p ii = 1 A Markov chain is an absorbing Markov chain if the chain has at least one absorbing state and it is possible to go from any nonabsorbing state to an absorbing state. let P be the transition matrix of an absorbing Markov chain. Rearrange the rows and columns so that the absorbing states come rst. Matrix P will have the form I 0 R Q the fundamental matrix is dened as F = (I Q) 1 and it can be shown that: P n I 0, n F R 0 the matrix F R gives the matrix of probabilities that a particular initial nonabsorbing state will lead to a particular absorbing state. Regular Markov chains A Markov chain is a regular Markov chain if its transition matrix is regular i.e. some power of it has all the entries positive. for a regular Markov chain there exists a unique probability vector v such that for every probability vector v 0 : v 0 P n v, n the vector v is called the equilibrium vector and it gives the long-range trend of the Markov chain the vector v = (v 1, v 2,..., v n ) is found using the identities: and v v v 1 + v 2 +... + v n = 1.

Solved problems Problem 1. At the end of June 40% of the voters were registered as liberal, 45% as conservative, and 15% as independent. Over a one-month period, the liberals retained 80% of their constituency, while 15% switched to conservative and 5% to independent. The conservatives retained 70% and lost 30% to liberals. The independent retained 60% and lost 20% each to liberals and conservatives. Assume that these trends continue. a. Write a transition matrix using this information. b. Find the percent of each type of voter at the end of August. c. If the elections are in October 2018 which party will have the most chances to win the elections? Solution: the transition matrix, using the states L (liberal), C (conservative) and I (independent), is: 0.80 0.15 0.05 0.30 0.70 0 0.20 0.20 0.60 identify the initial probability vector: p(0) = (0.40, 0.45, 0.15) after two months the probability vector is computed using the formula: p(2) = p(0) P 2 observe that P 2 is a regular matrix and thus we have a regular Markov chain. formulate the main property of a regular Markov chain: for a regular Markov chain there exists a unique probability vector v such that for every probability vector v 0 : v 0 P n v, n nd this equilibrium vector v = (v 1, v 2, v 3 ) which gives the long-range trend of the Markov chain, from the equations: and v v v 1 + v 2 + v 3 = 1. the vector v gives the situation in October 2018.

Problem 2. A large group of mice is kept in a cage having connected compartments A, B, and C. Mice in compartment A move to B with probability 0.3 and to C with probability 0.4 Mice in B move to A or C with probability 0.2 and 0.25, respectively. The door of compartment C can not be opened from inside. Find the probability that a mouse from compartment A will eventually end up in compartment C. Solution: The probability transition matrix of the attached Markov chain is: A B C A 0.3 0.3 0.4 B 0.2 0.55 0.25 C 0 0 1 Thus C is an absorbing state absorbing Markov states: a state i is absorbing if p ii = 1 A Markov chain is an absorbing Markov chain if the chain has at least one absorbing state and it is possible to go from any nonabsorbing state to an absorbing state. let P be the transition matrix of an absorbing Markov chain. Rearrange the rows and columns so that the absorbing states come rst. Matrix P will have the form I 0 R Q the fundamental matrix is dened as F = (I Q) 1 and it can be shown that: P n I 0, n F R 0 the matrix F R gives the matrix of probabilities that a particular initial nonabsorbing state will lead to a particular absorbing state. Rearranging the states one gets: C A B C A 1 0 0 0.4 0.3 0.3 B 0.25 0.2 0.55 0.4 0.3 0.3 thus R = and Q = 0.25 0.2 0.55 1.76 1.17 0.99 Finally one has F and F R 0.78 2.74 0.99 Thus a mouse from compartment A has 99% chance to end up trapped in compartment C.

Proposed problems Problem 1. Write the transition diagram corresponding to the transition matrix: 0.5 0.3 0.2 0 1 0 0.2 0.2 0.6 and reversely write the transition matrix corresponding to the diagram: Problem 2. At "Politehnica" University a student has a chance of 15% of unking out during a given year, a 25% chance to repeat the year and 60% chance to nish the year. For a 3rd year student the possible states are: 3rd year student, 4th year student, has unked out, has graduated. Find a transition matrix. Find the probability that a 3rd year student will graduate. Problem 3. A market analyst is interested in whether consumers prefer Dell or Gateway computers. two market surveys taken one year apart reveals the following: 10% of Dell owners had switched to Gateway and the rest continued with Dell. 35% of Gateway owners had switched to Dell and the rest continued with Gateway. Find the distribution of the market after a long period of time. Problem 4. A security guard can stand in front of any one of the three doors of a building, and every minute he decides whether to move to another door chosen at random. If he is at middle door, he is equally likely to stay where he is, move to the door to the left, or move to the door to the right. If he is at the door on either end, he is equally likely to stay where he is or to move to the middle door. Write the transition probability matrix and prove that it corresponds to a regular Markov chain. Find the long-range trend for the fraction of time the guard spends in front of each door.

Problem 5. Let Ω = {C, R, S, G} denote the space of weather conditions, where C=cloudy, R=rainy, S=snowy and G=good. We suppose to have a probability transition matrix P as follows: 0.35 0.25 0.15 0.25 0.35 0.35 0.20 0.10 0.35 0.15 0.45 0.05 0.34 0.05 0.01 0.60 If on Monday the weather is good what is the weather forecast for Wednesday? (i.e. the chances it will be cloudy, rainy, snowy or good ). Find the chance that on Tuesday it will be rainy, on Wednesday it will be cloudy and on Thursday it will rain again. Problem 6. We simplify the previous problem assuming now only three possible weather conditions C,R and G with the probability matrix: 0.5 0.2 0.3 0.4 0.4 0.2 0.3 0.3 0.4 If on Monday is rainy what is the weather forecast for the Christmas Day? Problem 7. A computer system can operate in two dierent modes. Every hour, it remains in the same mode or switches to a dierent mode according to the transition probability matrix: 0.4 0.6 0.6 0.4 If the system is in Mode I at 5 : 30 pm, what is the probability that it will be in Mode I at 7 : 30 pm on the same day? Draw the state transition diagram for the corresponding Markov chains of these two problems.