Math 166: Topics in Contemporary Mathematics II
|
|
- Bertha Williams
- 5 years ago
- Views:
Transcription
1 Math 166: Topics in Contemporary Mathematics II Xin Ma Texas A&M University November 26, 2017 Xin Ma (TAMU) Math 166 November 26, / 14
2 A Review A Markov process is a finite sequence of experiments in which 1. Each experiment has the same possible outcomes. 2. The probabilities of the outcomes depend only on the preceding experiment. We can apply matrix multiplication to find the probability distribution X n at the n-th stage T n X 0 = X n where T is the transition matrix, X 0 is the initial distribution. We say that a matrix T is a transition (or stochastic) matrix if 1. the matrix T is square 2. all elements in the matrix T are between 0 and 1 3. the sum of all the elements in any column is 1. Xin Ma (TAMU) Math 166 November 26, / 14
3 A Review Sometimes, X n X L as n becomes large. We call X L the steady-state distribution or the limiting distribution. If a Markov process has a steady-state distribution, we call this Markov process is regular. A Markov process is regular if, and only if, the associated transition (stochastic) matrix is regular. Note: If a Markov process has a steady-state distribution, it will ALWAYS approach this column matrix X L, no matter what the initial distribution is. A transition (or stochastic) matrix T is regular if some power of T has all positive entries, in other words, all the entries are strictly greater than 0. If there is a matrix L such that T n L as n becomes large and unbounded, we call L the limiting matrix. If T is regular, the limits L exists. If T is absorbing (will be talked soon), the limits L also exists. Xin Ma (TAMU) Math 166 November 26, / 14
4 Definition: In a Markov Processes, a state is called absorbing if it has the property that when entered it is impossible to leave. For example, consider a 3-state Markov process with the transition matrix T. If the second state is absorbing, then for any natural number n we always have 0 0 T n 1 = Definition: A Markov process is called absorbing if 1. There is at least one absorbing state. 2. It is possible to move from any nonabsorbing state to one of the absorbing states in a finite number states. The transition matrix for an absorbing Markov process is said to be an absorbing stochastic matrix. Xin Ma (TAMU) Math 166 November 26, / 14
5 Example: A mouse house has 3 rooms: Room A, B, and C. Room A has cheese in it. Every minute, the mouse makes a choice. If the mouse is in Room B, the probability is 0.6 that it will move to Room A and 0.4 that it will move to Room C. If the mouse is in Room C, the probability that it will stay there is 0.1, the probability that it will move to Room A is 0.7, and the probability that it will move to Room B is 0.2. Once the mouse enters Room A, it never leaves. a. Write the transition matrix T for this Markov process. Is it an absorbing Markov process? Xin Ma (TAMU) Math 166 November 26, / 14
6 Example: A mouse house has 3 rooms: Room A, B, and C. Room A has cheese in it. Every minute, the mouse makes a choice. If the mouse is in Room B, the probability is 0.6 that it will move to Room A and 0.4 that it will move to Room C. If the mouse is in Room C, the probability that it will stay there is 0.1, the probability that it will move to Room A is 0.7, and the probability that it will move to Room B is 0.2. Once the mouse enters Room A, it never leaves. a. Write the transition matrix T for this Markov process. Is it an absorbing Markov process? A B C A T = B ; Yes, Because A is absorbing and C P(B A) = 0.6 > 0 and P(C A) = 0.7 > 0 Xin Ma (TAMU) Math 166 November 26, / 14
7 Remark: If a Markov process is absorbing, then there is at least a ONE on the main diagonal of the associated transition matrix. This is a necessary condition but not sufficient. You also need to verify condition 2 in the definition. b. If there are a bunch of mice in the house, with 10% in Room A, 60% in Room B, and 30% in Room C, what is the room distribution after 5 minutes? Xin Ma (TAMU) Math 166 November 26, / 14
8 Remark: If a Markov process is absorbing, then there is at least a ONE on the main diagonal of the associated transition matrix. This is a necessary condition but not sufficient. You also need to verify condition 2 in the definition. b. If there are a bunch of mice in the house, with 10% in Room A, 60% in Room B, and 30% in Room C, what is the room distribution after 5 minutes? X 0 = 0.6. Then X 5 = T 5 X 0 = Xin Ma (TAMU) Math 166 November 26, / 14
9 c. If the mouse is initially in Room C, what are the probabilities of being in each room after 5 minutes? Xin Ma (TAMU) Math 166 November 26, / 14
10 c. If the mouse is initially in Room C, what are the probabilities of being in each room after 5 minutes? Z 0 = 0. Then Z 5 = T 5 Z 0 = d. Find T 5 and guess what the limiting matrix L should be. Xin Ma (TAMU) Math 166 November 26, / 14
11 c. If the mouse is initially in Room C, what are the probabilities of being in each room after 5 minutes? Z 0 = 0. Then Z 5 = T 5 Z 0 = d. Find T 5 and guess what the limiting matrix L should be T 5 = We may expect that L = e. Find the long-term behavior if the mouse starts at B and C. Xin Ma (TAMU) Math 166 November 26, / 14
12 c. If the mouse is initially in Room C, what are the probabilities of being in each room after 5 minutes? Z 0 = 0. Then Z 5 = T 5 Z 0 = d. Find T 5 and guess what the limiting matrix L should be T 5 = We may expect that L = e. Find the long-term behavior if the mouse starts at B and C Y 0 = 1 Z 0 = 0. The long-term behavior: LY 0 = 0 and LZ 0 = 0 as well. 0 Xin Ma (TAMU) Math 166 November 26, / 14
13 Remark: If a absorbing Markov process has only one absorbing state, say the ith state, then the long term probability of ending in the ith state is one no matter what the initial distribution is. This is why we call it absorbing. And all the entries of ith row of the limiting matrix are 1 and the other entries are 0 as the example above shows. Now let us consider an example of a Markov process with more than one absorbing states. Example: A mouse house has 3 rooms: Room A, B, and C. Room A has cheese in it and Room B has a trap in it. Every minute, the mouse makes a choice. If the mouse is in Room C, the probability is 0.2 that it will stay there, 0.3 that it will move to Room A, and 0.5 that it will move to Room B. Once the mouse enters either Room A or Room B, it never leaves. Xin Ma (TAMU) Math 166 November 26, / 14
14 a. Write the transition matrix T for this Markov process. Is it an absorbing Markov process? Xin Ma (TAMU) Math 166 November 26, / 14
15 a. Write the transition matrix T for this Markov process. Is it an absorbing Markov process? A B C A T = B ; C Yes, Because states A and B are absorbing and P(C B) = 0.5 > 0 and P(C A) = 0.3 > 0. b. If there are a bunch of mice in the house, with 10% in Room A, 60% in Room B, and 30% in Room C, what is the room distribution after 5 minutes? Xin Ma (TAMU) Math 166 November 26, / 14
16 a. Write the transition matrix T for this Markov process. Is it an absorbing Markov process? A B C A T = B ; C Yes, Because states A and B are absorbing and P(C B) = 0.5 > 0 and P(C A) = 0.3 > 0. b. If there are a bunch of mice in the house, with 10% in Room A, 60% in Room B, and 30% in Room C, what is the room distribution after 5 minutes? 0.1 X 0 = 0.6. Then X 5 = T 5 X 0 = Xin Ma (TAMU) Math 166 November 26, / 14
17 c. If the mouse is initially in Room C, what are the probabilities of being in each room after 5 minutes? Xin Ma (TAMU) Math 166 November 26, / 14
18 c. If the mouse is initially in Room C, what are the probabilities of being in each room after 5 minutes? X 0 = 0. Then X 5 = T 5 X 0 = d. Find T 5 and T 50 and guess what the limiting matrix L should be. Xin Ma (TAMU) Math 166 November 26, / 14
19 c. If the mouse is initially in Room C, what are the probabilities of being in each room after 5 minutes? X 0 = 0. Then X 5 = T 5 X 0 = d. Find T 5 and T 50 and guess what the limiting matrix L should be. By calculator: T 5 = and T 50 = Then we may expect L = Xin Ma (TAMU) Math 166 November 26, / 14
20 e. Find the long-term behavior if the mouse starts at C. Xin Ma (TAMU) Math 166 November 26, / 14
21 e. Find the long-term behavior if the mouse starts at C. 0 Z 0 = 0. The long-term behavior: LZ 0 = Xin Ma (TAMU) Math 166 November 26, / 14
22 Remark: 1. If a Markov process has 2 or more absorbing states, then the long term probabilities of ending in the nonabsorbing states will be 0 and the long term probabilities of ending in the absorbing states will depend on the initial distribution. 2. We can deduce the limiting matrix L by computing T n for large n (usually n=50 is sufficiently large and then round your answers to four decimal places). Then find the ending distribution according to the initial distribution X 0, say LX 0. Note: No matter how many absorbing states a Markov process has, eventually everything will be absorbed by the absorbing states. Xin Ma (TAMU) Math 166 November 26, / 14
23 Example: Determine if the following stochastic transition matrices are absorbing. If absorbing, find the limiting matrix a. A = Xin Ma (TAMU) Math 166 November 26, / 14
24 Example: Determine if the following stochastic transition matrices are absorbing. If absorbing, find the limiting matrix a. A = (Yes). L = since there is only one absorbing state (also try A 50 to verify this) b. B = Xin Ma (TAMU) Math 166 November 26, / 14
25 Example: Determine if the following stochastic transition matrices are absorbing. If absorbing, find the limiting matrix a. A = (Yes). L = since there is only one absorbing state (also try A 50 to verify this) b. B = (no) since there is no absorbing state (no entries in diagonal is one). Xin Ma (TAMU) Math 166 November 26, / 14
26 c Xin Ma (TAMU) Math 166 November 26, / 14
27 c. d (Yes) second state is absorbing. and 1st 2nd and rd 1st 2nd Xin Ma (TAMU) Math 166 November 26, / 14
28 c (Yes) second state is absorbing. and 1st 2nd and rd 1st 2nd d (No). Though 1st state is absorbing. but the other two states cannot go to 1st state. e Xin Ma (TAMU) Math 166 November 26, / 14
29 c. d (Yes) second state is absorbing. and 1st 2nd and rd 1st 2nd (No). Though 1st state is absorbing. but the other two states cannot go to 1st state e (Yes.) The 1st and 3rd state are absorbing. No other two states can go to the 3rd state. But both of them can go to the 1st state. Xin Ma (TAMU) Math 166 November 26, / 14
Math 166: Topics in Contemporary Mathematics II
Math 166: Topics in Contemporary Mathematics II Xin Ma Texas A&M University November 26, 2017 Xin Ma (TAMU) Math 166 November 26, 2017 1 / 10 Announcements 1. Homework 27 (M.1) due on this Wednesday and
More informationQUEUING MODELS AND MARKOV PROCESSES
QUEUING MODELS AND MARKOV ROCESSES Queues form when customer demand for a service cannot be met immediately. They occur because of fluctuations in demand levels so that models of queuing are intrinsically
More informationMath "Matrix Approach to Solving Systems" Bibiana Lopez. November Crafton Hills College. (CHC) 6.3 November / 25
Math 102 6.3 "Matrix Approach to Solving Systems" Bibiana Lopez Crafton Hills College November 2010 (CHC) 6.3 November 2010 1 / 25 Objectives: * Define a matrix and determine its order. * Write the augmented
More informationMATH 310, REVIEW SHEET 2
MATH 310, REVIEW SHEET 2 These notes are a very short summary of the key topics in the book (and follow the book pretty closely). You should be familiar with everything on here, but it s not comprehensive,
More informationSpecial Mathematics. Tutorial 13. Markov chains
Tutorial 13 Markov chains "The future starts today, not tomorrow." Pope John Paul II A sequence of trials of an experiment is a nite Markov chain if: the outcome of each experiment is one of a nite set
More informationECE 541 Project Report: Modeling the Game of RISK Using Markov Chains
Contents ECE 541 Project Report: Modeling the Game of RISK Using Markov Chains Stochastic Signals and Systems Rutgers University, Fall 2014 Sijie Xiong, RUID: 151004243 Email: sx37@rutgers.edu 1 The Game
More informationChapter 16 focused on decision making in the face of uncertainty about one future
9 C H A P T E R Markov Chains Chapter 6 focused on decision making in the face of uncertainty about one future event (learning the true state of nature). However, some decisions need to take into account
More informationChapter 10 Markov Chains and Transition Matrices
Finite Mathematics (Mat 119) Lecture week 3 Dr. Firozzaman Department of Mathematics and Statistics Arizona State University Chapter 10 Markov Chains and Transition Matrices A Markov Chain is a sequence
More informationMatrix Basic Concepts
Matrix Basic Concepts Topics: What is a matrix? Matrix terminology Elements or entries Diagonal entries Address/location of entries Rows and columns Size of a matrix A column matrix; vectors Special types
More informationMarkov Chains and Transition Probabilities
Hinthada University Research Journal 215, Vol. 6, No. 1 3 Markov Chains and Transition Probabilities Ko Ko Oo Abstract Markov chain is widely applicable to the study of many real-world phenomene. We represent
More informationChapter 29 out of 37 from Discrete Mathematics for Neophytes: Number Theory, Probability, Algorithms, and Other Stuff by J. M.
29 Markov Chains Definition of a Markov Chain Markov chains are one of the most fun tools of probability; they give a lot of power for very little effort. We will restrict ourselves to finite Markov chains.
More informationIEOR 6711: Professor Whitt. Introduction to Markov Chains
IEOR 6711: Professor Whitt Introduction to Markov Chains 1. Markov Mouse: The Closed Maze We start by considering how to model a mouse moving around in a maze. The maze is a closed space containing nine
More informationDesigning Information Devices and Systems I Spring 2019 Lecture Notes Note 6
EECS 16A Designing Information Devices and Systems I Spring 2019 Lecture Notes Note 6 6.1 Introduction: Matrix Inversion In the last note, we considered a system of pumps and reservoirs where the water
More informationThe Boundary Problem: Markov Chain Solution
MATH 529 The Boundary Problem: Markov Chain Solution Consider a random walk X that starts at positive height j, and on each independent step, moves upward a units with probability p, moves downward b units
More informationTMA 4265 Stochastic Processes Semester project, fall 2014 Student number and
TMA 4265 Stochastic Processes Semester project, fall 2014 Student number 730631 and 732038 Exercise 1 We shall study a discrete Markov chain (MC) {X n } n=0 with state space S = {0, 1, 2, 3, 4, 5, 6}.
More informationEquilibrium. Why? Model 1 A Reversible Reaction. At what point is a reversible reaction completed?
Why? Equilibrium At what point is a reversible reaction completed? Most of the reactions that we have studied this year have been forward reactions once the reactant has changed into the product it stays
More informationMATH Mathematics for Agriculture II
MATH 10240 Mathematics for Agriculture II Academic year 2018 2019 UCD School of Mathematics and Statistics Contents Chapter 1. Linear Algebra 1 1. Introduction to Matrices 1 2. Matrix Multiplication 3
More informationMath 2J Lecture 16-11/02/12
Math 2J Lecture 16-11/02/12 William Holmes Markov Chain Recap The population of a town is 100000. Each person is either independent, democrat, or republican. In any given year, each person can choose to
More informationChapter 35 out of 37 from Discrete Mathematics for Neophytes: Number Theory, Probability, Algorithms, and Other Stuff by J. M. Cargal.
35 Mixed Chains In this chapter we learn how to analyze Markov chains that consists of transient and absorbing states. Later we will see that this analysis extends easily to chains with (nonabsorbing)
More information(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?
IEOR 3106: Introduction to Operations Research: Stochastic Models Fall 2006, Professor Whitt SOLUTIONS to Final Exam Chapters 4-7 and 10 in Ross, Tuesday, December 19, 4:10pm-7:00pm Open Book: but only
More informationeach nonabsorbing state to each absorbing state.
Chapter 8 Markov Processes Absorbing States Markov Processes Markov process models are useful in studying the evolution of systems over repeated trials or sequential time periods or stages. They have been
More informationMath x + 3y 5z = 14 3x 2y + 3z = 17 4x + 3y 2z = 1
Math 210 1. Solve the system: x + y + z = 1 2x + 3y + 4z = 5 (a z = 2, y = 1 and x = 0 (b z =any value, y = 3 2z and x = z 2 (c z =any value, y = 3 2z and x = z + 2 (d z =any value, y = 3 + 2z and x =
More informationMarkov Processes Hamid R. Rabiee
Markov Processes Hamid R. Rabiee Overview Markov Property Markov Chains Definition Stationary Property Paths in Markov Chains Classification of States Steady States in MCs. 2 Markov Property A discrete
More information8-15. Stop by or call (630)
To review the basics Matrices, what they represent, and how to find sum, scalar product, product, inverse, and determinant of matrices, watch the following set of YouTube videos. They are followed by several
More informationSolving Linear Systems
Solving Linear Systems Iterative Solutions Methods Philippe B. Laval KSU Fall 207 Philippe B. Laval (KSU) Linear Systems Fall 207 / 2 Introduction We continue looking how to solve linear systems of the
More informationn α 1 α 2... α m 1 α m σ , A =
The Leslie Matrix The Leslie matrix is a generalization of the above. It is a matrix which describes the increases in numbers in various age categories of a population year-on-year. As above we write p
More information30.5. Iterative Methods for Systems of Equations. Introduction. Prerequisites. Learning Outcomes
Iterative Methods for Systems of Equations 0.5 Introduction There are occasions when direct methods (like Gaussian elimination or the use of an LU decomposition) are not the best way to solve a system
More informationNext topics: Solving systems of linear equations
Next topics: Solving systems of linear equations 1 Gaussian elimination (today) 2 Gaussian elimination with partial pivoting (Week 9) 3 The method of LU-decomposition (Week 10) 4 Iterative techniques:
More informationChill Out: How Hot Objects Cool
Chill Out: How Hot Objects Cool Activity 17 When you have a hot drink, you know that it gradually cools off. Newton s law of cooling provides us with a model for cooling. It states that the temperature
More informationThe probability of going from one state to another state on the next trial depends only on the present experiment and not on past history.
c Dr Oksana Shatalov, Fall 2010 1 9.1: Markov Chains DEFINITION 1. Markov process, or Markov Chain, is an experiment consisting of a finite number of stages in which the outcomes and associated probabilities
More informationMarkov Chains and Pandemics
Markov Chains and Pandemics Caleb Dedmore and Brad Smith December 8, 2016 Page 1 of 16 Abstract Markov Chain Theory is a powerful tool used in statistical analysis to make predictions about future events
More informationMath 138: Introduction to solving systems of equations with matrices. The Concept of Balance for Systems of Equations
Math 138: Introduction to solving systems of equations with matrices. Pedagogy focus: Concept of equation balance, integer arithmetic, quadratic equations. The Concept of Balance for Systems of Equations
More informationLecture 20 : Markov Chains
CSCI 3560 Probability and Computing Instructor: Bogdan Chlebus Lecture 0 : Markov Chains We consider stochastic processes. A process represents a system that evolves through incremental changes called
More informationReview of matrices. Let m, n IN. A rectangle of numbers written like A =
Review of matrices Let m, n IN. A rectangle of numbers written like a 11 a 12... a 1n a 21 a 22... a 2n A =...... a m1 a m2... a mn where each a ij IR is called a matrix with m rows and n columns or an
More informationMarkov Chains, Stochastic Processes, and Matrix Decompositions
Markov Chains, Stochastic Processes, and Matrix Decompositions 5 May 2014 Outline 1 Markov Chains Outline 1 Markov Chains 2 Introduction Perron-Frobenius Matrix Decompositions and Markov Chains Spectral
More informationDesigning Information Devices and Systems I Fall 2018 Lecture Notes Note 6
EECS 16A Designing Information Devices and Systems I Fall 2018 Lecture Notes Note 6 6.1 Introduction: Matrix Inversion In the last note, we considered a system of pumps and reservoirs where the water in
More information30.3. LU Decomposition. Introduction. Prerequisites. Learning Outcomes
LU Decomposition 30.3 Introduction In this Section we consider another direct method for obtaining the solution of systems of equations in the form AX B. Prerequisites Before starting this Section you
More information18.06 Problem Set 1 Solutions Due Thursday, 11 February 2010 at 4 pm in Total: 100 points
18.06 Problem Set 1 Solutions Due Thursday, 11 February 2010 at 4 pm in 2-106. Total: 100 points Section 1.2. Problem 23: The figure shows that cos(α) = v 1 / v and sin(α) = v 2 / v. Similarly cos(β) is
More informationEigenvalues and eigenvectors
Roberto s Notes on Linear Algebra Chapter 0: Eigenvalues and diagonalization Section Eigenvalues and eigenvectors What you need to know already: Basic properties of linear transformations. Linear systems
More informationPre-Calculus I. For example, the system. x y 2 z. may be represented by the augmented matrix
Pre-Calculus I 8.1 Matrix Solutions to Linear Systems A matrix is a rectangular array of elements. o An array is a systematic arrangement of numbers or symbols in rows and columns. Matrices (the plural
More information10.2 ITERATIVE METHODS FOR SOLVING LINEAR SYSTEMS. The Jacobi Method
54 CHAPTER 10 NUMERICAL METHODS 10. ITERATIVE METHODS FOR SOLVING LINEAR SYSTEMS As a numerical technique, Gaussian elimination is rather unusual because it is direct. That is, a solution is obtained after
More informationA Sudoku Submatrix Study
A Sudoku Submatrix Study Merciadri Luca LucaMerciadri@studentulgacbe Abstract In our last article ([1]), we gave some properties of Sudoku matrices We here investigate some properties of the Sudoku submatrices
More informationMath 22AL Lab #4. 1 Objectives. 2 Header. 0.1 Notes
Math 22AL Lab #4 0.1 Notes Green typewriter text represents comments you must type. Each comment is worth one point. Blue typewriter text represents commands you must type. Each command is worth one point.
More informationGAUSSIAN ELIMINATION AND LU DECOMPOSITION (SUPPLEMENT FOR MA511)
GAUSSIAN ELIMINATION AND LU DECOMPOSITION (SUPPLEMENT FOR MA511) D. ARAPURA Gaussian elimination is the go to method for all basic linear classes including this one. We go summarize the main ideas. 1.
More informationLinear Algebra Section 2.6 : LU Decomposition Section 2.7 : Permutations and transposes Wednesday, February 13th Math 301 Week #4
Linear Algebra Section. : LU Decomposition Section. : Permutations and transposes Wednesday, February 1th Math 01 Week # 1 The LU Decomposition We learned last time that we can factor a invertible matrix
More informationspring, math 204 (mitchell) list of theorems 1 Linear Systems Linear Transformations Matrix Algebra
spring, 2016. math 204 (mitchell) list of theorems 1 Linear Systems THEOREM 1.0.1 (Theorem 1.1). Uniqueness of Reduced Row-Echelon Form THEOREM 1.0.2 (Theorem 1.2). Existence and Uniqueness Theorem THEOREM
More informationLectures on Probability and Statistical Models
Lectures on Probability and Statistical Models Phil Pollett Professor of Mathematics The University of Queensland c These materials can be used for any educational purpose provided they are are not altered
More informationName: Exam 2 Solutions. March 13, 2017
Department of Mathematics University of Notre Dame Math 00 Finite Math Spring 07 Name: Instructors: Conant/Galvin Exam Solutions March, 07 This exam is in two parts on pages and contains problems worth
More informationLINEAR SYSTEMS, MATRICES, AND VECTORS
ELEMENTARY LINEAR ALGEBRA WORKBOOK CREATED BY SHANNON MARTIN MYERS LINEAR SYSTEMS, MATRICES, AND VECTORS Now that I ve been teaching Linear Algebra for a few years, I thought it would be great to integrate
More informationSolving Linear Systems
Solving Linear Systems Iterative Solutions Methods Philippe B. Laval KSU Fall 2015 Philippe B. Laval (KSU) Linear Systems Fall 2015 1 / 12 Introduction We continue looking how to solve linear systems of
More informationMATH 310, REVIEW SHEET
MATH 310, REVIEW SHEET These notes are a summary of the key topics in the book (and follow the book pretty closely). You should be familiar with everything on here, but it s not comprehensive, so please
More informationAlgebra & Trig. I. For example, the system. x y 2 z. may be represented by the augmented matrix
Algebra & Trig. I 8.1 Matrix Solutions to Linear Systems A matrix is a rectangular array of elements. o An array is a systematic arrangement of numbers or symbols in rows and columns. Matrices (the plural
More informationLinear Algebra Tutorial for Math3315/CSE3365 Daniel R. Reynolds
Linear Algebra Tutorial for Math3315/CSE3365 Daniel R. Reynolds These notes are meant to provide a brief introduction to the topics from Linear Algebra that will be useful in Math3315/CSE3365, Introduction
More information22A-2 SUMMER 2014 LECTURE Agenda
22A-2 SUMMER 204 LECTURE 2 NATHANIEL GALLUP The Dot Product Continued Matrices Group Work Vectors and Linear Equations Agenda 2 Dot Product Continued Angles between vectors Given two 2-dimensional vectors
More informationThe Leslie Matrix. The Leslie Matrix (/2)
The Leslie Matrix The Leslie matrix is a generalization of the above. It describes annual increases in various age categories of a population. As above we write p n+1 = Ap n where p n, A are given by:
More informationMATH 56A: STOCHASTIC PROCESSES CHAPTER 1
MATH 56A: STOCHASTIC PROCESSES CHAPTER. Finite Markov chains For the sake of completeness of these notes I decided to write a summary of the basic concepts of finite Markov chains. The topics in this chapter
More information1. (3pts) State three of the properties of matrix multiplication.
Math 125 Exam 2 Version 1 October 23, 2006 60 points possible 1. (a) (3pts) State three of the properties of matrix multiplication. Solution: From page 72 of the notes: Theorem: The Properties of Matrix
More information3.2 Iterative Solution Methods for Solving Linear
22 CHAPTER 3. NUMERICAL LINEAR ALGEBRA 3.2 Iterative Solution Methods for Solving Linear Systems 3.2.1 Introduction We continue looking how to solve linear systems of the form Ax = b where A = (a ij is
More information4.1 Markov Processes and Markov Chains
Chapter Markov Processes. Markov Processes and Markov Chains Recall the following example from Section.. Two competing Broadband companies, A and B, each currently have 0% of the market share. Suppose
More informationInvariants II. LA Math Circle (Advanced) November 15, 2015
Invariants II LA Math Circle (Advanced) November 15, 2015 Recall that an invariant is some quantity associated with a system that is left unchanged by a specified process. We typically use them to show
More informationRISKy Business: An In-Depth Look at the Game RISK
Rose-Hulman Undergraduate Mathematics Journal Volume 3 Issue Article 3 RISKy Business: An In-Depth Look at the Game RISK Sharon Blatt Elon University, slblatt@hotmail.com Follow this and additional works
More information1. Algebraic and geometric treatments Consider an LP problem in the standard form. x 0. Solutions to the system of linear equations
The Simplex Method Most textbooks in mathematical optimization, especially linear programming, deal with the simplex method. In this note we study the simplex method. It requires basically elementary linear
More information( 3, 4) means x = 3 and y = 4
11 2B: Solving a System of Linear Equations by Graphing What is a system of Linear Equations? A system of linear equations is a list of two linear equations that each represents the graph of a line. y
More informationElementary Linear Algebra Review for Exam 3 Exam is Friday, December 11th from 1:15-3:15
Elementary Linear Algebra Review for Exam 3 Exam is Friday, December th from :5-3:5 The exam will cover sections: 6., 6.2, 7. 7.4, and the class notes on dynamical systems. You absolutely must be able
More informationExecutive Assessment. Executive Assessment Math Review. Section 1.0, Arithmetic, includes the following topics:
Executive Assessment Math Review Although the following provides a review of some of the mathematical concepts of arithmetic and algebra, it is not intended to be a textbook. You should use this chapter
More informationAny live cell with less than 2 live neighbours dies. Any live cell with 2 or 3 live neighbours lives on to the next step.
2. Cellular automata, and the SIRS model In this Section we consider an important set of models used in computer simulations, which are called cellular automata (these are very similar to the so-called
More informationLecture 10: Markov Chains
Lecture 10: Markov Chains Review of Markov Chains Let s see an example called city-suburb problem: Suppose the population of a city and its suburbs were measured each years. Because the total population
More informationCHAPTER 2 Matrices. Section 2.1 Operations with Matrices Section 2.2 Properties of Matrix Operations... 36
CHAPER Matrices Section. Operations with Matrices... Section. Properties of Matrix Operations... 6 Section. he Inverse of a Matrix... 4 Section.4 Elementary Matrices... 46 Section.5 Markov Chains... 5
More informationElementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.
Elementary Linear Algebra Review for Exam Exam is Monday, November 6th. The exam will cover sections:.4,..4, 5. 5., 7., the class notes on Markov Models. You must be able to do each of the following. Section.4
More informationeigenvalues, markov matrices, and the power method
eigenvalues, markov matrices, and the power method Slides by Olson. Some taken loosely from Jeff Jauregui, Some from Semeraro L. Olson Department of Computer Science University of Illinois at Urbana-Champaign
More information[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]
Math 43 Review Notes [Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty Dot Product If v (v, v, v 3 and w (w, w, w 3, then the
More informationMath 2331 Linear Algebra
1.1 Linear System Math 2331 Linear Algebra 1.1 Systems of Linear Equations Shang-Huan Chiu Department of Mathematics, University of Houston schiu@math.uh.edu math.uh.edu/ schiu/ Shang-Huan Chiu, University
More informationEvaluating Determinants by Row Reduction
Evaluating Determinants by Row Reduction MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 2015 Objectives Reduce a matrix to row echelon form and evaluate its determinant.
More informationLecture 4: Applications of Orthogonality: QR Decompositions
Math 08B Professor: Padraic Bartlett Lecture 4: Applications of Orthogonality: QR Decompositions Week 4 UCSB 204 In our last class, we described the following method for creating orthonormal bases, known
More informationMath 344 Lecture # Linear Systems
Math 344 Lecture #12 2.7 Linear Systems Through a choice of bases S and T for finite dimensional vector spaces V (with dimension n) and W (with dimension m), a linear equation L(v) = w becomes the linear
More information7.6 The Inverse of a Square Matrix
7.6 The Inverse of a Square Matrix Copyright Cengage Learning. All rights reserved. What You Should Learn Verify that two matrices are inverses of each other. Use Gauss-Jordan elimination to find inverses
More informationLU Factorization. Marco Chiarandini. DM559 Linear and Integer Programming. Department of Mathematics & Computer Science University of Southern Denmark
DM559 Linear and Integer Programming LU Factorization Marco Chiarandini Department of Mathematics & Computer Science University of Southern Denmark [Based on slides by Lieven Vandenberghe, UCLA] Outline
More information. Get closed expressions for the following subsequences and decide if they converge. (1) a n+1 = (2) a 2n = (3) a 2n+1 = (4) a n 2 = (5) b n+1 =
Math 316, Intro to Analysis subsequences. Recall one of our arguments about why a n = ( 1) n diverges. Consider the subsequences a n = ( 1) n = +1. It converges to 1. On the other hand, the subsequences
More informationMTH 2032 Semester II
MTH 232 Semester II 2-2 Linear Algebra Reference Notes Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education December 28, 2 ii Contents Table of Contents
More informationTMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013
TMA4115 - Calculus 3 Lecture 21, April 3 Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013 www.ntnu.no TMA4115 - Calculus 3, Lecture 21 Review of last week s lecture Last week
More informationSection 1.1: Systems of Linear Equations
Section 1.1: Systems of Linear Equations Two Linear Equations in Two Unknowns Recall that the equation of a line in 2D can be written in standard form: a 1 x 1 + a 2 x 2 = b. Definition. A 2 2 system of
More informationMath 301 Test III. Dr. Holmes. November 4, 2004
Math 30 Test III Dr. Holmes November 4, 2004 This exam lasts from 9:40 until 0:35. You may drop one problem, other than problem 7. If you do all problems, your best work will count. Books, notes, and your
More informationMATH 56A: STOCHASTIC PROCESSES CHAPTER 7
MATH 56A: STOCHASTIC PROCESSES CHAPTER 7 7. Reversal This chapter talks about time reversal. A Markov process is a state X t which changes with time. If we run time backwards what does it look like? 7.1.
More informationMATH 19B FINAL EXAM PROBABILITY REVIEW PROBLEMS SPRING, 2010
MATH 9B FINAL EXAM PROBABILITY REVIEW PROBLEMS SPRING, 00 This handout is meant to provide a collection of exercises that use the material from the probability and statistics portion of the course The
More informationMath 671: Tensor Train decomposition methods II
Math 671: Tensor Train decomposition methods II Eduardo Corona 1 1 University of Michigan at Ann Arbor December 13, 2016 Table of Contents 1 What we ve talked about so far: 2 The Tensor Train decomposition
More informationSystem of Linear Equation: with more than Two Equations and more than Two Unknowns
System of Linear Equation: with more than Two Equations and more than Two Unknowns Michigan Department of Education Standards for High School: Standard 1: Solve linear equations and inequalities including
More informationConcepts. Materials. Objective
. Activity 10 From a Distance... You Can See It! Teacher Notes Concepts Midpoint between two points Distance between two points Pythagorean Theorem Calculator Skills Entering fractions: N Setting decimal
More information5.1 Introduction to Matrices
5.1 Introduction to Matrices Reminder: A matrix with m rows and n columns has size m x n. (This is also sometimes referred to as the order of the matrix.) The entry in the ith row and jth column of a matrix
More informationLeslie matrices and Markov chains.
Leslie matrices and Markov chains. Example. Suppose a certain species of insect can be divided into 2 classes, eggs and adults. 10% of eggs survive for 1 week to become adults, each adult yields an average
More informationLinear Equations in Linear Algebra
1 Linear Equations in Linear Algebra 1.7 LINEAR INDEPENDENCE LINEAR INDEPENDENCE Definition: An indexed set of vectors {v 1,, v p } in n is said to be linearly independent if the vector equation x x x
More informationFinite-Horizon Statistics for Markov chains
Analyzing FSDT Markov chains Friday, September 30, 2011 2:03 PM Simulating FSDT Markov chains, as we have said is very straightforward, either by using probability transition matrix or stochastic update
More informationChapter 1. Vectors, Matrices, and Linear Spaces
1.7 Applications to Population Distributions 1 Chapter 1. Vectors, Matrices, and Linear Spaces 1.7. Applications to Population Distributions Note. In this section we break a population into states and
More informationWeek 2. Section Texas A& M University. Department of Mathematics Texas A& M University, College Station 22 January-24 January 2019
Week 2 Section 1.2-1.4 Texas A& M University Department of Mathematics Texas A& M University, College Station 22 January-24 January 2019 Oğuz Gezmiş (TAMU) Topics in Contemporary Mathematics II Week2 1
More informationLinear Equations in Linear Algebra
Linear Equations in Linear Algebra.7 LINEAR INDEPENDENCE LINEAR INDEPENDENCE Definition: An indexed set of vectors {v,, v p } in n is said to be linearly independent if the vector equation x x x 2 2 p
More informationSolving Ax = b w/ different b s: LU-Factorization
Solving Ax = b w/ different b s: LU-Factorization Linear Algebra Josh Engwer TTU 14 September 2015 Josh Engwer (TTU) Solving Ax = b w/ different b s: LU-Factorization 14 September 2015 1 / 21 Elementary
More informationOutlines. Discrete Time Markov Chain (DTMC) Continuous Time Markov Chain (CTMC)
Markov Chains (2) Outlines Discrete Time Markov Chain (DTMC) Continuous Time Markov Chain (CTMC) 2 pj ( n) denotes the pmf of the random variable p ( n) P( X j) j We will only be concerned with homogenous
More informationDirect Methods for Solving Linear Systems. Matrix Factorization
Direct Methods for Solving Linear Systems Matrix Factorization Numerical Analysis (9th Edition) R L Burden & J D Faires Beamer Presentation Slides prepared by John Carroll Dublin City University c 2011
More informationMarkov Chains Absorption (cont d) Hamid R. Rabiee
Markov Chains Absorption (cont d) Hamid R. Rabiee 1 Absorbing Markov Chain An absorbing state is one in which the probability that the process remains in that state once it enters the state is 1 (i.e.,
More informationRoberto s Notes on Linear Algebra Chapter 9: Orthogonality Section 2. Orthogonal matrices
Roberto s Notes on Linear Algebra Chapter 9: Orthogonality Section 2 Orthogonal matrices What you need to know already: What orthogonal and orthonormal bases for subspaces are. What you can learn here:
More informationIntroduction to Measurements. Introduction to Measurements. Introduction to Measurements. Introduction to Measurements. Introduction to Measurements
CIVL 1112 Surveying - Precision and 1/8 Typically, we are accustomed to counting but not measuring. Engineers are concerned with distances, elevations, volumes, direction, and weights. Fundamental principle
More information