MATH 446/546 Test 2 Fall 2014

Size: px
Start display at page:

Download "MATH 446/546 Test 2 Fall 2014"

Transcription

1 MATH 446/546 Test 2 Fall 204 Note the problems are separated into two sections a set for all students and an additional set for those taking the course at the 546 level. Please read and follow all of these directions. Answer all the questions appropriate to level at which you are taking the course. You may use any calculator you have at your disposal (please not your phone). I have allowed you access to the IUP sage server if you would like to use it as a calculator. Show all of your work to get full credit. This includes algebraic expressions in order to get foll credit. Note this specifically means that you should write down symbolic matrix formulations of the computations you are doing with Sage and Matlab. Write a sentence answer for each of the word problem parts stated. Answer all questions neatly. Work alone! NAME:

2 All Students. Given the probability transition matrix for three states: 0. p = (5 pts) Calculate the probability that starting from state in 4 steps the Markov chain ends in state 3. Here we look at p 4 = and use the entry in row column 3. Thus the probability of starting in state that we are in state 3 after 4 steps of the chain is (5 pts) What is the steady state distribution for the given Markov chain? Here we are looking at finding a vector π such that: Thus, we solve π = πp (P T I)π = 0 with one of the rows replaced by the constraint that π i =. This gives: π T = Solving for π yields the steady state distribution: π = ( , , ) Note this is very similar to the rows of p raised to a high power: p 00 =

3 2. (5 pts) A police car is on patrol in a neighborhood known for its gang activities. During a patrol, there is a 60% chance of responding time to the location where help is needed, else regular patrol will continue. Upon receiving a call, there is a 0% chance for cancellation (in which case normal patrol is resumed) and a 30% chance that the car is already responding to a previous call. When the police car arrives at the scene, there is a 0% chance that the instigators will have fled (in which case the car returns back to patrol) and a 40% chance that apprehension is made immediately. Else, the officers will search the area. If apprehension occurs, there is a 60% chance of transporting the suspects to the police station, else they are released and the car returns to patrol. Express the probabilistic activities of the police patrol in the form of a diagram, and then probability transition matrix. The trick to this problem is to consider all the possible states. Thus, the car can be in any of the following five places described in the problem: Patrol-l, Call-2, Scene-3, Apprehend Transit-4, Station-5 The following diagram depicts the different transitions in the Markov Chain. Patrol 0.6 Station 0. Call 0. Fled Apprehend Scene Search Using the numbering scheme allows for the following probability transition matrix to be created P =

4 3. Patients suffering form kidney failure can either get a transplant or undergo periodic dialysis. During any one year, 30% undergo cadaveric transplants, and 0% receive living-donor kidneys. In the year following a transplant, 30% of the cadaveric transplants and 5% of living-donor recipients go back to dialysis. Death percentages among the two groups are 20% and 0%, respectively. Of those in the dialysis pool, 0% die, and of those who survive more than one year after transplant, 5% die and 5% go back to dialysis. The following diagram represents this situation graphically: Dialysis : Cadaveric Transplant : Living Doner Transplant : 3 0. Good Post-Transplant Year : Death: (a) (6 pts) What is a valid transition matrix for the described Markov Chain? The transition matrix for the given situation using the state numbers as given in the diagram is as follows: P = (b) (5 pts) What is the expected number of years a patient stays on dialysis. The expected number of visits to state j starting in state i is given by element (i, j) in (I N) where when given probability transition matrix P we calculate N by removing the absorbing state. This gives N =

5 and (I N) = Here it can see that a patient will stay on dialysis approximately years. (c) (5 pts) What is the longevity of a patient who starts on dialysis? The expected time to absorption may be found by summing the values in the (I N) matrix across any row representing the number of transitions till absorption when starting in state i. (I N) = The expected number of years until someone who starts on dialysis dies is 2.92 years. (d) (5 pts) What is the life expectancy of a patient who survives year or longer after a transplant. Here we can use the same work we had form the previous problem. Noting that the entry in row 4 corresponding to having a good post-transplant year is approximately Thus, we expect a patient who survives year or longer after a transplant to live 6.46 more years. (e) (5 pts) The expected number of years before an at-least--year transplant survivor goes back to dialysis or dies. Here we can consider the matrix (I N) that gives the expected time in state j starting in state i. Considering the row 4 column 4 entry this is the expected time in a post-transplant good year prior to death or going back on dialysis. Specifically we see it is expected that a person will spend 3.98 years in this particular state. To answer the question as worded we need to consider going back on dialysis as an absorbing state and place it last in the transition matrix. Mapping the states in the transition matrix as follows: 2, 3, 4, 5, Thus the new probability transition matrix is: P =

6 and can be separated into the following: N = and A = The expected time to absorption is given by: 6.0 (I N) = and starting in the state corresponding to a good post transplant year we expect 0 years before death or dialysis. 6

7 545 Additional Problems. (0 pts) A doubly stochastic Markov chain with m states in the state space S is one where p ij = j S. Show that the vector ( m, m,..., ) m is a stationary distribution. Here we aim to show that for a matrix P with the conditions that: p ij = j S ( columns sum to ) and that p ij = i S ( rows sum to ), j S π = ( m, m,..., m ) satisfies π = πp. Proof: Let π i = m for any i S. Then consider the jth element of the row vector πp given by: π i p ij = m p ij = p ij m = m = π j Thus, π = = π = πp. ( m, m,..., ) m is the stationary distribution as was to be shown. 7

SOLUTIONS IEOR 3106: Second Midterm Exam, Chapters 5-6, November 8, 2012

SOLUTIONS IEOR 3106: Second Midterm Exam, Chapters 5-6, November 8, 2012 SOLUTIONS IEOR 3106: Second Midterm Exam, Chapters 5-6, November 8, 2012 This exam is closed book. YOU NEED TO SHOW YOUR WORK. Honor Code: Students are expected to behave honorably, following the accepted

More information

Statistics 150: Spring 2007

Statistics 150: Spring 2007 Statistics 150: Spring 2007 April 23, 2008 0-1 1 Limiting Probabilities If the discrete-time Markov chain with transition probabilities p ij is irreducible and positive recurrent; then the limiting probabilities

More information

Math 304 Handout: Linear algebra, graphs, and networks.

Math 304 Handout: Linear algebra, graphs, and networks. Math 30 Handout: Linear algebra, graphs, and networks. December, 006. GRAPHS AND ADJACENCY MATRICES. Definition. A graph is a collection of vertices connected by edges. A directed graph is a graph all

More information

= P{X 0. = i} (1) If the MC has stationary transition probabilities then, = i} = P{X n+1

= P{X 0. = i} (1) If the MC has stationary transition probabilities then, = i} = P{X n+1 Properties of Markov Chains and Evaluation of Steady State Transition Matrix P ss V. Krishnan - 3/9/2 Property 1 Let X be a Markov Chain (MC) where X {X n : n, 1, }. The state space is E {i, j, k, }. The

More information

MATH 445/545 Test 1 Spring 2016

MATH 445/545 Test 1 Spring 2016 MATH 445/545 Test Spring 06 Note the problems are separated into two sections a set for all students and an additional set for those taking the course at the 545 level. Please read and follow all of these

More information

Statistics 433 Practice Final Exam: Cover Sheet and Marking Sheet

Statistics 433 Practice Final Exam: Cover Sheet and Marking Sheet Statistics 433 Practice Final Exam: Cover Sheet and Marking Sheet YOUR NAME INSTRUCTIONS: No notes, no calculators, and no communications devices are permitted. Please keep all materials away from your

More information

IEOR 4106: Introduction to Operations Research: Stochastic Models Spring 2011, Professor Whitt Class Lecture Notes: Tuesday, March 1.

IEOR 4106: Introduction to Operations Research: Stochastic Models Spring 2011, Professor Whitt Class Lecture Notes: Tuesday, March 1. IEOR 46: Introduction to Operations Research: Stochastic Models Spring, Professor Whitt Class Lecture Notes: Tuesday, March. Continuous-Time Markov Chains, Ross Chapter 6 Problems for Discussion and Solutions.

More information

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Monday, Feb 10, 2014

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Monday, Feb 10, 2014 Statistics 253/317 Introduction to Probability Models Winter 2014 - Midterm Exam Monday, Feb 10, 2014 Student Name (print): (a) Do not sit directly next to another student. (b) This is a closed-book, closed-note

More information

Markov Chains Handout for Stat 110

Markov Chains Handout for Stat 110 Markov Chains Handout for Stat 0 Prof. Joe Blitzstein (Harvard Statistics Department) Introduction Markov chains were first introduced in 906 by Andrey Markov, with the goal of showing that the Law of

More information

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes? IEOR 3106: Introduction to Operations Research: Stochastic Models Fall 2006, Professor Whitt SOLUTIONS to Final Exam Chapters 4-7 and 10 in Ross, Tuesday, December 19, 4:10pm-7:00pm Open Book: but only

More information

Readings: Finish Section 5.2

Readings: Finish Section 5.2 LECTURE 19 Readings: Finish Section 5.2 Lecture outline Markov Processes I Checkout counter example. Markov process: definition. -step transition probabilities. Classification of states. Example: Checkout

More information

MA 110 Algebra and Trigonometry for Calculus Spring 2017 Exam 1 Tuesday, 7 February Multiple Choice Answers EXAMPLE A B C D E.

MA 110 Algebra and Trigonometry for Calculus Spring 2017 Exam 1 Tuesday, 7 February Multiple Choice Answers EXAMPLE A B C D E. MA 110 Algebra and Trigonometry for Calculus Spring 2017 Exam 1 Tuesday, 7 February 2017 Multiple Choice Answers EXAMPLE A B C D E Question Name: Section: Last 4 digits of student ID #: This exam has ten

More information

IEOR 6711: Professor Whitt. Introduction to Markov Chains

IEOR 6711: Professor Whitt. Introduction to Markov Chains IEOR 6711: Professor Whitt Introduction to Markov Chains 1. Markov Mouse: The Closed Maze We start by considering how to model a mouse moving around in a maze. The maze is a closed space containing nine

More information

Lecture 5: Introduction to Markov Chains

Lecture 5: Introduction to Markov Chains Lecture 5: Introduction to Markov Chains Winfried Just Department of Mathematics, Ohio University January 24 26, 2018 weather.com light The weather is a stochastic process. For now we can assume that this

More information

STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008

STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008 Name STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008 There are five questions on this test. DO use calculators if you need them. And then a miracle occurs is not a valid answer. There

More information

Probability, Random Processes and Inference

Probability, Random Processes and Inference INSTITUTO POLITÉCNICO NACIONAL CENTRO DE INVESTIGACION EN COMPUTACION Laboratorio de Ciberseguridad Probability, Random Processes and Inference Dr. Ponciano Jorge Escamilla Ambrosio pescamilla@cic.ipn.mx

More information

a. Define your variables. b. Construct and fill in a table. c. State the Linear Programming Problem. Do Not Solve.

a. Define your variables. b. Construct and fill in a table. c. State the Linear Programming Problem. Do Not Solve. Math Section. Example : The officers of a high school senior class are planning to rent buses and vans for a class trip. Each bus can transport 4 students, requires chaperones, and costs $, to rent. Each

More information

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Chapter 5. Continuous-Time Markov Chains Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Continuous-Time Markov Chains Consider a continuous-time stochastic process

More information

Eigenvalues and eigenvectors.

Eigenvalues and eigenvectors. Eigenvalues and eigenvectors. Example. A population of a certain animal can be broken into three classes: eggs, juveniles, and adults. Eggs take one year to become juveniles, and juveniles take one year

More information

Page 0 of 5 Final Examination Name. Closed book. 120 minutes. Cover page plus five pages of exam.

Page 0 of 5 Final Examination Name. Closed book. 120 minutes. Cover page plus five pages of exam. Final Examination Closed book. 120 minutes. Cover page plus five pages of exam. To receive full credit, show enough work to indicate your logic. Do not spend time calculating. You will receive full credit

More information

Markov Chains Absorption Hamid R. Rabiee

Markov Chains Absorption Hamid R. Rabiee Markov Chains Absorption Hamid R. Rabiee Absorbing Markov Chain An absorbing state is one in which the probability that the process remains in that state once it enters the state is (i.e., p ii = ). A

More information

MARKOV PROCESSES. Valerio Di Valerio

MARKOV PROCESSES. Valerio Di Valerio MARKOV PROCESSES Valerio Di Valerio Stochastic Process Definition: a stochastic process is a collection of random variables {X(t)} indexed by time t T Each X(t) X is a random variable that satisfy some

More information

Uncertainty Runs Rampant in the Universe C. Ebeling circa Markov Chains. A Stochastic Process. Into each life a little uncertainty must fall.

Uncertainty Runs Rampant in the Universe C. Ebeling circa Markov Chains. A Stochastic Process. Into each life a little uncertainty must fall. Uncertainty Runs Rampant in the Universe C. Ebeling circa 2000 Markov Chains A Stochastic Process Into each life a little uncertainty must fall. Our Hero - Andrei Andreyevich Markov Born: 14 June 1856

More information

STOCHASTIC MODELLING OF A COMPUTER SYSTEM WITH HARDWARE REDUNDANCY SUBJECT TO MAXIMUM REPAIR TIME

STOCHASTIC MODELLING OF A COMPUTER SYSTEM WITH HARDWARE REDUNDANCY SUBJECT TO MAXIMUM REPAIR TIME STOCHASTIC MODELLING OF A COMPUTER SYSTEM WITH HARDWARE REDUNDANCY SUBJECT TO MAXIMUM REPAIR TIME V.J. Munday* Department of Statistics, M.D. University, Rohtak-124001 (India) Email: vjmunday@rediffmail.com

More information

LECTURE NOTES: Discrete time Markov Chains (2/24/03; SG)

LECTURE NOTES: Discrete time Markov Chains (2/24/03; SG) LECTURE NOTES: Discrete time Markov Chains (2/24/03; SG) A discrete time Markov Chains is a stochastic dynamical system in which the probability of arriving in a particular state at a particular time moment

More information

ECE 541 Project Report: Modeling the Game of RISK Using Markov Chains

ECE 541 Project Report: Modeling the Game of RISK Using Markov Chains Contents ECE 541 Project Report: Modeling the Game of RISK Using Markov Chains Stochastic Signals and Systems Rutgers University, Fall 2014 Sijie Xiong, RUID: 151004243 Email: sx37@rutgers.edu 1 The Game

More information

Markov Model. Model representing the different resident states of a system, and the transitions between the different states

Markov Model. Model representing the different resident states of a system, and the transitions between the different states Markov Model Model representing the different resident states of a system, and the transitions between the different states (applicable to repairable, as well as non-repairable systems) System behavior

More information

Information Theory. Lecture 5 Entropy rate and Markov sources STEFAN HÖST

Information Theory. Lecture 5 Entropy rate and Markov sources STEFAN HÖST Information Theory Lecture 5 Entropy rate and Markov sources STEFAN HÖST Universal Source Coding Huffman coding is optimal, what is the problem? In the previous coding schemes (Huffman and Shannon-Fano)it

More information

CENTRAL TEXAS COLLEGE SYLLABUS FOR MATH 2318 Linear Algebra. Semester Hours Credit: 3

CENTRAL TEXAS COLLEGE SYLLABUS FOR MATH 2318 Linear Algebra. Semester Hours Credit: 3 CENTRAL TEXAS COLLEGE SYLLABUS FOR MATH 2318 Linear Algebra Semester Hours Credit: 3 I. INTRODUCTION A. Linear Algebra is a three semester-hour course. This course introduces and provides models for application

More information

Be sure this exam has 9 pages including the cover. The University of British Columbia

Be sure this exam has 9 pages including the cover. The University of British Columbia Be sure this exam has 9 pages including the cover The University of British Columbia Sessional Exams 2011 Term 2 Mathematics 303 Introduction to Stochastic Processes Dr. D. Brydges Last Name: First Name:

More information

Chapter 16 focused on decision making in the face of uncertainty about one future

Chapter 16 focused on decision making in the face of uncertainty about one future 9 C H A P T E R Markov Chains Chapter 6 focused on decision making in the face of uncertainty about one future event (learning the true state of nature). However, some decisions need to take into account

More information

Lecture Notes: Markov chains Tuesday, September 16 Dannie Durand

Lecture Notes: Markov chains Tuesday, September 16 Dannie Durand Computational Genomics and Molecular Biology, Lecture Notes: Markov chains Tuesday, September 6 Dannie Durand In the last lecture, we introduced Markov chains, a mathematical formalism for modeling how

More information

Markov Processes Cont d. Kolmogorov Differential Equations

Markov Processes Cont d. Kolmogorov Differential Equations Markov Processes Cont d Kolmogorov Differential Equations The Kolmogorov Differential Equations characterize the transition functions {P ij (t)} of a Markov process. The time-dependent behavior of the

More information

MATH 56A SPRING 2008 STOCHASTIC PROCESSES

MATH 56A SPRING 2008 STOCHASTIC PROCESSES MATH 56A SPRING 008 STOCHASTIC PROCESSES KIYOSHI IGUSA Contents 4. Optimal Stopping Time 95 4.1. Definitions 95 4.. The basic problem 95 4.3. Solutions to basic problem 97 4.4. Cost functions 101 4.5.

More information

IE 336 Seat # Name. Closed book. One page of hand-written notes, front and back. No calculator. 60 minutes.

IE 336 Seat # Name. Closed book. One page of hand-written notes, front and back. No calculator. 60 minutes. Closed book. One page of hand-written notes, front and back. No calculator. 60 minutes. Cover page and five pages of exam. Four questions. To receive full credit, show enough work to indicate your logic.

More information

ANALYTICAL MODEL OF A VIRTUAL BACKBONE STABILITY IN MOBILE ENVIRONMENT

ANALYTICAL MODEL OF A VIRTUAL BACKBONE STABILITY IN MOBILE ENVIRONMENT (The 4th New York Metro Area Networking Workshop, New York City, Sept. 2004) ANALYTICAL MODEL OF A VIRTUAL BACKBONE STABILITY IN MOBILE ENVIRONMENT Ibrahim Hökelek 1, Mariusz A. Fecko 2, M. Ümit Uyar 1

More information

Markov chains and the number of occurrences of a word in a sequence ( , 11.1,2,4,6)

Markov chains and the number of occurrences of a word in a sequence ( , 11.1,2,4,6) Markov chains and the number of occurrences of a word in a sequence (4.5 4.9,.,2,4,6) Prof. Tesler Math 283 Fall 208 Prof. Tesler Markov Chains Math 283 / Fall 208 / 44 Locating overlapping occurrences

More information

solve them completely showing your steps along the way

solve them completely showing your steps along the way Dear IB Math Studies SL Year 2 Students, We have covered chapter 1 (number and algebra 1), chapter 2 (descriptive statistics), chapter 5 (statistical applications), chapter 7 (number and algebra 2), chapter

More information

QUALIFYING EXAM IN SYSTEMS ENGINEERING

QUALIFYING EXAM IN SYSTEMS ENGINEERING QUALIFYING EXAM IN SYSTEMS ENGINEERING Written Exam: MAY 23, 2017, 9:00AM to 1:00PM, EMB 105 Oral Exam: May 25 or 26, 2017 Time/Location TBA (~1 hour per student) CLOSED BOOK, NO CHEAT SHEETS BASIC SCIENTIFIC

More information

Problems. HW problem 5.7 Math 504. Spring CSUF by Nasser Abbasi

Problems. HW problem 5.7 Math 504. Spring CSUF by Nasser Abbasi Problems HW problem 5.7 Math 504. Spring 2008. CSUF by Nasser Abbasi 1 Problem 6.3 Part(A) Let I n be an indicator variable de ned as 1 when (n = jj I n = 0 = i) 0 otherwise Hence Now we see that E (V

More information

STOCHASTIC PROCESSES Basic notions

STOCHASTIC PROCESSES Basic notions J. Virtamo 38.3143 Queueing Theory / Stochastic processes 1 STOCHASTIC PROCESSES Basic notions Often the systems we consider evolve in time and we are interested in their dynamic behaviour, usually involving

More information

The cost/reward formula has two specific widely used applications:

The cost/reward formula has two specific widely used applications: Applications of Absorption Probability and Accumulated Cost/Reward Formulas for FDMC Friday, October 21, 2011 2:28 PM No class next week. No office hours either. Next class will be 11/01. The cost/reward

More information

Markov Chains for the RISK Board Game Revisited

Markov Chains for the RISK Board Game Revisited VOL. 76, NO. 2, APRIL 2003 129 because of similarity. The parameter h for a given object can be thought of as the radius of a sphere that has the same ratio of V to A as the object. It will henceforth

More information

IE 336 Seat # Name (one point) < KEY > Closed book. Two pages of hand-written notes, front and back. No calculator. 60 minutes.

IE 336 Seat # Name (one point) < KEY > Closed book. Two pages of hand-written notes, front and back. No calculator. 60 minutes. Closed book. Two pages of hand-written notes, front and back. No calculator. 6 minutes. Cover page and four pages of exam. Four questions. To receive full credit, show enough work to indicate your logic.

More information

88 CONTINUOUS MARKOV CHAINS

88 CONTINUOUS MARKOV CHAINS 88 CONTINUOUS MARKOV CHAINS 3.4. birth-death. Continuous birth-death Markov chains are very similar to countable Markov chains. One new concept is explosion which means that an infinite number of state

More information

Lecture 11: Introduction to Markov Chains. Copyright G. Caire (Sample Lectures) 321

Lecture 11: Introduction to Markov Chains. Copyright G. Caire (Sample Lectures) 321 Lecture 11: Introduction to Markov Chains Copyright G. Caire (Sample Lectures) 321 Discrete-time random processes A sequence of RVs indexed by a variable n 2 {0, 1, 2,...} forms a discretetime random process

More information

Name: MATH 3195 :: Fall 2011 :: Exam 2. No document, no calculator, 1h00. Explanations and justifications are expected for full credit.

Name: MATH 3195 :: Fall 2011 :: Exam 2. No document, no calculator, 1h00. Explanations and justifications are expected for full credit. Name: MATH 3195 :: Fall 2011 :: Exam 2 No document, no calculator, 1h00. Explanations and justifications are expected for full credit. 1. ( 4 pts) Say which matrix is in row echelon form and which is not.

More information

Multiple Choice Answers. MA 110 Precalculus Spring 2015 Exam 3 14 April Question

Multiple Choice Answers. MA 110 Precalculus Spring 2015 Exam 3 14 April Question MA 110 Precalculus Spring 2015 Exam 3 14 April 2015 Name: Section: Last 4 digits of student ID #: This exam has ten multiple choice questions (four points each) and five free response questions (seven

More information

Models of Molecular Evolution

Models of Molecular Evolution Models of Molecular Evolution Bret Larget Departments of Botany and of Statistics University of Wisconsin Madison September 15, 2007 Genetics 875 (Fall 2009) Molecular Evolution September 15, 2009 1 /

More information

Discrete Markov Chain. Theory and use

Discrete Markov Chain. Theory and use Discrete Markov Chain. Theory and use Andres Vallone PhD Student andres.vallone@predoc.uam.es 2016 Contents 1 Introduction 2 Concept and definition Examples Transitions Matrix Chains Classification 3 Empirical

More information

Homework set 2 - Solutions

Homework set 2 - Solutions Homework set 2 - Solutions Math 495 Renato Feres Simulating a Markov chain in R Generating sample sequences of a finite state Markov chain. The following is a simple program for generating sample sequences

More information

Markov Processes Hamid R. Rabiee

Markov Processes Hamid R. Rabiee Markov Processes Hamid R. Rabiee Overview Markov Property Markov Chains Definition Stationary Property Paths in Markov Chains Classification of States Steady States in MCs. 2 Markov Property A discrete

More information

QUEUING MODELS AND MARKOV PROCESSES

QUEUING MODELS AND MARKOV PROCESSES QUEUING MODELS AND MARKOV ROCESSES Queues form when customer demand for a service cannot be met immediately. They occur because of fluctuations in demand levels so that models of queuing are intrinsically

More information

57:022 Principles of Design II Final Exam Solutions - Spring 1997

57:022 Principles of Design II Final Exam Solutions - Spring 1997 57:022 Principles of Design II Final Exam Solutions - Spring 1997 Part: I II III IV V VI Total Possible Pts: 52 10 12 16 13 12 115 PART ONE Indicate "+" if True and "o" if False: + a. If a component's

More information

The Transition Probability Function P ij (t)

The Transition Probability Function P ij (t) The Transition Probability Function P ij (t) Consider a continuous time Markov chain {X(t), t 0}. We are interested in the probability that in t time units the process will be in state j, given that it

More information

0.1 Naive formulation of PageRank

0.1 Naive formulation of PageRank PageRank is a ranking system designed to find the best pages on the web. A webpage is considered good if it is endorsed (i.e. linked to) by other good webpages. The more webpages link to it, and the more

More information

MATH3200, Lecture 31: Applications of Eigenvectors. Markov Chains and Chemical Reaction Systems

MATH3200, Lecture 31: Applications of Eigenvectors. Markov Chains and Chemical Reaction Systems Lecture 31: Some Applications of Eigenvectors: Markov Chains and Chemical Reaction Systems Winfried Just Department of Mathematics, Ohio University April 9 11, 2018 Review: Eigenvectors and left eigenvectors

More information

University of Georgia Department of Mathematics. Math 1113 Final Exam Fall 2016

University of Georgia Department of Mathematics. Math 1113 Final Exam Fall 2016 University of Georgia Department of Mathematics Math 1113 Final Exam Fall 2016 By providing my signature below I acknowledge that I abide by the University s academic honesty policy. This is my work, and

More information

Lectures on Probability and Statistical Models

Lectures on Probability and Statistical Models Lectures on Probability and Statistical Models Phil Pollett Professor of Mathematics The University of Queensland c These materials can be used for any educational purpose provided they are are not altered

More information

Lecture 20 : Markov Chains

Lecture 20 : Markov Chains CSCI 3560 Probability and Computing Instructor: Bogdan Chlebus Lecture 0 : Markov Chains We consider stochastic processes. A process represents a system that evolves through incremental changes called

More information

A&S 320: Mathematical Modeling in Biology

A&S 320: Mathematical Modeling in Biology A&S 320: Mathematical Modeling in Biology David Murrugarra Department of Mathematics, University of Kentucky http://www.ms.uky.edu/~dmu228/as320/ Spring 2016 David Murrugarra (University of Kentucky) A&S

More information

Availability. M(t) = 1 - e -mt

Availability. M(t) = 1 - e -mt Availability Availability - A(t) the probability that the system is operating correctly and is available to perform its functions at the instant of time t More general concept than reliability: failure

More information

PBW 654 Applied Statistics - I Urban Operations Research

PBW 654 Applied Statistics - I Urban Operations Research PBW 654 Applied Statistics - I Urban Operations Research Lecture 2.I Queuing Systems An Introduction Operations Research Models Deterministic Models Linear Programming Integer Programming Network Optimization

More information

Chapter 4 Markov Chains at Equilibrium

Chapter 4 Markov Chains at Equilibrium Chapter 4 Markov Chains at Equilibrium 41 Introduction In this chapter we will study the long-term behavior of Markov chains In other words, we would like to know the distribution vector sn) when n The

More information

MATH 56A: STOCHASTIC PROCESSES CHAPTER 1

MATH 56A: STOCHASTIC PROCESSES CHAPTER 1 MATH 56A: STOCHASTIC PROCESSES CHAPTER. Finite Markov chains For the sake of completeness of these notes I decided to write a summary of the basic concepts of finite Markov chains. The topics in this chapter

More information

Queueing systems. Renato Lo Cigno. Simulation and Performance Evaluation Queueing systems - Renato Lo Cigno 1

Queueing systems. Renato Lo Cigno. Simulation and Performance Evaluation Queueing systems - Renato Lo Cigno 1 Queueing systems Renato Lo Cigno Simulation and Performance Evaluation 2014-15 Queueing systems - Renato Lo Cigno 1 Queues A Birth-Death process is well modeled by a queue Indeed queues can be used to

More information

Master s Written Examination - Solution

Master s Written Examination - Solution Master s Written Examination - Solution Spring 204 Problem Stat 40 Suppose X and X 2 have the joint pdf f X,X 2 (x, x 2 ) = 2e (x +x 2 ), 0 < x < x 2

More information

Social network analysis: social learning

Social network analysis: social learning Social network analysis: social learning Donglei Du (ddu@unb.edu) Faculty of Business Administration, University of New Brunswick, NB Canada Fredericton E3B 9Y2 October 20, 2016 Donglei Du (UNB) AlgoTrading

More information

Mathematical Framework for Stochastic Processes

Mathematical Framework for Stochastic Processes Mathematical Foundations of Discrete-Time Markov Chains Tuesday, February 04, 2014 2:04 PM Homework 1 posted, due Friday, February 21. Reading: Lawler, Ch. 1 Mathematical Framework for Stochastic Processes

More information

STATISTICAL ANALYSIS OF LAW ENFORCEMENT SURVEILLANCE IMPACT ON SAMPLE CONSTRUCTION ZONES IN MISSISSIPPI (Part 1: DESCRIPTIVE)

STATISTICAL ANALYSIS OF LAW ENFORCEMENT SURVEILLANCE IMPACT ON SAMPLE CONSTRUCTION ZONES IN MISSISSIPPI (Part 1: DESCRIPTIVE) STATISTICAL ANALYSIS OF LAW ENFORCEMENT SURVEILLANCE IMPACT ON SAMPLE CONSTRUCTION ZONES IN MISSISSIPPI (Part 1: DESCRIPTIVE) Tulio Sulbaran, Ph.D 1, David Marchman 2 Abstract It is estimated that every

More information

Logistical and Transportation Planning. QUIZ 1 Solutions

Logistical and Transportation Planning. QUIZ 1 Solutions QUIZ 1 Solutions Problem 1. Patrolling Police Car. A patrolling police car is assigned to the rectangular sector shown in the figure. The sector is bounded on all four sides by a roadway that requires

More information

TMA 4265 Stochastic Processes Semester project, fall 2014 Student number and

TMA 4265 Stochastic Processes Semester project, fall 2014 Student number and TMA 4265 Stochastic Processes Semester project, fall 2014 Student number 730631 and 732038 Exercise 1 We shall study a discrete Markov chain (MC) {X n } n=0 with state space S = {0, 1, 2, 3, 4, 5, 6}.

More information

Markov Chains. As part of Interdisciplinary Mathematical Modeling, By Warren Weckesser Copyright c 2006.

Markov Chains. As part of Interdisciplinary Mathematical Modeling, By Warren Weckesser Copyright c 2006. Markov Chains As part of Interdisciplinary Mathematical Modeling, By Warren Weckesser Copyright c 2006 1 Introduction A (finite) Markov chain is a process with a finite number of states (or outcomes, or

More information

The Leslie Matrix. The Leslie Matrix (/2)

The Leslie Matrix. The Leslie Matrix (/2) The Leslie Matrix The Leslie matrix is a generalization of the above. It describes annual increases in various age categories of a population. As above we write p n+1 = Ap n where p n, A are given by:

More information

Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem. Wade Trappe

Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem. Wade Trappe Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem Wade Trappe Lecture Overview Network of Queues Introduction Queues in Tandem roduct Form Solutions Burke s Theorem What

More information

Math 2J Lecture 16-11/02/12

Math 2J Lecture 16-11/02/12 Math 2J Lecture 16-11/02/12 William Holmes Markov Chain Recap The population of a town is 100000. Each person is either independent, democrat, or republican. In any given year, each person can choose to

More information

Matching Problems. Roberto Lucchetti. Politecnico di Milano

Matching Problems. Roberto Lucchetti. Politecnico di Milano Politecnico di Milano Background setting Problems introduced in 1962 by Gale and Shapley for the study of two sided markets: 1) workers & employers 2) interns & hospitals 3) students & universities 4)

More information

MARKOV MODEL WITH COSTS In Markov models we are often interested in cost calculations.

MARKOV MODEL WITH COSTS In Markov models we are often interested in cost calculations. MARKOV MODEL WITH COSTS In Markov models we are often interested in cost calculations. inventory model: storage costs manpower planning model: salary costs machine reliability model: repair costs We will

More information

Math 416 Lecture 11. Math 416 Lecture 16 Exam 2 next time

Math 416 Lecture 11. Math 416 Lecture 16 Exam 2 next time Math 416 Lecture 11 Math 416 Lecture 16 Exam 2 next time Birth and death processes, queueing theory In arrival processes, the state only jumps up. In a birth-death process, it can either jump up or down

More information

The SIS and SIR stochastic epidemic models revisited

The SIS and SIR stochastic epidemic models revisited The SIS and SIR stochastic epidemic models revisited Jesús Artalejo Faculty of Mathematics, University Complutense of Madrid Madrid, Spain jesus_artalejomat.ucm.es BCAM Talk, June 16, 2011 Talk Schedule

More information

Markov Chains and Pandemics

Markov Chains and Pandemics Markov Chains and Pandemics Caleb Dedmore and Brad Smith December 8, 2016 Page 1 of 16 Abstract Markov Chain Theory is a powerful tool used in statistical analysis to make predictions about future events

More information

Lesson Plan. AM 121: Introduction to Optimization Models and Methods. Lecture 17: Markov Chains. Yiling Chen SEAS. Stochastic process Markov Chains

Lesson Plan. AM 121: Introduction to Optimization Models and Methods. Lecture 17: Markov Chains. Yiling Chen SEAS. Stochastic process Markov Chains AM : Introduction to Optimization Models and Methods Lecture 7: Markov Chains Yiling Chen SEAS Lesson Plan Stochastic process Markov Chains n-step probabilities Communicating states, irreducibility Recurrent

More information

Incentivized Kidney Exchange

Incentivized Kidney Exchange Incentivized Kidney Exchange Tayfun Sönmez M. Utku Ünver M. Bumin Yenmez Boston College Boston College Boston College Kidney Exchange Kidney Exchange became a wide-spread modality of transplantation within

More information

1 Probabilities. 1.1 Basics 1 PROBABILITIES

1 Probabilities. 1.1 Basics 1 PROBABILITIES 1 PROBABILITIES 1 Probabilities Probability is a tricky word usually meaning the likelyhood of something occuring or how frequent something is. Obviously, if something happens frequently, then its probability

More information

CS 798: Homework Assignment 3 (Queueing Theory)

CS 798: Homework Assignment 3 (Queueing Theory) 1.0 Little s law Assigned: October 6, 009 Patients arriving to the emergency room at the Grand River Hospital have a mean waiting time of three hours. It has been found that, averaged over the period of

More information

Chapter 3: Markov Processes First hitting times

Chapter 3: Markov Processes First hitting times Chapter 3: Markov Processes First hitting times L. Breuer University of Kent, UK November 3, 2010 Example: M/M/c/c queue Arrivals: Poisson process with rate λ > 0 Example: M/M/c/c queue Arrivals: Poisson

More information

Markov decision processes and interval Markov chains: exploiting the connection

Markov decision processes and interval Markov chains: exploiting the connection Markov decision processes and interval Markov chains: exploiting the connection Mingmei Teo Supervisors: Prof. Nigel Bean, Dr Joshua Ross University of Adelaide July 10, 2013 Intervals and interval arithmetic

More information

ISyE 6650 Probabilistic Models Fall 2007

ISyE 6650 Probabilistic Models Fall 2007 ISyE 6650 Probabilistic Models Fall 2007 Homework 4 Solution 1. (Ross 4.3) In this case, the state of the system is determined by the weather conditions in the last three days. Letting D indicate a dry

More information

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Prof. Tesler Math 283 Fall 2018 Also see the separate version of this with Matlab and R commands. Prof. Tesler Diagonalizing

More information

Math 166: Topics in Contemporary Mathematics II

Math 166: Topics in Contemporary Mathematics II Math 166: Topics in Contemporary Mathematics II Xin Ma Texas A&M University November 26, 2017 Xin Ma (TAMU) Math 166 November 26, 2017 1 / 14 A Review A Markov process is a finite sequence of experiments

More information

Stochastic Modelling Unit 1: Markov chain models

Stochastic Modelling Unit 1: Markov chain models Stochastic Modelling Unit 1: Markov chain models Russell Gerrard and Douglas Wright Cass Business School, City University, London June 2004 Contents of Unit 1 1 Stochastic Processes 2 Markov Chains 3 Poisson

More information

Analysis of Student Success in CHEM 400 after Enrollment in CHEM 300

Analysis of Student Success in CHEM 400 after Enrollment in CHEM 300 Analysis of Student Success in CHEM 400 after Enrollment in CHEM 300 Background During the Spring of 2009, The Research Office responded to a request from the Chemistry Department to study the academic

More information

Who invented Calculus Newton or Leibniz? Join me in this discussion on Sept. 4, 2018.

Who invented Calculus Newton or Leibniz? Join me in this discussion on Sept. 4, 2018. Who invented Calculus Newton or Leibniz? Join me in this discussion on Sept. 4, 208. Sir Isaac Newton idology.wordpress.com Gottfried Wilhelm Leibniz et.fh-koeln.de Welcome to BC Calculus. I hope that

More information

Rooks and Pascal s Triangle. shading in the first k squares of the diagonal running downwards to the right. We

Rooks and Pascal s Triangle. shading in the first k squares of the diagonal running downwards to the right. We Roos and Pascal s Triangle An (n,) board consists of n 2 squares arranged in n rows and n columns with shading in the first squares of the diagonal running downwards to the right. We consider (n,) boards

More information

Name: Class: Date: Mini-Unit. Data & Statistics. Investigation 1: Variability & Associations in Numerical Data. Practice Problems

Name: Class: Date: Mini-Unit. Data & Statistics. Investigation 1: Variability & Associations in Numerical Data. Practice Problems Mini-Unit Data & Statistics Investigation 1: Variability & Associations in Numerical Data Practice Problems Directions: Please complete the necessary problems to earn a maximum of 5 points according to

More information

Continuous-Time Markov Chain

Continuous-Time Markov Chain Continuous-Time Markov Chain Consider the process {X(t),t 0} with state space {0, 1, 2,...}. The process {X(t),t 0} is a continuous-time Markov chain if for all s, t 0 and nonnegative integers i, j, x(u),

More information

Markov Chains, Stochastic Processes, and Matrix Decompositions

Markov Chains, Stochastic Processes, and Matrix Decompositions Markov Chains, Stochastic Processes, and Matrix Decompositions 5 May 2014 Outline 1 Markov Chains Outline 1 Markov Chains 2 Introduction Perron-Frobenius Matrix Decompositions and Markov Chains Spectral

More information

MATH 1020 TEST 1 VERSION A SPRING Printed Name: Section #: Instructor:

MATH 1020 TEST 1 VERSION A SPRING Printed Name: Section #: Instructor: Printed Name: Section #: Instructor: Please do not ask questions during this exam. If you consider a question to be ambiguous, state your assumptions in the margin and do the best you can to provide the

More information

The probability of going from one state to another state on the next trial depends only on the present experiment and not on past history.

The probability of going from one state to another state on the next trial depends only on the present experiment and not on past history. c Dr Oksana Shatalov, Fall 2010 1 9.1: Markov Chains DEFINITION 1. Markov process, or Markov Chain, is an experiment consisting of a finite number of stages in which the outcomes and associated probabilities

More information