Markov Chains. Chapter 16. Markov Chains - 1
|
|
- Edwina Simpson
- 5 years ago
- Views:
Transcription
1 Markov Chains Chapter 16 Markov Chains - 1
2 Why Study Markov Chains? Decision Analysis focuses on decision making in the face of uncertainty about one future event. However, many decisions need to consider uncertainty about a sequence of future events. Uncertain demand for GM SUVs each month over the next year Uncertain weather in Napa valley every week over the grape season Uncertain daily evolution of stock prices We need probability models for systems that evolve over time in a probabilistic manner stochastic processes Markov chains are special stochastic processes: Probabilities indicating how the process will evolve in the future depending only on the present state of the process They provide the conceptual foundation for Markov Decision Processes, perhaps the most widely used probabilistic decision models Decision Analysis-2
3 Overview Stochastic process Markov chains Chapman-Kolmogorov equations State classification First passage time Long-run properties Absorption states Markov Chains - 3
4 Event vs. Random Variable What is a random variable? Recall: a sample space is the set of all possible outcomes of an experiment A random variable takes numerical values depending on the outcome of the experiment Examples of random variables: X = number on a die (integer values) X = number of customers purchasing an item (integer values) X = inches of rain (could be integer or real-valued) X = time until a customer gets served (real-valued) Markov Chains - 4
5 Stochastic Processes Suppose now we take a series of observations of that random variable, X 0, X 1, X 2, A stochastic process is an indexed collection of random variables {X t }, where t is the index from a given set T. (The index t often denotes time.) Examples: Roll a die 10 times, X i = number on die on i th roll, i=1,2,,10. Note that X i takes integer values from 1 to 6. The stochastic process { X t } = {X 1,X 2,..} denotes the sequence of rolls. Sales of an item, X t = number of items sold on day t, t=1,2, Then the stochastic process { X t } = {X 0, X 1,X 2,..} provides a mathematical representation of how the sales evolve starting today Markov Chains - 5
6 Gambler s Ruin Example Consider a gambling game where you win $1 with probability p, and lose $1 with probability 1-p on each turn. The game ends when you either accumulate $3 or go broke. You start with $1. Let X t denote your fortune after t turns of the game. Then the stochastic process {X t }= {X 1, X 2, } describes how your gambling fortune evolves. Questions you might want to answer: Should you play? Will the game eventually end? What is the probability you win $3 or go broke? How does everything change with p? Markov Chains - 6
7 Space of a Stochastic Process The value of X t is the characteristic of interest X t may be continuous or discrete, but we ll focus on discrete Example: X t = number of defective items on day t This graph is a realization of a stochastic process X t t Here, X 4 = 2 we say the state of our stochastic process at time t=4 is 2. Markov Chains - 7
8 States of a Stochastic Process Note that a stochastic process denotes how the state of a system evolves over discrete time points Here, we have discrete states AND discrete time points In fact, we ll consider a finite number of possible states. We label the states (or values) 0, 1, 2,, M These states will be mutually exclusive and exhaustive What do those mean? Mutually exclusive: States have no intersection cannot be in two different states at the same time Exhaustive: All possible outcomes are included in the states Markov Chains - 8
9 Types of Stochastic Processes There are several types of stochastic processes depending on how future values probabilistically depend on present and past values. In general, future values may depend on the present value as well as all the past values (for example, stock prices may depend on past values) On the other hand, future values may be completely independent of present and past values (as in fair coin tossing or fair die rolling). In some cases, future values may be independent of past values and depend only on the present value (as in the gambling example). In INDE 411, we will focus on this last category of stochastic processes (called Markov chains). Hence, our stochastic processes {X t } are called discrete time finite state Markov chains Markov Chains - 9
10 Weather Example Let X t be a random variable that takes value 0 if the weather is dry on day t and value 1 if the weather is rainy on day t. Then the stochastic process { X t }={X 0, X 1,X 2,..} provides a mathematical representation of how the weather evolves starting today (t=0), and the state of the system is dry or rainy. Suppose the probability that tomorrow is dry is 0.8 if today is dry, but is 0.6 if it rains today. We write: P(dry tomorrow dry today) = 0.8 = P(X 1 =0 X 0 =0) P(dry tomorrow rainy today) = 0.6 = P(X 1 =0 X 0 =1) Or, for any day t, we write: P(X t+1 =0 X t =0) = 0.8 and P(X t+1 =0 X t =1) = 0.6 Markov Chains - 10
11 Weather Example, continued Suppose we are given the states of weather on days 0,1,2,3. That is, suppose we know that X 0 =0 X 1 =0 X 2 =1, X 3 =0 (dry, dry, rainy, dry). What is the probability that X 4 =0? Mathematically, what is P(X 4 =0 X 3 =0, X 2 =1, X 1 =0, X 0 =0)? We have P(X 4 =0 X 3 =0) = 0.8, and, in writing this number we did not care about the values of X 2 X 1 X 0 This observation is true for any values of X 3 X 2 X 1 X 0 and in fact for any t. Intuitively, given today s weather and the weather in the past, the conditional probability of tomorrow s weather is independent of weather in the past and depends only on today s weather (this is called the Markovian property). Markov Chains - 11
12 Markovian Property A stochastic process {X t } satisfies the Markovian property if P(X t+1 =j X 0 =k 0, X 1 =k 1,, X t-1 =k t-1, X t =i) = P(X t+1 =j X t =i) for all t = 0, 1, 2, and for every possible state, i,j What does this mean? Future depends only on the present, not on the past Or, given the current state and the past states, the conditional probability of the next state is independent of past states and depends only on the current state. Markov Chains - 12
13 Markov Chain Definition A stochastic process {X t } for t = 0, 1, 2, is a Markov chain if it satisfies the Markovian property. Markov Chains - 13
14 One-Step Transition Probabilities The conditional probabilities P(X t+1 =j X t =i) are called the one-step transition probabilities One-step transition probabilities are stationary if for all t P(X t+1 =j X t =i) = P(X 1 =j X 0 =i) = p ij Interpretation: the conditional probabilities don t change over time, they are the same for all t X t j i t t+1 Markov Chains - 14
15 One-step Transition Probabilities for the Weather Markov Chain The weather chain p 00 =P(X t+1 = 0 X t = 0) = p 10 =P(X t+1 = 0 X t =1) = p 01 =P(X t+1 = 1 X t = 0) = 1-P(X t+1 = 0 X t =0) = p 11 =P(X t+1 = 1 X t =1) = 1-P(X t+1 = 0 X t =1) = One-step transition matrix: arrange the four one-step transition probabilities in a one-step transition matrix P whose rows and columns correspond to states and entries are p ij =P(X t+1 = j X t = i) State Markov Chains - 15
16 One-step Transition Probabilities for the Weather Markov Chain The weather chain p 00 =P(X t+1 = 0 X t = 0) = 0.8 p 10 =P(X t+1 = 0 X t =1) = 0.6 p 01 =P(X t+1 = 1 X t = 0) = 1-P(X t+1 = 0 X t =0) = 0.2 p 11 =P(X t+1 = 1 X t =1) = 1-P(X t+1 = 0 X t =1) = 0.4 One-step transition matrix: arrange the four one-step transition probabilities in a one-step transition matrix P whose rows and columns correspond to states and entries are p ij =P(X t+1 = j X t = i) State p 00 = 0.8 p 01 = p 10 = 0.6 p 11 = 0.4 Markov Chains - 16
17 Transition Matrix Stationary one-step transition probabilities can be represented using a one-step transition matrix P, p ij = P(X t+1 =j X t =i) for i, j {0,1,,M} " $ $ P = $ $ $ # p 00 p p 0M p 10 p p 1M!! p (M!1)M p M 0 p M1... p MM % ' ' ' ' ' & Markov Chains - 17
18 Markov Chain State Transition Diagram A Markov chain with its stationary transition probabilities can also be illustrated using a state transition diagram Weather example: 0 1 Weather Dry 0 Rain 1! # " $ & % 0.8 Dry Rain Markov Chains - 18
19 Weather Example with Variable Probabilities State Transition Diagram p Dry 0 1-p Rain 1 1-q Probability Transition Matrix 0 1 Dry 0 " p 1! p $ Rain 1 # $ q 1! q q % ' &' Markov Chains - 19
20 Gambler s Ruin Stochastic Process Consider again the gambling game with probability p=0.4 of winning on any turn, and you start with $1, stop when you go broke or have $3 What are the random variables of interest, X t? X t =$fortune on turn t What are the possible values (states) of the random variables? {0,1,2,3} What is the index t? turn of the game Markov Chains - 20
21 Gambler s Ruin as a Markov Chain Does the Gambler s Ruin stochastic process satisfy the Markovian property? Yes, intuitively, given your current gambling fortune and all past gambling fortunes, the conditional probability of your gambling fortune after one more gamble is independent of your past gambling fortunes and depends only on your current gambling fortune. More formally, P(X 5 =0 X 4 =1, X 3 =2, X 2 =1, X 1 =2, X 0 =1) = 0.6. In writing this number, you did not care about values of X 3 X 2 X 1 X 0 Is the Gambler s Ruin stochastic process stationary? Yes, intuitively, the probability of winning is the same for all turns of the game. More formally, P(X t+1 =0 X t =1) = 0.6 for all t. Markov Chains - 21
22 Gambler s Ruin Markov Chain Suppose the probability of winning on any turn is p=0.4 State transition diagram: One-step transition matrix P: " % $ ' $ ' $ ' $ # ' & Markov Chains - 22
23 Gambler s Ruin Example with Variable Probability Probability p of winning on any turn State Transition Diagram p p p 1-p Probability Transition Matrix # & % ( % 1" p 0 p 0 ( % 0 1" p 0 p( % $ ( ' Markov Chains - 23
24 Inventory Example A camera store stocks a particular model camera Orders may be placed on Saturday night and the cameras will be delivered first thing Monday morning The store uses an (s, S) policy: If the number in inventory is less than s, order enough to bring the supply up to S If the number of cameras in inventory is greater than or equal to s, do not order any cameras The store set the policy with s = 1 and S = 3 If zero cameras on hand on Saturday night, order 3 cameras If one or more cameras on hand on Saturday night, do not order any cameras Markov Chains - 24
25 Inventory Example What are the random variables of interest, X t? X t = number of cameras in inventory on Saturday night of week t What are the possible values (states) of these random variables? {0,1,2,3} What is the index, t? weeks Markov Chains - 25
26 Inventory Example Graph one possible realization of the stochastic process X t Sat night X 0 = 3 Sat night X 1 = 2 Sat night X 2 = 0 Sat night X 3 = 1 t Markov Chains - 26
27 Inventory Example Describe X t+1 as a function of X t, the number of cameras on hand at the end of the t th week, under the (s=1, S=3) inventory policy X 0 represents the initial number of cameras on hand Let D i represent the demand for cameras during week i Assume D i s are independent and identically distributed (iid) random variables X t+1 = Max {3 - D t+1, 0} Max {X t - D t+1, 0} if X t = 0 (Order) if X t 1 (Don t order) Markov Chains - 27
28 State Transition Diagram Inventory Example P(D = 0) P(D =1) P(D = 2) P(D = 2) P(D " 3) P(D "1) P(D =1) P(D =1) P(D = 0) P(D = 0) P(D " 2) P(D " 3) Probability Transition Matrix P(D = 0) # P(D " 3) P(D = 2) P(D =1) P(D = 0) & % ( % P(D "1) P(D = 0) 0 0 ( % P(D " 2) P(D =1) P(D = 0) 0 ( % $ P(D " 3) P(D = 2) P(D =1) P(D = 0) ( ' Markov Chains - 28
29 Demand Probabilities with Poisson Distribution Assume D t ~ Poisson(λ=1) for all t Recall, the Poisson pmf is P( X n) P( D = 0) =1 0 e "1 0! = e"1 = P( D =1) =1 1 e "1 1! P( D = 2) =1 2 e "1 P D #1 = e "1 = ! = e"1 2 = ( ) =1" P( D = 0) = ( ) =1" P( D = 0) " P( D =1) = ( ) =1" P( D = 0) " P( D =1) " P( D = 2) = P D # 2 P D # 3 " e n!! n = =! n = 1, 2, Markov Chains - 29
30 Inventory Example Transition Probabilities Write P, the one-step transition matrix P = # P(D " 3) P(D = 2) P(D = 1) P(D = 0) & % ( % P(D " 1) P(D = 0) 0 0 ( % P(D " 2) P(D = 1) P(D = 0) 0 ( % $ P(D " 3) P(D = 2) P(D = 1) P(D = 0) ( ' = # & % ( % ( % ( % $ ( ' Markov Chains - 30
Chapter 16 focused on decision making in the face of uncertainty about one future
9 C H A P T E R Markov Chains Chapter 6 focused on decision making in the face of uncertainty about one future event (learning the true state of nature). However, some decisions need to take into account
More informationISM206 Lecture, May 12, 2005 Markov Chain
ISM206 Lecture, May 12, 2005 Markov Chain Instructor: Kevin Ross Scribe: Pritam Roy May 26, 2005 1 Outline of topics for the 10 AM lecture The topics are: Discrete Time Markov Chain Examples Chapman-Kolmogorov
More information2 DISCRETE-TIME MARKOV CHAINS
1 2 DISCRETE-TIME MARKOV CHAINS 21 FUNDAMENTAL DEFINITIONS AND PROPERTIES From now on we will consider processes with a countable or finite state space S {0, 1, 2, } Definition 1 A discrete-time discrete-state
More informationMarkov Chains (Part 3)
Markov Chains (Part 3) State Classification Markov Chains - State Classification Accessibility State j is accessible from state i if p ij (n) > for some n>=, meaning that starting at state i, there is
More informationSTOCHASTIC MODELS LECTURE 1 MARKOV CHAINS. Nan Chen MSc Program in Financial Engineering The Chinese University of Hong Kong (ShenZhen) Sept.
STOCHASTIC MODELS LECTURE 1 MARKOV CHAINS Nan Chen MSc Program in Financial Engineering The Chinese University of Hong Kong (ShenZhen) Sept. 6, 2016 Outline 1. Introduction 2. Chapman-Kolmogrov Equations
More informationProbability, Random Processes and Inference
INSTITUTO POLITÉCNICO NACIONAL CENTRO DE INVESTIGACION EN COMPUTACION Laboratorio de Ciberseguridad Probability, Random Processes and Inference Dr. Ponciano Jorge Escamilla Ambrosio pescamilla@cic.ipn.mx
More informationThe Markov Decision Process (MDP) model
Decision Making in Robots and Autonomous Agents The Markov Decision Process (MDP) model Subramanian Ramamoorthy School of Informatics 25 January, 2013 In the MAB Model We were in a single casino and the
More informationISE/OR 760 Applied Stochastic Modeling
ISE/OR 760 Applied Stochastic Modeling Topic 2: Discrete Time Markov Chain Yunan Liu Department of Industrial and Systems Engineering NC State University Yunan Liu (NC State University) ISE/OR 760 1 /
More informationMarkov Chains (Part 4)
Markov Chains (Part 4) Steady State Probabilities and First Passage Times Markov Chains - 1 Steady-State Probabilities Remember, for the inventory example we had (8) P &.286 =.286.286 %.286 For an irreducible
More informationSome Definition and Example of Markov Chain
Some Definition and Example of Markov Chain Bowen Dai The Ohio State University April 5 th 2016 Introduction Definition and Notation Simple example of Markov Chain Aim Have some taste of Markov Chain and
More informationMATH HOMEWORK PROBLEMS D. MCCLENDON
MATH 46- HOMEWORK PROBLEMS D. MCCLENDON. Consider a Markov chain with state space S = {0, }, where p = P (0, ) and q = P (, 0); compute the following in terms of p and q: (a) P (X 2 = 0 X = ) (b) P (X
More informationStochastic processes. MAS275 Probability Modelling. Introduction and Markov chains. Continuous time. Markov property
Chapter 1: and Markov chains Stochastic processes We study stochastic processes, which are families of random variables describing the evolution of a quantity with time. In some situations, we can treat
More informationn(1 p i ) n 1 p i = 1 3 i=1 E(X i p = p i )P(p = p i ) = 1 3 p i = n 3 (p 1 + p 2 + p 3 ). p i i=1 P(X i = 1 p = p i )P(p = p i ) = p1+p2+p3
Introduction to Probability Due:August 8th, 211 Solutions of Final Exam Solve all the problems 1. (15 points) You have three coins, showing Head with probabilities p 1, p 2 and p 3. You perform two different
More informationMotivation. Stat Camp for the MBA Program. Probability. Experiments and Outcomes. Daniel Solow 5/10/2017
Stat Camp for the MBA Program Daniel Solow Lecture 2 Probability Motivation You often need to make decisions under uncertainty, that is, facing an unknown future. Examples: How many computers should I
More informationMarkov Processes Hamid R. Rabiee
Markov Processes Hamid R. Rabiee Overview Markov Property Markov Chains Definition Stationary Property Paths in Markov Chains Classification of States Steady States in MCs. 2 Markov Property A discrete
More informationMarkov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains
Markov Chains A random process X is a family {X t : t T } of random variables indexed by some set T. When T = {0, 1, 2,... } one speaks about a discrete-time process, for T = R or T = [0, ) one has a continuous-time
More informationMath 381 Discrete Mathematical Modeling
Math 381 Discrete Mathematical Modeling Sean Griffin Today: -Projects -Central Limit Theorem -Markov Chains Handout Projects Deadlines: Project groups and project descriptions due w/ homework (Due 7/23)
More informationIntroduction to Stochastic Processes
18.445 Introduction to Stochastic Processes Lecture 1: Introduction to finite Markov chains Hao Wu MIT 04 February 2015 Hao Wu (MIT) 18.445 04 February 2015 1 / 15 Course description About this course
More informationLecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes
Lecture Notes 7 Random Processes Definition IID Processes Bernoulli Process Binomial Counting Process Interarrival Time Process Markov Processes Markov Chains Classification of States Steady State Probabilities
More informationLesson B1 - Probability Distributions.notebook
Learning Goals: * Define a discrete random variable * Applying a probability distribution of a discrete random variable. * Use tables, graphs, and expressions to represent the distributions. Should you
More informationSTATISTICAL INDEPENDENCE AND AN INVITATION TO THE Art OF CONDITIONING
STATISTICAL INDEPENDENCE AND AN INVITATION TO THE Art OF CONDITIONING Tutorial 2 STAT1301 Fall 2010 28SEP2010, MB103@HKU By Joseph Dong Look, imagine a remote village where there has been a long drought.
More information18.600: Lecture 32 Markov Chains
18.600: Lecture 32 Markov Chains Scott Sheffield MIT Outline Markov chains Examples Ergodicity and stationarity Outline Markov chains Examples Ergodicity and stationarity Markov chains Consider a sequence
More informationLesson Plan. AM 121: Introduction to Optimization Models and Methods. Lecture 17: Markov Chains. Yiling Chen SEAS. Stochastic process Markov Chains
AM : Introduction to Optimization Models and Methods Lecture 7: Markov Chains Yiling Chen SEAS Lesson Plan Stochastic process Markov Chains n-step probabilities Communicating states, irreducibility Recurrent
More information18.175: Lecture 30 Markov chains
18.175: Lecture 30 Markov chains Scott Sheffield MIT Outline Review what you know about finite state Markov chains Finite state ergodicity and stationarity More general setup Outline Review what you know
More informationMath 1313 Experiments, Events and Sample Spaces
Math 1313 Experiments, Events and Sample Spaces At the end of this recording, you should be able to define and use the basic terminology used in defining experiments. Terminology The next main topic in
More informationCS 361: Probability & Statistics
February 19, 2018 CS 361: Probability & Statistics Random variables Markov s inequality This theorem says that for any random variable X and any value a, we have A random variable is unlikely to have an
More informationFundamentals of Probability CE 311S
Fundamentals of Probability CE 311S OUTLINE Review Elementary set theory Probability fundamentals: outcomes, sample spaces, events Outline ELEMENTARY SET THEORY Basic probability concepts can be cast in
More informationIntroduction to probability
Introduction to probability 4.1 The Basics of Probability Probability The chance that a particular event will occur The probability value will be in the range 0 to 1 Experiment A process that produces
More informationLecture 4 - Random walk, ruin problems and random processes
Lecture 4 - Random walk, ruin problems and random processes Jan Bouda FI MU April 19, 2009 Jan Bouda (FI MU) Lecture 4 - Random walk, ruin problems and random processesapril 19, 2009 1 / 30 Part I Random
More informationDiscrete Markov Chain. Theory and use
Discrete Markov Chain. Theory and use Andres Vallone PhD Student andres.vallone@predoc.uam.es 2016 Contents 1 Introduction 2 Concept and definition Examples Transitions Matrix Chains Classification 3 Empirical
More informationChapter 2. Probability
2-1 Chapter 2 Probability 2-2 Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance with certainty. Examples: rolling a die tossing
More informationCOMS 4721: Machine Learning for Data Science Lecture 20, 4/11/2017
COMS 4721: Machine Learning for Data Science Lecture 20, 4/11/2017 Prof. John Paisley Department of Electrical Engineering & Data Science Institute Columbia University SEQUENTIAL DATA So far, when thinking
More informationChapter 2 Random Variables
Stochastic Processes Chapter 2 Random Variables Prof. Jernan Juang Dept. of Engineering Science National Cheng Kung University Prof. Chun-Hung Liu Dept. of Electrical and Computer Eng. National Chiao Tung
More information1 Gambler s Ruin Problem
Coyright c 2017 by Karl Sigman 1 Gambler s Ruin Problem Let N 2 be an integer and let 1 i N 1. Consider a gambler who starts with an initial fortune of $i and then on each successive gamble either wins
More informationChapter 1 (Basic Probability)
Chapter 1 (Basic Probability) What is probability? Consider the following experiments: 1. Count the number of arrival requests to a web server in a day. 2. Determine the execution time of a program. 3.
More information1 if the i-th toss is a Head (with probability p) ξ i = 0 if the i-th toss is a Tail (with probability 1 p)
6 Chapter 3. Markov Chain: Introduction Whatever happened in the past, be it glory or misery, be Markov! 3.1. Examples Example 3.1. (Coin Tossing.) Let ξ 0 0 and, for i 1, 1 if the i-th toss is a Head
More informationChapter 4 - Introduction to Probability
Chapter 4 - Introduction to Probability Probability is a numerical measure of the likelihood that an event will occur. Probability values are always assigned on a scale from 0 to 1. A probability near
More informationIntroduction to Probability
Introduction to Probability Content Experiments, Counting Rules, and Assigning Probabilities Events and Their Probability Some Basic Relationships of Probability Conditional Probability Bayes Theorem 2
More informationThe enumeration of all possible outcomes of an experiment is called the sample space, denoted S. E.g.: S={head, tail}
Random Experiment In random experiments, the result is unpredictable, unknown prior to its conduct, and can be one of several choices. Examples: The Experiment of tossing a coin (head, tail) The Experiment
More informationExamples of frequentist probability include games of chance, sample surveys, and randomized experiments. We will focus on frequentist probability sinc
FPPA-Chapters 13,14 and parts of 16,17, and 18 STATISTICS 50 Richard A. Berk Spring, 1997 May 30, 1997 1 Thinking about Chance People talk about \chance" and \probability" all the time. There are many
More informationSTAT Chapter 3: Probability
Basic Definitions STAT 515 --- Chapter 3: Probability Experiment: A process which leads to a single outcome (called a sample point) that cannot be predicted with certainty. Sample Space (of an experiment):
More informationNANYANG TECHNOLOGICAL UNIVERSITY SEMESTER I EXAMINATION MH4702/MAS446/MTH437 Probabilistic Methods in OR
NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER I EXAMINATION 2013-201 MH702/MAS6/MTH37 Probabilistic Methods in OR December 2013 TIME ALLOWED: 2 HOURS INSTRUCTIONS TO CANDIDATES 1. This examination paper contains
More informationSample Spaces, Random Variables
Sample Spaces, Random Variables Moulinath Banerjee University of Michigan August 3, 22 Probabilities In talking about probabilities, the fundamental object is Ω, the sample space. (elements) in Ω are denoted
More informationRANDOM WALKS IN ONE DIMENSION
RANDOM WALKS IN ONE DIMENSION STEVEN P. LALLEY 1. THE GAMBLER S RUIN PROBLEM 1.1. Statement of the problem. I have A dollars; my colleague Xinyi has B dollars. A cup of coffee at the Sacred Grounds in
More informationRecitation 2: Probability
Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions
More informationToday. Next lecture. (Ch 14) Markov chains and hidden Markov models
Today (Ch 14) Markov chains and hidden Markov models Graphical representation Transition probability matrix Propagating state distributions The stationary distribution Next lecture (Ch 14) Markov chains
More informationLecture 20 : Markov Chains
CSCI 3560 Probability and Computing Instructor: Bogdan Chlebus Lecture 0 : Markov Chains We consider stochastic processes. A process represents a system that evolves through incremental changes called
More informationUncertainty Runs Rampant in the Universe C. Ebeling circa Markov Chains. A Stochastic Process. Into each life a little uncertainty must fall.
Uncertainty Runs Rampant in the Universe C. Ebeling circa 2000 Markov Chains A Stochastic Process Into each life a little uncertainty must fall. Our Hero - Andrei Andreyevich Markov Born: 14 June 1856
More informationChapter 11 Advanced Topic Stochastic Processes
Chapter 11 Advanced Topic Stochastic Processes CHAPTER OUTLINE Section 1 Simple Random Walk Section 2 Markov Chains Section 3 Markov Chain Monte Carlo Section 4 Martingales Section 5 Brownian Motion Section
More information18.440: Lecture 33 Markov Chains
18.440: Lecture 33 Markov Chains Scott Sheffield MIT 1 Outline Markov chains Examples Ergodicity and stationarity 2 Outline Markov chains Examples Ergodicity and stationarity 3 Markov chains Consider a
More information1 Gambler s Ruin Problem
1 Gambler s Ruin Problem Consider a gambler who starts with an initial fortune of $1 and then on each successive gamble either wins $1 or loses $1 independent of the past with probabilities p and q = 1
More informationMath 456: Mathematical Modeling. Tuesday, March 6th, 2018
Math 456: Mathematical Modeling Tuesday, March 6th, 2018 Markov Chains: Exit distributions and the Strong Markov Property Tuesday, March 6th, 2018 Last time 1. Weighted graphs. 2. Existence of stationary
More informationERRATA. Standard Probability and Statistics Tables and Formulae by Daniel Zwillinger and Stephen Kokoska
ERRATA Standard Probability and Statistics Tables and Formulae by Daniel Zwillinger and Stephen Kokoska If you find errata, please email us at skokoska@bloomu.edu. 1. Section.1.5, Chernoff faces, page
More informationDetermining Probabilities. Product Rule for Ordered Pairs/k-Tuples:
Determining Probabilities Product Rule for Ordered Pairs/k-Tuples: Determining Probabilities Product Rule for Ordered Pairs/k-Tuples: Proposition If the first element of object of an ordered pair can be
More informationDiscrete Probability
Discrete Probability Mark Muldoon School of Mathematics, University of Manchester M05: Mathematical Methods, January 30, 2007 Discrete Probability - p. 1/38 Overview Mutually exclusive Independent More
More informationSTAT/MA 416 Answers Homework 4 September 27, 2007 Solutions by Mark Daniel Ward PROBLEMS
STAT/MA 416 Answers Homework 4 September 27, 2007 Solutions by Mark Daniel Ward PROBLEMS 2. We ust examine the 36 possible products of two dice. We see that 1/36 for i = 1, 9, 16, 25, 36 2/36 for i = 2,
More informationWeek 6, 9/24/12-9/28/12, Notes: Bernoulli, Binomial, Hypergeometric, and Poisson Random Variables
Week 6, 9/24/12-9/28/12, Notes: Bernoulli, Binomial, Hypergeometric, and Poisson Random Variables 1 Monday 9/24/12 on Bernoulli and Binomial R.V.s We are now discussing discrete random variables that have
More informationT 1. The value function v(x) is the expected net gain when using the optimal stopping time starting at state x:
108 OPTIMAL STOPPING TIME 4.4. Cost functions. The cost function g(x) gives the price you must pay to continue from state x. If T is your stopping time then X T is your stopping state and f(x T ) is your
More information6 Central Limit Theorem. (Chs 6.4, 6.5)
6 Central Limit Theorem (Chs 6.4, 6.5) Motivating Example In the next few weeks, we will be focusing on making statistical inference about the true mean of a population by using sample datasets. Examples?
More informationLTCC. Exercises. (1) Two possible weather conditions on any day: {rainy, sunny} (2) Tomorrow s weather depends only on today s weather
1. Markov chain LTCC. Exercises Let X 0, X 1, X 2,... be a Markov chain with state space {1, 2, 3, 4} and transition matrix 1/2 1/2 0 0 P = 0 1/2 1/3 1/6. 0 0 0 1 (a) What happens if the chain starts in
More informationMathematical Foundations
Mathematical Foundations Introduction to Data Science Algorithms Michael Paul and Jordan Boyd-Graber JANUARY 23, 2017 Introduction to Data Science Algorithms Boyd-Graber and Paul Mathematical Foundations
More information6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS
6.0/6.3 Spring 009 Quiz Wednesday, March, 7:30-9:30 PM. SOLUTIONS Name: Recitation Instructor: Question Part Score Out of 0 all 0 a 5 b c 5 d 5 e 5 f 5 3 a b c d 5 e 5 f 5 g 5 h 5 Total 00 Write your solutions
More informationProbability Theory and Applications
Probability Theory and Applications Videos of the topics covered in this manual are available at the following links: Lesson 4 Probability I http://faculty.citadel.edu/silver/ba205/online course/lesson
More informationIE 336 Seat # Name. Closed book. One page of hand-written notes, front and back. No calculator. 60 minutes.
Closed book. One page of hand-written notes, front and back. No calculator. 60 minutes. Cover page and five pages of exam. Four questions. To receive full credit, show enough work to indicate your logic.
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationChapter 1: Revie of Calculus and Probability
Chapter 1: Revie of Calculus and Probability Refer to Text Book: Operations Research: Applications and Algorithms By Wayne L. Winston,Ch. 12 Operations Research: An Introduction By Hamdi Taha, Ch. 12 OR441-Dr.Khalid
More informationChapter 2: Discrete Distributions. 2.1 Random Variables of the Discrete Type
Chapter 2: Discrete Distributions 2.1 Random Variables of the Discrete Type 2.2 Mathematical Expectation 2.3 Special Mathematical Expectations 2.4 Binomial Distribution 2.5 Negative Binomial Distribution
More information1 The Basic Counting Principles
1 The Basic Counting Principles The Multiplication Rule If an operation consists of k steps and the first step can be performed in n 1 ways, the second step can be performed in n ways [regardless of how
More informationProblems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman.
Math 224 Fall 2017 Homework 1 Drew Armstrong Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman. Section 1.1, Exercises 4,5,6,7,9,12. Solutions to Book Problems.
More informationName of the Student:
SUBJECT NAME : Probability & Queueing Theory SUBJECT CODE : MA 6453 MATERIAL NAME : Part A questions REGULATION : R2013 UPDATED ON : November 2017 (Upto N/D 2017 QP) (Scan the above QR code for the direct
More informationInference for Stochastic Processes
Inference for Stochastic Processes Robert L. Wolpert Revised: June 19, 005 Introduction A stochastic process is a family {X t } of real-valued random variables, all defined on the same probability space
More informationLecture 4 An Introduction to Stochastic Processes
Lecture 4 An Introduction to Stochastic Processes Prof. Massimo Guidolin Prep Course in Quantitative Methods for Finance August-September 2017 Plan of the lecture Motivation and definitions Filtrations
More informationEcon 325: Introduction to Empirical Economics
Econ 325: Introduction to Empirical Economics Lecture 2 Probability Copyright 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 3-1 3.1 Definition Random Experiment a process leading to an uncertain
More informationLecture #13 Tuesday, October 4, 2016 Textbook: Sections 7.3, 7.4, 8.1, 8.2, 8.3
STATISTICS 200 Lecture #13 Tuesday, October 4, 2016 Textbook: Sections 7.3, 7.4, 8.1, 8.2, 8.3 Objectives: Identify, and resist the temptation to fall for, the gambler s fallacy Define random variable
More informationIntroduction to Probability
Introduction to Probability Gambling at its core 16th century Cardano: Books on Games of Chance First systematic treatment of probability 17th century Chevalier de Mere posed a problem to his friend Pascal.
More informationMarkov Chains Handout for Stat 110
Markov Chains Handout for Stat 0 Prof. Joe Blitzstein (Harvard Statistics Department) Introduction Markov chains were first introduced in 906 by Andrey Markov, with the goal of showing that the Law of
More informationMATH 56A SPRING 2008 STOCHASTIC PROCESSES
MATH 56A SPRING 008 STOCHASTIC PROCESSES KIYOSHI IGUSA Contents 4. Optimal Stopping Time 95 4.1. Definitions 95 4.. The basic problem 95 4.3. Solutions to basic problem 97 4.4. Cost functions 101 4.5.
More informationDiscussion 03 Solutions
STAT Discussion Solutions Spring 8. A new flavor of toothpaste has been developed. It was tested by a group of people. Nine of the group said they liked the new flavor, and the remaining indicated they
More informationMarkov Chains. As part of Interdisciplinary Mathematical Modeling, By Warren Weckesser Copyright c 2006.
Markov Chains As part of Interdisciplinary Mathematical Modeling, By Warren Weckesser Copyright c 2006 1 Introduction A (finite) Markov chain is a process with a finite number of states (or outcomes, or
More informationMAS275 Probability Modelling Exercises
MAS75 Probability Modelling Exercises Note: these questions are intended to be of variable difficulty. In particular: Questions or part questions labelled (*) are intended to be a bit more challenging.
More informationDefinition and Examples of DTMCs
Definition and Examples of DTMCs Natarajan Gautam Department of Industrial and Systems Engineering Texas A&M University 235A Zachry, College Station, TX 77843-3131 Email: gautam@tamuedu Phone: 979-845-5458
More informationRandom Processes. DS GA 1002 Probability and Statistics for Data Science.
Random Processes DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Modeling quantities that evolve in time (or space)
More informationReinforcement Learning Wrap-up
Reinforcement Learning Wrap-up Slides courtesy of Dan Klein and Pieter Abbeel University of California, Berkeley [These slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley.
More informationInventory Model (Karlin and Taylor, Sec. 2.3)
stochnotes091108 Page 1 Markov Chain Models and Basic Computations Thursday, September 11, 2008 11:50 AM Homework 1 is posted, due Monday, September 22. Two more examples. Inventory Model (Karlin and Taylor,
More informationQuestion Paper Code : AEC11T03
Hall Ticket No Question Paper Code : AEC11T03 VARDHAMAN COLLEGE OF ENGINEERING (AUTONOMOUS) Affiliated to JNTUH, Hyderabad Four Year B Tech III Semester Tutorial Question Bank 2013-14 (Regulations: VCE-R11)
More informationELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Random Variable Discrete Random
More informationStochastic processes and stopping time Exercises
Stochastic processes and stopping time Exercises Exercise 2.1. Today is Monday and you have one dollar in your piggy bank. Starting tomorrow, every morning until Friday (inclusively), you toss a coin.
More informationPlease simplify your answers to the extent reasonable without a calculator. Show your work. Explain your answers, concisely.
Please simplify your answers to the extent reasonable without a calculator. Show your work. Explain your answers, concisely. 1. Consider a game which involves flipping a coin: winning $1 when it lands
More informationBinomial Probability. Permutations and Combinations. Review. History Note. Discuss Quizzes/Answer Questions. 9.0 Lesson Plan
9.0 Lesson Plan Discuss Quizzes/Answer Questions History Note Review Permutations and Combinations Binomial Probability 1 9.1 History Note Pascal and Fermat laid out the basic rules of probability in a
More informationProbability the chance that an uncertain event will occur (always between 0 and 1)
Quantitative Methods 2013 1 Probability as a Numerical Measure of the Likelihood of Occurrence Probability the chance that an uncertain event will occur (always between 0 and 1) Increasing Likelihood of
More informationWeek 04 Discussion. a) What is the probability that of those selected for the in-depth interview 4 liked the new flavor and 1 did not?
STAT Wee Discussion Fall 7. A new flavor of toothpaste has been developed. It was tested by a group of people. Nine of the group said they lied the new flavor, and the remaining 6 indicated they did not.
More informationComputer Science CPSC 322. Lecture 18 Marginalization, Conditioning
Computer Science CPSC 322 Lecture 18 Marginalization, Conditioning Lecture Overview Recap Lecture 17 Joint Probability Distribution, Marginalization Conditioning Inference by Enumeration Bayes Rule, Chain
More informationSTAT 516: Basic Probability and its Applications
Lecture 3: Conditional Probability and Independence Prof. Michael September 29, 2015 Motivating Example Experiment ξ consists of rolling a fair die twice; A = { the first roll is 6 } amd B = { the sum
More informationRandom Walks and Quantum Walks
Random Walks and Quantum Walks Stephen Bartlett, Department of Physics and Centre for Advanced Computing Algorithms and Cryptography, Macquarie University Random Walks and Quantum Walks Classical random
More informationLecture 3 Probability Basics
Lecture 3 Probability Basics Thais Paiva STA 111 - Summer 2013 Term II July 3, 2013 Lecture Plan 1 Definitions of probability 2 Rules of probability 3 Conditional probability What is Probability? Probability
More informationWhere are we in CS 440?
Where are we in CS 440? Now leaving: sequential deterministic reasoning Entering: probabilistic reasoning and machine learning robability: Review of main concepts Chapter 3 Making decisions under uncertainty
More informationLecture 5: Introduction to Markov Chains
Lecture 5: Introduction to Markov Chains Winfried Just Department of Mathematics, Ohio University January 24 26, 2018 weather.com light The weather is a stochastic process. For now we can assume that this
More informationWhat is Probability? Probability. Sample Spaces and Events. Simple Event
What is Probability? Probability Peter Lo Probability is the numerical measure of likelihood that the event will occur. Simple Event Joint Event Compound Event Lies between 0 & 1 Sum of events is 1 1.5
More informationLecture 1: Basics of Probability
Lecture 1: Basics of Probability (Luise-Vitetta, Chapter 8) Why probability in data science? Data acquisition is noisy Sampling/quantization external factors: If you record your voice saying machine learning
More informationSenior Math Circles November 19, 2008 Probability II
University of Waterloo Faculty of Mathematics Centre for Education in Mathematics and Computing Senior Math Circles November 9, 2008 Probability II Probability Counting There are many situations where
More information