Discrete Markov Processes. 1. Introduction

Similar documents
Transform Techniques. Moment Generating Function

Basic notions of probability theory (Part 2)

An random variable is a quantity that assumes different values with certain probabilities.

Homework 4 (Stats 620, Winter 2017) Due Tuesday Feb 14, in class Questions are derived from problems in Stochastic Processes by S. Ross.

Introduction to Probability and Statistics Slides 4 Chapter 4

Chapter 3 Common Families of Distributions

Homework 10 (Stats 620, Winter 2017) Due Tuesday April 18, in class Questions are derived from problems in Stochastic Processes by S. Ross.

Cash Flow Valuation Mode Lin Discrete Time

Stochastic models and their distributions

MODULE 3 FUNCTION OF A RANDOM VARIABLE AND ITS DISTRIBUTION LECTURES PROBABILITY DISTRIBUTION OF A FUNCTION OF A RANDOM VARIABLE

CS Homework Week 2 ( 2.25, 3.22, 4.9)

Random variables. A random variable X is a function that assigns a real number, X(ζ), to each outcome ζ in the sample space of a random experiment.

The Strong Law of Large Numbers

6. Stochastic calculus with jump processes

Statistical Distributions

Notes for Lecture 17-18

Asymptotic Equipartition Property - Seminar 3, part 1

Foundations of Statistical Inference. Sufficient statistics. Definition (Sufficiency) Definition (Sufficiency)

Sensors, Signals and Noise

Lecture 4: Processes with independent increments

Chapter 14 Wiener Processes and Itô s Lemma. Options, Futures, and Other Derivatives, 9th Edition, Copyright John C. Hull

ECE 510 Lecture 4 Reliability Plotting T&T Scott Johnson Glenn Shirley

Right tail. Survival function

Richard A. Davis Colorado State University Bojan Basrak Eurandom Thomas Mikosch University of Groningen

Chapter 4. Location-Scale-Based Parametric Distributions. William Q. Meeker and Luis A. Escobar Iowa State University and Louisiana State University

5. Stochastic processes (1)

ECE 510 Lecture 4 Reliability Plotting T&T Scott Johnson Glenn Shirley

Chapter 2. Models, Censoring, and Likelihood for Failure-Time Data

Vehicle Arrival Models : Headway

Martingales Stopping Time Processes

A general continuous auction system in presence of insiders

T L. t=1. Proof of Lemma 1. Using the marginal cost accounting in Equation(4) and standard arguments. t )+Π RB. t )+K 1(Q RB

Institute for Mathematical Methods in Economics. University of Technology Vienna. Singapore, May Manfred Deistler

556: MATHEMATICAL STATISTICS I

An introduction to the theory of SDDP algorithm

TMA 4265 Stochastic Processes

Unit Root Time Series. Univariate random walk

Backward stochastic dynamics on a filtered probability space

Math 10B: Mock Mid II. April 13, 2016

Stochastic Structural Dynamics. Lecture-6

Games Against Nature

An Introduction to Malliavin calculus and its applications

Reliability of Technical Systems

Sample Autocorrelations for Financial Time Series Models. Richard A. Davis Colorado State University Thomas Mikosch University of Copenhagen

LECTURE 1: GENERALIZED RAY KNIGHT THEOREM FOR FINITE MARKOV CHAINS

ON THE NUMBER OF FAMILIES OF BRANCHING PROCESSES WITH IMMIGRATION WITH FAMILY SIZES WITHIN RANDOM INTERVAL

Excel-Based Solution Method For The Optimal Policy Of The Hadley And Whittin s Exact Model With Arma Demand

Limit theorems for discrete-time metapopulation models

Empirical Process Theory

Approximation Algorithms for Unique Games via Orthogonal Separators

Regular Variation and Financial Time Series Models

in Engineering Prof. Dr. Michael Havbro Faber ETH Zurich, Switzerland Swiss Federal Institute of Technology

Stochastic Modelling in Finance - Solutions to sheet 8

Oscillation of an Euler Cauchy Dynamic Equation S. Huff, G. Olumolode, N. Pennington, and A. Peterson

Representation of Stochastic Process by Means of Stochastic Integrals

Comparison between the Discrete and Continuous Time Models

Basic definitions and relations

Answers to Exercises in Chapter 7 - Correlation Functions

Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles

Avd. Matematisk statistik

Answers to QUIZ

Stable approximations of optimal filters

A FAMILY OF MARTINGALES GENERATED BY A PROCESS WITH INDEPENDENT INCREMENTS

Inventory Analysis and Management. Multi-Period Stochastic Models: Optimality of (s, S) Policy for K-Convex Objective Functions

OBJECTIVES OF TIME SERIES ANALYSIS

Prediction for Risk Processes

GMM - Generalized Method of Moments

f(s)dw Solution 1. Approximate f by piece-wise constant left-continuous non-random functions f n such that (f(s) f n (s)) 2 ds 0.

Part III: Chap. 2.5,2.6 & 12

GEM4 Summer School OpenCourseWare

Chapter 4 Multiple Random Variables

Lecture 2: Repetition of probability theory and statistics

Normal Approximation in Geometric Probability

Application of a Stochastic-Fuzzy Approach to Modeling Optimal Discrete Time Dynamical Systems by Using Large Scale Data Processing

Two Popular Bayesian Estimators: Particle and Kalman Filters. McGill COMP 765 Sept 14 th, 2017

Anno accademico 2006/2007. Davide Migliore

On a Fractional Stochastic Landau-Ginzburg Equation

Most Probable Phase Portraits of Stochastic Differential Equations and Its Numerical Simulation

Nature Neuroscience: doi: /nn Supplementary Figure 1. Spike-count autocorrelations in time.

ST2352. Stochastic Processes constructed via Conditional Simulation. 09/02/2014 ST2352 Week 4 1

Chapter 2 Basic Reliability Mathematics

Lecture 6: Wiener Process

22. Inbreeding. related measures: = coefficient of kinship, a measure of relatedness of individuals of a population; panmictic index, P = 1 F;

Stochastic Model for Cancer Cell Growth through Single Forward Mutation

MATH 5720: Gradient Methods Hung Phan, UMass Lowell October 4, 2018

Bernoulli numbers. Francesco Chiatti, Matteo Pintonello. December 5, 2016

13.3 Term structure models

International Journal of Scientific & Engineering Research, Volume 4, Issue 10, October ISSN

3.1.3 INTRODUCTION TO DYNAMIC OPTIMIZATION: DISCRETE TIME PROBLEMS. A. The Hamiltonian and First-Order Conditions in a Finite Time Horizon

EECE 301 Signals & Systems Prof. Mark Fowler

STAT 430/510 Probability Lecture 16, 17: Compute by Conditioning

Problem set 2 for the course on. Markov chains and mixing times

Chapter 1 Fundamental Concepts

FICHE DE TRAVAUX DIRIGÉS NO 1

Continuous Random Variables. What continuous random variables are and how to use them. I can give a definition of a continuous random variable.

Exponential Weighted Moving Average (EWMA) Chart Under The Assumption of Moderateness And Its 3 Control Limits

We will briefly look at the definition of a probability space, probability measures, conditional probability and independence of probability events.

N O N H O M O G E N E O U S S T O C H A S T I C P R O C E S S E S C O N N E C T E D T O P O I S S O N P R O C E S S

Robust estimation based on the first- and third-moment restrictions of the power transformation model

An Exceptional Generalization of the Poisson Distribution

Transcription:

Discree Markov Processes 1. Inroducion 1. Probabiliy Spaces and Random Variables Sample space. A model for evens: is a family of subses of such ha c (1) if A, hen A, (2) if A 1, A 2,..., hen A1 A 2..., (3). This is called -algebra. Probabiliy measure is a funcion P: [0, 1] such ha (a) P( ) = 1, (b) if A i s disjoin, hen 1

P(A1 A 2... ) = P(A 1) + P(A 2) +... Evens A 1, A 2... are independen, if for any subse P(An1 A n2...) = P(A n1)p(a n2), Triple (,, P) is a probabiliy space. X: is a random variable, if for all x, { X( ) x }. A collecion of random variables {X T}, defined on (,, P), is a sochasic process. F(x) = P(X x) is cumulaive disribuion funcion of X F (x) = f(x) is densiy funcion of X (if i exiss). 2

G(s) = E[exp(sX)] is momen generaing funcion of X. Abbreviaions: c.d.f., d.f., and m.g.f. Ex. 1. X ~ N(0, 1), or X has sandard normal -1/2 2 disribuion, if f(x) = (2 ) exp(-x /2). Ex. 2. X ~ Exp( ), > 0, or X has exponenial disribuion, if F(x) = 1 - exp(- x) for x > 0. Ex. 3. X ~ Po( ), > 0, or X has Poisson disribuion, k if P(X = k) = exp(- ) /k! For k = 0, 1, 2,... Ex. 4. X ~ Bin(n, p), 0 < p < 1 and n = 1, 2,..., or X has binomial disribuion, if k n - k P(X = k) = n!/(k!(n - k)!)p (1 - p), for k = 0, 1,..., n. Ex. 5. X ~ Geo(p), 0 < p < 1, or X has geomeric k - 1 disribuion, if P(X = k) = p(1 - p), k = 1, 2,... Ex. 6. T = {1, 2, 3,...}, X ~ Ber(p), T, independen. 3

Then, {X T} is a Bernoulli process. Here, = {0, 1} {0, 1}..., and = all subses of. 1.2. Srucure of Gambling Problems T = {1, 2, 3,...} corresponds o a sequence of games Def. X = 1 if player wins game, X = 0 oherwise. Assume he Bernoulli process model of Ex. 6. Suppose each game coss 1 uni for he player. If he player wins, he ges y 1 unis. Def. Y = resul (or winnings ) of game, or Y = -1, if player loses, Y = y 0, if player wins. If y = (1 - p)/p, hen E[Y ] = 0, or he game is fair. Def. N = X 1 +... + X = # imes player wins in firs games 4

S = Y 1 +... + Y = oal winnings in firs games. Ex. If = hisory up o, hen E[N+1 ] = N + p. Can apply Cenral Limi Theorem o N and S! Borel-Canelli lemma. Evens A 1, A 2..., wih P(A k) = p k. Suppose k p k <. Then, wih probabiliy 1, only finiely many of he evens A k occur. Proof. B n = k n A k= a leas one of evens A n, A n+1,... occurs, B = n 1B n = infiniely many of evens A 1, A 2,... occur. P(B) P(B n) k n p k 0 Applicaion o Gambling: Bernoulli process wih 0 < p < ½, and y = 1 (unfair!). 5

P(player is on his own infiniely ofen) = P(n wins in 2n games for infiniely many n) = 0. 1/2 n -n (Use Sirling s formula: n! ~ (2 n) n e ) 1.3. Condiional Probabiliies and Expecaions Def. A, B, P(B) > 0. Condiional probabiliy of A given B is P(A B) = P(A B)/P(B). Ex. A = { X > a}, B = {X > b}, a > b > 0, where X ~ Exp( ). Then, P(A B) = exp(-(a - b) ). Def. X, Y discree rv s wih P(Y = y) > 0. The condiional expecaion of X given Y = y is E[X Y = y] = x P(X = x Y = y). Noe, E[X Y] is a rv ha depends on Y, bu is a consan w.r.. X! Recall: E[E[X Y]] = E[X]. x Ex. X i ~ Po( i), i > 0, i = 1, 2 independen. Y = X 1 + X 2. Then, he disribuion of X 1 given Y = y is, (X1 Y = y) ~ Bin(y, 1/( 1 + 2)). Therefore, E[X Y = y] = y /( + ). 1 1 1 2 6

Def. X, Y discree rv s wih P(Y = y) > 0 for all y. The condiional variance of X given Y is 2 2 Var(X Y) = E[X Y] - E[X Y]. Th. Var(X) = E[Var(X Y)] + Var(E[X Y]). 1.4. Expecaion of a Waiing Time A rv X 0 defined on (,, P) can be inerpreed as a waiing ime, wih a survival funcion p(x) = P(X > x). Define a Bernoulli process {X T} such ha X( ) = 1 if X( ) >, and X ( ) = 0 oherwise, for all. Then, I follows ha Ex. 1. X ~ Exp( ), > 0, and a > 0. Then, E[X] = 1/, and E[X X > a] = a + 1/. Ex. 2. X ~ Geo(p), 0 < p < 1, saisfies P(X > k) = k (1 - p), so E[X] = 1/p. 7

1.5. Two Applicaions Ex. 1. S. Peersburg paradox: if you increase bes o cover pas losses, bu sop a firs win, you win wih cerainy in fair, and even in unfavorable, games! Ex. 2. N ~ Po( ) is # of accidens, X i ~ Po( ) is # hospialized in acciden i, Y = X +...+ X. Var(Y) =? 1 N 8