Stochastic Processes and Advanced Mathematical Finance. Stochastic Processes

Similar documents
Stochastic Processes and Advanced Mathematical Finance. Intuitive Introduction to Diffusions

Topics in Probability Theory and Stochastic Processes Steven R. Dunbar. Waiting Time to Absorption

Stochastic Processes and Advanced Mathematical Finance

Selected Topics in Probability and Stochastic Processes Steve Dunbar. Partial Converse of the Central Limit Theorem

Topics in Probability Theory and Stochastic Processes Steven R. Dunbar. Binomial Distribution

Topics in Probability Theory and Stochastic Processes Steven R. Dunbar. Binomial Distribution

Topics in Probability Theory and Stochastic Processes Steven R. Dunbar. Notation and Problems of Hidden Markov Models

Stochastic Processes and Advanced Mathematical Finance. Path Properties of Brownian Motion

Stochastic Processes and Advanced Mathematical Finance. Randomness

Topics in Probability Theory and Stochastic Processes Steven R. Dunbar. Worst Case and Average Case Behavior of the Simplex Algorithm

Topics in Probability Theory and Stochastic Processes Steven R. Dunbar. Stirling s Formula in Real and Complex Variables

Lecture 1: Brief Review on Stochastic Processes

Figure 10.1: Recording when the event E occurs

Stochastic Processes

RANDOM WALKS AND THE PROBABILITY OF RETURNING HOME

DISCRETE STOCHASTIC PROCESSES Draft of 2nd Edition

Topics in Probability Theory and Stochastic Processes Steven R. Dunbar. Stirling s Formula from the Sum of Average Differences

CMPSCI 240: Reasoning Under Uncertainty

the time it takes until a radioactive substance undergoes a decay

M4A42 APPLIED STOCHASTIC PROCESSES

Table of Contents [ntc]

An Introduction to Stochastic Modeling

I will post Homework 1 soon, probably over the weekend, due Friday, September 30.

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

Lecture 4 An Introduction to Stochastic Processes

Chapter 8: An Introduction to Probability and Statistics

Probability and Statistics Concepts

Discrete Random Variables

Topics in Probability Theory and Stochastic Processes Steven R. Dunbar. Examples of Hidden Markov Models

Random variables. DS GA 1002 Probability and Statistics for Data Science.

6.262: Discrete Stochastic Processes 2/2/11. Lecture 1: Introduction and Probability review

Modeling with Itô Stochastic Differential Equations

CONTENTS. Preface List of Symbols and Notation

MATH 3C: MIDTERM 1 REVIEW. 1. Counting

Probability Experiments, Trials, Outcomes, Sample Spaces Example 1 Example 2

Probability: Terminology and Examples Class 2, Jeremy Orloff and Jonathan Bloom

Basic Definitions: Indexed Collections and Random Functions

Introduction to Probability

STA Module 4 Probability Concepts. Rev.F08 1

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

1/2 1/2 1/4 1/4 8 1/2 1/2 1/2 1/2 8 1/2 6 P =

STAT/SOC/CSSS 221 Statistical Concepts and Methods for the Social Sciences. Random Variables

Stochastic Histories. Chapter Introduction

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10

Elementary Applications of Probability Theory

Introduction to Stochastic Processes

MATH MW Elementary Probability Course Notes Part I: Models and Counting

Markov Chains and Pandemics

Calculus Review Session. Rob Fetter Duke University Nicholas School of the Environment August 13, 2015

Probability: Sets, Sample Spaces, Events

Introduction to Diffusion Processes.

Basic Probability. Introduction

k P (X = k)

Statistics and Econometrics I

FDST Markov Chain Models

Fundamentals of Probability CE 311S

Chapter 14. From Randomness to Probability. Copyright 2012, 2008, 2005 Pearson Education, Inc.

Why should you care?? Intellectual curiosity. Gambling. Mathematically the same as the ESP decision problem we discussed in Week 4.

X 1 ((, a]) = {ω Ω : X(ω) a} F, which leads us to the following definition:

Random Variables. Statistics 110. Summer Copyright c 2006 by Mark E. Irwin

SIMPLE RANDOM WALKS: IMPROBABILITY OF PROFITABLE STOPPING

DS-GA 1002 Lecture notes 2 Fall Random variables

Examples of Countable State Markov Chains Thursday, October 16, :12 PM

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

P (A) = P (B) = P (C) = P (D) =

Stochastic Processes and Advanced Mathematical Finance. Duration of the Gambler s Ruin

Use of Eigen values and eigen vectors to calculate higher transition probabilities

Lecture 2 Binomial and Poisson Probability Distributions

Econ 424 Time Series Concepts

Three Disguises of 1 x = e λx

Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman.

Mathematical Framework for Stochastic Processes

2. Probability. Chris Piech and Mehran Sahami. Oct 2017

Lecture 1: Probability Fundamentals

Stochastic processes. MAS275 Probability Modelling. Introduction and Markov chains. Continuous time. Markov property

Chapter 4: An Introduction to Probability and Statistics

SOLUTIONS IEOR 3106: Second Midterm Exam, Chapters 5-6, November 8, 2012

Markov Chains. Arnoldo Frigessi Bernd Heidergott November 4, 2015

Probability (Devore Chapter Two)

Crash Course in Statistics for Neuroscience Center Zurich University of Zurich

Discrete Event Systems Exam

Detailed Balance and Branching Processes Monday, October 20, :04 PM

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Monday, Feb 10, 2014

MARKOV PROCESSES. Valerio Di Valerio

Fractals. Mandelbrot defines a fractal set as one in which the fractal dimension is strictly greater than the topological dimension.

Stochastic Models in Computer Science A Tutorial

Randomized Algorithms

Probability Year 9. Terminology

Countable state discrete time Markov Chains

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

Sample Spaces, Random Variables

Review of Probability Theory

Probability and Statistics for Engineers

Introduction to Stochastic SIR Model

Stochastic Processes Calcul stochastique. Geneviève Gauthier. HEC Montréal. Stochastic processes. Stochastic processes.

Statistics 433 Practice Final Exam: Cover Sheet and Marking Sheet

Calculation of Steady-State Probabilities of M/M Queues: Further Approaches. Queuing Theory and Applications (MA597)

Answers to selected exercises

CS684 Graph Algorithms

An Introduction to Probability Theory and Its Applications

Transcription:

Steven R. Dunbar Department of Mathematics 203 Avery Hall University of Nebraska-Lincoln Lincoln, NE 68588-0130 http://www.math.unl.edu Voice: 402-472-3731 Fax: 402-472-8466 Stochastic Processes and Advanced Mathematical Finance Stochastic Processes Rating Student: contains scenes of mild algebra or calculus that may require guidance. 1

Section Starter Question Name something that is both random and varies over time. Does the randomness depend on the history of the process or only on its current state? Key Concepts 1. A sequence or interval of random outcomes, that is, a string of random outcomes dependent on time as well as the randomness is a stochastic process. With the inclusion of a time variable, the rich range of random outcome distributions becomes a great variety of stochastic processes. Nevertheless, the most commonly studied types of random processes have connections. 2. Stochastic processes are functions of two variables, the time index and the sample point. As a consequence, stochastic processes are interpreted in several ways. The simplest is to look at the stochastic process at a fixed value of time. The result is a random variable with a probability distribution, just as studied in elementary probability. 3. Another way to look at a stochastic process is to consider the stochastic process as a function of the sample point ω. Each ω maps to an associated function of time X(t). This means that one may look at a stochastic process as a mapping from the sample space Ω to a set of functions. In this interpretation, stochastic processes are a generalization from the random variables of elementary probability theory. 2

Vocabulary 1. A sequence or interval of random outcomes, that is, random outcomes dependent on time is a stochastic process. 2. Let J be a subset of the non-negative real numbers. Let Ω be a set, usually called the sample space or probability space. An element ω of Ω is a sample point or sample path. Let S be a set of values, often the real numbers, called the state space. A stochastic process is a function X : (J, Ω) S, that is a function of both time and the sample point to the state space. 3. The particular stochastic process usually called a simple random walk T n gives the position in the integers after taking a step to the right for a head, and a step to the left for a tail. 4. A generalization of a Markov chain is a Markov Process. In a Markov process, we allow the index set to be either a discrete set of times as the integers or an interval, such as the non-negative reals. Likewise the state space may be either a set of discrete values or an interval, even the whole real line. In mathematical notation a stochastic process X(t) is called Markov if for every n and t 1 < t 2 <... < t n and real number x n, we have P [X(t n ) x n X(t n 1 ),..., X(t 1 )] = P [X(t n ) x n X(t n 1 )]. Many of the models in this text will naturally be Markov processes because of the intuitive appeal of this memory-less property. Mathematical Ideas Definition and Notations A sequence or interval of random outcomes, that is, random outcomes dependent on time is a stochastic process. Stochastic is a synonym for ran- 3

dom. The word is of Greek origin and means pertaining to chance (Greek stokhastikos, skillful in aiming; from stokhasts, diviner; from stokhazesthai, to guess at, to aim at; and from stochos target, aim, guess). The modifier stochastic indicates that a subject is random in some aspect. Stochastic is often used in contrast to deterministic, which means that random phenomena are not involved. More formally, let J be subset of the non-negative real numbers. Usually J is the nonnegative integers 0, 1, 2,... or the nonnegative reals {t : t 0}. J is the index set of the process, and we usually refer to t J as the time variable. Let Ω be a set, usually called the sample space or probability space. An element ω of Ω is a sample point or sample path. Let S be a set of values, often the real numbers, called the state space. A stochastic process is a function X : (J, Ω) S, a function of both time and the sample point to the state space. Because we are usually interested in the probability of sets of sample points that lead to a set of outcomes in the state space and not the individual sample points, the common practice is to suppress the dependence on the sample point. That is, we usually write X(t) instead of the more complete X(t, ω). Furthermore, especially if the time set is discrete, say the nonnegative integers, then we usually write the index variable or time variable as a subscript. Thus X n would be the usual notation for a stochastic process indexed by the nonnegative integers and X t or X(t) is a stochastic process indexed by the non-negative reals. Because of the randomness, we can think of a stochastic process as a random sequence if the index set is the nonnegative integers and a random function if the time variable is the nonnegative reals. Examples The most fundamental example of a stochastic process is a coin flip sequence. The index set is the set of positive integers, counting the number of the flip. The sample space is the set of all possible infinite coin flip sequences Ω = {HHT HT T T HT..., T HT HT T HHT...,...}. We take the state space to be the set 1, 0 so that X n = 1 if flip n comes up heads, and X n = 0 if the flip comes up tails. Then the coin flip stochastic process can be viewed as the set of all random sequences of 1 s and 0 s. An associated random process is to take X 0 = 0 and S n = n j=0 X j for n 0. Now the state space is the set of nonnegative integers. The stochastic process S n counts the number of 4

heads encountered in the flipping sequence up to flip number n. Alternatively, we can take the same index set, the same probability space of coin flip sequences and define Y n = 1 if flip n comes up heads, and Y n = 1 if the flip comes up tails. This is just another way to encode the coin flips now as random sequences of 1 s and 1 s. A more interesting associated random process is to take Y 0 = 0 and T n = n j=0 Y j for n 0. Now the state space is the set of integers. The stochastic process T n gives the position in the integer number line after taking a step to the right for a head, and a step to the left for a tail. This particular stochastic process is usually called a simple random walk. We can generalize random walk by allowing the state space to be the set of points with integer coordinates in two-, three- or higher-dimensional space, called the integer lattice, and using some random device to select the direction at each step. Markov Chains A Markov chain is sequence of random variables X j where the index j runs through 0, 1, 2,.... The sample space is not specified explicitly, but it involves a sequence of random selections detailed by the effect in the state space. The state space may be either a finite or infinite set of discrete states. The defining property of a Markov chain is that P [X j = l X 0 = k 0, X 1 = k 1,..., X j 1 = k j 1 ] = P [X j = l X j 1 = k j 1 ]. In more detail, the probability of transition from state k j 1 at time j 1 to state l at time j depends only on k j 1 and l, not on the history X 0 = k 0, X 1 = k 1,..., X j 2 = k j 2 of how the process got to k j 1 A simple random walk is an example of a Markov chain. The states are the integers and the transition probabilities are P [X j = l X j 1 = k] = 1/2 if l = k 1 or l = k + 1, P [X j = l X j 1 = k] = 0 otherwise. Another example would be the position of a game piece in the board game Monopoly. The index set is the nonnegative integers listing the plays of the game, with X 0 denoting the starting position at the Go corner. The sample space is the set of infinite sequence of rolls of a pair of dice. The state space is the set of 40 real-estate properties and other positions around the board. 5

Markov chains are an important and useful class of stochastic processes. Markov chains extended to making optimal decisions under uncertainty are Markov decision processes. Another extension to signal processing and bioinformatics is the hidden Markov model. Mathematicians have extensively studied and classified Markov chains and their extensions but we will not examine them carefully in this text. A generalization of a Markov chain is a Markov process. In a Markov process, we allow the index set to be either a discrete set of times as the integers or an interval, such as the nonnegative reals. Likewise the state space may be either a set of discrete values or an interval, even the whole real line. In mathematical notation a stochastic process X(t) is called Markov if for every n and t 1 < t 2 <... < t n and real number x n, we have P [X(t n ) x n X(t n 1 ),..., X(t 1 )] = P [X(t n ) x n X(t n 1 )]. Many of the models in this text will naturally be Markov processes because of the intuitive modeling appeal of this memory-less property. Many stochastic processes are naturally expressed as taking place in a discrete state space with a continuous time index. For example, consider radioactive decay, counting the number of atomic decays that have occurred up to time t by using a Geiger counter. The discrete state variable is the number of clicks heard. The mathematical Poisson process is an excellent model of this physical process. More generally, instead of radioactive events giving a single daughter particle, imagine a birth event with a random number (distributed according to some probability law) of offspring born at random times. Then the stochastic process measures the population in time. These are birth processes and make excellent models in population biology and the physics of cosmic rays. Continue to generalize and imagine that each individual in the population has a random life-span distributed according to some law, then dies. This gives a birth-and-death process. In another variation, imagine a disease with a random number of susceptible individuals getting infected, in turn infecting a random number of other individuals in the population, then recovering and becoming immune. The stochastic process counts the number of susceptible, infected and recovered individuals at any time, an SIR epidemic process. In another variation, consider customers arriving at a service counter at random intervals with some specified distribution, often taken to be an exponential probability distribution with parameter λ. The customers get 6

served one-by-one, each taking a random service time, again often taken to be exponentially distributed. The state space is the number of customers waiting for service, the queue length at any time. These are called queuing processes. Mathematically, these processes can be studied with compound Poisson processes. Continuous Space Processes usually take the state space to be the real numbers or some interval of the reals. One example is the magnitude of noise on top of a signal, say a radio message. In practice the magnitude of the noise can be taken to be a random variable taking values in the real numbers, and changing in time. Then subtracting off the known signal leaves a continuoustime, continuous state-space stochastic process. To mitigate the noise s effect engineers model the characteristics of the process. To model noise means to specify the probability distribution of the random magnitude. A simple model is to take the distribution of values to be normally distributed, leading to the class of Gaussian processes including white noise. Another continuous space and continuous time stochastic process is a model of the motion of particles suspended in a liquid or a gas. The random thermal perturbations in a liquid are responsible for a random walk phenomenon known as Brownian motion and also as the Wiener process, and the collisions of molecules in a gas are a random walk responsible for diffusion. In this process, we measure the position of the particle over time so that is a stochastic process from the nonnegative real numbers to either one-, two- or three-dimensional real space. Random walks have fascinating mathematical properties. Scientists make the model more realistic by including the effects of inertia leading to a more refined form of Brownian motion called the Ornstein-Uhlenbeck process. Extending this idea to economics, we will model market prices of financial assets such as stocks as a continuous time, continuous space process. Random market forces create small but constantly occurring price changes. This results in a stochastic process from a continuous time variable representing time to the reals or nonnegative reals representing prices. By refining the model so that prices are nonnegative leads to the stochastic process known as Geometric Brownian Motion. Family of Stochastic Processes A sequence or interval of random outcomes, that is to say, a string of random outcomes dependent on time as well as the randomness is a stochastic pro- 7

cess. With the inclusion of a time variable, the rich range of random outcome distributions becomes a great variety of stochastic processes. Nevertheless, the most commonly studied types of random processes have connections. A tree is in Figure 1, along with an indication of the stochastic process types studied in this text. Ways to Interpret Stochastic Processes Stochastic processes are functions of two variables, the time index and the sample point. As a consequence, stochastic processes are interpreted in several ways. The simplest is to look at the stochastic process at a fixed value of time. The result is a random variable with a probability distribution, just as studied in elementary probability. Another way to look at a stochastic process is to consider the stochastic process as a function of the sample point ω. Each ω maps to an associated function X(t). This means that one can look at a stochastic process as a mapping from the sample space Ω to a set of functions. In this interpretation, stochastic processes are a generalization from the random variables of elementary probability theory. In elementary probability theory, random variables are a mapping from a sample space to the real numbers, for stochastic processes the mapping is from a sample space to a space of functions. Now we ask questions like: What is the probability of the set of functions that exceed a fixed value on a fixed time interval? ; What is the probability of the set of functions having a certain limit at infinity? ; and What is the probability of the set of functions that are differentiable everywhere? This is a fruitful way to consider stochastic processes, but it requires sophisticated mathematical tools and careful analysis. Another way to look at stochastic processes is to ask what happens at special times. For example, consider the time it takes until the function takes on one of two certain values, say a and b. Then ask What is the probability that the stochastic process assumes the value a before it assumes the value b? Note that the time that each function assumes the value a is different, 8

Markov Processes Discrete Time Continuous Time Finite State Space Infinite Discrete State Space Continuous State Space Infinite Discrete State Space Markov Chain Classic Random Walk on Integer Lattices Gaussian Processes Poisson Processes Markov Decision Processes Hidden Markov Models Wiener Process (Brownian Motion) White Noise Compound Poisson Processes Geometric BrownianMotion Diffusions Birth and Death Processes Queueing Processes Ornstein-Uhlenbeck Process Figure 1: A tree of some stochastic processes, from most general at the top to more specific at the end leaves. Stochastic processes studied in this text are red. 9

it is a random time. This provides an interaction between the time variable and the sample point through the values of the function. This too is a fruitful way to think about stochastic processes. In this text, we will consider each of these approaches with the corresponding questions. Sources The material in this section is adapted from many texts on probability theory and stochastic processes, especially the classic texts by S. Karlin and H. Taylor, S. Ross, and W. Feller. Problems to Work for Understanding Reading Suggestion: References [1] William Feller. An Introduction to Probability Theory and Its Applications, Volume I, volume I. John Wiley and Sons, third edition, 1973. QA 273 F3712. [2] Sheldon Ross. A First Course in Probability. Macmillan, 1976. [3] Sheldon M. Ross. Introduction to Probability Models. Academic Press, 9th edition, 2006. 10

[4] H. M. Taylor and Samuel Karlin. An Introduction to Stochastic Modeling. Academic Press, third edition, 1998. Outside Readings and Links: 1. Origlio, Vincenzo. Stochastic. From MathWorld A Wolfram Web Resource, created by Eric W. Weisstein. Stochastic 2. Weisstein, Eric W. Stochastic Process. From MathWorld A Wolfram Web Resource. Stochastic Process 3. Weisstein, Eric W. Markov Chain. From MathWorld A Wolfram Web Resource. Markov Chain 4. Weisstein, Eric W. Markov Process. From MathWorld A Wolfram Web Resource. Markov Process 5. Julia Ruscher studying stochastic processes I check all the information on each page for correctness and typographical errors. Nevertheless, some errors may occur and I would be grateful if you would alert me to such errors. I make every reasonable effort to present current and accurate information for public use, however I do not guarantee the accuracy or timeliness of information on this website. Your use of the information from this website is strictly voluntary and at your risk. I have checked the links to external sites for usefulness. Links to external websites are provided as a convenience. I do not endorse, control, monitor, or guarantee the information contained in any external website. I don t guarantee that the links are active at all times. Use the links here with the same caution as you would all information on the Internet. This website reflects the thoughts, interests and opinions of its author. They do not explicitly represent official positions or policies of my employer. Information on this website is subject to change without notice. 11

Steve Dunbar s Home Page, http://www.math.unl.edu/~sdunbar1 Email to Steve Dunbar, sdunbar1 at unl dot edu Last modified: Processed from L A TEX source on July 15, 2016 12