Martingales Stopping Time Processes

Similar documents
Cash Flow Valuation Mode Lin Discrete Time

Lecture Notes 2. The Hilbert Space Approach to Time Series

5. Stochastic processes (1)

T L. t=1. Proof of Lemma 1. Using the marginal cost accounting in Equation(4) and standard arguments. t )+Π RB. t )+K 1(Q RB

4 Sequences of measurable functions

The Arcsine Distribution

Linear Response Theory: The connection between QFT and experiments

Notes for Lecture 17-18

Inventory Analysis and Management. Multi-Period Stochastic Models: Optimality of (s, S) Policy for K-Convex Objective Functions

Monochromatic Infinite Sumsets

1 Review of Zero-Sum Games

EE 315 Notes. Gürdal Arslan CLASS 1. (Sections ) What is a signal?

Essential Microeconomics : OPTIMAL CONTROL 1. Consider the following class of optimization problems

Matrix Versions of Some Refinements of the Arithmetic-Geometric Mean Inequality

The Optimal Stopping Time for Selling an Asset When It Is Uncertain Whether the Price Process Is Increasing or Decreasing When the Horizon Is Infinite

On Boundedness of Q-Learning Iterates for Stochastic Shortest Path Problems

Application of a Stochastic-Fuzzy Approach to Modeling Optimal Discrete Time Dynamical Systems by Using Large Scale Data Processing

arxiv: v1 [math.pr] 19 Feb 2011

An introduction to the theory of SDDP algorithm

Lecture 2-1 Kinematics in One Dimension Displacement, Velocity and Acceleration Everything in the world is moving. Nothing stays still.

Families with no matchings of size s

Final Spring 2007

LECTURE 1: GENERALIZED RAY KNIGHT THEOREM FOR FINITE MARKOV CHAINS

Bernoulli numbers. Francesco Chiatti, Matteo Pintonello. December 5, 2016

Finish reading Chapter 2 of Spivak, rereading earlier sections as necessary. handout and fill in some missing details!

Vehicle Arrival Models : Headway

The Asymptotic Behavior of Nonoscillatory Solutions of Some Nonlinear Dynamic Equations on Time Scales

Overview. COMP14112: Artificial Intelligence Fundamentals. Lecture 0 Very Brief Overview. Structure of this course

Physics 235 Chapter 2. Chapter 2 Newtonian Mechanics Single Particle

Representation of Stochastic Process by Means of Stochastic Integrals

Lecture 10: The Poincaré Inequality in Euclidean space

18 Biological models with discrete time

6. Stochastic calculus with jump processes

An Excursion into Set Theory using a Constructivist Approach

An Introduction to Malliavin calculus and its applications

Generalized Snell envelope and BSDE With Two general Reflecting Barriers

arxiv: v1 [math.ca] 15 Nov 2016

RANDOM LAGRANGE MULTIPLIERS AND TRANSVERSALITY

Introduction to Probability and Statistics Slides 4 Chapter 4

MATH 4330/5330, Fourier Analysis Section 6, Proof of Fourier s Theorem for Pointwise Convergence

The Strong Law of Large Numbers

Two Coupled Oscillators / Normal Modes

23.2. Representing Periodic Functions by Fourier Series. Introduction. Prerequisites. Learning Outcomes

An Introduction to Backward Stochastic Differential Equations (BSDEs) PIMS Summer School 2016 in Mathematical Finance.

International Journal of Scientific & Engineering Research, Volume 4, Issue 10, October ISSN

11!Hí MATHEMATICS : ERDŐS AND ULAM PROC. N. A. S. of decomposiion, properly speaking) conradics he possibiliy of defining a counably addiive real-valu

Reading from Young & Freedman: For this topic, read sections 25.4 & 25.5, the introduction to chapter 26 and sections 26.1 to 26.2 & 26.4.

Some Ramsey results for the n-cube

Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles

Let us start with a two dimensional case. We consider a vector ( x,

2. Nonlinear Conservation Law Equations

We just finished the Erdős-Stone Theorem, and ex(n, F ) (1 1/(χ(F ) 1)) ( n

Inequality measures for intersecting Lorenz curves: an alternative weak ordering

Random Walk with Anti-Correlated Steps

Math 10B: Mock Mid II. April 13, 2016

An random variable is a quantity that assumes different values with certain probabilities.

Stationary Distribution. Design and Analysis of Algorithms Andrei Bulatov

Supplement for Stochastic Convex Optimization: Faster Local Growth Implies Faster Global Convergence

Online Convex Optimization Example And Follow-The-Leader

A Rigorous Introduction to Brownian Motion

Utility maximization in incomplete markets

Seminar 4: Hotelling 2

Discrete Markov Processes. 1. Introduction

Logic in computer science

Orthogonal Rational Functions, Associated Rational Functions And Functions Of The Second Kind

SELBERG S CENTRAL LIMIT THEOREM ON THE CRITICAL LINE AND THE LERCH ZETA-FUNCTION. II

FINM 6900 Finance Theory

Guest Lectures for Dr. MacFarlane s EE3350 Part Deux

Electrical and current self-induction

Representing a Signal. Continuous-Time Fourier Methods. Linearity and Superposition. Real and Complex Sinusoids. Jean Baptiste Joseph Fourier

Learning a Class from Examples. Training set X. Class C 1. Class C of a family car. Output: Input representation: x 1 : price, x 2 : engine power

Effects of Coordinate Curvature on Integration

On a Fractional Stochastic Landau-Ginzburg Equation

Echocardiography Project and Finite Fourier Series

14 Autoregressive Moving Average Models

MODULE 3 FUNCTION OF A RANDOM VARIABLE AND ITS DISTRIBUTION LECTURES PROBABILITY DISTRIBUTION OF A FUNCTION OF A RANDOM VARIABLE

Example on p. 157

Chapter 2. First Order Scalar Equations

KINEMATICS IN ONE DIMENSION

Languages That Are and Are Not Context-Free

Kinematics Vocabulary. Kinematics and One Dimensional Motion. Position. Coordinate System in One Dimension. Kinema means movement 8.

Question 1: Question 2: Topology Exercise Sheet 3

12: AUTOREGRESSIVE AND MOVING AVERAGE PROCESSES IN DISCRETE TIME. Σ j =

Learning a Class from Examples. Training set X. Class C 1. Class C of a family car. Output: Input representation: x 1 : price, x 2 : engine power

Hamilton- J acobi Equation: Weak S olution We continue the study of the Hamilton-Jacobi equation:

arxiv: v1 [math.pr] 6 Oct 2008

Solutions from Chapter 9.1 and 9.2

Predator - Prey Model Trajectories and the nonlinear conservation law

Matlab and Python programming: how to get started

E β t log (C t ) + M t M t 1. = Y t + B t 1 P t. B t 0 (3) v t = P tc t M t Question 1. Find the FOC s for an optimum in the agent s problem.

Optimality Conditions for Unconstrained Problems

Backward stochastic dynamics on a filtered probability space

This document was generated at 7:34 PM, 07/27/09 Copyright 2009 Richard T. Woodward

Multi-particle processes with reinforcements.

Stochastic Model for Cancer Cell Growth through Single Forward Mutation

From Complex Fourier Series to Fourier Transforms

CONTROL SYSTEMS, ROBOTICS AND AUTOMATION Vol. XI Control of Stochastic Systems - P.R. Kumar

RL Lecture 7: Eligibility Traces. R. S. Sutton and A. G. Barto: Reinforcement Learning: An Introduction 1

SMT 2014 Calculus Test Solutions February 15, 2014 = 3 5 = 15.

EECE 301 Signals & Systems Prof. Mark Fowler

Transcription:

IOSR Journal of Mahemaics (IOSR-JM) e-issn: 2278-5728, p-issn: 2319-765. Volume 11, Issue 1 Ver. II (Jan - Feb. 2015), PP 59-64 www.iosrjournals.org Maringales Sopping Time Processes I. Fulaan Deparmen of Mahemaics, Ahmadu Bello Universiy, Zaria, Nigeria Absrac: We begin wih some preliminaries on measure-heoreic probabiliy heory which in urn, allows us o discuss he definiion and basic properies of maringales. We hen discuss he idea of sopping imes and sopping process. We sae auxiliary resuls and prove a heorem on a sopping process using he Càdlàg propery of he process. Keywords: Càdlàg, maringale, sopping process, sopping ime I. Inroducion We shall provide explanaion of he heory on maringale and he formal definiion. We also define a sopping ime, provide some simple explanaion of he classical (discree) ime and oher relaed sopping imes. The concep of sochasic process and Càdlàg propery will be inroduced and a las use same o prove a heorem on a sopping process. In modern probabiliy heory he model for a random experimen is called a probabiliy space and his consiss of he riple, F,P where Ω is a se, called he sample space, F is a -algebra of he subses of Ω and P is a probabiliy measure on, F, meaning ha every se in F is measurable and P 1. The measure P gives he probabiliy ha an oucome occurs. In discree probabiliy, one is usually ineresed in random variables which are real-valued funcions on he sample space. A random variable will hen be a funcion : which is measurable wih respec o F. The expeced value of a random variable is is inegral wih respec o he measure P, i.e., E w dp w w and we say ha a random variable is inegrable if E. In he case of he condiional expecaion, given a sub--algebra of F, he condiional expecaion E A is he expecaion of given ha he evens conained in have occurred. A filraion on of sub-algebras of F such ha F F F.... F. 1 2 3,,P F is an increasing sequence 1 DOI: 10.9790/5728-11125964 www.iosrjournals.org 59 Page F 1.1 Maringale I is already nown ha a maringale is (informally) a random process which models a player s forune in a fair game. Tha is his expeced forune a ime given he hisory up o his poin is equal o his curren forune. E,..., (1.1.1) +1 1 This in urn implies for all :... E E E E E (1.1.2) +1 1 1 0 So, player s expeced forune a any ime is equal o his saring expeced forune. Basically, he requiremens for a maringale are a probabiliy space, F,P, a sequence of σ-algebras F0 F1... Fn F and a sequence of variables 0, 1,..., n, ha is a sochasic process. Filraions are imporan because hey provide a concise way of finding maringale, since he condiions for a maringale are ha: (a) Each is F -measurable (i.e. if we now he informaion in F, hen we now he value of ). This is o say ha is adaped o he filraion F and for each, (b) E

(c) E F +1, a.s. Maringales Sopping Time Processes This means ha maringales end o go neiher up nor down, [3]. Maringales are allegory of life [5]. E F and sub-maringales end o go up, Supermaringales end o go down, E F +1 +1. See also, [6] and [7]. To find when o sop a maringale a random ime, one needs a rule for sopping a random process which does no depend on fuure. Reurning o (1.1.2), one ass wheher he game remains fair when sopped a a randomly chosen ime. Specifically, if is a random sopping ime and denoes he game sopped a his ime, do we have E E 0 as well? In general, he answer is no as poined ou by [1].The duo envisaged a siuaion where he player is allowed o go ino deb by any amoun o play for an arbirarily long ime. In such a siuaion, he player may ineviably come ou ahead. There are condiions which will guaranee fairness and Doyle and Snell give hem in he following heorem: 1.2 Theorem [Maringale Sopping Theorem] A fair game ha is sopped a random ime will remain fair o he end of he game if i is assumed ha; (a) There is a finie amoun of money in he world (b) A player mus sop if he wins all of his money or goes ino deb by his amoun A more formal saemen of he above heorem wih proof is provided in [2] and i is called Doob s Opional- Sopping Theorem. In he exposiion, basically, he conceps of [3] and [4] are employed. The saemen here is a finie amoun of money in he world is encoded by assuming ha he random variables are uniformly bounded and his is condiion (b) below. Similarly, he requiremen ha he player mus sop afer a finie amoun ime is obained by requiring ha be almos surely bounded, which is condiion (a) below. The hird condiion under which he heorem holds is essenially limi on he size of a be a any given ime, and his is (c). 1.3 Theorem (Doob s Opional-Sopping Theorem) Le, F, F, P be a filered probabiliy space and a maringale wih respec o F. Le be a sopping ime. Suppose ha any of he following condiions holds: w N w (a) There is apposiive ineger N such ha (b) There is a posiive ineger K such ha w K and w and is a.s. finie (c) E, and here is a posiive real number K such ha w 1 w K and w. Then is inegrable and Le,,P E E 0 II. Sopping Time n F be a probabiliy space and le F 0 (random variable) : 0,1,2,...,n wih he propery ha w : w F 0,1,2,..., n, We say ha is almos surely finie if P.Usually, we se T 0,1,2,..., n 0 be a filraion. A sopping ime is a map as he index se for he ime variable, and he σ -algebra F is he collecion of evens which are observable up o and including ime. The condiion ha is a sopping ime means ha he oucome of he even is nown a ime, [8]. No nowledge of he fuure is required, since such a rule would surely resul in an unfair game. In discree ime siuaion, wheret 0,1,2,..., n, he condiion ha F means ha a any ime T one nows based on informaion up o ime if one has been sopped already or no which is F. This is no rue for coninuous ime case in which T is an equivalen o he requiremen ha inerval of he real numbers and hence uncounable due o he fac ha σ -algebras are no in general closed DOI: 10.9790/5728-11125964 www.iosrjournals.org 60 Page

Maringales Sopping Time Processes under aing uncounable unions of evens. 2.1 Independen Sopping Time Le : 0 be a sochasic process and suppose ha is any random ime ha is independen of he process. Then is a sopping ime. Here does no depend on : 0 (pas or fuure). An example of his is ha before one sars gambling, one decides ha one will sop gambling afer he 10 h gamble P 10. Anoher example is ha if every day afer looing a he soc price, regardless of all else. i.e. one hrows a fair die. One decides o sell he soc he firs ime he die shows up an odd number. In his case is independen of he soc pricing. 2.2 Non-sopping Times (Las Exi Time) In aing an example of a ra in he open maze problem in which he ra evenually reaches freedom (sae 0) and never reurns ino he maze. Suppose ha he ra sars off in cell 1, 0 1. Le denoe he las ime he ra visis cell 1 before leaving he maze: max 0: 1 (2.2.1) We now need o now he fuure o deermine such a ime. For insance he even 0 ells us ha in fac he ra never reurned o cell 1 0 1, 1, 1... (2.2.2) 0 1 2 So, his depends on all of he fuure, no jus 0. This is no a sopping ime. In general, a las exi ime (he las ime a process his a given sae or se of saes) is no a sopping ime for in order o now he las has happened or occurred, one mus now he fuure. Possibly he mos popular example of sopping ime is he hiing ime of a sochasic process, ha is ha firs ime a which a cerain pre-specified se is hi by he considered process. The reverse of he above saemen is as well rue: for any sopping ime, here exiss an adaped sochasic process and a Borel-measurable se such ha he corresponding hiing ime will be exacly his sopping ime. The saemen and oher consrucions of sopping imes are also in [9], [10],[11],[12] and [13]. III. Sochasic Processes By a sochasic process we mean a collecion or family of random variables : T indexed by ime, T 0,. Since he index represens ime, we hen hin of as he sae or he posiion of he process a ime. T, is called he parameer se and,, he sae of he process. If T is counable, he process is said o be discree parameer process. If T is no counable, he process is said o be a coninuous parameer process. A random variable is a sopping ime for a sochasic process if i is a sopping ime for he naural filraion of, ha is s s :. The firs ime ha an adaped process, his a given value or se of values is a sopping ime. The inclusion of ino he range of is o cover all cases where never his he given values, ha is, = if he random even never happens. In oher words, le, 0 be a sochasic process. A sopping ime wih respec o, 0 is a random ime such ha for each 0 is compleely, he even deermined by (a mos) he oal informaion nown up he ime,,,...,. In he conex of gambling, in which denoe our oal earnings afer he gambling a ime, a sopping ime is hus a rule ha ells us a wha ime o sop gambling. Our decision o sop afer a given gamble can only depend (a mos) on he informaion nown a ha ime (no on he fuure informaion). This is o say ha a gambler sops he being process when he reaches a cerain goal. The ime i aes o reach his goal is generally no a deerminisic one. Raher, i is a random variable depending on he curren resul of he be, as well as he combined informaion from all previous bes. If we le be a sopping ime and be random process, hen for any posiive ineger and w, we define DOI: 10.9790/5728-11125964 www.iosrjournals.org 61 Page 0 1

w min w, 3.1 Sopped Process Given a sochasic process F. Define he random variable w which is adaped o he filraion, w w 0 oherwise or equivalenly, 1. 0 Maringales Sopping Time Processes F and le be a sopping ime wih respec o The process is called a sopped process. I is equivalen o for and equal o for Following [14], one can inuiively choose a sochasic process which will assume value 1 unil jus before he sopping ime is reached, from which on i will be 0. The 1-0-process herefore firs his he Borel se {0} a he sopping ime. As such a 1-1 relaion is esablished (Theorem 3.5) beween sopping imes and adaped càdlàg (see 3.2) processes ha are non-decreasing and ae he values 0 and 1 A 1-0-process as described above is ain o a random raffic ligh which can go hrough hree possible scenarios over ime (wih green sanding for 1 and red sanding for 0 ) as follows: (i) Says red forever (sopped immediaely) (ii) Green a he beginning, hen urns red and says red for he res of he ime (sopped a some sage) (iii) Says green forever (never sopped) For he adapedness of 1-0-process, i can only change based on informaion up o he corresponding poin in ime. This inuiive inerpreaion of sopping ime as he ime when such a raffic ligh lighs seems o be easier o undersand han he concep of random ime which is nown once i has been reached. 3.2 Definiion Càdlàg (a French word sanding for: coninue à droie, limiée à gauche) Meaning righ- coninuous wih lef-handed limi (RCLL). If for all w ϵ Ω he pahs w : T have he propery RCLL, a real-valued sochasic process will be called càdlàg. 3.3 Definiion An adaped càdlàg process on,,, P T = w w T and F F is a sopping process if 0,1, (3.3.1) s, s, T (3.3.2) s For a finie or infinie discree ime axis given by T :, j if j saisfying (3.3.1) and (3.3.2) is auomaically càdlàg. 3.4 Definiion For a sopping process on, F, F, P w, we define w if 1 T min T : w 0 oherwise The minimum in he lower case exiss because of he càdlàg propery of each pah. By definiion T 1 which implies ha and by adapedness of and ha c, an adaped process (3.4.1) (3.4.2) DOI: 10.9790/5728-11125964 www.iosrjournals.org 62 Page

T Maringales Sopping Time Processes F (3.4.3) Therefore is a sopping ime for a sopping process. is he firs ime of hiing he Lebesgue-measure se 0. 3.5 Theorem [14]. f : (3.5.1) is a bijecion beween he sopping processes and he sopping imes on, F, F, P such ha Hence Proof f 1 : (3.5.2) and (3.5.3) f maps processes o sopping imes. Under f, differen sopping processes lead o differen sopping imes as seen in (3.4.1). Then f is herefore an injecion. For any sopping ime, we have from (3.4.1) and (3.4.2) ha Since if 1 1 T w min T :1 w 0 oherwise 1 w 0 w, he righ hand side of (3.5.4) is w. This means ha w w, hus f is surjecve and his proves (3.5.1). Now, from (3.5.1) and (3.5.3), we have T 1 and his proves (3.5.4). 3.6 Theorem For a sopping ime, he sochasic process T is a sopping process. Proof 1 w DOI: 10.9790/5728-11125964 www.iosrjournals.org 63 Page T defined by (3.5.4) (3.6.1) By adapedness of, we have c and ha T F, which implies ha is an adaped process. We have from he definiion ha s 1 1 for s s. Now, lim w 0 since is càdlàg hen a sopping process w w. IV. Conclusion We have in nushell proved a heorem whose process is càdlàg. One can conclude ha many of he processes wih such a propery can be made o be a sopping process by a sopping ime. Thus, i is imporan o noice ha using his consrucion he resul can be exended o coninuous processes by perhaps maing i càdlàg. References [1]. Doyle, P. G. and Snell, J. L. (1984) Random Wals and Elecrical Newors. Carus Mahemaical Monographs, Mahemaical Associaion of America Washingon DC [2]. Lalonde M. S. (201) The Maringale Sopping Theorem hp://mah.darmouh.ed wpw/mah100w13 [3]. Williams, D. (1991). Probabiliy wih Maringales.Cambridge Universiy Press (Cambridge Mahemaical Tex Boos). [4]. Doob, J. L. (1971). Wha is Maringale? American Mahemaics Monhly 78 (5) 451-463 [5]. Gu, A. (2004). Probabiliy: A Graduae Course. Springer Texs in Saisics. Springer-verlag. [6]. Gusa, D., Kuush, A. and Mishura Y, F. (2010). Theory of Sochasic Processes. Springer verlag. [7]. Sroo, D.W. (1993) Probabiliy Theory, an Analyic View.Cambridge Universiy Press. [8]. Kruglov, V. M. (2011). Sopping Times. Moscow Universiy Compuaional Mahemaics and Cyberneics. 35 (2), 55-62. [9]. Luz, P. (2012). Early Sopping Bu when? Lecure noes in Compuer Science (7700) 53-67. [10]. Jua, Lempa (2012). Opimal Sopping wih Informaion Consrain Applied Mahemaics and Opimizaion. 66 (2) 147-173.

Maringales Sopping Time Processes [11]. Persi, D. and Lauren, M. (2009). On Times o Quasi-Saionery for Birh and Deah Processes. Journal of Theoreical Probabiliy 3 (22) 558-586. [12]. Neil F. (1980). Consrucion of Sopping Times T such ha F mod P.Lecure Noes in mahemaics 794 412-423 [13]. Nudley, R. M. and Guman S. (1977). Sopping Times wih given Laws.Lecure Noes in mahemaics 581 51-58 [14]. Fischer, T. (2012). On Simple Represenaions of Sopping Time and Sopping Time σ-algebras. Lecure Noes on Sochasic Processes. Insiue of Mahemaics, Universiy of Wuerzburg, Germany. T T DOI: 10.9790/5728-11125964 www.iosrjournals.org 64 Page