An Introduction to Stochastic Calculus

Size: px
Start display at page:

Download "An Introduction to Stochastic Calculus"

Transcription

1 An Introduction to Stochastic Calculus Haijun Li Department of Mathematics Washington State University Weeks 3-4 Haijun Li An Introduction to Stochastic Calculus Weeks / 22

2 Outline 1 Conditional Expectation A Motivating Example σ-fields 2 The General Conditional Expectation The Conditional Expectation Given Known Information Rules for Calculation of Conditional Expectations The Projection Property of Conditional Expectations Haijun Li An Introduction to Stochastic Calculus Weeks / 22

3 Conditioning Based on Available Information Let X be a random variable defined on a probability space (Ω, F, P), and B Ω with P(B) > 0. The conditional probability of A given B is defined as P(A B) := P(A B). P(B) That is, P( B) : F [0, 1] is a probability measure given the information that the event B has occurred. The conditional distribution function of X given B is defined as A A {}}{{}}{ F(x B) := P( X x B) = P( {X x} B). P(B) Haijun Li An Introduction to Stochastic Calculus Weeks / 22

4 Conditional Average Let F(x) = P(X x) denote the distribution of X. The conditional expectation of X given B is the average of the conditional distribution F(x B). That is, B x df (x) E(X B) := x df (x B) = = E(XI B) P(B) P(B), where I B (ω) : Ω [0, 1] is the indicator function of the event B: { 1, if ω B I B (ω) = 0, if ω / B. E(X B) can be viewed as our estimate to X given the information that the event B has occurred. E(X B c ) is similarly defined. Together, E(X B) and E(X B c ) provide our estimate to X depending on whether or not B occurs. Haijun Li An Introduction to Stochastic Calculus Weeks / 22

5 Conditional Expectation Under Discrete Conditioning Think I B as a random variable that carries the information on whether the event B occurs, and the conditional expectation E(X I B ) of X given I B is a random variable defined as { E(X B), if ω B E(X I B )(ω) := E(X B c ), if ω / B. The random variable E(X I B ) is our estimate to X based on the information provided by I B. Consider a discrete random variable Y on Ω taking distinct values y i, i = 1, 2,...,. Let A i = {ω Ω : Y (ω) = y i }. Note that Y carries the information on whether or not events A i s occur. Define the conditional expectation of X given Y : E(X Y )(ω) := E(X A i ) = E(X Y = y i ), if ω A i, i = 1, 2,.... The random variable E(X Y ) can be viewed as our estimate to X based on the information carried by Y. Haijun Li An Introduction to Stochastic Calculus Weeks / 22

6 Example: Uniform Random Variable Consider the random variable X(ω) = ω on Ω = (0, 1], with density function given by f X (x) = 1 for all x (0, 1]. Assume that one of the events ( i 1 A i = n, i ], i = 1,..., n, n occurred. Then E(X A i ) = 1 P(A i ) A i xf X (x)dx = 1 2 2i 1 n (i.e., the center of A i ). The value E(X A i ) is the updated expectation on the new space A i, given the information that A i occurred. Haijun Li An Introduction to Stochastic Calculus Weeks / 22

7 Haijun Li An Introduction to Stochastic Figure Calculus : Weeks / Estimating CONDITIONAL Uniform EXPECTATION Random Variable 59 L, I0 O W Figure Left: a unaform random variable X on (0,1] (dotted line) and its expectation (solid line); see Example Right: the random variable X (dotted line) and the conditional expectations E(X(Ai) (solid lines), where Ai = ((i - 1)/5, i/5], i = 1,...,5. These conditional expectations can be interpreted as the values of a discrete random variable E(X]Y) with distinct constant values on the sets Ai; see Example

8 Estimating Uniform Random Variable Consider ( i 1 A i = n, i ], i = 1,..., n, n E(X A i ) = 1 2i 1 (i.e., the center of A i ). 2 n Define Y (ω) := n i=1 i 1 n I A i (ω), i = 1,..., n. The conditional expectation E(X Y )(ω) = 1 2 2i 1, if ω A i, i = 1,..., n. n Since E(X Y )(ω) is the average of X given the information that ω ((i 1)/n, i/n], E(X Y ) is a coarser version of X, that is, an approximation to X, given the information that any of the A i s occurred. E(X Y ) X, when n is sufficiently large. Haijun Li An Introduction to Stochastic Calculus Weeks / 22

9 Properties of Conditional Expectation The conditional expectation is linear: for random variables X 1, X 2 and constants c 1, c 2, EX = E[E(X Y )]. E([c 1 X 1 + c 2 X 2 ] Y ) = c 1 E(X 1 Y ) + c 2 E(X 2 Y ). If X and Y are independent, then E(X Y ) = EX. The random variable E(X Y ) is a (measurable) function of Y : E(X Y ) = g(y ), where g(y) = E(X Y = y i )I {yi }(y). i=1 Haijun Li An Introduction to Stochastic Calculus Weeks / 22

10 σ-fields Observe that the values of Y did not really matter for the definition of E(X Y ) under discrete conditioning, but it was crucial that conditioning events A i s describe the information carried by all the distinct values of Y. That is, we estimate the random variable X via E(X Y ) based on the information provided by observable events A i s and their composite events, such as A i A j and A i A j,... Definition of σ-fields A σ-field F on Ω is a collection of subsets (observable events) of Ω satisfying the following conditions: F and Ω F. If A F, then A c F. If A 1, A 2, F, then i=1 A i F, and i=1 A i F. Haijun Li An Introduction to Stochastic Calculus Weeks / 22

11 Generated σ-fields For any collection C of events, let σ(c) denote the smallest σ-field containing C, by adding all possible unions, intersections and complements. σ(c) is said to be generated by C. The following are some examples. F = {, Ω} = σ({ }). F = {, Ω, B, B c } = σ({b}). F = {A : A Ω} = σ({a : A Ω}). Let C = {(a, b] : < a < b < }, then any set in B 1 = σ(c) is called a Borel subset in R. Let C = {(a, b] : < a i < b i <, i = 1,..., d}, then any set in B d = σ(c) is called a Borel subset in R d. Haijun Li An Introduction to Stochastic Calculus Weeks / 22

12 σ-fields Generated By Random Variables Let Y be a discrete random variable taking distinct values y i, i = 1, 2,.... Define A i = {ω : Y (ω) = y i }, i = 1, 2,.... A typical set in the σ-field σ({a i }) is of this form A = i I A i, I {1, 2,... }. σ({a i }) is called the σ-field generated by Y, and denoted by σ(y ). Let Y be a random vector and A(a, b] = {ω : Y (ω) (a, b], < a i < b i <, i = 1,..., d}. The σ-field σ({a(a, b], a, b R d }) is called the σ-field generated by Y, and denoted by σ(y ). σ(y ) provides the essential information about the structure of Y, and contains all the observable events {ω : Y (ω) C}, where C is a Borel subset of R d. Haijun Li An Introduction to Stochastic Calculus Weeks / 22

13 σ-fields Generated By Stochastic Processes For a stochastic process Y = (Y t, t T ), and any (measurable) set C of functions on T, let A(C) = {ω : the sample path (Y t (ω), t T ) belongs to C}. The σ-field generated by the process Y is the smallest σ-field that contains all the events of the form A(C). Example: For Brownian motion B = (B t, t 0), let F t := σ({b s, s t}) denote the σ-field generated by Brownian motion prior to time t. F t contains the essential information about the structure of the process B on [0, t]. One can show that this σ-field is generated by all sets of the form, 0 t 1 t 2 t n t, A t1,...,t n (C) = {ω : (B t1 (ω),..., B tn (ω)) C} for all the n-dimensional Borel sets C. Haijun Li An Introduction to Stochastic Calculus Weeks / 22

14 Information Represented by σ-fields For a random variable, a random vector or a stochastic process Y on Ω, the σ-field σ(y ) generated by Y contains the essential information about the structure of Y as a function of ω Ω. It consists of all subsets {ω : Y (ω) C} for suitable sets C. Because Y generates a σ-field, we also say that Y contains information represented by σ(y ) or Y carries the information σ(y ). For any measurable function f acting on Y, since {ω : f (Y (ω)) C} = {ω : Y (ω) f 1 (C)}, measurable set C we have that σ(f (Y )) σ(y ). That is, a function f acting on Y does not provide new information about the structure of Y. Example: For Brownian motion B = (B t, t 0), consider the function f (B) = sup 0 t 1 B t. The σ(f (B)) σ({b s, s t}) for any t 1. Haijun Li An Introduction to Stochastic Calculus Weeks / 22

15 The General Conditional Expectation Let (Ω, F, P) be a probability space, and Y, Y 1 and Y 2 denote random variables (or random vectors, stochastic processes) defined on Ω. The information of Y is contained in F or Y does not contain more information than that contained in F σ(y ) F. Y 1 contains more information than Y 2 σ(y 2 ) σ(y 1 ). Conditional Expectation Given the σ-field Let X be a random variable defined on Ω. The conditional expectation given F is a random variable, denoted by E(X F), with the following properties: E(X F) does not contain more information than that contained in F: σ(e(x F)) F. For any event A F, E(XI A ) = E(E(X F)I A ). By virtue of the Radon-Nikodym theorem, we can show the existence and almost sure (a.s.) uniqueness of E(X F). Haijun Li An Introduction to Stochastic Calculus Weeks / 22

16 Conditional Expectation Given Generated Information Let Y be a random variable (random vector or stochastic process) on Ω. The conditional expectation of X given Y, denoted by E(X Y ), is defined as E(X Y ) := E(X σ(y )). The random variables X and E(X F) are close to each other, not in the sense that they coincide for any ω, but averages (expectations) of X and E(X F) on suitable sets A are the same. The conditional expectation E(X F) is a coarser version of the original random variable X and is our estimate to X given the information F. Example: Let Y be a discrete random variable taking distinct values y i, i = 1, 2,.... Any set A σ(y ) can be written as A = i I A i = i I {ω : Y (ω) = y i }, for some I {1, 2,... }. Let Z := E(X Y ). Then σ(z ) σ(y ) and Z (ω) = E(X A i ), for ω A i. Observe that E(XI A ) = E(X i I I Ai ) = i I E(XI Ai ) = i I E(X A i )P(A i ) = E(ZI A ). Haijun Li An Introduction to Stochastic Calculus Weeks / 22

17 Remarks Classical Conditional Expectation: Let B be an event with P(B) > 0, P(B c ) > 0. Define F B := σ({b}) = {, Ω, B, B c }. Then E(X F B )(ω) = E(X B), for ω B. Classical Conditional Probability: If X = I A, then E(I A F B )(ω) = E(I A B) = P(A B), for ω B. P(B) The idea to write the conditional expectation as averages of a random variable given a σ-field (information set) goes back to Andrey Kolmogorov (1933, the Foundations of the Theory of Probability). Haijun Li An Introduction to Stochastic Calculus Weeks / 22

18 Rules for Calculation of Conditional Expectations Let X, X 1, X 2 denote random variables defined on (Ω, F, P). 1 For any two constants c 1, c 2, E(c 1 X 1 + c 2 X 2 F) = c 1 E(X 1 F) + c 2 E(X 2 F). 2 EX = E[E(X F)]. 3 If X and F are independent, then E(X F) = EX. In particular, if X and Y are independent, then E(X Y ) = EX. 4 If σ(x) F, then E(X F) = X. In particular, if X is a function of Y, then σ(x) σ(y ) and E(X Y ) = X. 5 If σ(x) F, then E(XX 1 F) = XE(X 1 F). In particular, if X is a function of Y, then σ(x) σ(y ) and E(XX 1 Y ) = XE(X 1 Y ). 6 If F and F are two σ-fields with F F, then E(X F) = E[E(X F ) F], and E(X F) = E[E(X F) F ]. The more information (F ) we know, the finer estimate (E(X F ) VS E(X F)) we can provide for X. Haijun Li An Introduction to Stochastic Calculus Weeks / 22

19 Examples Example 1: If X and Y are independent, then E(XY Y ) = Y EX, and E(X + Y Y ) = EX + Y. Example 2: Consider Brownian motion B = (B t, t 0). The σ-fields F s = σ(b x, x s) represent an increasing stream of information about the structure of the process. Find E(B t F s ) = E(B t B x, x s) for s 0. If s t, then F s F t and thus E(B t F s ) = B t. If s < t, then E(B t F s ) = E(B t B s F s ) + E(B s F s ) = E(B s F s ) = B s. E(B t F s ) = B min(s,t). Haijun Li An Introduction to Stochastic Calculus Weeks / 22

20 Another Example: Squared Brownian Motion Consider again Brownian motion B = (B t, t 0), with the σ-fields F s = σ(b x, x s). Define X t := B 2 t t, t 0. If s t, then F s F t and thus E(X t F s ) = X t. If s < t, observe that X t = [(B t B s ) + B s ] 2 t = (B t B s ) 2 + B 2 s + 2B s (B t B s ) t. Since (B t B s ) and (B t B s ) 2 are independent of F s, we have E[(B t B s ) 2 F s ] = E(B t B s ) 2 = (t s), E[B s (B t B s ) F s ] = B s E(B t B s ) = 0. Since σ(b 2 s ) σ(b s ) F s, we have Thus, E(X t F s ) = X s. E(X t F s ) = X min(s,t). E(B 2 s F s ) = B 2 s. Haijun Li An Introduction to Stochastic Calculus Weeks / 22

21 The Projection Property of Conditional Expectations We now formulate precisely the meaning of the statement that the conditional expectation E(X F) can be understood as the optimal estimate to X given the information F. Define L 2 (F) := {Z : σ(z ) F, EZ 2 < }. If F = σ(y ), then Z L 2 (σ(y )) implies that Z is a function of Y. The Projection Property Let X be a random variable with EX 2 <. The conditional expectation E(X F) is that random variable in L 2 (F) which is closest to X in the mean square sense: E[X E(X F)] 2 = min E(X Z ) 2. Z L 2 (F) If F = σ(y ), then E(X Y ) is that function of Y which has a finite second moment and which is closest to X in the mean square sense. Haijun Li An Introduction to Stochastic Calculus Weeks / 22

22 The Best Prediction Based on Available Information It follows from the projection property that the conditional expectation E(X F) can be viewed as the best prediction of X given the information F. For example, for Brownian motion B = (B t, t 0), we have, s t, E(B t B x, x s) = B s, and E(B 2 t t B x, x s) = B 2 s s. That is, the best predictions of the future values B t and B 2 t t, given the information about Brownian motion until the present time s, are the present values B s and B 2 s s, respectively. This property characterizes the whole class of martingales with a finite second moment. Haijun Li An Introduction to Stochastic Calculus Weeks / 22

A D VA N C E D P R O B A B I L - I T Y

A D VA N C E D P R O B A B I L - I T Y A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2

More information

Chapter 1: Probability Theory Lecture 1: Measure space, measurable function, and integration

Chapter 1: Probability Theory Lecture 1: Measure space, measurable function, and integration Chapter 1: Probability Theory Lecture 1: Measure space, measurable function, and integration Random experiment: uncertainty in outcomes Ω: sample space: a set containing all possible outcomes Definition

More information

Lecture 4: Conditional expectation and independence

Lecture 4: Conditional expectation and independence Lecture 4: Conditional expectation and independence In elementry probability, conditional probability P(B A) is defined as P(B A) P(A B)/P(A) for events A and B with P(A) > 0. For two random variables,

More information

Lecture 22: Variance and Covariance

Lecture 22: Variance and Covariance EE5110 : Probability Foundations for Electrical Engineers July-November 2015 Lecture 22: Variance and Covariance Lecturer: Dr. Krishna Jagannathan Scribes: R.Ravi Kiran In this lecture we will introduce

More information

Notes 1 : Measure-theoretic foundations I

Notes 1 : Measure-theoretic foundations I Notes 1 : Measure-theoretic foundations I Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Section 1.0-1.8, 2.1-2.3, 3.1-3.11], [Fel68, Sections 7.2, 8.1, 9.6], [Dur10,

More information

Random Process Lecture 1. Fundamentals of Probability

Random Process Lecture 1. Fundamentals of Probability Random Process Lecture 1. Fundamentals of Probability Husheng Li Min Kao Department of Electrical Engineering and Computer Science University of Tennessee, Knoxville Spring, 2016 1/43 Outline 2/43 1 Syllabus

More information

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Matija Vidmar February 7, 2018 1 Dynkin and π-systems Some basic

More information

Lecture 4 An Introduction to Stochastic Processes

Lecture 4 An Introduction to Stochastic Processes Lecture 4 An Introduction to Stochastic Processes Prof. Massimo Guidolin Prep Course in Quantitative Methods for Finance August-September 2017 Plan of the lecture Motivation and definitions Filtrations

More information

Notes 13 : Conditioning

Notes 13 : Conditioning Notes 13 : Conditioning Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Sections 0, 4.8, 9, 10], [Dur10, Section 5.1, 5.2], [KT75, Section 6.1]. 1 Conditioning 1.1 Review

More information

r=1 I r of intervals I r should be the sum of their lengths:

r=1 I r of intervals I r should be the sum of their lengths: m3f33chiii Chapter III: MEASURE-THEORETIC PROBABILITY 1. Measure The language of option pricing involves that of probability, which in turn involves that of measure theory. This originated with Henri LEBESGUE

More information

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik Stochastic Processes Winter Term 2016-2017 Paolo Di Tella Technische Universität Dresden Institut für Stochastik Contents 1 Preliminaries 5 1.1 Uniform integrability.............................. 5 1.2

More information

Advanced Probability

Advanced Probability Advanced Probability Perla Sousi October 10, 2011 Contents 1 Conditional expectation 1 1.1 Discrete case.................................. 3 1.2 Existence and uniqueness............................ 3 1

More information

FE 5204 Stochastic Differential Equations

FE 5204 Stochastic Differential Equations Instructor: Jim Zhu e-mail:zhu@wmich.edu http://homepages.wmich.edu/ zhu/ January 20, 2009 Preliminaries for dealing with continuous random processes. Brownian motions. Our main reference for this lecture

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013. Conditional expectations, filtration and martingales

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013. Conditional expectations, filtration and martingales MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013 Conditional expectations, filtration and martingales Content. 1. Conditional expectations 2. Martingales, sub-martingales

More information

Chapter 1: Probability Theory Lecture 1: Measure space and measurable function

Chapter 1: Probability Theory Lecture 1: Measure space and measurable function Chapter 1: Probability Theory Lecture 1: Measure space and measurable function Random experiment: uncertainty in outcomes Ω: sample space: a set containing all possible outcomes Definition 1.1 A collection

More information

Verona Course April Lecture 1. Review of probability

Verona Course April Lecture 1. Review of probability Verona Course April 215. Lecture 1. Review of probability Viorel Barbu Al.I. Cuza University of Iaşi and the Romanian Academy A probability space is a triple (Ω, F, P) where Ω is an abstract set, F is

More information

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( ) Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio (2014-2015) Etienne Tanré - Olivier Faugeras INRIA - Team Tosca October 22nd, 2014 E. Tanré (INRIA - Team Tosca) Mathematical

More information

Lectures 22-23: Conditional Expectations

Lectures 22-23: Conditional Expectations Lectures 22-23: Conditional Expectations 1.) Definitions Let X be an integrable random variable defined on a probability space (Ω, F 0, P ) and let F be a sub-σ-algebra of F 0. Then the conditional expectation

More information

Lecture 1: Overview of percolation and foundational results from probability theory 30th July, 2nd August and 6th August 2007

Lecture 1: Overview of percolation and foundational results from probability theory 30th July, 2nd August and 6th August 2007 CSL866: Percolation and Random Graphs IIT Delhi Arzad Kherani Scribe: Amitabha Bagchi Lecture 1: Overview of percolation and foundational results from probability theory 30th July, 2nd August and 6th August

More information

We will briefly look at the definition of a probability space, probability measures, conditional probability and independence of probability events.

We will briefly look at the definition of a probability space, probability measures, conditional probability and independence of probability events. 1 Probability 1.1 Probability spaces We will briefly look at the definition of a probability space, probability measures, conditional probability and independence of probability events. Definition 1.1.

More information

1. Probability Measure and Integration Theory in a Nutshell

1. Probability Measure and Integration Theory in a Nutshell 1. Probability Measure and Integration Theory in a Nutshell 1.1. Measurable Space and Measurable Functions Definition 1.1. A measurable space is a tuple (Ω, F) where Ω is a set and F a σ-algebra on Ω,

More information

RS Chapter 1 Random Variables 6/5/2017. Chapter 1. Probability Theory: Introduction

RS Chapter 1 Random Variables 6/5/2017. Chapter 1. Probability Theory: Introduction Chapter 1 Probability Theory: Introduction Basic Probability General In a probability space (Ω, Σ, P), the set Ω is the set of all possible outcomes of a probability experiment. Mathematically, Ω is just

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

Probability Theory. Richard F. Bass

Probability Theory. Richard F. Bass Probability Theory Richard F. Bass ii c Copyright 2014 Richard F. Bass Contents 1 Basic notions 1 1.1 A few definitions from measure theory............. 1 1.2 Definitions............................. 2

More information

STOR 635 Notes (S13)

STOR 635 Notes (S13) STOR 635 Notes (S13) Jimmy Jin UNC-Chapel Hill Last updated: 1/14/14 Contents 1 Measure theory and probability basics 2 1.1 Algebras and measure.......................... 2 1.2 Integration................................

More information

Product measures, Tonelli s and Fubini s theorems For use in MAT4410, autumn 2017 Nadia S. Larsen. 17 November 2017.

Product measures, Tonelli s and Fubini s theorems For use in MAT4410, autumn 2017 Nadia S. Larsen. 17 November 2017. Product measures, Tonelli s and Fubini s theorems For use in MAT4410, autumn 017 Nadia S. Larsen 17 November 017. 1. Construction of the product measure The purpose of these notes is to prove the main

More information

Exercises Measure Theoretic Probability

Exercises Measure Theoretic Probability Exercises Measure Theoretic Probability 2002-2003 Week 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary

More information

1 Stat 605. Homework I. Due Feb. 1, 2011

1 Stat 605. Homework I. Due Feb. 1, 2011 The first part is homework which you need to turn in. The second part is exercises that will not be graded, but you need to turn it in together with the take-home final exam. 1 Stat 605. Homework I. Due

More information

Do stochastic processes exist?

Do stochastic processes exist? Project 1 Do stochastic processes exist? If you wish to take this course for credit, you should keep a notebook that contains detailed proofs of the results sketched in my handouts. You may consult any

More information

Chapter II Independence, Conditional Expectation

Chapter II Independence, Conditional Expectation Chapter II Independence, Conditional Expectation The notions of statistical independence, conditional expectation, and conditional probability are the cornerstones of probability theory. Since probabilities

More information

Notes on Measure, Probability and Stochastic Processes. João Lopes Dias

Notes on Measure, Probability and Stochastic Processes. João Lopes Dias Notes on Measure, Probability and Stochastic Processes João Lopes Dias Departamento de Matemática, ISEG, Universidade de Lisboa, Rua do Quelhas 6, 1200-781 Lisboa, Portugal E-mail address: jldias@iseg.ulisboa.pt

More information

(A n + B n + 1) A n + B n

(A n + B n + 1) A n + B n 344 Problem Hints and Solutions Solution for Problem 2.10. To calculate E(M n+1 F n ), first note that M n+1 is equal to (A n +1)/(A n +B n +1) with probability M n = A n /(A n +B n ) and M n+1 equals

More information

Brownian Motion and Conditional Probability

Brownian Motion and Conditional Probability Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical

More information

18.175: Lecture 2 Extension theorems, random variables, distributions

18.175: Lecture 2 Extension theorems, random variables, distributions 18.175: Lecture 2 Extension theorems, random variables, distributions Scott Sheffield MIT Outline Extension theorems Characterizing measures on R d Random variables Outline Extension theorems Characterizing

More information

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1 ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1 Last compiled: November 6, 213 1. Conditional expectation Exercise 1.1. To start with, note that P(X Y = P( c R : X > c, Y c or X c, Y > c = P( c Q : X > c, Y

More information

Lecture 6 Basic Probability

Lecture 6 Basic Probability Lecture 6: Basic Probability 1 of 17 Course: Theory of Probability I Term: Fall 2013 Instructor: Gordan Zitkovic Lecture 6 Basic Probability Probability spaces A mathematical setup behind a probabilistic

More information

JUSTIN HARTMANN. F n Σ.

JUSTIN HARTMANN. F n Σ. BROWNIAN MOTION JUSTIN HARTMANN Abstract. This paper begins to explore a rigorous introduction to probability theory using ideas from algebra, measure theory, and other areas. We start with a basic explanation

More information

Probability Theory II. Spring 2014 Peter Orbanz

Probability Theory II. Spring 2014 Peter Orbanz Probability Theory II Spring 2014 Peter Orbanz Contents Chapter 1. Martingales, continued 1 1.1. Martingales indexed by partially ordered sets 1 1.2. Notions of convergence for martingales 3 1.3. Uniform

More information

PROBABILITY THEORY II

PROBABILITY THEORY II Ruprecht-Karls-Universität Heidelberg Institut für Angewandte Mathematik Prof. Dr. Jan JOHANNES Outline of the lecture course PROBABILITY THEORY II Summer semester 2016 Preliminary version: April 21, 2016

More information

ECE534, Spring 2018: Solutions for Problem Set #4 Due Friday April 6, 2018

ECE534, Spring 2018: Solutions for Problem Set #4 Due Friday April 6, 2018 ECE534, Spring 2018: s for Problem Set #4 Due Friday April 6, 2018 1. MMSE Estimation, Data Processing and Innovations The random variables X, Y, Z on a common probability space (Ω, F, P ) are said to

More information

A User s Guide to Measure Theoretic Probability Errata and comments

A User s Guide to Measure Theoretic Probability Errata and comments A User s Guide to Measure Theoretic Probability Errata and comments Chapter 2. page 25, line -3: Upper limit on sum should be 2 4 n page 34, line -10: case of a probability measure page 35, line 20 23:

More information

Measure Theory, Probability, and Martingales

Measure Theory, Probability, and Martingales Trinity University Digital Commons @ Trinity Math Honors Theses Mathematics Department 4-20-2011 Measure Theory, Probability, and Martingales Xin Ma Trinity University, xma@trinity.edu Follow this and

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

ST5215: Advanced Statistical Theory

ST5215: Advanced Statistical Theory Department of Statistics & Applied Probability Thursday, August 15, 2011 Lecture 2: Measurable Function and Integration Measurable function f : a function from Ω to Λ (often Λ = R k ) Inverse image of

More information

Part III Advanced Probability

Part III Advanced Probability Part III Advanced Probability Based on lectures by M. Lis Notes taken by Dexter Chua Michaelmas 2017 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after

More information

Stochastic Calculus (Lecture #3)

Stochastic Calculus (Lecture #3) Stochastic Calculus (Lecture #3) Siegfried Hörmann Université libre de Bruxelles (ULB) Spring 2014 Outline of the course 1. Stochastic processes in continuous time. 2. Brownian motion. 3. Itô integral:

More information

Deep Learning for Computer Vision

Deep Learning for Computer Vision Deep Learning for Computer Vision Lecture 3: Probability, Bayes Theorem, and Bayes Classification Peter Belhumeur Computer Science Columbia University Probability Should you play this game? Game: A fair

More information

EE514A Information Theory I Fall 2013

EE514A Information Theory I Fall 2013 EE514A Information Theory I Fall 2013 K. Mohan, Prof. J. Bilmes University of Washington, Seattle Department of Electrical Engineering Fall Quarter, 2013 http://j.ee.washington.edu/~bilmes/classes/ee514a_fall_2013/

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Stochastic Processes. Amir Dembo (revised by Kevin Ross) August 21, 2013

Stochastic Processes. Amir Dembo (revised by Kevin Ross) August 21, 2013 Stochastic Processes Amir Dembo (revised by Kevin Ross) August 21, 2013 E-mail address: amir@stat.stanford.edu Department of Statistics, Stanford University, Stanford, CA 94305. Contents Preface 5 Chapter

More information

M5A42 APPLIED STOCHASTIC PROCESSES

M5A42 APPLIED STOCHASTIC PROCESSES M5A42 APPLIED STOCHASTIC PROCESSES Professor G.A. Pavliotis Department of Mathematics Imperial College London, UK LECTURE 1 06/10/2016 Lectures: Thursdays 14:00-15:00, Huxley 140, Fridays 10:00-12:00,

More information

Martingale Theory and Applications

Martingale Theory and Applications Martingale Theory and Applications Dr Nic Freeman June 4, 2015 Contents 1 Conditional Expectation 2 1.1 Probability spaces and σ-fields............................ 2 1.2 Random Variables...................................

More information

MATH 418: Lectures on Conditional Expectation

MATH 418: Lectures on Conditional Expectation MATH 418: Lectures on Conditional Expectation Instructor: r. Ed Perkins, Notes taken by Adrian She Conditional expectation is one of the most useful tools of probability. The Radon-Nikodym theorem enables

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example

More information

STOCHASTIC MODELS FOR WEB 2.0

STOCHASTIC MODELS FOR WEB 2.0 STOCHASTIC MODELS FOR WEB 2.0 VIJAY G. SUBRAMANIAN c 2011 by Vijay G. Subramanian. All rights reserved. Permission is hereby given to freely print and circulate copies of these notes so long as the notes

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

Measure Theoretic Probability. P.J.C. Spreij

Measure Theoretic Probability. P.J.C. Spreij Measure Theoretic Probability P.J.C. Spreij this version: September 11, 2008 Contents 1 σ-algebras and measures 1 1.1 σ-algebras............................... 1 1.2 Measures...............................

More information

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay 1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued

More information

Math-Stat-491-Fall2014-Notes-I

Math-Stat-491-Fall2014-Notes-I Math-Stat-491-Fall2014-Notes-I Hariharan Narayanan October 2, 2014 1 Introduction This writeup is intended to supplement material in the prescribed texts: Introduction to Probability Models, 10th Edition,

More information

CHANGE OF MEASURE. D.Majumdar

CHANGE OF MEASURE. D.Majumdar CHANGE OF MEASURE D.Majumdar We had touched upon this concept when we looked at Finite Probability spaces and had defined a R.V. Z to change probability measure on a space Ω. We need to do the same thing

More information

Lecture 21: Expectation of CRVs, Fatou s Lemma and DCT Integration of Continuous Random Variables

Lecture 21: Expectation of CRVs, Fatou s Lemma and DCT Integration of Continuous Random Variables EE50: Probability Foundations for Electrical Engineers July-November 205 Lecture 2: Expectation of CRVs, Fatou s Lemma and DCT Lecturer: Krishna Jagannathan Scribe: Jainam Doshi In the present lecture,

More information

Solutions Homework 6

Solutions Homework 6 1 Solutions Homework 6 October 26, 2015 Solution to Exercise 1.5.9: Part (a) is easy: we know E[X Y ] = k if X = k, a.s. The function φ(y) k is Borel measurable from the range space of Y to, and E[X Y

More information

Probability Theory II. Spring 2016 Peter Orbanz

Probability Theory II. Spring 2016 Peter Orbanz Probability Theory II Spring 2016 Peter Orbanz Contents Chapter 1. Martingales 1 1.1. Martingales indexed by partially ordered sets 1 1.2. Martingales from adapted processes 4 1.3. Stopping times and

More information

4th Preparation Sheet - Solutions

4th Preparation Sheet - Solutions Prof. Dr. Rainer Dahlhaus Probability Theory Summer term 017 4th Preparation Sheet - Solutions Remark: Throughout the exercise sheet we use the two equivalent definitions of separability of a metric space

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 3 9/10/2008 CONDITIONING AND INDEPENDENCE

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 3 9/10/2008 CONDITIONING AND INDEPENDENCE MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 3 9/10/2008 CONDITIONING AND INDEPENDENCE Most of the material in this lecture is covered in [Bertsekas & Tsitsiklis] Sections 1.3-1.5

More information

Topology, Math 581, Fall 2017 last updated: November 24, Topology 1, Math 581, Fall 2017: Notes and homework Krzysztof Chris Ciesielski

Topology, Math 581, Fall 2017 last updated: November 24, Topology 1, Math 581, Fall 2017: Notes and homework Krzysztof Chris Ciesielski Topology, Math 581, Fall 2017 last updated: November 24, 2017 1 Topology 1, Math 581, Fall 2017: Notes and homework Krzysztof Chris Ciesielski Class of August 17: Course and syllabus overview. Topology

More information

Survival Analysis: Counting Process and Martingale. Lu Tian and Richard Olshen Stanford University

Survival Analysis: Counting Process and Martingale. Lu Tian and Richard Olshen Stanford University Survival Analysis: Counting Process and Martingale Lu Tian and Richard Olshen Stanford University 1 Lebesgue-Stieltjes Integrals G( ) is a right-continuous step function having jumps at x 1, x 2,.. b f(x)dg(x)

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Independent random variables

Independent random variables CHAPTER 2 Independent random variables 2.1. Product measures Definition 2.1. Let µ i be measures on (Ω i,f i ), 1 i n. Let F F 1... F n be the sigma algebra of subsets of Ω : Ω 1... Ω n generated by all

More information

I forgot to mention last time: in the Ito formula for two standard processes, putting

I forgot to mention last time: in the Ito formula for two standard processes, putting I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy

More information

Measure and integration

Measure and integration Chapter 5 Measure and integration In calculus you have learned how to calculate the size of different kinds of sets: the length of a curve, the area of a region or a surface, the volume or mass of a solid.

More information

Measure-theoretic probability

Measure-theoretic probability Measure-theoretic probability Koltay L. VEGTMAM144B November 28, 2012 (VEGTMAM144B) Measure-theoretic probability November 28, 2012 1 / 27 The probability space De nition The (Ω, A, P) measure space is

More information

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets

More information

18.175: Lecture 3 Integration

18.175: Lecture 3 Integration 18.175: Lecture 3 Scott Sheffield MIT Outline Outline Recall definitions Probability space is triple (Ω, F, P) where Ω is sample space, F is set of events (the σ-algebra) and P : F [0, 1] is the probability

More information

Lecture 21 Representations of Martingales

Lecture 21 Representations of Martingales Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let

More information

Probability: Handout

Probability: Handout Probability: Handout Klaus Pötzelberger Vienna University of Economics and Business Institute for Statistics and Mathematics E-mail: Klaus.Poetzelberger@wu.ac.at Contents 1 Axioms of Probability 3 1.1

More information

Week 2. Review of Probability, Random Variables and Univariate Distributions

Week 2. Review of Probability, Random Variables and Univariate Distributions Week 2 Review of Probability, Random Variables and Univariate Distributions Probability Probability Probability Motivation What use is Probability Theory? Probability models Basis for statistical inference

More information

Stochastic Processes and Integrals

Stochastic Processes and Integrals Wayne State University Mathematics Faculty Research Publications Mathematics 1-1-217 Stochastic Processes and Integrals Jose L. Menaldi Wayne State University, menaldi@wayne.edu Recommended Citation Menaldi,

More information

Jointly measurable and progressively measurable stochastic processes

Jointly measurable and progressively measurable stochastic processes Jointly measurable and progressively measurable stochastic processes Jordan Bell jordan.bell@gmail.com Department of Mathematics, University of Toronto June 18, 2015 1 Jointly measurable stochastic processes

More information

Lectures for APM 541: Stochastic Modeling in Biology. Jay Taylor

Lectures for APM 541: Stochastic Modeling in Biology. Jay Taylor Lectures for APM 541: Stochastic Modeling in Biology Jay Taylor November 3, 2011 Contents 1 Distributions, Expectations, and Random Variables 4 1.1 Probability Spaces...................................

More information

Measurability Is Not About Information. Juan Dubra and Federico Echenique. March 2001

Measurability Is Not About Information. Juan Dubra and Federico Echenique. March 2001 Measurability Is Not About Information By Juan Dubra and Federico Echenique March 2001 COWLES FOUNDATION DISCUSSION PAPER NO. 1296 COWLES FOUNDATION FOR RESEARCH IN ECONOMICS YALE UNIVERSITY Box 208281

More information

ABSTRACT EXPECTATION

ABSTRACT EXPECTATION ABSTRACT EXPECTATION Abstract. In undergraduate courses, expectation is sometimes defined twice, once for discrete random variables and again for continuous random variables. Here, we will give a definition

More information

Statistical methods in recognition. Why is classification a problem?

Statistical methods in recognition. Why is classification a problem? Statistical methods in recognition Basic steps in classifier design collect training images choose a classification model estimate parameters of classification model from training images evaluate model

More information

Basic Definitions: Indexed Collections and Random Functions

Basic Definitions: Indexed Collections and Random Functions Chapter 1 Basic Definitions: Indexed Collections and Random Functions Section 1.1 introduces stochastic processes as indexed collections of random variables. Section 1.2 builds the necessary machinery

More information

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A ) 6. Brownian Motion. stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ, P) and a real valued stochastic process can be defined

More information

Martingale Theory for Finance

Martingale Theory for Finance Martingale Theory for Finance Tusheng Zhang October 27, 2015 1 Introduction 2 Probability spaces and σ-fields 3 Integration with respect to a probability measure. 4 Conditional expectation. 5 Martingales.

More information

STAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes

STAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes STAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes This section introduces Lebesgue-Stieltjes integrals, and defines two important stochastic processes: a martingale process and a counting

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

CHAPTER 1. Martingales

CHAPTER 1. Martingales CHAPTER 1 Martingales The basic limit theorems of probability, such as the elementary laws of large numbers and central limit theorems, establish that certain averages of independent variables converge

More information

UCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei)

UCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei) UCSD ECE53 Handout #7 Prof. Young-Han Kim Tuesday, May 6, 4 Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei). Neural net. Let Y = X + Z, where the signal X U[,] and noise Z N(,) are independent.

More information

Course 212: Academic Year Section 1: Metric Spaces

Course 212: Academic Year Section 1: Metric Spaces Course 212: Academic Year 1991-2 Section 1: Metric Spaces D. R. Wilkins Contents 1 Metric Spaces 3 1.1 Distance Functions and Metric Spaces............. 3 1.2 Convergence and Continuity in Metric Spaces.........

More information

Exercise Exercise Homework #6 Solutions Thursday 6 April 2006

Exercise Exercise Homework #6 Solutions Thursday 6 April 2006 Unless otherwise stated, for the remainder of the solutions, define F m = σy 0,..., Y m We will show EY m = EY 0 using induction. m = 0 is obviously true. For base case m = : EY = EEY Y 0 = EY 0. Now assume

More information

Lecture 2: Random Variables and Expectation

Lecture 2: Random Variables and Expectation Econ 514: Probability and Statistics Lecture 2: Random Variables and Expectation Definition of function: Given sets X and Y, a function f with domain X and image Y is a rule that assigns to every x X one

More information

Lecture 5: Expectation

Lecture 5: Expectation Lecture 5: Expectation 1. Expectations for random variables 1.1 Expectations for simple random variables 1.2 Expectations for bounded random variables 1.3 Expectations for general random variables 1.4

More information

GEOMETRY AND PROBABILITY Math Fall Renato Feres

GEOMETRY AND PROBABILITY Math Fall Renato Feres GEOMETRY AND PROBABILITY Math 545 - Fall 2001 Renato Feres Contents Chapter 1. An Informal Introduction to Brownian Motion 5 1.1. Stochastic Differential Equations 5 1.2. The Microscopic View 6 1.3. Temperature

More information

Lecture 11. Probability Theory: an Overveiw

Lecture 11. Probability Theory: an Overveiw Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

Lecture 12: Multiple Random Variables and Independence

Lecture 12: Multiple Random Variables and Independence EE5110: Probability Foundations for Electrical Engineers July-November 2015 Lecture 12: Multiple Random Variables and Independence Instructor: Dr. Krishna Jagannathan Scribes: Debayani Ghosh, Gopal Krishna

More information

This chapter contains a very bare summary of some basic facts from topology.

This chapter contains a very bare summary of some basic facts from topology. Chapter 2 Topological Spaces This chapter contains a very bare summary of some basic facts from topology. 2.1 Definition of Topology A topology O on a set X is a collection of subsets of X satisfying the

More information