MATH 6605: SUMMARY LECTURE NOTES

Size: px
Start display at page:

Download "MATH 6605: SUMMARY LECTURE NOTES"

Transcription

1 MATH 6605: SUMMARY LECTURE NOTES These notes summarize the lectures on weak convergence of stochastic processes. If you see any typos, please let me know. 1. Construction of Stochastic rocesses A stochastic process is a collection {X t, t T } of random variables on a probability space Ω, F,. A natural interpretation of the index is time, but many other indices are possible. Example 1. Simple symmetric random walk T = N and oisson rocess T = R +. For each t 1, t 2,..., t k T, the random vector X t1,..., X tk has some distribution. Let s denote this as µ t1,...,t k H = X t1,..., X tk H, H B k, where B k are the Borel sets of R k. The collection of these distributions as the sets t 1, t 2,..., t k T vary {µ t1,...,t k : k N, t 1, t 2,..., t k T, distinct} create the finite dimensional distributions f.d.d. of the process {X t, t T }. Equivalently, we could specify the finite dimensional cumulative distribution functions and still have the same amount of information. For any distinct t 1, t 2,..., t k T, the f.d.d. s satisfy the consistency conditions: C1. µ t1,...,t k H 1... H k = µ ts1,...,t sk H s1... H sk, for H i B, i = 1..., k and any s1,..., sk, a permutation of 1,..., k. C2. µ t1,t 2,...,t k R H 2... H k = µ t2,...,t k H 2... H k for H i B, i = 2..., k. It turns out that any collection of distributions that satisfies the consistency conditions specifies a unique measure for a stochastic process. That is Theorem 1 Kolmogorov Existence Theorem. A family of Borel probability measures {µ t1,...,t k ; k N, t 1,..., t k T distinct} satisfies the consistency conditions C1 and C2 if and only if there exists a unique triple Ω = R T, F T, and random variables {X t ; t T } defined on this triple, such that for all k N, distinct t 1,..., t k T and Borel H R k X t1,..., X tk H = µ t1,...,t k H. Date: March 31,

2 2 MATH 6605: LECTURE NOTES A key quantity in this result is the σ-algebra, F T. This is the σ-algebra containing the cylinder sets: F T = σ {{X t H}; t T, H R Borel}. One way to think of the cylinder sets is by imagining a finite number of windows at a finite number of times. The cylinder set contains all of the functions which pass through each of the windows. The σ-algebra F T is quite sufficient detailed to answer questions about Ω = R T if T is countable. However, if T is uncountable, R T is much too large and F T is much too small to answer questions of interest. See for example Chapter 36 in [B2], section titled The inadequacy of F T. Example 2. Brownian Motion. Brownian motion has many properties which are quite well-known. One of these, is that it is continuous a.s. However, F T is too small a σ-algebra to ask this question. Instead, we will show that there exists a version of Brownian motion which has continuous sample paths. Let C[0, 1] be the space of continuous functions on the closed interval [0, 1] we could also work with C[0, 10] or C[0, ], the choice C[0, 1] is made for convenience. This is a complete separable metric space for the metric dx, y = sup x [0,1] xt yt uniform convergence. The next result tells us what conditions of the finite dimensional distributions guarantee the existence of a process with continuous sample paths. Theorem 2. Let Xt be a stochastic process such that E[ Xt Xs β ] M t s 1+α for some β, α > 0 and M <. Then the stochastic process can be realized on the space of continuous functions, C[0, 1]. The idea of the proof is to show that the polygonal approximation of X n t converges uniformly. The proof appears, for example, in Chapter 5 of [V]. Notably, the rate of the approximation depends only on the parameters M, α, β. Specifically, one can show that there exists a finite constant C = CM, α so that 1.1 sup X n t Xt > γ t [0,1] C γ β 2 nα. Specifically, the theorem tells us that there exists a version of Xt which is continuous a.s. and has the correct finite dimensional distributions. Example 3. Check property for Brownian Motion.

3 MATH 6605: LECTURE NOTES 3 Therefore we may realize Brownian Motion on the triple Ω = C[0, 1], F T,. Let B denote the σ-algebra generated by the open sets of C[0, 1]. It is not difficult to show that B = F T. This is important in light of how we handle weak convergence see Section 2. Example 4. Define Brownian Bridge. Check for continuity. Theorem 3. Brownian Motion is nowhere differentiable. The proof appears, for example, in Section 7.2 of [D]. 2. Weak Convergence in Metric Spaces This section summarizes the main results on weak convergence in metric spaces. references, see the appropriate sections in [B1, S, V]. For Let X be a separable metric space, and let B denote the σ-algebra generated by the open sets of X. Note that this guarantees that all continuous functions are measurable. A sequence of probability measures on X, B converges weakly, µ n µ, if fdµ n fdµ for all continuous bounded functions f : X R. If Xn is a sequence of random variables on X with LX n = µ n and LX = µ, then the statements X n X and µ n µ are equivalent. Theorem 4. The following are equivalent. 1 µ n µ 2 lim sup n µ n C µc for all closed C 3 lim inf n µ n G µg for all open G 4 lim µ n A = µa for all A B such that µ A = 0. Theorem 5 Continuous Mapping Theorem. Suppose that X n X on X and let g : X R be a functions which is continuous LX a.s.. Then gx n gx. Similar to our definitions before, we say that a sequence {µ n }, of probability measures on X, is tight if for all ε > 0, there exists a compact set K = Kε X such that sup n µ n K c < ε. For X = R d, the definition is the same if we replace the set K with a bounded rectangle. Theorem 6. Suppose that {µ n } is tight and has only one limit point {µ}. Then µ n µ. We have already studied and proved these results for X = R or X = R d. We did not prove the general case in lecture, as they are similar in nature.

4 4 MATH 6605: LECTURE NOTES 3. Weak Convergence in C[0, 1] Our next goal is to apply the results of Section 2 to weak convergence of stochastic processes in C[0, 1]. By Theorem 6, to check weak convergence of stochastic processes it is enough to check 1 that the sequence of measures is tight, and 2 that the finite dimensional distributions converge. Checking tightness can be quite a work-out at times. We will look at three different ways of doing this. But first, we need to figure out what compact sets in C[0, 1] look like. A family of functions F is equicontinuous if lim sup sup δ 0 x F s t <δ xt xs = 0. This allows us to characterize the compact sets in C[0, 1]. Theorem 7 Ascoli-Arzelá. A set K C[0, 1] has compact closure if it is equicontinuous and sup x K x0 < First Results. An immediate consequence of Theorem 7 is the following. roposition 8. Suppose that X n C[0, 1] and that 1 lim M sup n X n 0 > M = 0 2 For all ɛ > 0, lim δ 0 sup n sup s t δ X n t X n s > ɛ = 0 Then LX n is tight. The result gives a direct brute force approach to checking tightness in many cases. The difficulty of course lies in the presence of the sup inside the probability. The next theorem gives conditions which are much easier to check. Theorem 9. Suppose that µ n = LX n where X n : [0, 1] R. If E[ X n t X n s β ] M t s 1+α for some M < and α, β > 0, all independent of n, and if sup X n 0 > γ 0 n as γ, then {µ n } is a tight sequence of probability measures on C[0, 1]. Sketch of proof. In class I sketched a proof of this result. A key to this was taking as fact 1.1. Let X be any stochastic process which satisfies the conditions of the theorem and let

5 MATH 6605: LECTURE NOTES 5 X m denote its polygonal approximation on 0, 1/2 m,..., 1. For δ = 2 m Cheb. assumptions sup t s δ X m t X m s > ɛ sup i + 1 i X X > ɛ i 2 m 2 m 2 i + 1 i 2 m sup X X > ɛ i 2 m 2 m 2 [ ] i + 1 i β 2 m sup ɛ β 2 β E X X i 2 m 2 m 2 m ɛ β 2 β M2 m1+α = ɛ β 2 β M2 αm. Therefore, together with 1.1, we obtain that sup t s δ Xt Xs > ɛ sup X m t X m s > ɛ + 2 sup X m t Xs > ɛ t s δ 3 t 3 ɛ β 6 β CM, α3β M + 2 δ α. ɛ β Since the bounds are uniform in any choice of X satisfying the conditions of the theorem, the result follows. Example 5. Check that the continuous polygonal version of the re-scaled random walk 1 n X nt converges to Brownian motion Empirical rocesses. Let X 1, X 2,... be iid random variables with some cumulative distribution function F. The empirical distribution function is defined as F n X = 1 n IX i x. n i=1 This quantity is of key interest in statistics. Our goal is to show the following two results. Theorem 10 Glivenko-Cantelli. sup F n x F x 0, a.s. x

6 6 MATH 6605: LECTURE NOTES Theorem 11 Donsker. Let Ux denote the Brownian Bridge. Then sup nf n x F x sup UF x. x x One can also show that sup Ut α = k+1 e 2k2 α 2. 0t1 k=1 In this section we will prove the above results by showing Theorem 12. Let Ux denote the Brownian Bridge, and suppose that F x = x. That is, we are sampling from the Uniform[0,1] distribution. Then Y n x { nf n x x} {UF x}. Exercise 1. Show that Theorem 13 implies Theorems 10 and 11. Exercise 2. Show that Theorem 13 holds in finite dimensional distributions. As before, to prove Theorem 13, we need to first place F n in the space of continuous functions. To do this, define G n to be the distribution function corresponding to a uniform distribution of mass n over the intervals [X i, X i+1 ] for i = 1,..., n, where X 0 = 0 and X n+1 = 1. It is not difficult to show that 3.2 F n x G n x 1 n, 0 x 1. The theorem we will prove is the following. Theorem 13. Let Ux denote the Brownian Bridge, and suppose that F x = x. That is, we are sampling from the Uniform[0,1] distribution. Then Z n x { ng n x x} {Ux}. Since we already have convergence of finite dimensional distributions exercise, by 3.2 and Theorem 8 [correction : actually, I should refer to Theorem 16 here], to prove tightness it remains to show that for all η > 0 sup Y n t Y n s > η t s δ as δ 0. Because F x = x, it is enough to check sup Y n t > η t δ 0. 0

7 Define the quantities Since, M n M n + Y n δ, MATH 6605: LECTURE NOTES 7 M m = max nδi/m, Y n δ Y n δi/m } 1im M m = max nδi/m }. 1im M m > η M m > η/2 + Y n δ > η/2. We will show that M m > η/2 Bη 4 δ 2 in a little bit. By right-continuity of F n and Chebyshev, we can let m in the above to get that sup Y n t > η t δ Bη 4 δ η 4 E[ Y nδ 4 ]. Noting that lim n E[ Y n δ 4 ] 3δ 2 completes the proof. Theorem 14. Let ξ 1, ξ 2,... be random variables not necessarily iid, and define S k = ξ 1 + ξ ξ k, with S 0 = 0. Let M m = max 0km min { S k, S m S k }. Suppose there exists non-negative numbers u 1, u 2,... such that E [ S j S i 2 S k S j 2] u l. Then there exists a constant K 1, large enough so that 22 2/5 + K 1/5 5 1, satisfying i<lk M m λ K λ 4 u u m 2 Exercise 3. Show that Theorem 14 implies the remaining bound M m > η/2 Bη 4 δ 2, for some positive B <. That is, let ξ i = Y n δi/m Y n δi 1/m, and show that in this case S k satisfies condition Using Martingale Theory. I will omit the write up of the basic martingale stuff, since this is not only fairly straightforward, but also appears in Chapter 14 of [R]. The key result that we are interested in is Theorem 15. If {X n } is a submartingale, then for all α > 0 [ ] max X i α E[ X n ]. 0in α

8 8 MATH 6605: LECTURE NOTES In this section we will apply this result to provide an alternative proof of Example 5. First though, I need to address a technical issue. I realize that roposition 8 is a little too stringent for our purposes, and so we will next state a slightly easier version. roposition 16. Suppose that X n C[0, 1] and let µ n = LX n. The sequence {µ n } is tight if and only if 1 lim M sup n X n 0 > M = 0 2 For all ɛ, η > 0, there exists a δ 0, 1, and an integer n 0 such that sup n n 0 sup X n t X n s > ɛ s t δ Sketch of proof. The result follows from roposition 8 and the fact that any random variable X C[0, 1] has a tight measure. This last fact is a bit technical, and its proof appears, for example, in [B1], page 10, Theorem 1.4. Therefore, by the Arzela-Ascoli result, for any fixed n, lim δ 0 sup X n t X n s > ɛ s t δ Now, since each µ 1, µ 2,..., µ n0 is tight, we can strengthen condition 2 above by decreasing δ is necessary, to obtain condition 2 of roposition 8. Example 6. Let X n be a symmetric random walk. Using roposition 16 and Theorem 15, show that 1 n X nt converges to Brownian motion. η. = Continuous-time Martingales Consider a stochastic process {X t } t 0, defined on Ω, F,. A filtration is a family of σ algebras, {F t } t 0, such that F s F t F, for all s t. The process X t is a martingale with respect to the filtration {F t }, if 1 E[ X t ] < 2 E[X t F s ] = X s for all s t. As before, if the equality is replaced with then we have a submartingale. If it is replaced with a then we have a supermartingale. Example 7. B t and B 2 t t are both martingales with respect to the canonical filtration of Brownian motion, F t = σx u, u t. Example 8. A stochastic process, X t, is a Brownian motion if and only if for all λ, Z t = exp{λx t λ 2 t/2} is a martingale.

9 MATH 6605: LECTURE NOTES 9 Exercise 4. Let N t be a oisson process with rate λ. Find function at, bt that N t at and N 2 t bt are both martingales with F t = σn u, u t. A non-negative random variable is a stopping time with respect to a filtration, {F t } t 0, if for each t 0, {ω, : τω t} F t. That is, by time t, you know whether or not you have stopped. There is also a continuous version of the optional sampling theorem. Theorem 17 Optional Sampling Theorem. Let M t be a right-continuous martingale with respect to {F t } t 0, and let τ, σ be two stopping times such that 0 σ τ M <. Then E[M σ ] = E[M τ ]. In particular, it follows that if τ is a bounded stopping time, then E[M τ ] = E[M 0 ]. Note that the theorem does not hold if the stopping time is not bounded. This idea may be used to find the distribution of the hitting time of Brownian motion. Exercise 5. Let τ = inf{t 0 : B t 1}. Then τ is not a bounded stopping time, but τ n is, for any n. Therefore, we may apply the optional sampling theorem to the martingale in Example 8 and τ n. Letting n go to infinity using the BCT, we obtain that E [ exp{ λ 2 τ/2} ] = exp{ λ}. A transformation and inversion of the Laplace transform gets us to τ t = t 0 e 1 2u 2πu 3 du. Another way of computing this distribution is through the reflection principle. Theorem 18. If B t is a Brownian motion starting at 0, then for any a > 0 sup B s a 0st = 2 B t a. Here, we have simply that τ t = sup 0st B s 1 = 2 B t 1, and a simple transformation yields the result. Another application albeit much more tedious of the reflection principle is the proof of the next result. Theorem 19. Let Ut denote a Brownian Bridge. Then sup Us b = 1 2 0s1 k 1 1 k+1 e 2k2 b 2. The proof appears in, for example, [D].

10 10 MATH 6605: LECTURE NOTES References [B1] atrick Billingsley; Convergence of robability Measures, Wiley, [B2] atrick Billingsley; robability and Measure, 3rd Edition, Wiley, [D] Rick Durrett; robability: Theory and Examples, 3rd Edition, Duxbury ress, [R] Jeff Rosenthal; A First Look at Rigorous robability Theory. 2nd Edition, World Scientific, [S] Galen Shorack; robability for Statisticians, Springer, [V] S.R.S. Varadhan; Stochastic rocesses, Courant, repared by Hanna K. Jankowski Department of Mathematics and Statistics, York University hkj@mathstat.yorku.ca

Weak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij

Weak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij Weak convergence and Brownian Motion (telegram style notes) P.J.C. Spreij this version: December 8, 2006 1 The space C[0, ) In this section we summarize some facts concerning the space C[0, ) of real

More information

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. Lecture 2 1 Martingales We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. 1.1 Doob s inequality We have the following maximal

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Empirical Processes: General Weak Convergence Theory

Empirical Processes: General Weak Convergence Theory Empirical Processes: General Weak Convergence Theory Moulinath Banerjee May 18, 2010 1 Extended Weak Convergence The lack of measurability of the empirical process with respect to the sigma-field generated

More information

1 Weak Convergence in R k

1 Weak Convergence in R k 1 Weak Convergence in R k Byeong U. Park 1 Let X and X n, n 1, be random vectors taking values in R k. These random vectors are allowed to be defined on different probability spaces. Below, for the simplicity

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Introduction to Empirical Processes and Semiparametric Inference Lecture 09: Stochastic Convergence, Continued

Introduction to Empirical Processes and Semiparametric Inference Lecture 09: Stochastic Convergence, Continued Introduction to Empirical Processes and Semiparametric Inference Lecture 09: Stochastic Convergence, Continued Michael R. Kosorok, Ph.D. Professor and Chair of Biostatistics Professor of Statistics and

More information

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

Brownian Motion and Stochastic Calculus

Brownian Motion and Stochastic Calculus ETHZ, Spring 17 D-MATH Prof Dr Martin Larsson Coordinator A Sepúlveda Brownian Motion and Stochastic Calculus Exercise sheet 6 Please hand in your solutions during exercise class or in your assistant s

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

Lecture 17 Brownian motion as a Markov process

Lecture 17 Brownian motion as a Markov process Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is

More information

SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES

SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES RUTH J. WILLIAMS October 2, 2017 Department of Mathematics, University of California, San Diego, 9500 Gilman Drive,

More information

Lecture 21 Representations of Martingales

Lecture 21 Representations of Martingales Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let

More information

JUSTIN HARTMANN. F n Σ.

JUSTIN HARTMANN. F n Σ. BROWNIAN MOTION JUSTIN HARTMANN Abstract. This paper begins to explore a rigorous introduction to probability theory using ideas from algebra, measure theory, and other areas. We start with a basic explanation

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 7 9/25/2013

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 7 9/25/2013 MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 7 9/5/013 The Reflection Principle. The Distribution of the Maximum. Brownian motion with drift Content. 1. Quick intro to stopping times.

More information

Weak convergence and Compactness.

Weak convergence and Compactness. Chapter 4 Weak convergence and Compactness. Let be a complete separable metic space and B its Borel σ field. We denote by M() the space of probability measures on (, B). A sequence µ n M() of probability

More information

Selected Exercises on Expectations and Some Probability Inequalities

Selected Exercises on Expectations and Some Probability Inequalities Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ

More information

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1 Problem Sheet 1 1. Let Ω = {1, 2, 3}. Let F = {, {1}, {2, 3}, {1, 2, 3}}, F = {, {2}, {1, 3}, {1, 2, 3}}. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X :

More information

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Prakash Balachandran Department of Mathematics Duke University April 2, 2008 1 Review of Discrete-Time

More information

Introduction to Empirical Processes and Semiparametric Inference Lecture 02: Overview Continued

Introduction to Empirical Processes and Semiparametric Inference Lecture 02: Overview Continued Introduction to Empirical Processes and Semiparametric Inference Lecture 02: Overview Continued Michael R. Kosorok, Ph.D. Professor and Chair of Biostatistics Professor of Statistics and Operations Research

More information

Lecture 12. F o s, (1.1) F t := s>t

Lecture 12. F o s, (1.1) F t := s>t Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let

More information

Math 6810 (Probability) Fall Lecture notes

Math 6810 (Probability) Fall Lecture notes Math 6810 (Probability) Fall 2012 Lecture notes Pieter Allaart University of North Texas September 23, 2012 2 Text: Introduction to Stochastic Calculus with Applications, by Fima C. Klebaner (3rd edition),

More information

Notes 1 : Measure-theoretic foundations I

Notes 1 : Measure-theoretic foundations I Notes 1 : Measure-theoretic foundations I Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Section 1.0-1.8, 2.1-2.3, 3.1-3.11], [Fel68, Sections 7.2, 8.1, 9.6], [Dur10,

More information

In terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3.

In terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3. 1. GAUSSIAN PROCESSES A Gaussian process on a set T is a collection of random variables X =(X t ) t T on a common probability space such that for any n 1 and any t 1,...,t n T, the vector (X(t 1 ),...,X(t

More information

Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012

Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012 Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 202 The exam lasts from 9:00am until 2:00pm, with a walking break every hour. Your goal on this exam should be to demonstrate mastery of

More information

Product measure and Fubini s theorem

Product measure and Fubini s theorem Chapter 7 Product measure and Fubini s theorem This is based on [Billingsley, Section 18]. 1. Product spaces Suppose (Ω 1, F 1 ) and (Ω 2, F 2 ) are two probability spaces. In a product space Ω = Ω 1 Ω

More information

Exercises: sheet 1. k=1 Y k is called compound Poisson process (X t := 0 if N t = 0).

Exercises: sheet 1. k=1 Y k is called compound Poisson process (X t := 0 if N t = 0). Exercises: sheet 1 1. Prove: Let X be Poisson(s) and Y be Poisson(t) distributed. If X and Y are independent, then X + Y is Poisson(t + s) distributed (t, s > 0). This means that the property of a convolution

More information

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik Stochastic Processes Winter Term 2016-2017 Paolo Di Tella Technische Universität Dresden Institut für Stochastik Contents 1 Preliminaries 5 1.1 Uniform integrability.............................. 5 1.2

More information

Exercises Measure Theoretic Probability

Exercises Measure Theoretic Probability Exercises Measure Theoretic Probability 2002-2003 Week 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary

More information

Random Process Lecture 1. Fundamentals of Probability

Random Process Lecture 1. Fundamentals of Probability Random Process Lecture 1. Fundamentals of Probability Husheng Li Min Kao Department of Electrical Engineering and Computer Science University of Tennessee, Knoxville Spring, 2016 1/43 Outline 2/43 1 Syllabus

More information

Do stochastic processes exist?

Do stochastic processes exist? Project 1 Do stochastic processes exist? If you wish to take this course for credit, you should keep a notebook that contains detailed proofs of the results sketched in my handouts. You may consult any

More information

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION We will define local time for one-dimensional Brownian motion, and deduce some of its properties. We will then use the generalized Ray-Knight theorem proved in

More information

Introduction to Empirical Processes and Semiparametric Inference Lecture 08: Stochastic Convergence

Introduction to Empirical Processes and Semiparametric Inference Lecture 08: Stochastic Convergence Introduction to Empirical Processes and Semiparametric Inference Lecture 08: Stochastic Convergence Michael R. Kosorok, Ph.D. Professor and Chair of Biostatistics Professor of Statistics and Operations

More information

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Matija Vidmar February 7, 2018 1 Dynkin and π-systems Some basic

More information

An almost sure invariance principle for additive functionals of Markov chains

An almost sure invariance principle for additive functionals of Markov chains Statistics and Probability Letters 78 2008 854 860 www.elsevier.com/locate/stapro An almost sure invariance principle for additive functionals of Markov chains F. Rassoul-Agha a, T. Seppäläinen b, a Department

More information

Exercises. T 2T. e ita φ(t)dt.

Exercises. T 2T. e ita φ(t)dt. Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.

More information

Lecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past.

Lecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past. 1 Markov chain: definition Lecture 5 Definition 1.1 Markov chain] A sequence of random variables (X n ) n 0 taking values in a measurable state space (S, S) is called a (discrete time) Markov chain, if

More information

A D VA N C E D P R O B A B I L - I T Y

A D VA N C E D P R O B A B I L - I T Y A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2

More information

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1 Chapter 2 Probability measures 1. Existence Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension to the generated σ-field Proof of Theorem 2.1. Let F 0 be

More information

MATH 5616H INTRODUCTION TO ANALYSIS II SAMPLE FINAL EXAM: SOLUTIONS

MATH 5616H INTRODUCTION TO ANALYSIS II SAMPLE FINAL EXAM: SOLUTIONS MATH 5616H INTRODUCTION TO ANALYSIS II SAMPLE FINAL EXAM: SOLUTIONS You may not use notes, books, etc. Only the exam paper, a pencil or pen may be kept on your desk during the test. Calculators are not

More information

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A ) 6. Brownian Motion. stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ, P) and a real valued stochastic process can be defined

More information

OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS

OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS APPLICATIONES MATHEMATICAE 29,4 (22), pp. 387 398 Mariusz Michta (Zielona Góra) OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS Abstract. A martingale problem approach is used first to analyze

More information

THEOREMS, ETC., FOR MATH 515

THEOREMS, ETC., FOR MATH 515 THEOREMS, ETC., FOR MATH 515 Proposition 1 (=comment on page 17). If A is an algebra, then any finite union or finite intersection of sets in A is also in A. Proposition 2 (=Proposition 1.1). For every

More information

Probability Theory I: Syllabus and Exercise

Probability Theory I: Syllabus and Exercise Probability Theory I: Syllabus and Exercise Narn-Rueih Shieh **Copyright Reserved** This course is suitable for those who have taken Basic Probability; some knowledge of Real Analysis is recommended( will

More information

Stochastic integration. P.J.C. Spreij

Stochastic integration. P.J.C. Spreij Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................

More information

STAT 7032 Probability Spring Wlodek Bryc

STAT 7032 Probability Spring Wlodek Bryc STAT 7032 Probability Spring 2018 Wlodek Bryc Created: Friday, Jan 2, 2014 Revised for Spring 2018 Printed: January 9, 2018 File: Grad-Prob-2018.TEX Department of Mathematical Sciences, University of Cincinnati,

More information

Weak convergence and large deviation theory

Weak convergence and large deviation theory First Prev Next Go To Go Back Full Screen Close Quit 1 Weak convergence and large deviation theory Large deviation principle Convergence in distribution The Bryc-Varadhan theorem Tightness and Prohorov

More information

Stochastic Processes

Stochastic Processes Stochastic Processes A very simple introduction Péter Medvegyev 2009, January Medvegyev (CEU) Stochastic Processes 2009, January 1 / 54 Summary from measure theory De nition (X, A) is a measurable space

More information

Spring 2014 Advanced Probability Overview. Lecture Notes Set 1: Course Overview, σ-fields, and Measures

Spring 2014 Advanced Probability Overview. Lecture Notes Set 1: Course Overview, σ-fields, and Measures 36-752 Spring 2014 Advanced Probability Overview Lecture Notes Set 1: Course Overview, σ-fields, and Measures Instructor: Jing Lei Associated reading: Sec 1.1-1.4 of Ash and Doléans-Dade; Sec 1.1 and A.1

More information

The strictly 1/2-stable example

The strictly 1/2-stable example The strictly 1/2-stable example 1 Direct approach: building a Lévy pure jump process on R Bert Fristedt provided key mathematical facts for this example. A pure jump Lévy process X is a Lévy process such

More information

The main results about probability measures are the following two facts:

The main results about probability measures are the following two facts: Chapter 2 Probability measures The main results about probability measures are the following two facts: Theorem 2.1 (extension). If P is a (continuous) probability measure on a field F 0 then it has a

More information

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Noèlia Viles Cuadros BCAM- Basque Center of Applied Mathematics with Prof. Enrico

More information

Part III Advanced Probability

Part III Advanced Probability Part III Advanced Probability Based on lectures by M. Lis Notes taken by Dexter Chua Michaelmas 2017 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after

More information

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes BMS Basic Course Stochastic Processes II/ Wahrscheinlichkeitstheorie III Michael Scheutzow Lecture Notes Technische Universität Berlin Sommersemester 218 preliminary version October 12th 218 Contents

More information

Martingales, standard filtrations, and stopping times

Martingales, standard filtrations, and stopping times Project 4 Martingales, standard filtrations, and stopping times Throughout this Project the index set T is taken to equal R +, unless explicitly noted otherwise. Some things you might want to explain in

More information

Verona Course April Lecture 1. Review of probability

Verona Course April Lecture 1. Review of probability Verona Course April 215. Lecture 1. Review of probability Viorel Barbu Al.I. Cuza University of Iaşi and the Romanian Academy A probability space is a triple (Ω, F, P) where Ω is an abstract set, F is

More information

Exercises: sheet 1. k=1 Y k is called compound Poisson process (X t := 0 if N t = 0).

Exercises: sheet 1. k=1 Y k is called compound Poisson process (X t := 0 if N t = 0). Exercises: sheet 1 1. Prove: Let X be Poisson(s) and Y be Poisson(t) distributed. If X and Y are independent, then X + Y is Poisson(t + s) distributed (t, s > 0). This means that the property of a convolution

More information

Useful Probability Theorems

Useful Probability Theorems Useful Probability Theorems Shiu-Tang Li Finished: March 23, 2013 Last updated: November 2, 2013 1 Convergence in distribution Theorem 1.1. TFAE: (i) µ n µ, µ n, µ are probability measures. (ii) F n (x)

More information

13 The martingale problem

13 The martingale problem 19-3-2012 Notations Ω complete metric space of all continuous functions from [0, + ) to R d endowed with the distance d(ω 1, ω 2 ) = k=1 ω 1 ω 2 C([0,k];H) 2 k (1 + ω 1 ω 2 C([0,k];H) ), ω 1, ω 2 Ω. F

More information

An Introduction to Stochastic Processes in Continuous Time

An Introduction to Stochastic Processes in Continuous Time An Introduction to Stochastic Processes in Continuous Time Flora Spieksma adaptation of the text by Harry van Zanten to be used at your own expense May 22, 212 Contents 1 Stochastic Processes 1 1.1 Introduction......................................

More information

STAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes

STAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes STAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes This section introduces Lebesgue-Stieltjes integrals, and defines two important stochastic processes: a martingale process and a counting

More information

Lecture 10. Theorem 1.1 [Ergodicity and extremality] A probability measure µ on (Ω, F) is ergodic for T if and only if it is an extremal point in M.

Lecture 10. Theorem 1.1 [Ergodicity and extremality] A probability measure µ on (Ω, F) is ergodic for T if and only if it is an extremal point in M. Lecture 10 1 Ergodic decomposition of invariant measures Let T : (Ω, F) (Ω, F) be measurable, and let M denote the space of T -invariant probability measures on (Ω, F). Then M is a convex set, although

More information

Math 209B Homework 2

Math 209B Homework 2 Math 29B Homework 2 Edward Burkard Note: All vector spaces are over the field F = R or C 4.6. Two Compactness Theorems. 4. Point Set Topology Exercise 6 The product of countably many sequentally compact

More information

Lecture 22 Girsanov s Theorem

Lecture 22 Girsanov s Theorem Lecture 22: Girsanov s Theorem of 8 Course: Theory of Probability II Term: Spring 25 Instructor: Gordan Zitkovic Lecture 22 Girsanov s Theorem An example Consider a finite Gaussian random walk X n = n

More information

1. Aufgabenblatt zur Vorlesung Probability Theory

1. Aufgabenblatt zur Vorlesung Probability Theory 24.10.17 1. Aufgabenblatt zur Vorlesung By (Ω, A, P ) we always enote the unerlying probability space, unless state otherwise. 1. Let r > 0, an efine f(x) = 1 [0, [ (x) exp( r x), x R. a) Show that p f

More information

More Empirical Process Theory

More Empirical Process Theory More Empirical Process heory 4.384 ime Series Analysis, Fall 2008 Recitation by Paul Schrimpf Supplementary to lectures given by Anna Mikusheva October 24, 2008 Recitation 8 More Empirical Process heory

More information

From now on, we will represent a metric space with (X, d). Here are some examples: i=1 (x i y i ) p ) 1 p, p 1.

From now on, we will represent a metric space with (X, d). Here are some examples: i=1 (x i y i ) p ) 1 p, p 1. Chapter 1 Metric spaces 1.1 Metric and convergence We will begin with some basic concepts. Definition 1.1. (Metric space) Metric space is a set X, with a metric satisfying: 1. d(x, y) 0, d(x, y) = 0 x

More information

Doléans measures. Appendix C. C.1 Introduction

Doléans measures. Appendix C. C.1 Introduction Appendix C Doléans measures C.1 Introduction Once again all random processes will live on a fixed probability space (Ω, F, P equipped with a filtration {F t : 0 t 1}. We should probably assume the filtration

More information

Homework Assignment #2 for Prob-Stats, Fall 2018 Due date: Monday, October 22, 2018

Homework Assignment #2 for Prob-Stats, Fall 2018 Due date: Monday, October 22, 2018 Homework Assignment #2 for Prob-Stats, Fall 2018 Due date: Monday, October 22, 2018 Topics: consistent estimators; sub-σ-fields and partial observations; Doob s theorem about sub-σ-field measurability;

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

General Glivenko-Cantelli theorems

General Glivenko-Cantelli theorems The ISI s Journal for the Rapid Dissemination of Statistics Research (wileyonlinelibrary.com) DOI: 10.100X/sta.0000......................................................................................................

More information

A Barrier Version of the Russian Option

A Barrier Version of the Russian Option A Barrier Version of the Russian Option L. A. Shepp, A. N. Shiryaev, A. Sulem Rutgers University; shepp@stat.rutgers.edu Steklov Mathematical Institute; shiryaev@mi.ras.ru INRIA- Rocquencourt; agnes.sulem@inria.fr

More information

Exercises in stochastic analysis

Exercises in stochastic analysis Exercises in stochastic analysis Franco Flandoli, Mario Maurelli, Dario Trevisan The exercises with a P are those which have been done totally or partially) in the previous lectures; the exercises with

More information

Math 321 Final Examination April 1995 Notation used in this exam: N. (1) S N (f,x) = f(t)e int dt e inx.

Math 321 Final Examination April 1995 Notation used in this exam: N. (1) S N (f,x) = f(t)e int dt e inx. Math 321 Final Examination April 1995 Notation used in this exam: N 1 π (1) S N (f,x) = f(t)e int dt e inx. 2π n= N π (2) C(X, R) is the space of bounded real-valued functions on the metric space X, equipped

More information

Homework # , Spring Due 14 May Convergence of the empirical CDF, uniform samples

Homework # , Spring Due 14 May Convergence of the empirical CDF, uniform samples Homework #3 36-754, Spring 27 Due 14 May 27 1 Convergence of the empirical CDF, uniform samples In this problem and the next, X i are IID samples on the real line, with cumulative distribution function

More information

The Arzelà-Ascoli Theorem

The Arzelà-Ascoli Theorem John Nachbar Washington University March 27, 2016 The Arzelà-Ascoli Theorem The Arzelà-Ascoli Theorem gives sufficient conditions for compactness in certain function spaces. Among other things, it helps

More information

Probability and Measure

Probability and Measure Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability

More information

The Contraction Method on C([0, 1]) and Donsker s Theorem

The Contraction Method on C([0, 1]) and Donsker s Theorem The Contraction Method on C([0, 1]) and Donsker s Theorem Henning Sulzbach J. W. Goethe-Universität Frankfurt a. M. YEP VII Probability, random trees and algorithms Eindhoven, March 12, 2010 joint work

More information

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,

More information

Continuous Functions on Metric Spaces

Continuous Functions on Metric Spaces Continuous Functions on Metric Spaces Math 201A, Fall 2016 1 Continuous functions Definition 1. Let (X, d X ) and (Y, d Y ) be metric spaces. A function f : X Y is continuous at a X if for every ɛ > 0

More information

Jump Processes. Richard F. Bass

Jump Processes. Richard F. Bass Jump Processes Richard F. Bass ii c Copyright 214 Richard F. Bass Contents 1 Poisson processes 1 1.1 Definitions............................. 1 1.2 Stopping times.......................... 3 1.3 Markov

More information

Properties of an infinite dimensional EDS system : the Muller s ratchet

Properties of an infinite dimensional EDS system : the Muller s ratchet Properties of an infinite dimensional EDS system : the Muller s ratchet LATP June 5, 2011 A ratchet source : wikipedia Plan 1 Introduction : The model of Haigh 2 3 Hypothesis (Biological) : The population

More information

PROBABILITY THEORY II

PROBABILITY THEORY II Ruprecht-Karls-Universität Heidelberg Institut für Angewandte Mathematik Prof. Dr. Jan JOHANNES Outline of the lecture course PROBABILITY THEORY II Summer semester 2016 Preliminary version: April 21, 2016

More information

Basic Definitions: Indexed Collections and Random Functions

Basic Definitions: Indexed Collections and Random Functions Chapter 1 Basic Definitions: Indexed Collections and Random Functions Section 1.1 introduces stochastic processes as indexed collections of random variables. Section 1.2 builds the necessary machinery

More information

Stochastic Calculus and Black-Scholes Theory MTH772P Exercises Sheet 1

Stochastic Calculus and Black-Scholes Theory MTH772P Exercises Sheet 1 Stochastic Calculus and Black-Scholes Theory MTH772P Exercises Sheet. For ξ, ξ 2, i.i.d. with P(ξ i = ± = /2 define the discrete-time random walk W =, W n = ξ +... + ξ n. (i Formulate and prove the property

More information

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI Contents Preface 1 1. Introduction 1 2. Preliminaries 4 3. Local martingales 1 4. The stochastic integral 16 5. Stochastic calculus 36 6. Applications

More information

Math 328 Course Notes

Math 328 Course Notes Math 328 Course Notes Ian Robertson March 3, 2006 3 Properties of C[0, 1]: Sup-norm and Completeness In this chapter we are going to examine the vector space of all continuous functions defined on the

More information

Section 45. Compactness in Metric Spaces

Section 45. Compactness in Metric Spaces 45. Compactness in Metric Spaces 1 Section 45. Compactness in Metric Spaces Note. In this section we relate compactness to completeness through the idea of total boundedness (in Theorem 45.1). We define

More information

(B(t i+1 ) B(t i )) 2

(B(t i+1 ) B(t i )) 2 ltcc5.tex Week 5 29 October 213 Ch. V. ITÔ (STOCHASTIC) CALCULUS. WEAK CONVERGENCE. 1. Quadratic Variation. A partition π n of [, t] is a finite set of points t ni such that = t n < t n1

More information

Exercises Measure Theoretic Probability

Exercises Measure Theoretic Probability Exercises Measure Theoretic Probability Chapter 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary family

More information

P (A G) dp G P (A G)

P (A G) dp G P (A G) First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume

More information

Probability Theory. Richard F. Bass

Probability Theory. Richard F. Bass Probability Theory Richard F. Bass ii c Copyright 2014 Richard F. Bass Contents 1 Basic notions 1 1.1 A few definitions from measure theory............. 1 1.2 Definitions............................. 2

More information

ERRATA: Probabilistic Techniques in Analysis

ERRATA: Probabilistic Techniques in Analysis ERRATA: Probabilistic Techniques in Analysis ERRATA 1 Updated April 25, 26 Page 3, line 13. A 1,..., A n are independent if P(A i1 A ij ) = P(A 1 ) P(A ij ) for every subset {i 1,..., i j } of {1,...,

More information

Additive Lévy Processes

Additive Lévy Processes Additive Lévy Processes Introduction Let X X N denote N independent Lévy processes on. We can construct an N-parameter stochastic process, indexed by N +, as follows: := X + + X N N for every := ( N )

More information

Lecture 19 L 2 -Stochastic integration

Lecture 19 L 2 -Stochastic integration Lecture 19: L 2 -Stochastic integration 1 of 12 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 19 L 2 -Stochastic integration The stochastic integral for processes

More information

On the submartingale / supermartingale property of diffusions in natural scale

On the submartingale / supermartingale property of diffusions in natural scale On the submartingale / supermartingale property of diffusions in natural scale Alexander Gushchin Mikhail Urusov Mihail Zervos November 13, 214 Abstract Kotani 5 has characterised the martingale property

More information

µ X (A) = P ( X 1 (A) )

µ X (A) = P ( X 1 (A) ) 1 STOCHASTIC PROCESSES This appendix provides a very basic introduction to the language of probability theory and stochastic processes. We assume the reader is familiar with the general measure and integration

More information

Summer Jump-Start Program for Analysis, 2012 Song-Ying Li. 1 Lecture 7: Equicontinuity and Series of functions

Summer Jump-Start Program for Analysis, 2012 Song-Ying Li. 1 Lecture 7: Equicontinuity and Series of functions Summer Jump-Start Program for Analysis, 0 Song-Ying Li Lecture 7: Equicontinuity and Series of functions. Equicontinuity Definition. Let (X, d) be a metric space, K X and K is a compact subset of X. C(K)

More information

FOUNDATIONS OF MARTINGALE THEORY AND STOCHASTIC CALCULUS FROM A FINANCE PERSPECTIVE

FOUNDATIONS OF MARTINGALE THEORY AND STOCHASTIC CALCULUS FROM A FINANCE PERSPECTIVE FOUNDATIONS OF MARTINGALE THEORY AND STOCHASTIC CALCULUS FROM A FINANCE PERSPECTIVE JOSEF TEICHMANN 1. Introduction The language of mathematical Finance allows to express many results of martingale theory

More information