Survival Analysis: Counting Process and Martingale. Lu Tian and Richard Olshen Stanford University
|
|
- Morgan Neal
- 5 years ago
- Views:
Transcription
1 Survival Analysis: Counting Process and Martingale Lu Tian and Richard Olshen Stanford University 1
2 Lebesgue-Stieltjes Integrals G( ) is a right-continuous step function having jumps at x 1, x 2,.. b f(x)dg(x) = f(x j ){G(x j ) G(x j )) = f(x j ) {G(x j )}. a a<x j b a<x j b 2
3 One-sample problem In one-sample problem: observation (U i, δ i ), i = 1, 2,, n. N(t) = n i=1 I(U i t, δ i = 1) # observed failures by time t 0 f(t)dn(t) = K f(τ j )d j j=1 f(t) = { n i=1 I(U i t)} 1 0 f(t)dn(t) = K j=1 d j Y (τ j ), The NA estimator for the cumulative hazard function! 3
4 Two-sample problem In two-sample problem: observation (U i, δ i, Z i ), i = 1, 2,, n. N l (t) = n i=1 I(U i t, δ i = 1, Z i = l) # observed failures by time t at group l, l = 0, 1. Y l (t) = n i=1 I(U i t, Z i = l) # at risk by time t at group l, l = 0, 1. Consider the stochastic integral W = 0 Y 0 (t) Y (t) dn 1(t) 0 Y 1 (t) Y (t) dn 0(t) 4
5 Two-sample problem W = K j=1 Y 0 (τ j ) Y (τ j ) N 1(τ j ) K j=1 Y 1 (τ j ) Y (τ j ) N 0(τ j ) = K j=1 [ Y0 (τ j ) Y (τ j ) d 1j Y ] 1(τ j ) Y (τ j ) d 0j = K j=1 = j Y 0 (τ j )d 1j Y 1 (τ j )(d j d 0j ) Y (τ j ) ( d 1j d j Y1(τ ) j ) Y (τ j ) = j (O j E j ); 5
6 Regression problem In Cox regression: observations (U i, δ i, Z i ), i = 1, 2,, n. N i (t) = I(U i t, δ i = 1) The score function related to PL function is S n (β) = = n i=1 n i=1 { } δ i Z i S(1) (β, U i ) S (0) (β, U i ) { } Z i S(1) (β, s) S (0) (β, s) 0 dn i (s) 6
7 σ-algebra F is a non-empty collection of subsets in Ω. F is called σalgebra if If E F then E c F if E i F, i = 1,, n then i=1 E i F. 7
8 Probability Space A probability space (Ω, F, P ) is an abstract space Ω equipped with a σ algebra and a set function P defined on F such that 1. If P {E} 0 when E F 2. P {Ω} = 1 3. P { i=1 E i} = i=1 P {E i} when E i F, i = 1,, n and E i E j = ϕ, i j. 8
9 Random Variables and Stochastic Process (Ω, F, P ) is a probability space. A random variable X : Ω R is a real-valued function satisfying {ω : X(ω) (, b]} F for any b. A stochastic process is a family of random variables X = {X(t) : t Γ} indexed by a set Γ all defined on the same probability space (Ω, F, P ). Often times Γ = [0, ) in survival analysis. 9
10 Filtration and Adaptability A family of σ-algebra {F t : t 0} is called a filtration of a probability space (Ω, F, P ) if F t F and F s F t for s t. A filtration {F t ; t 0} is right-continuous if for any t F t + = F t where F t + = ϵ>0 F t+ϵ. A stochastic process {X(t) : t 0} is said to be adapted to {F t, t 0} if X(t) is F t measurable for each t, i.e., {ω : X(t; ω) (, b]} F t. 10
11 History of a stochastic process Let X = {X(t) : t 0} be a stochastic process, and let F t = σ{x(s) : 0 s t}, the smallest σ-algebra making all of the random variables X(s), 0 s t measurable. The filtration {F t : t 0} is called the history of X. 11
12 Conditional Expectation Suppose that Y is random variable on a probability space (Ω, F, P ) and let G be a sub-σ-algebra of F. Let X be a random variable satisfying 1. X is G-measurable 2. B Y dp = B XdP for all sets B G. The random variable X is the called the conditional expectation of Y given G and denoted by E(Y G). 12
13 Conditional Expectation If F t = {ϕ, Ω}, then E(X F t ) = E(X) E{E(X F t )} = E(X) If F s F t, then E{E(X F s ) F t } = E(X F s ) If Y is F t -measurable, then E(XY F t ) = Y E(X F t ) If X and Y are independent, then E(Y σ(x)) = E(Y ) E(aX + by F t ) = ae(x F t ) + be(y F t ) If g( ) is a convex function, then E{g(X) F t } g{e(x F t )}. E(Y σ(x t, t Γ)) = E(Y X t, t Γ) 13
14 Example of conditional expectation The probability space (Ω, F, P ) 1. Ω = [0, 1) 2. F is the Borel algebra generated by all the open interval (a, b) [0, 1] 3. The probability measure: the lebesgue measure P ((a, b)) = b a. Define the sigma fields 1. F 1 = σ([0, 1/2), [1/2, 1)) 2. F 2 = σ([0, 1/4), [1/4, 2/4), [2/4, 3/4), [3/4, 1)) 3. F k = σ([i/2 k, (i + 1)/2 k ), i = 0, 1,, 2 k 1) 14
15 Example of conditional expectation Random Variable X : (0, 1) R 1 : X(ω) = f(ω) X k = E(X F k ) X k is F k measurable X k (ω) is a step function: X k (ω) = a i, ω [i/2 k, (i + 1)/2 k ] B X kdp = B XdP For i = 0, 1,, 2 k 1, (i+1)/2 k a i = 2 k i/2 k f(ω)dω 15
16 Martingale Let X( ) = {X(t), t 0} be a right-continuous a stochastic process with left-hand limit and F t be a filtration on a common probability space. X( ) is a martingale if 1. X is adapted to {F t : t 0}. 2. E X(t) < for any t 3. E[X(t + s) F t ] = X(t) for any t, s 0. X( ) is called a sub-martingale if = is replaced by and super-martingale if = is replaced by. 16
17 Martingale In the survival analysis: F t = σ{x(s) : 0 u t}. For t, s 0 E[X(t + s) F t ] = E[X(t + s) X(u), 0 u t], where (X(u), 0 u t) is the history of the process X( ) from 0 to t. 17
18 Example of martingale Y 1, Y 2,... are i.i.d. random variables satisfying Y i = +1 w.p. 1/2 1 w.p. 1/2 Define X(0) = 0, and X(n) = n Y j n = 1, 2, (X(u) : 0 u n) = (X(1),..., X(n)) (Y 1,..., Y n ) F n = σ(x(1),..., X(n)) = σ(y 1,..., Y n ) j=1 18
19 Example of martingale Clearly, E X(n) < for each n. n+k E[X(n + k) F n ] = E Y j X(1),..., X(n) j=1 n+k = E Y j Y 1,..., Y n j=1 = Y Y n + E [Y n Y n+k Y 1,..., Y n ] = Y Y n = X(n) 19
20 Two important properties of martingale Let X be a martingale with respect to a filtration {F t : t 0} then E{X(t) F t } = X(t ), where F t = s<t F s. Let X be a martingale with respect to a filtration {F t : t 0} then E{dX(t) F t } = 0 E{X(t) X(t ϵ) F t ϵ } = X(t ϵ) X(t ϵ) = 0. 20
21 Counting Process A stochastic process N( ) = (N(t) : t 0) is called a counting process if 1. N(0) = 0 2. N(t) <, all t 3. With probability 1, N(t) is a right-continuous step function with jumps of size +1. N i (t), N(t) and N l (t) are all counting process. 21
22 Compensator Process the stochastic process A( ), defined by A(t) def = t 0 Y (u) h(u)du. E(N(t)) = E(A(t)) M(t) = N(t) A(t) is a mean zero process 22
23 Example T EXP (λ) h(t) = λ A(t) = t 0 h(u)y (u)du = λ t Y (u)du = λ min(t, U) 0 M(t) = I(U t)δ λ min(t, U) In general M(t) = I(U t)δ t 0 Y (u)h(u)du Indeed, M(t) is a martingale under independent censoring assumption! 23
24 Martingale M(t) = N(t) t 0 Y (s)h(s)ds and F t = σ{n(u), N C (u), u [0, t]}. The key is to show that E(M(t + s) M(t) F t ) = 0 a.s. Need to show that E[N(t + s) N(t) F t ] = E [ t+s t h(u)y (u)du F t ] Consider the value of Y (t) E [N(t + s) N(t) F t ] 0 if Y (t) = 0 = E [N(t + s) N(t) Y (t) = 1] if Y (t) = 1. 24
25 Martingale Since N(t+s)-N(t) is zero or one, E[N(t + s) N(t) Y (t) = 1] = P [N(t + s) N(t) = 1 Y (t) = 1] = P [t < U t + s, δ = 1 U t] = P [t < T t + s, C > T ]. P [U t] 25
26 Martingale Consider the value of Y (t) E [ t+s = t h(u)y (u)du F t ] 0 if Y (t) = 0 [ ] t+s E h(u)y (u)du Y (t) = 1 if Y (t) = 1. t When Y (t) = 1 E [ ] t+s h(u)y (u)du Y (t) = 1 t = t+s h(u)p [U u U t]du = t t+s t h(u)p (U u)du P (U t) = = t+s t h(u)e [Y (u) Y (t) = 1] du P [t < T t + s, C > T ]. P [U t] 26
27 Martingale In the last step, we argue that t+s t h(u)p (U u)du P (U t) = P [t < T t + s, C > T ] P [U t] It holds if P (u T u + ϵ T u, C u) h(u) = lim, ϵ 0 ϵ which is the noninformative censoring assumption! 27
28 Predictable process A stochastic process X( ) is called predictable with respect to a filtration F t if for very t, X(t) is measurable with respect to F t Any left-continuous adapted process is predictable. N(t) is not predictable since it is right-continuous but Y (t) = I(U t) is a predictable process. 28
29 Doob-Meyer decomposition Suppose that X( ) is a non-negative submartinagle adapted to {F t, t 0}. Then there exists a right-continuous and non-decreasing predictable process A( ) such that E(A(t)) < for all t and M( ) = X( ) A( ) is a martinagle. If A(0) = 0 a.s., then A( ) is unique! The process A( ) is the called the compensator for X( ). t 0 h(u)y (u)du is the unique compensator for the counting process N(t) = I(U t, δ = 1). 29
30 Property of Martingale associated with counting process If M( ) is a martinagle, then M 2 ( ) is a submartingale (why?) By Doob-Meyer decomposition, there is a unique predicable process < M, M > such that M 2 ( ) < M, M > ( ) is a martingale. E(M 2 (t)) = E{< M, M > (t)} If M( ) = N( ) A( ), then < M, M > ( ) = A( ) We can calculate the variance of N( ) A( )! var{n(t) A(t)} = E{A(t)} = E t 0 Y (u)h(u)du = t 0 P (U > u)h(u)du. 30
STAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes
STAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes This section introduces Lebesgue-Stieltjes integrals, and defines two important stochastic processes: a martingale process and a counting
More informationSTAT331. Combining Martingales, Stochastic Integrals, and Applications to Logrank Test & Cox s Model
STAT331 Combining Martingales, Stochastic Integrals, and Applications to Logrank Test & Cox s Model Because of Theorem 2.5.1 in Fleming and Harrington, see Unit 11: For counting process martingales with
More informationLecture 2: Martingale theory for univariate survival analysis
Lecture 2: Martingale theory for univariate survival analysis In this lecture T is assumed to be a continuous failure time. A core question in this lecture is how to develop asymptotic properties when
More informationSTAT 331. Martingale Central Limit Theorem and Related Results
STAT 331 Martingale Central Limit Theorem and Related Results In this unit we discuss a version of the martingale central limit theorem, which states that under certain conditions, a sum of orthogonal
More informationLectures 22-23: Conditional Expectations
Lectures 22-23: Conditional Expectations 1.) Definitions Let X be an integrable random variable defined on a probability space (Ω, F 0, P ) and let F be a sub-σ-algebra of F 0. Then the conditional expectation
More informationSTAT Sample Problem: General Asymptotic Results
STAT331 1-Sample Problem: General Asymptotic Results In this unit we will consider the 1-sample problem and prove the consistency and asymptotic normality of the Nelson-Aalen estimator of the cumulative
More information1. Stochastic Processes and filtrations
1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S
More informationLinear rank statistics
Linear rank statistics Comparison of two groups. Consider the failure time T ij of j-th subject in the i-th group for i = 1 or ; the first group is often called control, and the second treatment. Let n
More informationMath 735: Stochastic Analysis
First Prev Next Go To Go Back Full Screen Close Quit 1 Math 735: Stochastic Analysis 1. Introduction and review 2. Notions of convergence 3. Continuous time stochastic processes 4. Information and conditional
More informationFiltrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition
Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,
More informationLecture 5: Expectation
Lecture 5: Expectation 1. Expectations for random variables 1.1 Expectations for simple random variables 1.2 Expectations for bounded random variables 1.3 Expectations for general random variables 1.4
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013. Conditional expectations, filtration and martingales
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013 Conditional expectations, filtration and martingales Content. 1. Conditional expectations 2. Martingales, sub-martingales
More informationFundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales
Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Prakash Balachandran Department of Mathematics Duke University April 2, 2008 1 Review of Discrete-Time
More informationExercises. (a) Prove that m(t) =
Exercises 1. Lack of memory. Verify that the exponential distribution has the lack of memory property, that is, if T is exponentially distributed with parameter λ > then so is T t given that T > t for
More informationMath 635: An Introduction to Brownian Motion and Stochastic Calculus
First Prev Next Go To Go Back Full Screen Close Quit 1 Math 635: An Introduction to Brownian Motion and Stochastic Calculus 1. Introduction and review 2. Notions of convergence and results from measure
More informationSurvival Analysis. Lu Tian and Richard Olshen Stanford University
1 Survival Analysis Lu Tian and Richard Olshen Stanford University 2 Survival Time/ Failure Time/Event Time We will introduce various statistical methods for analyzing survival outcomes What is the survival
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationELEMENTS OF PROBABILITY THEORY
ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable
More informationProblem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1
Problem Sheet 1 1. Let Ω = {1, 2, 3}. Let F = {, {1}, {2, 3}, {1, 2, 3}}, F = {, {2}, {1, 3}, {1, 2, 3}}. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X :
More informationMartingale Theory for Finance
Martingale Theory for Finance Tusheng Zhang October 27, 2015 1 Introduction 2 Probability spaces and σ-fields 3 Integration with respect to a probability measure. 4 Conditional expectation. 5 Martingales.
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationA D VA N C E D P R O B A B I L - I T Y
A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2
More information1 Stat 605. Homework I. Due Feb. 1, 2011
The first part is homework which you need to turn in. The second part is exercises that will not be graded, but you need to turn it in together with the take-home final exam. 1 Stat 605. Homework I. Due
More informationStochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik
Stochastic Processes Winter Term 2016-2017 Paolo Di Tella Technische Universität Dresden Institut für Stochastik Contents 1 Preliminaries 5 1.1 Uniform integrability.............................. 5 1.2
More informationWe will briefly look at the definition of a probability space, probability measures, conditional probability and independence of probability events.
1 Probability 1.1 Probability spaces We will briefly look at the definition of a probability space, probability measures, conditional probability and independence of probability events. Definition 1.1.
More informationExercises Measure Theoretic Probability
Exercises Measure Theoretic Probability 2002-2003 Week 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary
More informationFundamental Tools - Probability Theory II
Fundamental Tools - Probability Theory II MSc Financial Mathematics The University of Warwick September 29, 2015 MSc Financial Mathematics Fundamental Tools - Probability Theory II 1 / 22 Measurable random
More informationVerona Course April Lecture 1. Review of probability
Verona Course April 215. Lecture 1. Review of probability Viorel Barbu Al.I. Cuza University of Iaşi and the Romanian Academy A probability space is a triple (Ω, F, P) where Ω is an abstract set, F is
More informationBrownian Motion and Stochastic Calculus
ETHZ, Spring 17 D-MATH Prof Dr Martin Larsson Coordinator A Sepúlveda Brownian Motion and Stochastic Calculus Exercise sheet 6 Please hand in your solutions during exercise class or in your assistant s
More information(A n + B n + 1) A n + B n
344 Problem Hints and Solutions Solution for Problem 2.10. To calculate E(M n+1 F n ), first note that M n+1 is equal to (A n +1)/(A n +B n +1) with probability M n = A n /(A n +B n ) and M n+1 equals
More informationLecture 19 L 2 -Stochastic integration
Lecture 19: L 2 -Stochastic integration 1 of 12 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 19 L 2 -Stochastic integration The stochastic integral for processes
More informationSTAT331. Cox s Proportional Hazards Model
STAT331 Cox s Proportional Hazards Model In this unit we introduce Cox s proportional hazards (Cox s PH) model, give a heuristic development of the partial likelihood function, and discuss adaptations
More informationSurvival Analysis: Weeks 2-3. Lu Tian and Richard Olshen Stanford University
Survival Analysis: Weeks 2-3 Lu Tian and Richard Olshen Stanford University 2 Kaplan-Meier(KM) Estimator Nonparametric estimation of the survival function S(t) = pr(t > t) The nonparametric estimation
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationAt t = T the investors learn the true state.
1. Martingales A discrete time introduction Model of the market with the following submodels (i) T + 1 trading dates, t =0, 1,...,T. Mathematics of Financial Derivatives II Seppo Pynnönen Professor of
More informationLecture 3: Expected Value. These integrals are taken over all of Ω. If we wish to integrate over a measurable subset A Ω, we will write
Lecture 3: Expected Value 1.) Definitions. If X 0 is a random variable on (Ω, F, P), then we define its expected value to be EX = XdP. Notice that this quantity may be. For general X, we say that EX exists
More informationExercises. T 2T. e ita φ(t)dt.
Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.
More informationLecture 4: Conditional expectation and independence
Lecture 4: Conditional expectation and independence In elementry probability, conditional probability P(B A) is defined as P(B A) P(A B)/P(A) for events A and B with P(A) > 0. For two random variables,
More informationSTOCHASTIC MODELS FOR WEB 2.0
STOCHASTIC MODELS FOR WEB 2.0 VIJAY G. SUBRAMANIAN c 2011 by Vijay G. Subramanian. All rights reserved. Permission is hereby given to freely print and circulate copies of these notes so long as the notes
More informationSolutions to the Exercises in Stochastic Analysis
Solutions to the Exercises in Stochastic Analysis Lecturer: Xue-Mei Li 1 Problem Sheet 1 In these solution I avoid using conditional expectations. But do try to give alternative proofs once we learnt conditional
More informationProbability and Measure
Chapter 4 Probability and Measure 4.1 Introduction In this chapter we will examine probability theory from the measure theoretic perspective. The realisation that measure theory is the foundation of probability
More informationMath 180C, Spring Supplement on the Renewal Equation
Math 18C Spring 218 Supplement on the Renewal Equation. These remarks supplement our text and set down some of the material discussed in my lectures. Unexplained notation is as in the text or in lecture.
More informationAn Introduction to Stochastic Calculus
An Introduction to Stochastic Calculus Haijun Li lih@math.wsu.edu Department of Mathematics Washington State University Weeks 3-4 Haijun Li An Introduction to Stochastic Calculus Weeks 3-4 1 / 22 Outline
More informationStochastic Integration and Continuous Time Models
Chapter 3 Stochastic Integration and Continuous Time Models 3.1 Brownian Motion The single most important continuous time process in the construction of financial models is the Brownian motion process.
More informationConvergence Concepts of Random Variables and Functions
Convergence Concepts of Random Variables and Functions c 2002 2007, Professor Seppo Pynnonen, Department of Mathematics and Statistics, University of Vaasa Version: January 5, 2007 Convergence Modes Convergence
More informationMATH 6605: SUMMARY LECTURE NOTES
MATH 6605: SUMMARY LECTURE NOTES These notes summarize the lectures on weak convergence of stochastic processes. If you see any typos, please let me know. 1. Construction of Stochastic rocesses A stochastic
More informationWeek 2. Review of Probability, Random Variables and Univariate Distributions
Week 2 Review of Probability, Random Variables and Univariate Distributions Probability Probability Probability Motivation What use is Probability Theory? Probability models Basis for statistical inference
More informationThe instantaneous and forward default intensity of structural models
The instantaneous and forward default intensity of structural models Cho-Jieh Chen Department of Mathematical and Statistical Sciences University of Alberta Edmonton, AB E-mail: cjchen@math.ualberta.ca
More informationLectures on Statistics. William G. Faris
Lectures on Statistics William G. Faris December 1, 2003 ii Contents 1 Expectation 1 1.1 Random variables and expectation................. 1 1.2 The sample mean........................... 3 1.3 The sample
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 13: Expectation and Variance and joint distributions Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin
More informationStochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes
BMS Basic Course Stochastic Processes II/ Wahrscheinlichkeitstheorie III Michael Scheutzow Lecture Notes Technische Universität Berlin Sommersemester 218 preliminary version October 12th 218 Contents
More informationProbability Theory II. Spring 2016 Peter Orbanz
Probability Theory II Spring 2016 Peter Orbanz Contents Chapter 1. Martingales 1 1.1. Martingales indexed by partially ordered sets 1 1.2. Martingales from adapted processes 4 1.3. Stopping times and
More informationNotes 1 : Measure-theoretic foundations I
Notes 1 : Measure-theoretic foundations I Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Section 1.0-1.8, 2.1-2.3, 3.1-3.11], [Fel68, Sections 7.2, 8.1, 9.6], [Dur10,
More informationAsymptotic Distributions for the Nelson-Aalen and Kaplan-Meier estimators and for test statistics.
Asymptotic Distributions for the Nelson-Aalen and Kaplan-Meier estimators and for test statistics. Dragi Anevski Mathematical Sciences und University November 25, 21 1 Asymptotic distributions for statistical
More informationCONVERGENCE OF RANDOM SERIES AND MARTINGALES
CONVERGENCE OF RANDOM SERIES AND MARTINGALES WESLEY LEE Abstract. This paper is an introduction to probability from a measuretheoretic standpoint. After covering probability spaces, it delves into the
More informationn E(X t T n = lim X s Tn = X s
Stochastic Calculus Example sheet - Lent 15 Michael Tehranchi Problem 1. Let X be a local martingale. Prove that X is a uniformly integrable martingale if and only X is of class D. Solution 1. If If direction:
More informationUseful Probability Theorems
Useful Probability Theorems Shiu-Tang Li Finished: March 23, 2013 Last updated: November 2, 2013 1 Convergence in distribution Theorem 1.1. TFAE: (i) µ n µ, µ n, µ are probability measures. (ii) F n (x)
More informationSTAT 331. Accelerated Failure Time Models. Previously, we have focused on multiplicative intensity models, where
STAT 331 Accelerated Failure Time Models Previously, we have focused on multiplicative intensity models, where h t z) = h 0 t) g z). These can also be expressed as H t z) = H 0 t) g z) or S t z) = e Ht
More informationStochastic Calculus and Black-Scholes Theory MTH772P Exercises Sheet 1
Stochastic Calculus and Black-Scholes Theory MTH772P Exercises Sheet. For ξ, ξ 2, i.i.d. with P(ξ i = ± = /2 define the discrete-time random walk W =, W n = ξ +... + ξ n. (i Formulate and prove the property
More informationBASICS OF PROBABILITY
October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,
More informationLecture 6 Basic Probability
Lecture 6: Basic Probability 1 of 17 Course: Theory of Probability I Term: Fall 2013 Instructor: Gordan Zitkovic Lecture 6 Basic Probability Probability spaces A mathematical setup behind a probabilistic
More informationChapter 6: Random Processes 1
Chapter 6: Random Processes 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationSelected Exercises on Expectations and Some Probability Inequalities
Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ
More informationECE534, Spring 2018: Solutions for Problem Set #4 Due Friday April 6, 2018
ECE534, Spring 2018: s for Problem Set #4 Due Friday April 6, 2018 1. MMSE Estimation, Data Processing and Innovations The random variables X, Y, Z on a common probability space (Ω, F, P ) are said to
More informationFE 5204 Stochastic Differential Equations
Instructor: Jim Zhu e-mail:zhu@wmich.edu http://homepages.wmich.edu/ zhu/ January 20, 2009 Preliminaries for dealing with continuous random processes. Brownian motions. Our main reference for this lecture
More informationIf Y and Y 0 satisfy (1-2), then Y = Y 0 a.s.
20 6. CONDITIONAL EXPECTATION Having discussed at length the limit theory for sums of independent random variables we will now move on to deal with dependent random variables. An important tool in this
More informationMarkov Branching Process
MODULE 8: BRANCHING PROCESSES 11 Lecture 3 Markov Branching Process Let Z(t) be number of particles at time t. Let δ 1k + a k h + o(h), k = 0, 1,... represent the probability that a single particle will
More information1 Brownian Local Time
1 Brownian Local Time We first begin by defining the space and variables for Brownian local time. Let W t be a standard 1-D Wiener process. We know that for the set, {t : W t = } P (µ{t : W t = } = ) =
More informationABSTRACT EXPECTATION
ABSTRACT EXPECTATION Abstract. In undergraduate courses, expectation is sometimes defined twice, once for discrete random variables and again for continuous random variables. Here, we will give a definition
More informationA Short Introduction to Diffusion Processes and Ito Calculus
A Short Introduction to Diffusion Processes and Ito Calculus Cédric Archambeau University College, London Center for Computational Statistics and Machine Learning c.archambeau@cs.ucl.ac.uk January 24,
More informationJump Processes. Richard F. Bass
Jump Processes Richard F. Bass ii c Copyright 214 Richard F. Bass Contents 1 Poisson processes 1 1.1 Definitions............................. 1 1.2 Stopping times.......................... 3 1.3 Markov
More information1 Probability and Random Variables
1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in
More informationProbability- the good parts version. I. Random variables and their distributions; continuous random variables.
Probability- the good arts version I. Random variables and their distributions; continuous random variables. A random variable (r.v) X is continuous if its distribution is given by a robability density
More informationRandom Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay
1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued
More informationNotes on Stochastic Calculus
Notes on Stochastic Calculus David Nualart Kansas University nualart@math.ku.edu 1 Stochastic Processes 1.1 Probability Spaces and Random Variables In this section we recall the basic vocabulary and results
More informationPROBABILITY THEORY II
Ruprecht-Karls-Universität Heidelberg Institut für Angewandte Mathematik Prof. Dr. Jan JOHANNES Outline of the lecture course PROBABILITY THEORY II Summer semester 2016 Preliminary version: April 21, 2016
More information1 Martingales. Martingales. (Ω, B, P ) is a probability space.
Martingales January 8, 206 Debdee Pati Martingales (Ω, B, P ) is a robability sace. Definition. (Filtration) filtration F = {F n } n 0 is a collection of increasing sub-σfields such that for m n, we have
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationProbability and Random Processes
Probability and Random Processes Lecture 7 Conditional probability and expectation Decomposition of measures Mikael Skoglund, Probability and random processes 1/13 Conditional Probability A probability
More informationNorthwestern University Department of Electrical Engineering and Computer Science
Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability
More informationSUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES
SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES RUTH J. WILLIAMS October 2, 2017 Department of Mathematics, University of California, San Diego, 9500 Gilman Drive,
More informationStochastic integration. P.J.C. Spreij
Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................
More informationECON 2530b: International Finance
ECON 2530b: International Finance Compiled by Vu T. Chau 1 Non-neutrality of Nominal Exchange Rate 1.1 Mussa s Puzzle(1986) RER can be defined as the relative price of a country s consumption basket in
More informationExercise Exercise Homework #6 Solutions Thursday 6 April 2006
Unless otherwise stated, for the remainder of the solutions, define F m = σy 0,..., Y m We will show EY m = EY 0 using induction. m = 0 is obviously true. For base case m = : EY = EEY Y 0 = EY 0. Now assume
More informationSTOR 635 Notes (S13)
STOR 635 Notes (S13) Jimmy Jin UNC-Chapel Hill Last updated: 1/14/14 Contents 1 Measure theory and probability basics 2 1.1 Algebras and measure.......................... 2 1.2 Integration................................
More informationInference for Stochastic Processes
Inference for Stochastic Processes Robert L. Wolpert Revised: June 19, 005 Introduction A stochastic process is a family {X t } of real-valued random variables, all defined on the same probability space
More information5 Operations on Multiple Random Variables
EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y
More informationMATH4210 Financial Mathematics ( ) Tutorial 7
MATH40 Financial Mathematics (05-06) Tutorial 7 Review of some basic Probability: The triple (Ω, F, P) is called a probability space, where Ω denotes the sample space and F is the set of event (σ algebra
More informationBernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010
1 Stochastic Calculus Notes Abril 13 th, 1 As we have seen in previous lessons, the stochastic integral with respect to the Brownian motion shows a behavior different from the classical Riemann-Stieltjes
More informationStochastic Processes
Stochastic Processes A very simple introduction Péter Medvegyev 2009, January Medvegyev (CEU) Stochastic Processes 2009, January 1 / 54 Summary from measure theory De nition (X, A) is a measurable space
More informationStochastic Analysis I S.Kotani April 2006
Stochastic Analysis I S.Kotani April 6 To describe time evolution of randomly developing phenomena such as motion of particles in random media, variation of stock prices and so on, we have to treat stochastic
More informationSolution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have
362 Problem Hints and Solutions sup g n (ω, t) g(ω, t) sup g(ω, s) g(ω, t) µ n (ω). t T s,t: s t 1/n By the uniform continuity of t g(ω, t) on [, T], one has for each ω that µ n (ω) as n. Two applications
More informationEE514A Information Theory I Fall 2013
EE514A Information Theory I Fall 2013 K. Mohan, Prof. J. Bilmes University of Washington, Seattle Department of Electrical Engineering Fall Quarter, 2013 http://j.ee.washington.edu/~bilmes/classes/ee514a_fall_2013/
More informationBrownian Motion and Conditional Probability
Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical
More informationMath-Stat-491-Fall2014-Notes-I
Math-Stat-491-Fall2014-Notes-I Hariharan Narayanan October 2, 2014 1 Introduction This writeup is intended to supplement material in the prescribed texts: Introduction to Probability Models, 10th Edition,
More informationP (A G) dp G P (A G)
First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume
More informationCIMPA SCHOOL, 2007 Jump Processes and Applications to Finance Monique Jeanblanc
CIMPA SCHOOL, 27 Jump Processes and Applications to Finance Monique Jeanblanc 1 Jump Processes I. Poisson Processes II. Lévy Processes III. Jump-Diffusion Processes IV. Point Processes 2 I. Poisson Processes
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationExercises Measure Theoretic Probability
Exercises Measure Theoretic Probability Chapter 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary family
More information