Tentamentsskrivning: Stochastic processes 1
|
|
- Olivia Jacobs
- 5 years ago
- Views:
Transcription
1 Tentamentsskrivning: Stochastic processes 1 Tentamentsskrivning i MSF2/MVE33, 7.5 hp. Tid: fredagen den 31 maj 213 kl Examinator och jour: Serik Sagitov, tel , mob , rum H326 i MV-huset. Hjälpmedel: miniräknare, egen formelsamling på 4 A4 sidor (2 blad). CTH: för 3 fordras 12 poäng, för 4-18 poäng, för 5-24 poäng. GU: för G fordras 12 poäng, för VG - 2 poäng. Inklusive eventuella bonus poäng. 1. (6 points) Define a process Y n recursively by Y n = Z n.3 Y n 1, n 1, where Z n are independent r.v. with zero means and unit variance. (a) Under which conditions on Y this process is weakly stationary? Compute its autocovariance function. (b) Find the best linear predictor Ŷr+3 of Y r+3 given (Y,..., Y r ). (c) What is the mean squared error of this prediction? (d) Give an example when this process is strongly stationary. 2. (4 points) Consider a Poisson process with parameter λ. (a) Describe the Poisson process as a renewal process. Compute its renewal function. (b) Let E(t) be the excess life at time t. Write down a renewal equation for P(E(t) > x). Solve this equation to find the distribution of E(t). (c) What is the stationary distribution of the excess life? 3. (4 points) Let X 1, X 2,... be iid r.v. with finite mean µ, and let M be a stopping time with respect to the sequence X n such that E(M) <. Then E(X X M ) = µe(m). (a) Give a detailed proof of the above formulated Wald s equation lemma. (b) Illustrate by an example that the statement may fail if M is not a stopping time. 4. (6 points) Let {X n } n=1 be a sequence of stochastic variables defined by X n = I 1 sin(φn) + I 2 cos(φn), where I 1, I 2, φ are independent random variables such that P (I 1 = 1) = P (I 1 = 1) = P (I 2 = 1) = P (I 2 = 1) = 1 2 and φ is uniformly distributed over [ π, π]. (a) On the same graph draw three examples of trajectories of the process {X n }. What is random in these trajectories? (b) Verify that {X n } is a stationary process. (c) Find the spectral density function of {X n }. (d) Find the spectral representation for the process {X n } in the following form X n = cos(nλ)du(λ) + sin(nλ)dv (λ).
2 Tentamentsskrivning: Stochastic processes 2 5. (4 points) Let random variables {X n } be iid random variables with P(X n = 2) = P(X n = ) = 1/2. (a) Show that the product S n = X 1 X n is a martingale. (b) Show that S n in probability but not in mean. (c) Does S n converge a.s.? 6. (3 points) For a filtration (F n ) let B n F n. Put X n = 1 B Bn. (a) Show that (X n ) is a submartingale. (b) What is the Doob decomposition for X n? 7. (3 points) Theorem. Let (Y n, F n ) be a submartingale and let T be a stopping time. Then (Y T n, F n ) is a submartingale. If moreover, E Y n <, then (Y n Y T n, F n ) is also a submartingale. (a) Referring to the above theorem carefully deduce the following statement. If (Y n, F n ) is a martingale and T is a stopping time, then (Y T n, F n ) and (Y n Y T n, F n ) are martingales. (b) Illustrate this theorem with a simple example. Partial answers and solutions are also welcome. Good luck!
3 Tentamentsskrivning: Stochastic processes 3 Solutions 1. This is an example of AR(1) model Y n = αy n 1 + Z n. Observe that for n, m, Y n+m = Z n+m + αz n+m α n 1 Z m+1 + α n Y m. 1a. Let Y has mean µ and variance σ 2. In the stationary case Y n should also have mean µ and variance σ 2. This leads to equations µ = α n µ, σ 2 = 1 + α α 2n 2 + α 2n σ 2, which imply µ = and σ 2 = 1 1 α Hence, using these values we compute for n, c(n) = Cov(Y n+m, Y m ) = E(Y n+m Y m ) = α n E(Y 2 m) = αn (.3)n = ( 1)n 1 α b. The best linear predictor Ŷr+k = r j= a jy r j is found from the equations or r a j c( j m ) = c(k + m), m r. j= r a j α j m = α k+m, m r. j= Intuition suggests seeking a solution of the form Ŷr+k = a Y r. Indeed, plugging a 1 =... = a r = we easily obtain a = α k. Thus Ŷr+k = (.3) k Y r, and Ŷr+3 =.27 Y r. 1c. The mean square error is E ( (Y r+k Ŷr+k) 2) = E ( (Y r+k α k Y r ) 2) = E ( (Z r+k + αz r+k α k 1 Z r+1 ) 2) = 1 α2k 1 α 2 = 1 (.9)k.91 = for k = 3. 1d. If Y has a normal distribution with mean zero and variance σ 2 = 1 1 α 1.989, then the 2 process becomes Gaussian. For the Gaussian processes weak and strong stationarity are equivalent. 2a. Let T =, T n = X X n, where X i are iid exponential inter-arrival times with parameter λ. The Poisson process N(t) is the renewal process giving the number of renewal events during the time interval (, t], so that {N(t) n} = {T n t}, T N(t) t < T N(t)+1. Its renewal function m(t) := E(N(t)) satisfies the renewal equation m(t) = F (t) + m(t u)df (u), F (t) = 1 e λt. In terms of the Laplace-Stieltjes transforms ˆm(θ) := e θt dm(t) we get Since ˆF (θ) = ˆm(θ) = ˆF (θ) + ˆm(θ) ˆF (θ). λ θ+λ, we obtain ˆm(θ) = λ θ, implying m(t) = λt.
4 Tentamentsskrivning: Stochastic processes 4 2b. The excess life time E(t) := T N(t)+1 t. The distribution of E(t) is given by the following recursion for b(t) := P(E(t) > y) ( ) ( ) b(t) = E E(1 {E(t)>y} X 1 ) = E 1 {E(t X1)>y}1 {X1 t} + 1 {X1>t+y} = 1 F (t + y) + This renewal equation gives P(E(t) > y) = 1 F (t + y) + = e λ(t+y) + λ b(t x)df (x). (1 F (t + y u))dm(u) e λ(t+y u) du = e λ(t+y) + λ Thus E(t) is exponentially distributed with parameter λ. y+t y e λx dx = e λy. 2c. From the previous calculation we see that the stationary distribution of E(t) is geometric with parameter λ. This is in agreement with the general result, with µ standing for the mean inter-arrival time, y lim P(E(t) y) = t µ 1 (1 F (x))dx = λ y e λx dx = 1 e λy. 3a. Let X 1, X 2,... be iid r.v. with finite mean µ, and let M be a stopping time with respect to the sequence X n such that E(M) <. Then Proof. By dominated convergence ( E(X X M ) = E = E(X X M ) = µe(m). lim n i=1 n ) ( X i 1 {M i} = lim E n ) X i 1 {M i} n E(X i )P(M i) = µe(m). i=1 Here we used independence between {M i} and X i, which follows from the fact that {M i} is the complimentary event to {M i 1} σ{x 1,..., X i 1 }. 3c. Turn to the Poisson process and observe that M = N(t) is not a stopping time and in this case E(T N(t) ) µm(t). Indeed, for the Poisson process µm(t) = t while T N(t) = t C(t), where C(t) > is the current lifetime. i=
5 Tentamentsskrivning: Stochastic processes 5 4a. What looks on the figure as a cloud of stationary allocated points is produced by three sinusoids: pluses: sin(πn/5) + cos(πn/5), circles: sin(πn/5) cos(πn/5), stars: sin(πn/3) + cos(πn/3). The randomness is created by the random choice of the frequency and phase. 4b. The process X n = I 1 sin(φn) + I 2 cos(φn) has zero means by independence and E(I 1 ) = E(I 2 ) =. Its autocovariancies are Cov(X n, X m ) = E(X n X m ) = E(sin(φn) sin(φm) + cos(φn) cos(φm)) = E(cos(φ(n m)) = π cos(φ(n m))dφ = 1 {n=m}. Thus (X n ) is a weakly stationary sequence with mean zero, variance 1, and zero autocorrelation ρ(n) = for n. 4c. The spectral density g must satisfy ρ(n) = cos(λn)g(λ)dλ. Then it is the uniform density g(λ) = π 1 1 {λ [,π]}. where 4d. In this case the spectral representation for the process {X n } is straightforward X n = I 1 sin(φn) + I 2 cos(φn) = cos(nλ)du(λ) + U(λ) = I 1 1 {λ φ}, V (λ) = I 2 1 {λ φ}. sin(nλ)dv (λ), Let us check the key properties of the random process (U(λ), V (λ)): (i) U(λ) and V (λ) have zero means, (ii) U(λ 1 ) and V (λ 2 ) are uncorrelated, (iii) Var(U(λ)) = Var(V (λ)) = E(1 {λ φ} ) = λ = G(λ), (iv) increments of U(λ) are uncorrelated, and increments of V (λ) are uncorrelated. The last property is obtained as follows: for λ 1 < λ 2 < λ 3, E(U(λ 3 ) U(λ 2 ))(U(λ 2 ) U(λ 1 )) = E(1 {λ3 φ} 1 {λ2 φ})(1 {λ2 φ} 1 {λ1 φ}) = E(1 {λ2<φ λ 3}1 {λ1<φ λ 2}) =. 5a. We have S n = 2 n with probability 2 n and S n = with probability 1 2 n, so that E(S n ) = 1. Clearly, E(S n+1 S n = 2 n ) = 2 n and E(S n+1 S n = ) =, so that E(S n+1 S n ) = S n. 5b. On one hand, P( S n > ɛ) = 2 n, so that S n P. On the other hand, E(Sn ) = 1, so that S n does not converge in mean. 5c. Put A n = {S n = }. We have A n A n+1 and A = n A n = {ω : S n (ω) }. Since P(A) = lim P(A n ) = 1, we conclude that S n almost surely. 6a. We verify three properties. (i) Since B i F i F n, we conclude that X n is measureable with respect to F n. (ii) The random variables X n have finite means a n := E(X n ) = P(B 1 ) P(B n ). (iii) E(X n+1 F n ) = X n + P(B n+1 ) X n.
6 Tentamentsskrivning: Stochastic processes 6 6b. Doob s decomposition: X n = M n + S n, where (M n, F n ) is a martingale and (S n, F n ) is an increasing predictable process (called the compensator of the submartingale). Here M and S are computed as: M =, S =, and for n Thus M n = X n a n and S n = a n. M n+1 M n = X n+1 E(X n+1 F n ) = 1 {Bn+1} P(B n+1 ), S n+1 S n = E(X n+1 F n ) X n = P(B n+1 ). 7a. Theorem 1. Let (Y n, F n ) be a submartingale and let T be a stopping time. Then (Y T n, F n ) is a submartingale. If moreover, E Y n <, then (Y n Y T n, F n ) is also a submartingale. Changing the sign we easily get the next sister theorem. Theorem 2. Let (Y n, F n ) be a supermartingale and let T be a stopping time. Then (Y T n, F n ) is a supermartingale. If moreover, E Y n <, then (Y n Y T n, F n ) is also a supermartingale. Since a martingale is both a submartingale and a supermartingale, these two theorems imply the following desired result. Theorem 3. Let (Y n, F n ) be a martingale and let T be a stopping time. Then (Y T n, F n ) is a martingale. If moreover, E Y n <, then (Y n Y T n, F n ) is also a martingale. 7b. Consider a symmetric simple random walk Y n = S n and take T = the first hitting time of the level a. On the figure we show the trajectories of the three martingales in question. Т Т Т
Tentamentsskrivning: Statistisk slutledning 1. Tentamentsskrivning i Statistisk slutledning MVE155/MSG200, 7.5 hp.
Tentamentsskrivning: Statistisk slutledning 1 Tentamentsskrivning i Statistisk slutledning MVE155/MSG200, 7.5 hp. Tid: 14 mars 2017, kl 14.00-18.00 Examinator och jour: Serik Sagitov, tel. 031-772-5351,
More informationTentamentsskrivning: Statistisk slutledning 1. Tentamentsskrivning i Statistisk slutledning MVE155/MSG200, 7.5 hp.
Tentamentsskrivning: Statistisk slutledning 1 Tentamentsskrivning i Statistisk slutledning MVE155/MSG200, 7.5 hp. Tid: tisdagen den 17 mars, 2015 kl 14.00-18.00 Examinator och jour: Serik Sagitov, tel.
More informationRenewal processes and Queues
Renewal processes and Queues Last updated by Serik Sagitov: May 6, 213 Abstract This Stochastic Processes course is based on the book Probabilities and Random Processes by Geoffrey Grimmett and David Stirzaker.
More informationTentamentsskrivning: Mathematisk statistik TMS Tentamentsskrivning i Mathematisk statistik TMS061
Tentamentsskrivning: Mathematisk statistik TMS061 1 Tentamentsskrivning i Mathematisk statistik TMS061 Tid: Tisdagen den 25 maj, 2010 kl 08.30-12.30 Examinator och jour: Serik Sagitov, tel. 772-5351, mob.
More informationChapter 6: Random Processes 1
Chapter 6: Random Processes 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationProblem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1
Problem Sheet 1 1. Let Ω = {1, 2, 3}. Let F = {, {1}, {2, 3}, {1, 2, 3}}, F = {, {2}, {1, 3}, {1, 2, 3}}. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X :
More informationSelected Exercises on Expectations and Some Probability Inequalities
Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ
More informationSolution: The process is a compound Poisson Process with E[N (t)] = λt/p by Wald's equation.
Solutions Stochastic Processes and Simulation II, May 18, 217 Problem 1: Poisson Processes Let {N(t), t } be a homogeneous Poisson Process on (, ) with rate λ. Let {S i, i = 1, 2, } be the points of the
More information1. Stochastic Processes and filtrations
1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S
More informationSolutions For Stochastic Process Final Exam
Solutions For Stochastic Process Final Exam (a) λ BMW = 20 0% = 2 X BMW Poisson(2) Let N t be the number of BMWs which have passes during [0, t] Then the probability in question is P (N ) = P (N = 0) =
More informationECE534, Spring 2018: Solutions for Problem Set #4 Due Friday April 6, 2018
ECE534, Spring 2018: s for Problem Set #4 Due Friday April 6, 2018 1. MMSE Estimation, Data Processing and Innovations The random variables X, Y, Z on a common probability space (Ω, F, P ) are said to
More informationRenewal theory and its applications
Renewal theory and its applications Stella Kapodistria and Jacques Resing September 11th, 212 ISP Definition of a Renewal process Renewal theory and its applications If we substitute the Exponentially
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More information1. Stochastic Process
HETERGENEITY IN QUANTITATIVE MACROECONOMICS @ TSE OCTOBER 17, 216 STOCHASTIC CALCULUS BASICS SANG YOON (TIM) LEE Very simple notes (need to add references). It is NOT meant to be a substitute for a real
More informationP (A G) dp G P (A G)
First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume
More informationSTAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes
STAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes This section introduces Lebesgue-Stieltjes integrals, and defines two important stochastic processes: a martingale process and a counting
More informationExercises: sheet 1. k=1 Y k is called compound Poisson process (X t := 0 if N t = 0).
Exercises: sheet 1 1. Prove: Let X be Poisson(s) and Y be Poisson(t) distributed. If X and Y are independent, then X + Y is Poisson(t + s) distributed (t, s > 0). This means that the property of a convolution
More information1 Delayed Renewal Processes: Exploiting Laplace Transforms
IEOR 6711: Stochastic Models I Professor Whitt, Tuesday, October 22, 213 Renewal Theory: Proof of Blackwell s theorem 1 Delayed Renewal Processes: Exploiting Laplace Transforms The proof of Blackwell s
More informationExponential Distribution and Poisson Process
Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential
More informationA D VA N C E D P R O B A B I L - I T Y
A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2
More informationFiltrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition
Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,
More informationECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes
ECE353: Probability and Random Processes Lecture 18 - Stochastic Processes Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu From RV
More informationMath 6810 (Probability) Fall Lecture notes
Math 6810 (Probability) Fall 2012 Lecture notes Pieter Allaart University of North Texas September 23, 2012 2 Text: Introduction to Stochastic Calculus with Applications, by Fima C. Klebaner (3rd edition),
More informationJump Processes. Richard F. Bass
Jump Processes Richard F. Bass ii c Copyright 214 Richard F. Bass Contents 1 Poisson processes 1 1.1 Definitions............................. 1 1.2 Stopping times.......................... 3 1.3 Markov
More informationBasics of Stochastic Modeling: Part II
Basics of Stochastic Modeling: Part II Continuous Random Variables 1 Sandip Chakraborty Department of Computer Science and Engineering, INDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR August 10, 2016 1 Reference
More information(A n + B n + 1) A n + B n
344 Problem Hints and Solutions Solution for Problem 2.10. To calculate E(M n+1 F n ), first note that M n+1 is equal to (A n +1)/(A n +B n +1) with probability M n = A n /(A n +B n ) and M n+1 equals
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is
More informationPoint Process Control
Point Process Control The following note is based on Chapters I, II and VII in Brémaud s book Point Processes and Queues (1981). 1 Basic Definitions Consider some probability space (Ω, F, P). A real-valued
More informationELEMENTS OF PROBABILITY THEORY
ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable
More informationRandom Processes. DS GA 1002 Probability and Statistics for Data Science.
Random Processes DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Modeling quantities that evolve in time (or space)
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationProblem Set 1 Solution Sketches Time Series Analysis Spring 2010
Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 1. Construct a martingale difference process that is not weakly stationary. Simplest e.g.: Let Y t be a sequence of independent, non-identically
More informationTime Series Solutions HT 2009
Time Series Solutions HT 2009 1. Let {X t } be the ARMA(1, 1) process, X t φx t 1 = ɛ t + θɛ t 1, {ɛ t } WN(0, σ 2 ), where φ < 1 and θ < 1. Show that the autocorrelation function of {X t } is given by
More informationSTAT331. Combining Martingales, Stochastic Integrals, and Applications to Logrank Test & Cox s Model
STAT331 Combining Martingales, Stochastic Integrals, and Applications to Logrank Test & Cox s Model Because of Theorem 2.5.1 in Fleming and Harrington, see Unit 11: For counting process martingales with
More informationSolution: By Markov inequality: P (X > 100) 0.8. By Chebyshev s inequality: P (X > 100) P ( X 80 > 20) 100/20 2 = The second estimate is better.
MA 485-1E, Probability (Dr Chernov) Final Exam Wed, Dec 12, 2001 Student s name Be sure to show all your work. Each problem is 4 points. Full credit will be given for 9 problems (36 points). You are welcome
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 7 9/25/2013
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 7 9/5/013 The Reflection Principle. The Distribution of the Maximum. Brownian motion with drift Content. 1. Quick intro to stopping times.
More informationModern Discrete Probability Branching processes
Modern Discrete Probability IV - Branching processes Review Sébastien Roch UW Madison Mathematics November 15, 2014 1 Basic definitions 2 3 4 Galton-Watson branching processes I Definition A Galton-Watson
More informationProbability Theory I: Syllabus and Exercise
Probability Theory I: Syllabus and Exercise Narn-Rueih Shieh **Copyright Reserved** This course is suitable for those who have taken Basic Probability; some knowledge of Real Analysis is recommended( will
More informationIf we want to analyze experimental or simulated data we might encounter the following tasks:
Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction
More informationLecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.
Lecture 2 1 Martingales We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. 1.1 Doob s inequality We have the following maximal
More informationIntroduction to Stochastic processes
Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space
More informationMATH 6605: SUMMARY LECTURE NOTES
MATH 6605: SUMMARY LECTURE NOTES These notes summarize the lectures on weak convergence of stochastic processes. If you see any typos, please let me know. 1. Construction of Stochastic rocesses A stochastic
More informationMartingale Theory for Finance
Martingale Theory for Finance Tusheng Zhang October 27, 2015 1 Introduction 2 Probability spaces and σ-fields 3 Integration with respect to a probability measure. 4 Conditional expectation. 5 Martingales.
More informationEcon 424 Time Series Concepts
Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length
More informationADVANCED PROBABILITY: SOLUTIONS TO SHEET 1
ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1 Last compiled: November 6, 213 1. Conditional expectation Exercise 1.1. To start with, note that P(X Y = P( c R : X > c, Y c or X c, Y > c = P( c Q : X > c, Y
More informationOptimal maintenance for minimal and imperfect repair models
Optimal maintenance for minimal and imperfect repair models Gustavo L. Gilardoni Universidade de Brasília AMMSI Workshop Grenoble, January 2016 Belo Horizonte Enrico Colosimo Marta Freitas Rodrigo Padilha
More informationI forgot to mention last time: in the Ito formula for two standard processes, putting
I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy
More informationPoisson Processes. Stochastic Processes. Feb UC3M
Poisson Processes Stochastic Processes UC3M Feb. 2012 Exponential random variables A random variable T has exponential distribution with rate λ > 0 if its probability density function can been written
More informationSurvival Analysis: Counting Process and Martingale. Lu Tian and Richard Olshen Stanford University
Survival Analysis: Counting Process and Martingale Lu Tian and Richard Olshen Stanford University 1 Lebesgue-Stieltjes Integrals G( ) is a right-continuous step function having jumps at x 1, x 2,.. b f(x)dg(x)
More informationn E(X t T n = lim X s Tn = X s
Stochastic Calculus Example sheet - Lent 15 Michael Tehranchi Problem 1. Let X be a local martingale. Prove that X is a uniformly integrable martingale if and only X is of class D. Solution 1. If If direction:
More informationExam Stochastic Processes 2WB08 - March 10, 2009,
Exam Stochastic Processes WB8 - March, 9, 4.-7. The number of points that can be obtained per exercise is mentioned between square brackets. The maximum number of points is 4. Good luck!!. (a) [ pts.]
More informationExponential martingales: uniform integrability results and applications to point processes
Exponential martingales: uniform integrability results and applications to point processes Alexander Sokol Department of Mathematical Sciences, University of Copenhagen 26 September, 2012 1 / 39 Agenda
More informationBrownian Motion. 1 Definition Brownian Motion Wiener measure... 3
Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................
More informationLecture 1: Fundamental concepts in Time Series Analysis (part 2)
Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)
More informationProbability Models. 4. What is the definition of the expectation of a discrete random variable?
1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions
More informationECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.
NAME: ECE 302 Division MWF 0:30-:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding. If you are not in Prof. Pollak s section, you may not take this
More informationECE534, Spring 2018: Solutions for Problem Set #5
ECE534, Spring 08: s for Problem Set #5 Mean Value and Autocorrelation Functions Consider a random process X(t) such that (i) X(t) ± (ii) The number of zero crossings, N(t), in the interval (0, t) is described
More informationThe exponential distribution and the Poisson process
The exponential distribution and the Poisson process 1-1 Exponential Distribution: Basic Facts PDF f(t) = { λe λt, t 0 0, t < 0 CDF Pr{T t) = 0 t λe λu du = 1 e λt (t 0) Mean E[T] = 1 λ Variance Var[T]
More informationChapter 2: Random Variables
ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:
More informationHW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]
HW7 Solutions. 5 pts.) James Bond James Bond, my favorite hero, has again jumped off a plane. The plane is traveling from from base A to base B, distance km apart. Now suppose the plane takes off from
More informationLycka till!
Avd. Matematisk statistik TENTAMEN I SF294 SANNOLIKHETSTEORI/EXAM IN SF294 PROBABILITY THE- ORY MONDAY THE 14 T H OF JANUARY 28 14. p.m. 19. p.m. Examinator : Timo Koski, tel. 79 71 34, e-post: timo@math.kth.se
More informationLecture 22 Girsanov s Theorem
Lecture 22: Girsanov s Theorem of 8 Course: Theory of Probability II Term: Spring 25 Instructor: Gordan Zitkovic Lecture 22 Girsanov s Theorem An example Consider a finite Gaussian random walk X n = n
More informationIntroduction to stochastic analysis
Introduction to stochastic analysis A. Guionnet 1 2 Department of Mathematics, MIT, 77 Massachusetts Avenue, Cambridge, MA 2139-437, USA. Abstract These lectures notes are notes in progress designed for
More informationA time series is called strictly stationary if the joint distribution of every collection (Y t
5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a
More informationSolution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have
362 Problem Hints and Solutions sup g n (ω, t) g(ω, t) sup g(ω, s) g(ω, t) µ n (ω). t T s,t: s t 1/n By the uniform continuity of t g(ω, t) on [, T], one has for each ω that µ n (ω) as n. Two applications
More information2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.
CS450 Final Review Problems Fall 08 Solutions or worked answers provided Problems -6 are based on the midterm review Identical problems are marked recap] Please consult previous recitations and textbook
More informationLecture 2: Martingale theory for univariate survival analysis
Lecture 2: Martingale theory for univariate survival analysis In this lecture T is assumed to be a continuous failure time. A core question in this lecture is how to develop asymptotic properties when
More informationMath Review Sheet, Fall 2008
1 Descriptive Statistics Math 3070-5 Review Sheet, Fall 2008 First we need to know about the relationship among Population Samples Objects The distribution of the population can be given in one of the
More informationOptional Stopping Theorem Let X be a martingale and T be a stopping time such
Plan Counting, Renewal, and Point Processes 0. Finish FDR Example 1. The Basic Renewal Process 2. The Poisson Process Revisited 3. Variants and Extensions 4. Point Processes Reading: G&S: 7.1 7.3, 7.10
More informationµ X (A) = P ( X 1 (A) )
1 STOCHASTIC PROCESSES This appendix provides a very basic introduction to the language of probability theory and stochastic processes. We assume the reader is familiar with the general measure and integration
More informationCHAPTER 1. Martingales
CHAPTER 1 Martingales The basic limit theorems of probability, such as the elementary laws of large numbers and central limit theorems, establish that certain averages of independent variables converge
More informationStatistics of stochastic processes
Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014
More informationProbability Theory II. Spring 2016 Peter Orbanz
Probability Theory II Spring 2016 Peter Orbanz Contents Chapter 1. Martingales 1 1.1. Martingales indexed by partially ordered sets 1 1.2. Martingales from adapted processes 4 1.3. Stopping times and
More informationName of the Student: Problems on Discrete & Continuous R.Vs
Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan
More informationUseful Probability Theorems
Useful Probability Theorems Shiu-Tang Li Finished: March 23, 2013 Last updated: November 2, 2013 1 Convergence in distribution Theorem 1.1. TFAE: (i) µ n µ, µ n, µ are probability measures. (ii) F n (x)
More informationStat 150 Practice Final Spring 2015
Stat 50 Practice Final Spring 205 Instructor: Allan Sly Name: SID: There are 8 questions. Attempt all questions and show your working - solutions without explanation will not receive full credit. Answer
More informationBrownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539
Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory
More informationECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else
ECE 450 Homework #3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3 4 5
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems
More informationLecture 17 Brownian motion as a Markov process
Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is
More informationMath 180C, Spring Supplement on the Renewal Equation
Math 18C Spring 218 Supplement on the Renewal Equation. These remarks supplement our text and set down some of the material discussed in my lectures. Unexplained notation is as in the text or in lecture.
More informationSpectral representations and ergodic theorems for stationary stochastic processes
AMS 263 Stochastic Processes (Fall 2005) Instructor: Athanasios Kottas Spectral representations and ergodic theorems for stationary stochastic processes Stationary stochastic processes Theory and methods
More informationStatistical signal processing
Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable
More informationLecture 5. 1 Chung-Fuchs Theorem. Tel Aviv University Spring 2011
Random Walks and Brownian Motion Tel Aviv University Spring 20 Instructor: Ron Peled Lecture 5 Lecture date: Feb 28, 20 Scribe: Yishai Kohn In today's lecture we return to the Chung-Fuchs theorem regarding
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More informationLecture Notes on Risk Theory
Lecture Notes on Risk Theory February 2, 21 Contents 1 Introduction and basic definitions 1 2 Accumulated claims in a fixed time interval 3 3 Reinsurance 7 4 Risk processes in discrete time 1 5 The Adjustment
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013. Conditional expectations, filtration and martingales
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013 Conditional expectations, filtration and martingales Content. 1. Conditional expectations 2. Martingales, sub-martingales
More informationELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random
More informationRandom variables. DS GA 1002 Probability and Statistics for Data Science.
Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities
More informationSTAT 331. Martingale Central Limit Theorem and Related Results
STAT 331 Martingale Central Limit Theorem and Related Results In this unit we discuss a version of the martingale central limit theorem, which states that under certain conditions, a sum of orthogonal
More informationStochastic Processes
Stochastic Processes A very simple introduction Péter Medvegyev 2009, January Medvegyev (CEU) Stochastic Processes 2009, January 1 / 54 Summary from measure theory De nition (X, A) is a measurable space
More informationExercises: sheet 1. k=1 Y k is called compound Poisson process (X t := 0 if N t = 0).
Exercises: sheet 1 1. Prove: Let X be Poisson(s) and Y be Poisson(t) distributed. If X and Y are independent, then X + Y is Poisson(t + s) distributed (t, s > 0). This means that the property of a convolution
More informationf X (x) = λe λx, , x 0, k 0, λ > 0 Γ (k) f X (u)f X (z u)du
11 COLLECTED PROBLEMS Do the following problems for coursework 1. Problems 11.4 and 11.5 constitute one exercise leading you through the basic ruin arguments. 2. Problems 11.1 through to 11.13 but excluding
More informationExercise Exercise Homework #6 Solutions Thursday 6 April 2006
Unless otherwise stated, for the remainder of the solutions, define F m = σy 0,..., Y m We will show EY m = EY 0 using induction. m = 0 is obviously true. For base case m = : EY = EEY Y 0 = EY 0. Now assume
More informationChapter 2 Random Processes
Chapter 2 Random Processes 21 Introduction We saw in Section 111 on page 10 that many systems are best studied using the concept of random variables where the outcome of a random experiment was associated
More informationStat 426 : Homework 1.
Stat 426 : Homework 1. Moulinath Banerjee University of Michigan Announcement: The homework carries 120 points and contributes 10 points to the total grade. (1) A geometric random variable W takes values
More informationCONVERGENCE OF RANDOM SERIES AND MARTINGALES
CONVERGENCE OF RANDOM SERIES AND MARTINGALES WESLEY LEE Abstract. This paper is an introduction to probability from a measuretheoretic standpoint. After covering probability spaces, it delves into the
More informationPart IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationSystem Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models
System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder
More information