1 Simulating normal (Gaussian) rvs with applications to simulating Brownian motion and geometric Brownian motion in one and two dimensions
|
|
- Quentin Blankenship
- 6 years ago
- Views:
Transcription
1 Copyright c 2007 by Karl Sigman 1 Simulating normal Gaussian rvs with applications to simulating Brownian motion and geometric Brownian motion in one and two dimensions Fundamental to many applications in financial engineering is the normal Gaussian distribution It is the building block for simulating such basic stochastic processes as Brownian motion and geometric Brownian motion In this section, we will go over algorithms for generating univariate normal rvs and learn how to use such algorithms for constructing sample paths of Brownian motion and geometric Brownian motion, in both one and two dimensions, at a desired sequence of times t 1 < t 2 < < t k 11 Generating a univariate normal random variable We focus here on the generation of a rv Z N0, 1 for we can always then transform it into an X Nµ, σ 2 via X = σz + µ For example, if {Bt : t 0} denotes a standard Brownian motion BM, then for any fixed t > 0, Bt N0, t can be constructed via Bt = tz If it is desired to simulate the pair Bt 1, Bt 2 where 0 < t 1 < t 2, then we can use the recursion Bt 2 = Bt 1 +Bt 2 Bt 1 and the basic stationary and independent increments properties of BM: Generate two iid N0, 1 rvs, Z 1, Z 2 and set Bt 1 = t 1 Z 1, Bt 2 = t 1 Z 1 + t 2 t 1 Z 2 The point is that Bt 1 N0, t 1, and independently Bt 2 Bt 1 N0, t 2 t 1 This method extends in the obvious fashion to the generation of the k dimensional multivariate normal vector Bt 1, Bt 2,, Bt k, in which 0 < t 1 < t 2 < < t k, k 2 N0, 1 density and cdf The density of Z is given by fx = 1 2π e x2 2, x R, and the cdf by Θx = P Z x = 1 2π x e y2 2 dy, x R Using the inverse transform method is not possible here since we do not have an explicit closed formula for Θ 1 y One could of course approximate this inverse function by some closed form function and then use that, and there are some algorithms in the literature for doing that But it is possible to exactly generate a Z N0, 1 using alternative clever algorithms, so that is what we will do Acceptance rejection algorithm for Z Recall from our lecture on the acceptance rejection method that we already have an algorithm for generating a rv Z N0, 1: Algorithm for generating Z N0, 1 via the acceptance rejection method 1 Generate two independent exponentials at rate 1; Y 1 = lnu 1 and Y 2 = lnu 2 1
2 2 If Y 2 Y /2, set Z = Y 1 ; otherwise go back to 1 3 Generate U Set Z = Z if U 05, set Z = Z if U > 05 On average, this algorithm requires 2 2e/π uniforms to produce one copy of Z Since we of course will use the algorithm many many times over and over again, we can do even better by using the free independent exponential produced as overshoot in a success at step 2, Y def = Y 2 Y /2, as the Y 1 in step 1 in the next round; this reduces the expected number to 264 We next introduce another algorithm that generates pairs Z 1, Z 2 of independent normals each time Called the polar method, it requires some more complex calculations sin, cos, etc but has an advantage of only requiring one uniform for each copy of Z, not a random number In a variety of real applications, it is very desirable to try to synchronize the use of random numbers across different runs in a simulation so as to reduce the variance of an estimate, and thus a non-random algorithm might well be useful at times So we include it here in our study Algorithm for generating Z N0, 1 via the polar method 1 Generate an exponential at rate 1/2, D = 2 lnu 1, and a uniform over 0, 2π, Θ = 2πU 2 2 Set Z 1 = D cos Θ, Z 2 = D sin Θ The polar method is easily derived proved via starting with an iid N0, 1 pair Z 1, Z 2, mapping the vector Z 1, Z 2 into polar coordinates, D = Z1 2 + Z2 2 the squared distance from the origin, Θ = arctan Z 2 /Z 1 the angle, counterclockwise, from the horizontal axis to the point Z 1, Z 2, then deriving using elementary multivariate calculus, with the Jacobian, etc the joint density function gd, θ of D, Θ, to reveal that D is an exponential at rate 1/2 and independent of D, the angle Θ is uniform over 0, 2π; gd, θ = 1 2 e d/2 1 2π, d > 0, θ 0, 2π Using this fact inversely then yields the algorithm: we can produce Z 1, Z 2 by mapping an independent pair D, Θ back into rectangular coordinates; that is exactly what step 2 in the algorithm does 12 Simulating 1-dimensional Brownian motion BM and geometric Brownian motion GBM 121 Standard BM A stochastic process B = {Bt : t 0} possessing wp1 continuous sample paths is called standard Brownian motion BM if 1 B0 = 0 2 B has both stationary and independent increments 3 Bt Bs has a normal distribution with mean 0 and variance t s, 0 s < t 2 and 3 together can be summarized by: If t 0 = 0 < t 1 < t 2 < < t k, then the increment rvs Bt i Bt i 1, i {1, k}, are independent and distributed as N0, t i t i 1 In particular, Bt i Bt i 1 is independent of Bt i 1 = Bt i 1 B0 If we only wish to simulate Bt at one fixed value t > 0, then we need only generate a unit normal Z N0, 1 and set Bt = tz But typically, we will want to simulate say k values at times t 0 = 0 < t 1 < t 2 < < t k and we can easily do so as follows: 2
3 Generate k iid unit normals Z 1, Z 2,, Z k, then construct the independent increments via Bt i Bt i 1 = t i t i 1 Z i, i = 1,, k Thus to simulate the values Bt 1,, Bt k, we sequentially generate unit normals, Z 1, Z 2,, Z k, and use the recursion Bt i+1 = Bt i + Bt i+1 Bt i = Bt i + t i+1 t i Z i+1, i {0, k 1} Simulating Standard BM at times 0 = t 0 < t 1 < t 2 < < t k : Sequentially generate unit normals Z 1, Z 2,, Z k, and recursively define Bt 1 = t 1 Z 1 Bt 2 = Bt 1 + t 2 t 1 Z 2 = t 1 Z 1 + t 2 t 1 Z 2 Bt k = k ti t i 1 Z i i=1 In the end, we see that to simulate BM at a collection of specific times we need only generate unit normals 122 BM with drift Xt = σbt + µt will denote the BM with drift µ and variance term σ > 0 It has continuous sample paths and is defined by 1 X0 = 0 2 X has both stationary and independent increments 3 Xt Xs has a normal distribution with mean µt s and variance σ 2 t s, 0 s < t Xt Xs thus can be constructed simulated by generating a standard normal rv Z and setting Xt Xs = σ t sz + µt s Again, by the stationary and independent increments, we can simulate such a BM at times 0 = t 0 < t 1 < t 2 < < t k, by generating k iid unit normals Z 1, Z 2,, Z k and using the recursion Xt i+1 = Xt i + Xt i+1 Xt i = Xt i + σ t i+1 t i Z i+1 + µt i+1 t i Simulating BM with drift µ and variance term σ at times 0 = t 0 < t 1 < t 2 < < t k : Sequentially generate unit normals Z 1, Z 2,, Z k, and recursively define Xt 1 = σ t 1 Z 1 + µt 1 Xt 2 = Xt 1 + σ t 2 t 1 Z 2 + µt 2 t 1 = σ t 1 Z 1 + µt 1 + σ t 2 t 1 Z 2 + µt 2 t 1 Xt k = k σ t i t i 1 Z i + µt i t i 1 i=1 3
4 123 Geometric BM Geometric Brownian motion GBM is given by St = S0e Xt, t 0, where Xt = σbt + µt, t 0, is a BM e Xt has a lognormal distribution for each fixed t > 0 In general if Y = e X is lognormal with X Nµ, σ 2, then we can easily simulate Y via setting Y = e σz+µ, with Z N0, 1 Moreover, for any 0 s < t it holds that St = S0 Ss S0 St Ss = S0eXs e Xt Xs, and since the increment Xs is independent of the increment Xt Xs, we conclude that the consecutive ratios Ss St S0 and Ss are independent lognormals We can thus simulate the pair Ss, St by generating two iid N0, 1 rvs, Z 1, Z 2 and setting Ss = S0e σ sz 1 +µs, St = Sse σ t sz 2 +µt s = S0e σ sz 1 +µs e σ t sz 2 +µt s More generally, for 0 = t 0 < t 1 < t 2 < < t k, define Y i = St i /St i 1, i {1, 2,, k} Then we can write St 1 = S0Y 1 St 2 = St 1 Y 2 = S 0 Y 1 Y 2 St k = St k 1 Y k = S 0 Y 1 Y 2 Y k The Y i are independent lognormal rvs and can be constructed by generating k iid N0, 1 rvs, Z 1, Z 2, Z k and setting Y i = e σ t i t i 1 Z i +µt i t i 1, i {1, 2,, k} 1 Simulating paths of GBM is thus an easy consequence of our algorithm for simulating paths of BM since for 0 = t 0 < t 1 < t 2 < < t k, the following recursion holds St i+1 = St i e Xt i+1 Xt i, i {0, 1,, k 1} Simulating Geometric BM with drift µ and variance term σ at times 0 = t 0 < t 1 < t 2 < < t k : Sequentially generate unit normals Z 1, Z 2,, Z k, and set the Y i as in 1 Then recursively define St 1 = S0Y 1 St 2 = St 1 Y 2 = S 0 Y 1 Y 2 St k = St k 1 Y k = S 0 Y 1 Y 2 Y k 4
5 13 Simulating correlated BM and GBM in two dimensions in which the two BM s have a specific desired correlation ρ In many financial applications, there are several correlated assets that make up a portfolio or from which certain options/derivatives are created spread options, basket options, etc Here we focus on the case of two correlated assets Let W 1 t and W 1 t denote standard Brownian motions Consider two Brownian motions X 1 t = σ 1 W 1 t+µ 1 t, and X 2 t = σ 2 W 1 t+µ 2 t Xt = X 1 t, X 2 t T is a two-dimensional Brownian motion, where we shall assume the coordinates have correlation coefficient ρ: For a given 1 < ρ < 1, CovX 1 t, X 2 t σ 1 t σ2 t = ρ, t > 0 2 To construct this BM, we start with two independent standard BM s, B 1 t and B 2 t, define Bt = B 1 t, B 2 t T, define the 2 2 matrix and construct where µ = µ 1, µ 2 T Letting A = σ1 0 σ 2 ρ σ 2 1 ρ 2 Xt = ABt + µt, t 0, Σ = AA T = σ 2 1 σ 1 σ 2 ρ σ 1 σ 2 ρ σ 2 2 we say that X is a two-dimensional BM with drift vector µ and covariance matrix Σ; we denote this by X BMµ, Σ From 2, this implies that for each t > 0, CovX 1 t, X 2 t = ρσ 1 σ 2 t As with BM in 1-dimension, X has both stationary and independent increments If we wish to simulate Xt at a fixed time t > 0, then we can generate Z 1, Z 2 iid N0, 1, define the vector Z = Z 1, Z 2 T, then set Xt = taz + µt For generating the pair Xt 1 and Xt 2 at the times 0 < t 1 < t 2, we generate a pair of N0, 1 vectors, Z 1 = Z 1,1, Z 1,2 T, Z 2 = Z 2,1, Z 2,2 T and define Xt 1 = t 1 AZ 1 + µt 1 Xt 2 = Xt 1 + t 2 t 1 AZ 2 + µt 2 t 1 = t 1 AZ 1 + µt 1 + t 2 t 1 AZ 2 + µt 2 t 1 This then yields,, 5
6 An algorithm for simulating X BMµ, Σ at times 0 = t 0 < t 1 < t 2 < < t k : Sequentially generate k iid pairs of independent N0, 1 rvs; Z 1 = Z 1,1, Z 1,2 T, Z 2 = Z 2,1, Z 2,2 T,, Z k = Z k,1, Z k,2 T, then recursively define Xt 1 = t 1 AZ 1 + µt 1 Xt 2 = Xt 1 + t 2 t 1 AZ 2 + µt 2 t 1 = t 1 AZ 1 + µt 1 + t 2 t 1 AZ 2 + µt 2 t 1 Xt k = Xt k 1 + k t k t k 1 AZ k + µt k t k 1 = t i t i 1 AZ i + µt i t i 1 i=1 From here we can then define correlated geometric BM s GBM S 1 t = S 1 0e X 1t, S 2 t = S 2 0e X 2t To simulate these, we use the algorithm above for the BM, coordinate by coordinate, utilizing the rows of the matrix A = A i,j given above, together with the algorithm at the end of Section 123 for simulating a one dimensional GBM An algorithm for simulating two correlated GBMs with underlying two-dimensional X BMµ, Σ at times 0 = t 0 < t 1 < t 2 < < t k : Sequentially generate k iid pairs of independent N0, 1 rvs; Z 1 = Z 1,1, Z 1,2 T, Z 2 = Z 2,1, Z 2,2 T,, Z k = Z k,1, Z k,2 T, then recursively define 2 t1 j=1 S 1 t 1 = S 1 0e A 1,jZ 1,j +µ 1 t 1 2 t2 t 1 j=1 S 1 t 2 = S 1 t 1 e A 1,jZ 2,j +µ 1 t 2 t 1 2 tk t k 1 j=1 S 1 t k = S 1 t k 1 e A 1,jZ k,j +µ 1 t k t k 1 2 t1 j=1 S 2 t 1 = S 2 0e A 2,jZ 1,j +µ 2 t 1 2 t2 t 1 j=1 S 2 t 2 = S 2 t 1 e A 2,jZ 2,j +µ 2 t 2 t 1 2 tk t k 1 j=1 S 2 t k = S 2 t k 1 e A 2,jZ k,j +µ 2 t k t k 1 Remark 11 The case of three or more dimensions can be handled analogously to our method here; we will cover this in more detail later 6
1 Acceptance-Rejection Method
Copyright c 2016 by Karl Sigman 1 Acceptance-Rejection Method As we already know, finding an explicit formula for F 1 (y), y [0, 1], for the cdf of a rv X we wish to generate, F (x) = P (X x), x R, is
More informationChapter 5 continued. Chapter 5 sections
Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More information1 IEOR 6712: Notes on Brownian Motion I
Copyright c 005 by Karl Sigman IEOR 67: Notes on Brownian Motion I We present an introduction to Brownian motion, an important continuous-time stochastic process that serves as a continuous-time analog
More informationStochastic Calculus and Black-Scholes Theory MTH772P Exercises Sheet 1
Stochastic Calculus and Black-Scholes Theory MTH772P Exercises Sheet. For ξ, ξ 2, i.i.d. with P(ξ i = ± = /2 define the discrete-time random walk W =, W n = ξ +... + ξ n. (i Formulate and prove the property
More informationLecture 11. Multivariate Normal theory
10. Lecture 11. Multivariate Normal theory Lecture 11. Multivariate Normal theory 1 (1 1) 11. Multivariate Normal theory 11.1. Properties of means and covariances of vectors Properties of means and covariances
More informationGaussian, Markov and stationary processes
Gaussian, Markov and stationary processes Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ November
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationIEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion
IEOR 471: Stochastic Models in Financial Engineering Summer 27, Professor Whitt SOLUTIONS to Homework Assignment 9: Brownian motion In Ross, read Sections 1.1-1.3 and 1.6. (The total required reading there
More information1 Review of Probability
1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x
More informationThe Multivariate Normal Distribution. Copyright c 2012 Dan Nettleton (Iowa State University) Statistics / 36
The Multivariate Normal Distribution Copyright c 2012 Dan Nettleton (Iowa State University) Statistics 611 1 / 36 The Moment Generating Function (MGF) of a random vector X is given by M X (t) = E(e t X
More informationQuestion: My computer only knows how to generate a uniform random variable. How do I generate others? f X (x)dx. f X (s)ds.
Simulation Question: My computer only knows how to generate a uniform random variable. How do I generate others?. Continuous Random Variables Recall that a random variable X is continuous if it has a probability
More informationECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes
ECE353: Probability and Random Processes Lecture 18 - Stochastic Processes Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu From RV
More informationLECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.
LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation
More informationThe Multivariate Normal Distribution. Copyright c 2012 Dan Nettleton (Iowa State University) Statistics / 36
The Multivariate Normal Distribution Copyright c 2012 Dan Nettleton (Iowa State University) Statistics 611 1 / 36 The Moment Generating Function (MGF) of a random vector X is given by M X (t) = E(e t X
More informationLecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1
Random Walks and Brownian Motion Tel Aviv University Spring 011 Lecture date: May 0, 011 Lecture 9 Instructor: Ron Peled Scribe: Jonathan Hermon In today s lecture we present the Brownian motion (BM).
More informationMonte-Carlo MMD-MA, Université Paris-Dauphine. Xiaolu Tan
Monte-Carlo MMD-MA, Université Paris-Dauphine Xiaolu Tan tan@ceremade.dauphine.fr Septembre 2015 Contents 1 Introduction 1 1.1 The principle.................................. 1 1.2 The error analysis
More informationMAS223 Statistical Inference and Modelling Exercises
MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,
More informationI forgot to mention last time: in the Ito formula for two standard processes, putting
I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationJoint Probability Distributions and Random Samples (Devore Chapter Five)
Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete
More information1 Inverse Transform Method and some alternative algorithms
Copyright c 2016 by Karl Sigman 1 Inverse Transform Method and some alternative algorithms Assuming our computer can hand us, upon demand, iid copies of rvs that are uniformly distributed on (0, 1), it
More informationESS011 Mathematical statistics and signal processing
ESS011 Mathematical statistics and signal processing Lecture 9: Gaussian distribution, transformation formula for continuous random variables, and the joint distribution Tuomas A. Rajala Chalmers TU April
More informationModule 9: Stationary Processes
Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.
More informationMultivariate Distributions
IEOR E4602: Quantitative Risk Management Spring 2016 c 2016 by Martin Haugh Multivariate Distributions We will study multivariate distributions in these notes, focusing 1 in particular on multivariate
More informationNonuniform Random Variate Generation
Nonuniform Random Variate Generation 1 Suppose we have a generator of i.i.d. U(0, 1) r.v. s. We want to generate r.v. s from other distributions, such as normal, Weibull, Poisson, etc. Or other random
More informationReview. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda
Review DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Probability and statistics Probability: Framework for dealing with
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationThe Multivariate Normal Distribution 1
The Multivariate Normal Distribution 1 STA 302 Fall 2017 1 See last slide for copyright information. 1 / 40 Overview 1 Moment-generating Functions 2 Definition 3 Properties 4 χ 2 and t distributions 2
More informationMultivariate Normal-Laplace Distribution and Processes
CHAPTER 4 Multivariate Normal-Laplace Distribution and Processes The normal-laplace distribution, which results from the convolution of independent normal and Laplace random variables is introduced by
More information1 Poisson processes, and Compound (batch) Poisson processes
Copyright c 2007 by Karl Sigman 1 Poisson processes, and Compound (batch) Poisson processes 1.1 Point Processes Definition 1.1 A simple point process ψ = {t n : n 1} is a sequence of strictly increasing
More informationIEOR E4703: Monte-Carlo Simulation
IEOR E4703: Monte-Carlo Simulation Output Analysis for Monte-Carlo Martin Haugh Department of Industrial Engineering and Operations Research Columbia University Email: martin.b.haugh@gmail.com Output Analysis
More informationThe concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.
The concentration of a drug in blood Exponential decay C12 concentration 2 4 6 8 1 C12 concentration 2 4 6 8 1 dc(t) dt = µc(t) C(t) = C()e µt 2 4 6 8 1 12 time in minutes 2 4 6 8 1 12 time in minutes
More informationBrownian Motion. An Undergraduate Introduction to Financial Mathematics. J. Robert Buchanan. J. Robert Buchanan Brownian Motion
Brownian Motion An Undergraduate Introduction to Financial Mathematics J. Robert Buchanan 2010 Background We have already seen that the limiting behavior of a discrete random walk yields a derivation of
More information2 Random Variable Generation
2 Random Variable Generation Most Monte Carlo computations require, as a starting point, a sequence of i.i.d. random variables with given marginal distribution. We describe here some of the basic methods
More informationKolmogorov Equations and Markov Processes
Kolmogorov Equations and Markov Processes May 3, 013 1 Transition measures and functions Consider a stochastic process {X(t)} t 0 whose state space is a product of intervals contained in R n. We define
More informationNext tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution. e z2 /2. f Z (z) = 1 2π. e z2 i /2
Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution Defn: Z R 1 N(0,1) iff f Z (z) = 1 2π e z2 /2 Defn: Z R p MV N p (0, I) if and only if Z = (Z 1,..., Z p ) (a column
More informationMTH739U/P: Topics in Scientific Computing Autumn 2016 Week 6
MTH739U/P: Topics in Scientific Computing Autumn 16 Week 6 4.5 Generic algorithms for non-uniform variates We have seen that sampling from a uniform distribution in [, 1] is a relatively straightforward
More informationChapter 2. Discrete Distributions
Chapter. Discrete Distributions Objectives ˆ Basic Concepts & Epectations ˆ Binomial, Poisson, Geometric, Negative Binomial, and Hypergeometric Distributions ˆ Introduction to the Maimum Likelihood Estimation
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationLecture 19 : Brownian motion: Path properties I
Lecture 19 : Brownian motion: Path properties I MATH275B - Winter 2012 Lecturer: Sebastien Roch References: [Dur10, Section 8.1], [Lig10, Section 1.5, 1.6], [MP10, Section 1.1, 1.2]. 1 Invariance We begin
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 7 9/25/2013
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 7 9/5/013 The Reflection Principle. The Distribution of the Maximum. Brownian motion with drift Content. 1. Quick intro to stopping times.
More informationThe Multivariate Gaussian Distribution
The Multivariate Gaussian Distribution Chuong B. Do October, 8 A vector-valued random variable X = T X X n is said to have a multivariate normal or Gaussian) distribution with mean µ R n and covariance
More informationThe Multivariate Normal Distribution. In this case according to our theorem
The Multivariate Normal Distribution Defn: Z R 1 N(0, 1) iff f Z (z) = 1 2π e z2 /2. Defn: Z R p MV N p (0, I) if and only if Z = (Z 1,..., Z p ) T with the Z i independent and each Z i N(0, 1). In this
More informationACM 116: Lectures 3 4
1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance
More informationCh3. Generating Random Variates with Non-Uniform Distributions
ST4231, Semester I, 2003-2004 Ch3. Generating Random Variates with Non-Uniform Distributions This chapter mainly focuses on methods for generating non-uniform random numbers based on the built-in standard
More informationContents 1. Contents
Contents 1 Contents 6 Distributions of Functions of Random Variables 2 6.1 Transformation of Discrete r.v.s............. 3 6.2 Method of Distribution Functions............. 6 6.3 Method of Transformations................
More information1 Stationary point processes
Copyright c 22 by Karl Sigman Stationary point processes We present here a brief introduction to stationay point processes on the real line.. Basic notation for point processes We consider simple point
More informationChange Of Variable Theorem: Multiple Dimensions
Change Of Variable Theorem: Multiple Dimensions Moulinath Banerjee University of Michigan August 30, 01 Let (X, Y ) be a two-dimensional continuous random vector. Thus P (X = x, Y = y) = 0 for all (x,
More information1 Solution to Problem 2.1
Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto
More informationPure Random process Pure Random Process or White Noise Process: is a random process {X t, t 0} which has: { σ 2 if k = 0 0 if k 0
MODULE 9: STATIONARY PROCESSES 7 Lecture 2 Autoregressive Processes 1 Moving Average Process Pure Random process Pure Random Process or White Noise Process: is a random process X t, t 0} which has: E[X
More informationBivariate Distributions. Discrete Bivariate Distribution Example
Spring 7 Geog C: Phaedon C. Kyriakidis Bivariate Distributions Definition: class of multivariate probability distributions describing joint variation of outcomes of two random variables (discrete or continuous),
More informationSimulating Random Variables
Simulating Random Variables Timothy Hanson Department of Statistics, University of South Carolina Stat 740: Statistical Computing 1 / 23 R has many built-in random number generators... Beta, gamma (also
More information1 Probability and Random Variables
1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in
More informationGaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008
Gaussian processes Chuong B Do (updated by Honglak Lee) November 22, 2008 Many of the classical machine learning algorithms that we talked about during the first half of this course fit the following pattern:
More information1. Stochastic Processes and filtrations
1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S
More informationSTAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong
STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X
More informationTime Series Solutions HT 2009
Time Series Solutions HT 2009 1. Let {X t } be the ARMA(1, 1) process, X t φx t 1 = ɛ t + θɛ t 1, {ɛ t } WN(0, σ 2 ), where φ < 1 and θ < 1. Show that the autocorrelation function of {X t } is given by
More informationEconometría 2: Análisis de series de Tiempo
Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each
More informationLecture 12: Diffusion Processes and Stochastic Differential Equations
Lecture 12: Diffusion Processes and Stochastic Differential Equations 1. Diffusion Processes 1.1 Definition of a diffusion process 1.2 Examples 2. Stochastic Differential Equations SDE) 2.1 Stochastic
More informationE X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.
E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,
More information4. CONTINUOUS RANDOM VARIABLES
IA Probability Lent Term 4 CONTINUOUS RANDOM VARIABLES 4 Introduction Up to now we have restricted consideration to sample spaces Ω which are finite, or countable; we will now relax that assumption We
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationSTA205 Probability: Week 8 R. Wolpert
INFINITE COIN-TOSS AND THE LAWS OF LARGE NUMBERS The traditional interpretation of the probability of an event E is its asymptotic frequency: the limit as n of the fraction of n repeated, similar, and
More informationThe multidimensional Ito Integral and the multidimensional Ito Formula. Eric Mu ller June 1, 2015 Seminar on Stochastic Geometry and its applications
The multidimensional Ito Integral and the multidimensional Ito Formula Eric Mu ller June 1, 215 Seminar on Stochastic Geometry and its applications page 2 Seminar on Stochastic Geometry and its applications
More informationBernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012
1 Stochastic Calculus Notes March 9 th, 1 In 19, Bachelier proposed for the Paris stock exchange a model for the fluctuations affecting the price X(t) of an asset that was given by the Brownian motion.
More information8 - Continuous random vectors
8-1 Continuous random vectors S. Lall, Stanford 2011.01.25.01 8 - Continuous random vectors Mean-square deviation Mean-variance decomposition Gaussian random vectors The Gamma function The χ 2 distribution
More informationStochastic Differential Equations.
Chapter 3 Stochastic Differential Equations. 3.1 Existence and Uniqueness. One of the ways of constructing a Diffusion process is to solve the stochastic differential equation dx(t) = σ(t, x(t)) dβ(t)
More information1 Some basic renewal theory: The Renewal Reward Theorem
Copyright c 27 by Karl Sigman Some basic renewal theory: The Renewal Reward Theorem Here, we will present some basic results in renewal theory such as the elementary renewal theorem, and then the very
More information18 Bivariate normal distribution I
8 Bivariate normal distribution I 8 Example Imagine firing arrows at a target Hopefully they will fall close to the target centre As we fire more arrows we find a high density near the centre and fewer
More informationAsymptotic Multivariate Kriging Using Estimated Parameters with Bayesian Prediction Methods for Non-linear Predictands
Asymptotic Multivariate Kriging Using Estimated Parameters with Bayesian Prediction Methods for Non-linear Predictands Elizabeth C. Mannshardt-Shamseldin Advisor: Richard L. Smith Duke University Department
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More informationFor a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,
CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationIEOR 4703: Homework 2 Solutions
IEOR 4703: Homework 2 Solutions Exercises for which no programming is required Let U be uniformly distributed on the interval (0, 1); P (U x) = x, x (0, 1). We assume that your computer can sequentially
More information4. Distributions of Functions of Random Variables
4. Distributions of Functions of Random Variables Setup: Consider as given the joint distribution of X 1,..., X n (i.e. consider as given f X1,...,X n and F X1,...,X n ) Consider k functions g 1 : R n
More informationThe Delta Method and Applications
Chapter 5 The Delta Method and Applications 5.1 Local linear approximations Suppose that a particular random sequence converges in distribution to a particular constant. The idea of using a first-order
More information1: PROBABILITY REVIEW
1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following
More information(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?
IEOR 3106: Introduction to Operations Research: Stochastic Models Fall 2006, Professor Whitt SOLUTIONS to Final Exam Chapters 4-7 and 10 in Ross, Tuesday, December 19, 4:10pm-7:00pm Open Book: but only
More informationStochastic Processes
Stochastic Processes Stochastic Process Non Formal Definition: Non formal: A stochastic process (random process) is the opposite of a deterministic process such as one defined by a differential equation.
More informationThe Multivariate Normal Distribution 1
The Multivariate Normal Distribution 1 STA 302 Fall 2014 1 See last slide for copyright information. 1 / 37 Overview 1 Moment-generating Functions 2 Definition 3 Properties 4 χ 2 and t distributions 2
More informationChapter 4 - Fundamentals of spatial processes Lecture notes
TK4150 - Intro 1 Chapter 4 - Fundamentals of spatial processes Lecture notes Odd Kolbjørnsen and Geir Storvik January 30, 2017 STK4150 - Intro 2 Spatial processes Typically correlation between nearby sites
More informationGaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012
Gaussian Processes Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 01 Pictorial view of embedding distribution Transform the entire distribution to expected features Feature space Feature
More information6. The econometrics of Financial Markets: Empirical Analysis of Financial Time Series. MA6622, Ernesto Mordecki, CityU, HK, 2006.
6. The econometrics of Financial Markets: Empirical Analysis of Financial Time Series MA6622, Ernesto Mordecki, CityU, HK, 2006. References for Lecture 5: Quantitative Risk Management. A. McNeil, R. Frey,
More informationGaussian distributions and processes have long been accepted as useful tools for stochastic
Chapter 3 Alpha-Stable Random Variables and Processes Gaussian distributions and processes have long been accepted as useful tools for stochastic modeling. In this section, we introduce a statistical model
More informationThe distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y)
Stochastic Processes - MM3 - Solutions MM3 - Review Exercise Let X N (0, ), i.e. X is a standard Gaussian/normal random variable, and denote by f X the pdf of X. Consider also a continuous random variable
More information9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006.
9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006. References for this Lecture: Introduction to Time Series and Forecasting. P.J. Brockwell and R. A. Davis, Springer Texts
More informationP (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n
JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are
More informationMarkov Chains and Hidden Markov Models
Chapter 1 Markov Chains and Hidden Markov Models In this chapter, we will introduce the concept of Markov chains, and show how Markov chains can be used to model signals using structures such as hidden
More informationProbability and Stochastic Processes
Probability and Stochastic Processes A Friendly Introduction Electrical and Computer Engineers Third Edition Roy D. Yates Rutgers, The State University of New Jersey David J. Goodman New York University
More informationSTA 2101/442 Assignment 3 1
STA 2101/442 Assignment 3 1 These questions are practice for the midterm and final exam, and are not to be handed in. 1. Suppose X 1,..., X n are a random sample from a distribution with mean µ and variance
More informationChance Constrained Programming
IE 495 Lecture 22 Chance Constrained Programming Prof. Jeff Linderoth April 21, 2003 April 21, 2002 Stochastic Programming Lecture 22 Slide 1 Outline HW Fixes Chance Constrained Programming Is Hard Main
More informationClass 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis.
Service Engineering Class 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis. G/G/1 Queue: Virtual Waiting Time (Unfinished Work). GI/GI/1: Lindley s Equations
More informationStatistics of stochastic processes
Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014
More informationSTA 294: Stochastic Processes & Bayesian Nonparametrics
MARKOV CHAINS AND CONVERGENCE CONCEPTS Markov chains are among the simplest stochastic processes, just one step beyond iid sequences of random variables. Traditionally they ve been used in modelling a
More informationT n B(T n ) n n T n. n n. = lim
Homework..7. (a). The relation T n T n 1 = T(B n ) shows that T n T n 1 is an identically sequence with common law as T. Notice that for any n 1, by Theorem.16 the Brownian motion B n (t) is independent
More informationNumerical Methods with Lévy Processes
Numerical Methods with Lévy Processes 1 Objective: i) Find models of asset returns, etc ii) Get numbers out of them. Why? VaR and risk management Valuing and hedging derivatives Why not? Usual assumption:
More informationXt i Xs i N(0, σ 2 (t s)) and they are independent. This implies that the density function of X t X s is a product of normal density functions:
174 BROWNIAN MOTION 8.4. Brownian motion in R d and the heat equation. The heat equation is a partial differential equation. We are going to convert it into a probabilistic equation by reversing time.
More informationLecture 25: Review. Statistics 104. April 23, Colin Rundel
Lecture 25: Review Statistics 104 Colin Rundel April 23, 2012 Joint CDF F (x, y) = P [X x, Y y] = P [(X, Y ) lies south-west of the point (x, y)] Y (x,y) X Statistics 104 (Colin Rundel) Lecture 25 April
More informationMA 8101 Stokastiske metoder i systemteori
MA 811 Stokastiske metoder i systemteori AUTUMN TRM 3 Suggested solution with some extra comments The exam had a list of useful formulae attached. This list has been added here as well. 1 Problem In this
More informationUCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)
UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable
More information