AMS 216 Stochastic Differential Equations Lecture 03 Copyright by Hongyun Wang, UCSC
|
|
- Stanley Phelps
- 5 years ago
- Views:
Transcription
1 Lecture 03 Copyright by Hongyun Wang, UCSC Review of probability theory (Continued) We show that Sum of independent normal random variable normal random variable Characteristic function (CF) of a random variable Random variable: Density: ρx(x) X(ω) Characteristic function is defined as φ X + ( ( )) = exp iξx ( ξ) E exp iξx ( )ρ X ( x)dx This is very similar to Fourier transform (FT). Fourier transform: Forward transform: ( ) = exp( i2π ξx) f x ˆf ξ + Backward transform: f 2 + ( x) = exp i2π ξx f 2 ( x) = f x ( ) ( ) ˆf ξ f ( x) ˆf ( ξ) ( )dx ˆf ( ξ ) f 2 ( x) ( )dξ Therefore, the backward transform is called the inverse transform. Connection between characteristic function (CF) and Fourier transform (FT) φ X ( ξ) = ˆρ ξ ( ) ξ = ξ 2π - -
2 If ˆf ξ If φ X ( ) = ĝ( ξ), then f ( x) = g( x). ( ξ) = φ Y ( ξ), then ρ X ( s) = ρ Y s ( ). Characteristic function (CF) of a normal random variable: X ~ N(μ, σ 2 ) Density: ρ X ( x) = ( ) 2 exp x µ 2πσ 2 2σ 2 Characteristic function: φ X ( ( )) = exp iξ x ( ξ) E exp iξx ( )ρ X ( x)dx = = ( ) 2 + i2σ 2 ξ x exp x µ 2πσ 2 2σ 2 exp x µ 2πσ 2 dx ( ) 2 + i2σ 2 ξ x µ ( ) 2 ( ) iσ 2 ξ + iµξ σ2 ξ 2 2σ 2 2 dx = exp iµξ σ2 ξ 2 2 ( ) iσ 2 ξ exp x µ 2πσ 2 2σ 2 2 dx = exp iµξ σ2 ξ 2 2 = exp iµξ σ2 ξ 2 2 We obtain: 2πσ 2 exp s 2 2σ 2 ds X ~ N ( µ, σ ) 2 φ X ( ξ) = exp iµξ σ2 ξ 2 Since we can map the characteristic function back to the probability density function, we have the theorem below
3 X ~ N(μ, σ 2 ) if and only if φ X ( ξ) = exp iµξ σ2 ξ 2 2 Next, we look at the characteristic function for the sum of two independent random variables. It is simply the product of two individual characteristic functions. If X and Y are independent, then we have φ ( X+Y )( ξ) = φ X ( ξ) φ Y ( ξ) Proof: φ ( X+Y )( ξ) = E exp iξ X +Y ( ( ( ))) ( ) (using the independence) ( ( )) = E exp( iξ X ) exp( iξy ) = E( exp( iξ X )) E exp iξy = φ X ( ξ) φ Y ( ξ) Applying this theorem to two independent normal random variables, we conclude Proof: Suppose X ~ N(μ, σ 2 ), Y ~ N(μ2, σ2 2 ). Suppose X and Y are independent. Then we have (X + Y) ~ N(μ+μ2, σ 2 + σ2 2 ). That is, sum of independent normals is normal ( )( ξ) = φ X ( ξ) φ Y ξ φ X+Y ( ) = exp iµ ξ σ 2 ξ 2 = exp i ( µ + µ 2 )ξ σ 2 2 ( + σ 2 )ξ exp iµ 2 ξ σ ξ 2-3 -
4 Before we start the discussion of stochastic differential equations, let us look at another example on probability to demonstrate the importance of the framework of repeated experiments. Monty Hall s game: Your group and Mike s group will do summer camping together. But each group has a different itinerary in mind. To decide on an itinerary, you and Mike play a game ONCE with Mike hosting. Specifications of the game: ) The host (Mike) puts a card in one of the 3 boxes without you looking (so he knows which box has the card but you don t know). 2) You select a box. 3) If the box you pick contains the card, you win (and you will have priority in the itinerary planning). Otherwise, you lose (and Mike will have priority in the itinerary planning). 4) At the end, all boxes are opened to verify that the host is not cheating. The incident: After you select box #, Mike says Let us play it the Monty Hall style. He opens box #2, which is empty, and offers you the option of switching to box #3. Question: Should you switch? Answer: The behavior of the host (Mike) is incompletely specified. There are many possible ways the game can be repeated. Version : Monty Hall s game This is Mathematicians definition of Monty Hall s game. The real game show hosted by Monty Hall did not actually follow these rules. *) Upon your initial selection, the host must open a box that you did not pick. *) The host must open an empty box. *) The host must offer you the option of switching. For this game, we have Pr(winning not switching) = /3 Pr(winning switching) = 2/3 See Appendix A for derivation
5 Version 2: (The greedy host) The greedy host wants to lure you away from the correct box. *) Upon your initial selection, the host has the option of opening a box that you did not pick. The greedy host will open a box if and only if you initial selection is correct. *) If the host opens a box, he must open an empty box. *) If the host opens a box, he must offer you the option of switching. For this game, we have Pr(winning not switching if offered) = /3 Pr(winning switching if offered) = 0 See Appendix A2 for derivation. Version 3: (The less greedy host) The less greedy host still wants to lure you away from the correct box. But he wants to avoid this behavior being easily recognized in repeated games. *) Upon your initial selection, the host has the option of opening a box that you did not pick. The less greedy host adds some randomness to the decision on whether or not to open a box. Pr(opening a box your initial selection is incorrect) = p Pr(opening a box your initial selection is correct) = p2 *) If the host opens a box, he must open an empty box. *) If the host opens a box, he must offer you the option of switching. Version 2 is a special case of Version 3 with p = 0 and p2 =. For this game, we have Pr(winning not switching if offered) = /3 Pr(winning switching if offered) = (+2p p2)/3 For example, for p = 0.25 and p2 = 0.75 Pr(winning switching if offered) = 0.25 See Appendix A3 for derivation. Key observation: When you encounter an incompletely specified game only once, you have to make a model perceiving how the game is repeated. The model is subjective
6 Stochastic differential equation dx ( t ) = b( X ( t ),t )dt + a( X ( t ),t )dw t Or in a more concise form dx = b( X,t )dt + a( X,t )dw We need to introduce W(t). ( ) The Wiener process (Brownian motion) Definition : The Wiener process, denoted by W(t) satisfies ) W(0) = 0 2) For t 0, W(t) ~ N(0, t) 3) For t4 t3 t2 t 0, increments W(t2) W(t) and W(t4) W(t3) are independent. Definition 2: ) W(0) = 0 2) For t2 t 0, W(t2) W(t) ~ N(0, t2 t) 3) For t4 t3 t2 t 0, increments W(t2) W(t) and W(t4) W(t3) are independent. Definition 2 appears to be stronger than Definition. Question: Are these two definitions equivalent? Answer: Yes. Suppose X and Y are independent. Suppose X ~ N(μ, σ 2 ) and (X+Y) ~ N(μ2, σ2 2 ). Then we have Y ~ N(μ2 μ, σ2 2 σ 2 ) Proof: Homework problem. Using this theorem, we show that Definition is as strong as Definition
7 We start with Definition and derive Definition 2. Definition : ==> W(t) ~ N(0, t) and W(t2) ~ N(0, t2) We write W(t2) as a sum W(t2) = W(t) + (W(t2) W(t)) For t2 t 0, W(t) and (W(t2) W(t)) are independent. Applying the theorem above, we conclude W(t2) W(t) ~ N(0, t2 t) which is Definition 2. Appendix A Suppose you never switch. Pr(winning not switching) = Pr(your initial selection is correct) = /3 Suppose you always switch. Recall that the host must open an empty box. When you initial section is incorrect and you switch, you will switch to the correct box. Pr(winning switching) = Pr(your initial selection is incorrect) = 2/3 Appendix A2 Suppose you never switch if offered. Pr(winning not switching if offered) = Pr(your initial selection is correct) = /3 Suppose you always switch if offered. Let A = your initial selection is correct B = the host opens an empty box S = switching if offered - 7 -
8 W = winning" The greedy host opens a box and offers you the option of switching if and only if your initial selection is correct. B <==> A We use the law of total probability. Pr(winning switching if offered) = Pr(W A and S) Pr(A) + Pr(W A C ) Pr(A C ) = 0 (/3) + 0 (2/3) = 0 Appendix A3 Suppose you never switch if offered. Pr(winning not switching if offered) = Pr(your initial selection is correct) = /3 Suppose you always switch if offered. Let A = your initial selection is correct B = the host opens an empty box S = switching if offered W = winning" The host decides whether or not to open a box with probabilities Pr(B A C ) = p Pr(B A) = p2 We use the law of total probability. Pr(winning switching if offered) = Pr(W A and B and S) Pr(A and B) + Pr(W A and B C and S) Pr(A and B C ) + Pr(W A C and B and S) Pr(A C and B) + Pr(W A C and B C and S) Pr(A C and B C ) We first calculate the various terms used in the law of total probability. Pr(A and B) = Pr(B A) Pr(A) = p2*(/3) Pr(A and B C ) = Pr(B C A) Pr(A) = ( p2)*(/3) Pr(A C and B) = Pr(B A C ) Pr(A C ) = p*(2/3) - 8 -
9 Pr(A C and B C ) = Pr(B C A C ) Pr(A C ) = ( p)*(2/3) Pr(W A and B and S) = 0 Pr(W A and B C and S) = Pr(W A C and B and S) = Pr(W A C and B C and S) = 0 Substituting these terms into the law of total probability, we obtain Pr(winning switching if offered) = 0 + ( p2)*(/3) + p*(2/3) + 0 = (+2p p2)/3 ==> Pr(winning switching if offered) = (+2p p2)/3-9 -
Great Theoretical Ideas in Computer Science
15-251 Great Theoretical Ideas in Computer Science Probability Theory: Counting in Terms of Proportions Lecture 10 (September 27, 2007) Some Puzzles Teams A and B are equally good In any one game, each
More informationB8.3 Mathematical Models for Financial Derivatives. Hilary Term Solution Sheet 2
B8.3 Mathematical Models for Financial Derivatives Hilary Term 18 Solution Sheet In the following W t ) t denotes a standard Brownian motion and t > denotes time. A partition π of the interval, t is a
More informationAMS 212A Applied Mathematical Methods I Appendices of Lecture 06 Copyright by Hongyun Wang, UCSC. ( ) cos2
AMS 22A Applied Mathematical Methods I Appendices of Lecture 06 Copyright by Hongyun Wang UCSC Appendix A: Proof of Lemma Lemma : Let (x ) be the solution of x ( r( x)+ q( x) )sin 2 + ( a) 0 < cos2 where
More information1 Introduction. 2 Diffusion equation and central limit theorem. The content of these notes is also covered by chapter 3 section B of [1].
1 Introduction The content of these notes is also covered by chapter 3 section B of [1]. Diffusion equation and central limit theorem Consider a sequence {ξ i } i=1 i.i.d. ξ i = d ξ with ξ : Ω { Dx, 0,
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationXt i Xs i N(0, σ 2 (t s)) and they are independent. This implies that the density function of X t X s is a product of normal density functions:
174 BROWNIAN MOTION 8.4. Brownian motion in R d and the heat equation. The heat equation is a partial differential equation. We are going to convert it into a probabilistic equation by reversing time.
More informationAMS 212A Applied Mathematical Methods I Lecture 14 Copyright by Hongyun Wang, UCSC. with initial condition
Lecture 14 Copyright by Hongyun Wang, UCSC Recap of Lecture 13 Semi-linear PDE a( x, t) u t + b( x, t)u = c ( x, t, u ) x Characteristics: dt d = a x, t dx d = b x, t with initial condition t ( 0)= t 0
More informationLecture 4. David Aldous. 2 September David Aldous Lecture 4
Lecture 4 David Aldous 2 September 2015 The specific examples I m discussing are not so important; the point of these first lectures is to illustrate a few of the 100 ideas from STAT134. Ideas used in
More information( ), λ. ( ) =u z. ( )+ iv z
AMS 212B Perturbation Metho Lecture 18 Copyright by Hongyun Wang, UCSC Method of steepest descent Consider the integral I = f exp( λ g )d, g C =u + iv, λ We consider the situation where ƒ() and g() = u()
More informationThe Cameron-Martin-Girsanov (CMG) Theorem
The Cameron-Martin-Girsanov (CMG) Theorem There are many versions of the CMG Theorem. In some sense, there are many CMG Theorems. The first version appeared in ] in 944. Here we present a standard version,
More informationMidterm Exam 1 (Solutions)
EECS 6 Probability and Random Processes University of California, Berkeley: Spring 07 Kannan Ramchandran February 3, 07 Midterm Exam (Solutions) Last name First name SID Name of student on your left: Name
More informationII. FOURIER TRANSFORM ON L 1 (R)
II. FOURIER TRANSFORM ON L 1 (R) In this chapter we will discuss the Fourier transform of Lebesgue integrable functions defined on R. To fix the notation, we denote L 1 (R) = {f : R C f(t) dt < }. The
More informationAMS 216 Stochastic Differential Equations Lecture 02 Copyright by Hongyun Wang, UCSC ( ( )) 2 = E X 2 ( ( )) 2
AMS 216 Stochastic Differetial Equatios Lecture 02 Copyright by Hogyu Wag, UCSC Review of probability theory (Cotiued) Variace: var X We obtai: = E X E( X ) 2 = E( X 2 ) 2E ( X )E X var( X ) = E X 2 Stadard
More informationELEMENTS OF PROBABILITY THEORY
ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable
More informationLecture 1: Basics of Probability
Lecture 1: Basics of Probability (Luise-Vitetta, Chapter 8) Why probability in data science? Data acquisition is noisy Sampling/quantization external factors: If you record your voice saying machine learning
More informationAMS 147 Computational Methods and Applications Lecture 13 Copyright by Hongyun Wang, UCSC
Lecture 13 Copyright y Hongyun Wang, UCSC Recap: Fitting to exact data *) Data: ( x j, y j ), j = 1,,, N y j = f x j *) Polynomial fitting Gis phenomenon *) Cuic spline Convergence of cuic spline *) Application
More informationWiener Measure and Brownian Motion
Chapter 16 Wiener Measure and Brownian Motion Diffusion of particles is a product of their apparently random motion. The density u(t, x) of diffusing particles satisfies the diffusion equation (16.1) u
More informationAn introduction to Lévy processes
with financial modelling in mind Department of Statistics, University of Oxford 27 May 2008 1 Motivation 2 3 General modelling with Lévy processes Modelling financial price processes Quadratic variation
More informationECE 450 Lecture 2. Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview
ECE 450 Lecture 2 Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview Conditional Probability, Pr(A B) Total Probability Bayes Theorem Independent Events
More informationLinear Filtering of general Gaussian processes
Linear Filtering of general Gaussian processes Vít Kubelka Advisor: prof. RNDr. Bohdan Maslowski, DrSc. Robust 2018 Department of Probability and Statistics Faculty of Mathematics and Physics Charles University
More informationA Concise Course on Stochastic Partial Differential Equations
A Concise Course on Stochastic Partial Differential Equations Michael Röckner Reference: C. Prevot, M. Röckner: Springer LN in Math. 1905, Berlin (2007) And see the references therein for the original
More informationAlgebra. Formal fallacies (recap). Example of base rate fallacy: Monty Hall Problem.
October 15, 2017 Algebra. Formal fallacies (recap). Example of base rate fallacy: Monty Hall Problem. A formal fallacy is an error in logic that can be seen in the argument's form. All formal fallacies
More informationLecture 2. Conditional Probability
Math 408 - Mathematical Statistics Lecture 2. Conditional Probability January 18, 2013 Konstantin Zuev (USC) Math 408, Lecture 2 January 18, 2013 1 / 9 Agenda Motivation and Definition Properties of Conditional
More information1. Pseudo-Eisenstein series
(January 4, 202) Spectral Theory for SL 2 (Z)\SL 2 (R)/SO 2 (R) Paul Garrett garrett@math.umn.edu http://www.math.umn.edu/ garrett/ Pseudo-Eisenstein series Fourier-Laplace-Mellin transforms Recollection
More informationVerona Course April Lecture 1. Review of probability
Verona Course April 215. Lecture 1. Review of probability Viorel Barbu Al.I. Cuza University of Iaşi and the Romanian Academy A probability space is a triple (Ω, F, P) where Ω is an abstract set, F is
More informationkg meter ii) Note the dimensions of ρ τ are kg 2 velocity 2 meter = 1 sec 2 We will interpret this velocity in upcoming slides.
II. Generalizing the 1-dimensional wave equation First generalize the notation. i) "q" has meant transverse deflection of the string. Replace q Ψ, where Ψ may indicate other properties of the medium that
More informationNeural Networks in Structured Prediction. November 17, 2015
Neural Networks in Structured Prediction November 17, 2015 HWs and Paper Last homework is going to be posted soon Neural net NER tagging model This is a new structured model Paper - Thursday after Thanksgiving
More informationStochastic Integration and Stochastic Differential Equations: a gentle introduction
Stochastic Integration and Stochastic Differential Equations: a gentle introduction Oleg Makhnin New Mexico Tech Dept. of Mathematics October 26, 27 Intro: why Stochastic? Brownian Motion/ Wiener process
More informationFourier Transform & Sobolev Spaces
Fourier Transform & Sobolev Spaces Michael Reiter, Arthur Schuster Summer Term 2008 Abstract We introduce the concept of weak derivative that allows us to define new interesting Hilbert spaces the Sobolev
More informationLecture 04: Conditional Probability. Lisa Yan July 2, 2018
Lecture 04: Conditional Probability Lisa Yan July 2, 2018 Announcements Problem Set #1 due on Friday Gradescope submission portal up Use Piazza No class or OH on Wednesday July 4 th 2 Summary from last
More informationUniversity of California, Los Angeles Department of Statistics. Joint probability distributions
Universit of California, Los Angeles Department of Statistics Statistics 100A Instructor: Nicolas Christou Joint probabilit distributions So far we have considered onl distributions with one random variable.
More informationLecture 4 - Conditional Probability
ecture 4 - Conditional Probability 6.04 - February, 00 Conditional Probability Suppose that we pick a random person in the world. Everyone has an equal chance of being picked. et A be the event that the
More informationP (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).
Lectures 7-8 jacques@ucsdedu 41 Conditional Probability Let (Ω, F, P ) be a probability space Suppose that we have prior information which leads us to conclude that an event A F occurs Based on this information,
More informationLecture 3. January 7, () Lecture 3 January 7, / 35
Lecture 3 January 7, 2013 () Lecture 3 January 7, 2013 1 / 35 Outline This week s lecture: Fast review of last week s lecture: Conditional probability. Partition, Partition theorem. Bayes theorem and its
More informationStochastic Numerical Analysis
Stochastic Numerical Analysis Prof. Mike Giles mike.giles@maths.ox.ac.uk Oxford University Mathematical Institute Stoch. NA, Lecture 3 p. 1 Multi-dimensional SDEs So far we have considered scalar SDEs
More informationUCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)
UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable
More informationLecture 12: Detailed balance and Eigenfunction methods
Lecture 12: Detailed balance and Eigenfunction methods Readings Recommended: Pavliotis [2014] 4.5-4.7 (eigenfunction methods and reversibility), 4.2-4.4 (explicit examples of eigenfunction methods) Gardiner
More informationI. Introduction. A New Method for Inverting Integrals Attenuated Radon Transform (SPECT) D to N map for Moving Boundary Value Problems
I. Introduction A New Method for Inverting Integrals Attenuated Radon Transform (SPECT) D to N map for Moving Boundary Value Problems F(k) = T 0 e k2 t+ikl(t) f (t)dt, k C. Integrable Nonlinear PDEs in
More informationProbability Pr(A) 0, for any event A. 2. Pr(S) = 1, for the sample space S. 3. If A and B are mutually exclusive, Pr(A or B) = Pr(A) + Pr(B).
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike License. Your use of this material constitutes acceptance of that license and the conditions of use of materials on this
More information1 Fourier Integrals of finite measures.
18.103 Fall 2013 1 Fourier Integrals of finite measures. Denote the space of finite, positive, measures on by M + () = {µ : µ is a positive measure on ; µ() < } Proposition 1 For µ M + (), we define the
More informationMATH4210 Financial Mathematics ( ) Tutorial 7
MATH40 Financial Mathematics (05-06) Tutorial 7 Review of some basic Probability: The triple (Ω, F, P) is called a probability space, where Ω denotes the sample space and F is the set of event (σ algebra
More informationChapter 7 Probability Basics
Making Hard Decisions Chapter 7 Probability Basics Slide 1 of 62 Introduction A A,, 1 An Let be an event with possible outcomes: 1 A = Flipping a coin A = {Heads} A 2 = Ω {Tails} The total event (or sample
More informationProblems 5: Continuous Markov process and the diffusion equation
Problems 5: Continuous Markov process and the diffusion equation Roman Belavkin Middlesex University Question Give a definition of Markov stochastic process. What is a continuous Markov process? Answer:
More informationConditional distributions. Conditional expectation and conditional variance with respect to a variable.
Conditional distributions Conditional expectation and conditional variance with respect to a variable Probability Theory and Stochastic Processes, summer semester 07/08 80408 Conditional distributions
More informationOn continuous time contract theory
Ecole Polytechnique, France Journée de rentrée du CMAP, 3 octobre, 218 Outline 1 2 Semimartingale measures on the canonical space Random horizon 2nd order backward SDEs (Static) Principal-Agent Problem
More informationMATH 220 solution to homework 5
MATH 220 solution to homework 5 Problem. (i Define E(t = k(t + p(t = then E (t = 2 = 2 = 2 u t u tt + u x u xt dx u 2 t + u 2 x dx, u t u xx + u x u xt dx x [u tu x ] dx. Because f and g are compactly
More informationShort-time expansions for close-to-the-money options under a Lévy jump model with stochastic volatility
Short-time expansions for close-to-the-money options under a Lévy jump model with stochastic volatility José Enrique Figueroa-López 1 1 Department of Statistics Purdue University Statistics, Jump Processes,
More informationLecture 2: Probability
Lecture 2: Probability Statistical Paradigms Statistical Estimator Method of Estimation Output Data Complexity Prior Info Classical Cost Function Analytical Solution Point Estimate Simple No Maximum Likelihood
More informationwhere r n = dn+1 x(t)
Random Variables Overview Probability Random variables Transforms of pdfs Moments and cumulants Useful distributions Random vectors Linear transformations of random vectors The multivariate normal distribution
More informationExample 4.1 Let X be a random variable and f(t) a given function of time. Then. Y (t) = f(t)x. Y (t) = X sin(ωt + δ)
Chapter 4 Stochastic Processes 4. Definition In the previous chapter we studied random variables as functions on a sample space X(ω), ω Ω, without regard to how these might depend on parameters. We now
More information1. Aufgabenblatt zur Vorlesung Probability Theory
24.10.17 1. Aufgabenblatt zur Vorlesung By (Ω, A, P ) we always enote the unerlying probability space, unless state otherwise. 1. Let r > 0, an efine f(x) = 1 [0, [ (x) exp( r x), x R. a) Show that p f
More informationNon-Zero-Sum Stochastic Differential Games of Controls and St
Non-Zero-Sum Stochastic Differential Games of Controls and Stoppings October 1, 2009 Based on two preprints: to a Non-Zero-Sum Stochastic Differential Game of Controls and Stoppings I. Karatzas, Q. Li,
More informationISyE 6644 Fall 2015 Test #1 Solutions (revised 10/5/16)
1 NAME ISyE 6644 Fall 2015 Test #1 Solutions (revised 10/5/16) You have 85 minutes. You get one cheat sheet. Put your succinct answers below. All questions are 3 points, unless indicated. You get 1 point
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationLecture Notes. The main new features today are two formulas, Inclusion-Exclusion and the product rule, introduced
Massachusetts Institute of Technology Lecture 19 6.042J/18.062J: Mathematics for Computer Science April 13, 2000 Professors David Karger and Nancy Lynch Lecture Notes In this lecture we discuss ways to
More informationUncertainty. Michael Peters December 27, 2013
Uncertainty Michael Peters December 27, 20 Lotteries In many problems in economics, people are forced to make decisions without knowing exactly what the consequences will be. For example, when you buy
More informationMATH 56A SPRING 2008 STOCHASTIC PROCESSES 197
MATH 56A SPRING 8 STOCHASTIC PROCESSES 197 9.3. Itô s formula. First I stated the theorem. Then I did a simple example to make sure we understand what it says. Then I proved it. The key point is Lévy s
More informationConvergence of price and sensitivities in Carr s randomization approximation globally and near barrier
Convergence of price and sensitivities in Carr s randomization approximation globally and near barrier Sergei Levendorskĭi University of Leicester Toronto, June 23, 2010 Levendorskĭi () Convergence of
More informationPhysics 6303 Lecture 2 August 22, 2018
Physics 6303 Lecture 2 August 22, 2018 LAST TIME: Coordinate system construction, covariant and contravariant vector components, basics vector review, gradient, divergence, curl, and Laplacian operators
More informationMathematics of Physics and Engineering II: Homework answers You are encouraged to disagree with everything that follows. P R and i j k 2 1 1
Mathematics of Physics and Engineering II: Homework answers You are encouraged to disagree with everything that follows Homework () P Q = OQ OP =,,,, =,,, P R =,,, P S = a,, a () The vertex of the angle
More informationThe Wiener Itô Chaos Expansion
1 The Wiener Itô Chaos Expansion The celebrated Wiener Itô chaos expansion is fundamental in stochastic analysis. In particular, it plays a crucial role in the Malliavin calculus as it is presented in
More informationQuestion Points Score Total: 137
Math 447 Test 1 SOLUTIONS Fall 2015 No books, no notes, only SOA-approved calculators. true/false or fill-in-the-blank question. You must show work, unless the question is a Name: Section: Question Points
More information{ } is an asymptotic sequence.
AMS B Perturbation Methods Lecture 3 Copyright by Hongyun Wang, UCSC Recap Iterative method for finding asymptotic series requirement on the iteration formula to make it work Singular perturbation use
More information1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).
CS 70 Discrete Mathematics for CS Spring 2006 Vazirani Lecture 8 Conditional Probability A pharmaceutical company is marketing a new test for a certain medical condition. According to clinical trials,
More informationStochastic Mechanics of Particles and Fields
Stochastic Mechanics of Particles and Fields Edward Nelson Department of Mathematics, Princeton University These slides are posted at http://math.princeton.edu/ nelson/papers/xsmpf.pdf A preliminary draft
More informationBasics on Probability. Jingrui He 09/11/2007
Basics on Probability Jingrui He 09/11/2007 Coin Flips You flip a coin Head with probability 0.5 You flip 100 coins How many heads would you expect Coin Flips cont. You flip a coin Head with probability
More information1 Basic continuous random variable problems
Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and
More informationKolmogorov Equations and Markov Processes
Kolmogorov Equations and Markov Processes May 3, 013 1 Transition measures and functions Consider a stochastic process {X(t)} t 0 whose state space is a product of intervals contained in R n. We define
More information1 Delayed Renewal Processes: Exploiting Laplace Transforms
IEOR 6711: Stochastic Models I Professor Whitt, Tuesday, October 22, 213 Renewal Theory: Proof of Blackwell s theorem 1 Delayed Renewal Processes: Exploiting Laplace Transforms The proof of Blackwell s
More informationThe concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.
The concentration of a drug in blood Exponential decay C12 concentration 2 4 6 8 1 C12 concentration 2 4 6 8 1 dc(t) dt = µc(t) C(t) = C()e µt 2 4 6 8 1 12 time in minutes 2 4 6 8 1 12 time in minutes
More informationMATH97 Testing Enhancement Workshop. Department of Mathematics and Computer Science Coppin State University Dr. Min A Updated on Oct 20, 2014
MATH97 Testing Enhancement Workshop Department of Mathematics and Computer Science Coppin State University Dr. Min A Updated on Oct 20, 2014 Addition and Subtraction Same sign: Ex: 1+3 = 4 1 3 = 4 Keep
More informationSession 1: Probability and Markov chains
Session 1: Probability and Markov chains 1. Probability distributions and densities. 2. Relevant distributions. 3. Change of variable. 4. Stochastic processes. 5. The Markov property. 6. Markov finite
More informationBrownian Motion. Chapter Stochastic Process
Chapter 1 Brownian Motion 1.1 Stochastic Process A stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ,P and a real valued stochastic
More informationStochastic Modelling in Climate Science
Stochastic Modelling in Climate Science David Kelly Mathematics Department UNC Chapel Hill dtbkelly@gmail.com November 16, 2013 David Kelly (UNC) Stochastic Climate November 16, 2013 1 / 36 Why use stochastic
More informationLecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1
Random Walks and Brownian Motion Tel Aviv University Spring 011 Lecture date: May 0, 011 Lecture 9 Instructor: Ron Peled Scribe: Jonathan Hermon In today s lecture we present the Brownian motion (BM).
More informationPhysics 584 Computational Methods
Physics 584 Computational Methods Introduction to Matlab and Numerical Solutions to Ordinary Differential Equations Ryan Ogliore April 18 th, 2016 Lecture Outline Introduction to Matlab Numerical Solutions
More informationDiscrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10
EECS 70 Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10 Introduction to Basic Discrete Probability In the last note we considered the probabilistic experiment where we flipped
More informationDiscrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14
CS 70 Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 Introduction One of the key properties of coin flips is independence: if you flip a fair coin ten times and get ten
More informationProbabilistic models
Kolmogorov (Andrei Nikolaevich, 1903 1987) put forward an axiomatic system for probability theory. Foundations of the Calculus of Probabilities, published in 1933, immediately became the definitive formulation
More informationGoodness-of-Fit Testing with Empirical Copulas
with Empirical Copulas Sami Umut Can John Einmahl Roger Laeven Department of Econometrics and Operations Research Tilburg University January 19, 2011 with Empirical Copulas Overview of Copulas with Empirical
More informationMath 212-Lecture 8. The chain rule with one independent variable
Math 212-Lecture 8 137: The multivariable chain rule The chain rule with one independent variable w = f(x, y) If the particle is moving along a curve x = x(t), y = y(t), then the values that the particle
More informationInput-queued switches: Scheduling algorithms for a crossbar switch. EE 384X Packet Switch Architectures 1
Input-queued switches: Scheduling algorithms for a crossbar switch EE 84X Packet Switch Architectures Overview Today s lecture - the input-buffered switch architecture - the head-of-line blocking phenomenon
More informationHarnack Inequalities and Applications for Stochastic Equations
p. 1/32 Harnack Inequalities and Applications for Stochastic Equations PhD Thesis Defense Shun-Xiang Ouyang Under the Supervision of Prof. Michael Röckner & Prof. Feng-Yu Wang March 6, 29 p. 2/32 Outline
More informationAt time t 0, the velocity of a body moving along the horizontal s-axis is v = t 2 4t + 3.
Section 3.4 The Derivative as a Rate of Change: Derivative measures the rate of change of a dependent variable to independent variable. If s=f(t) is the function of an object showing the position of the
More information1 Antiderivatives graphically and numerically
Math B - Calculus by Hughes-Hallett, et al. Chapter 6 - Constructing antiderivatives Prepared by Jason Gaddis Antiderivatives graphically and numerically Definition.. The antiderivative of a function f
More informationLecture 5: The Principle of Deferred Decisions. Chernoff Bounds
Randomized Algorithms Lecture 5: The Principle of Deferred Decisions. Chernoff Bounds Sotiris Nikoletseas Associate Professor CEID - ETY Course 2013-2014 Sotiris Nikoletseas, Associate Professor Randomized
More informationthe convolution of f and g) given by
09:53 /5/2000 TOPIC Characteristic functions, cont d This lecture develops an inversion formula for recovering the density of a smooth random variable X from its characteristic function, and uses that
More informationUncertainty Quantification in Computational Science
DTU 2010 - Lecture I Uncertainty Quantification in Computational Science Jan S Hesthaven Brown University Jan.Hesthaven@Brown.edu Objective of lectures The main objective of these lectures are To offer
More informationLecture 22 Girsanov s Theorem
Lecture 22: Girsanov s Theorem of 8 Course: Theory of Probability II Term: Spring 25 Instructor: Gordan Zitkovic Lecture 22 Girsanov s Theorem An example Consider a finite Gaussian random walk X n = n
More informationBrownian Motion and Poisson Process
and Poisson Process She: What is white noise? He: It is the best model of a totally unpredictable process. She: Are you implying, I am white noise? He: No, it does not exist. Dialogue of an unknown couple.
More information6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )
6. Brownian Motion. stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ, P) and a real valued stochastic process can be defined
More informationA review: The Laplacian and the d Alembertian. j=1
Chapter One A review: The Laplacian and the d Alembertian 1.1 THE LAPLACIAN One of the main goals of this course is to understand well the solution of wave equation both in Euclidean space and on manifolds
More informationProblem Set 1. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 20
Problem Set MAS 6J/.6J: Pattern Recognition and Analysis Due: 5:00 p.m. on September 0 [Note: All instructions to plot data or write a program should be carried out using Matlab. In order to maintain a
More information18.01 Single Variable Calculus Fall 2006
MIT OpenCourseWare http://ocw.mit.edu 18.01 Single Variable Calculus Fall 2006 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. Exam 4 Review 1. Trig substitution
More informationDeep Learning for Computer Vision
Deep Learning for Computer Vision Lecture 3: Probability, Bayes Theorem, and Bayes Classification Peter Belhumeur Computer Science Columbia University Probability Should you play this game? Game: A fair
More informationNotes by Maksim Maydanskiy.
SPECTRAL FLOW IN MORSE THEORY. 1 Introduction Notes by Maksim Maydanskiy. Spectral flow is a general formula or computing the Fredholm index of an operator d ds +A(s) : L1,2 (R, H) L 2 (R, H) for a family
More information(3) Review of Probability. ST440/540: Applied Bayesian Statistics
Review of probability The crux of Bayesian statistics is to compute the posterior distribution, i.e., the uncertainty distribution of the parameters (θ) after observing the data (Y) This is the conditional
More informationSolution. This is a routine application of the chain rule.
EXAM 2 SOLUTIONS 1. If z = e r cos θ, r = st, θ = s 2 + t 2, find the partial derivatives dz ds chain rule. Write your answers entirely in terms of s and t. dz and dt using the Solution. This is a routine
More informationLES of Turbulent Flows: Lecture 3
LES of Turbulent Flows: Lecture 3 Dr. Jeremy A. Gibbs Department of Mechanical Engineering University of Utah Fall 2016 1 / 53 Overview 1 Website for those auditing 2 Turbulence Scales 3 Fourier transforms
More information