Question 1. Question Solution to final homework!! Friday 5 May Please report any typos immediately for a timely response.
|
|
- Silas Mathews
- 6 years ago
- Views:
Transcription
1 1 Please report any typos immediately for a timely response. Call EW tn W tn 1 near u n 1,...,..., W t1 near u 1 ) LHS for short. Add the quantity W tn 1 W tn 1 into the expected value of the LHS. Apply independent increments property of Weiner process and we get Question 1 LHS = EW tn W tn 1 W tn 1 near u n 1 ) + W tn 1 Finally apply linearity of expectation and the result follows. 2 Recall that for a Weiner process W t Question 2 W t W s Gaussian0, t s) 1) To prove the finite-ness of E W t, it s easy to derive using integration that E W t = 2t π < Next, by independent increments and the fact that W t W s is gaussian with mean 0, EW t W s F s ) = 0. Therefore, EW t F s ) = EW t W s F s ) + EW s F s ) = 0 + W s = W s and so W t is a martingale. Now look at U t : E U t E W t 2 + t = VarW t ) + t = 2t < Also, for s < t EU t F s ) = EW 2 t t F s ) = E[W t W s ) 2 + 2W t W s )W s + W 2 s F s) t = E[W t W s ) 2 ] + 2W s EW t W s F s ) + W 2 s t = VarW t W s ) W 2 s t = t s) + W 2 s t = U s
2 The last two equalities follow from 1). Finally for V t, notice that E V t = exp{2 1 λ 2 t}eexp{λw t }) = 1 < The last equality comes from the fact that the expected value is the mgf of Gaussian0, t). Plug-N -Chug in the exact value of the mgf and they evalute to 1. Finally, for s < t, EV t F s ) = e 2 1 λ 2t Eexp{λW t W s + W s )} F s ) = e 2 1 λ 2t Eexp{λW t W s )} F S )Eexp{λW s } F s ) = e 2 1 λ 2t e 2 1 λ 2 t s) Eexp{λW s } F s ) = e Ws 1 2 λ2s = V s 3 Let T = min{t 0 : W t [a, b]}, p b be the probability of hitting b before a, and similarly for p a. Verify yourself the optional stopping theorem can be used most importantly the uniformly integrability property). Thus, Question 3 Similarly, p a = EW T ) = bp b + a1 p b ) = EW 0 ) = 0 p b = b b a a b a Next, apply optional stopping theorem one more time to see that EU T = EU 0 = 0 EW 2 T ) = ET Note that ET = EW 2 T ) a2 + b 2 < and EW 2 T ) = a2 p a + b 2 p b = ab ET = ab 2
3 4 We assume that gu, t) satisfies gu, h) = πv, s u)gv, t + s)dv 2) Question 4 EZ t+h F t ) = EZ t+h X t = u) = E[gX t+h, t + h) X t = u) = gv, t + h)πv, h u)dv = gu, t) First and last equalities are derived from the fact that X is a Markov process and 2) respectively. For the case gx, t) = x 2 t, πx, s u)gx, t + s)dx = = { } 1 2πs exp x u)2 [x 2 t + s) ] dx 2s { } 1 2πs exp x u)2 2s x 2 dx t + s) = s + u 2 t + s) = u 2 t = gu, t) Similarly for gx, t) = exp { } λx λ2 t 2, use your wonderful integration skills to show that πx, s u)gx, t + s)dx = = gu, t) { 1 exp 2πs } { x u)2 exp 2s λx λ2 t + s) 2 } dx 3
4 5 We can write Question 5 n B = b j B tj for fixed b j = = n b j W tj t j W 1 ) n n b j W tj W 1 b j t j Sum and difference) of Gaussians is Gaussian. So B is Gaussian. This means that any linear combination finite) of B t is gaussian. So the collection B t1,..., B tn ) is multivariate Guassian and B t is a Gaussian process. The mean function is EB t ) = EW t ) tew 1 ) = 0 and the autocovariance function is given by CovB t, B s ) = E[W t tw 1 )W s sw 1 )] = EW t W s ) + stew 2 1 ) tew 1 W s ) sew 1 W t ) = s + st ts st = s1 t) 6 For simplicity, let Question 6 B k = W k2 n t W k 1)2 n t Since W t is a standard Weiner process, B k iid Gaussian0, 2 n t). Then by the Law of Large Numbers, B k 2 n E B k ) ) = n+1) t 1/2 as n π So it is proven that B k 4
5 Now let nk and U nk be as defined in the assignment. Equation 7) in the assignment can be rewritten as lim n U nk + t2 n ) = t or equivalently U nk 0 The following are true: EU nk = E 2 nk t2 n ) = 0 recall nk iid Gaussian0, 2 n t) EU 2 nk ) = E 4 nk + 2t 2 n 2 nk + t2 2 2n ) = 3t 2 2 2n 2t 2 2 2n + t 2 2 2n = 2t2 2 2n 0 as n and so L 2 convergence follows. The last equality follows from the hint. By Chebyshev s inequality, we have that 2 n ) P U nk > ɛ [ 2 ) ] n 2 E U nk ɛ 2 n 2 n ) P U nk > ɛ = 2t2 ɛ2 n 2t2 ɛ 2 due to iid property < Finally, apply the Borel-Cantelli Lemma, 2 n ) P U nk > ɛ i.o. = 0 and so for large n, 2 n ) 2 n ) P U nk < ɛ = 1 P lim U n nk < ɛ = 1 Then, we have that P lim n i=1 U nk = 0 ) = 1 implies U nk a.s. 0 5
6 7 Question 7 ) ) P max W s c = 1 P max W s < c 0 s T 0 s T ) = 1 P max W s c 0 s T = 1 P W T failed to pass c and W T < c) = 1 [P W T < c) P W T < c and W T passed c)] = 1 [P W T < c) P W T > c and W 0 = 0)] ) ) [ )] c c c = 1 Φ + 1 Φ = 2 1 Φ T T T Third to last equality follows from reflection principle. 8 For first case, let W be the waiting time so that 0 W. It can be easily shown using basic definition of expected value for continuous random variables that Question 8 EW = 2 For the second case, the mean measure is Leb and for waiting time W PW > t) = PNA) = 0 where A = 0, t)) = exp LebA) ) = exp t ) This gives the kernel of Exponential 1 ) and so EW =. 6
7 9 Fix some arbitrary point and let D be the distance from the point to the closest tree. Further let A r be the circle with radius r and centered at the fixed point. Then Question 9 λleba r ) = πr 2 λ NA r ) Poissonπr 2 λ) Then we have that PD > r) = PNA r ) = 0) = exp πr 2 λ) One-minus it and we get the CDF of D. For the second part, it s typical to treat the line of sight as a rectangle with arbitrary length l and width 2a. If there is a tree in this rectangular area, call it A, the view to the east is blocked. So we have λleba) = 2alλ NA) Poisson2arλ) Thus, PD > l) = PNA) = 0) = exp 2alλ) D Exponential2aλ) and so ED = 2aλ) 1 10 P NA 1 ) = a 1... NA k ) = a k ) k i 1 = P NA 1 ) = a 1 ) P NA i ) = a i {NA j ) = a j } i=2 Question 10 k = P NA 1 ) = a 1 ) P NA i ) NA i 1 ) = a i a i 1 ) i=2 k = P NA 1 + t) = a 1 ) P NA i + t) NA i 1 + t) = a i a i 1 ) i=2 k i 1 = P NA 1 + t) = a 1 ) P NA i + t) = a i {NA j + t) = a j } i=2 = P NA 1 + t) = a 1... NA k + t) = a k ) Second equality and third equality follow from independent increments and stationary increments respectively. 7
8 11 Given regions A and B, this measure characterizes the process on overlapping parts of A and B. In the stationary case, it is 0. It is also 0 for Poisson process. To see this, we know that Question 11 CovNA), NB)) = 2 1 [VarNA)+NB)) VarNA)) VarNB))] We know NA) and NB) are Poisson with rates λleba) and λlebb) respectively. Now use the mgf of NA) and NB) to derive the mgf of NA) + NB). It turns out it is the mgf of a PoissonλLebA) + λlebb)) Thus, CovNA), NB)) = 2 1 [VarNA)+NB)) VarNA)) VarNB))] = 0 12 Suppose the Poisson random measure has mean measure λleb ) and hence the avoidance measure is given by Question 12 V A) = PNA) = 0) = e λleba) 13 Define A t = {points p : p t}. Denote the overall intensity λ. By the definition of homogeneous Poisson process, NA t ) is Poisson with rate λv At where V At is the volume of A t. But Question 13 V At = v d t d Thus, Kt) = E[NA t)] λ = λ v d t d λ = v d t d 8
PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationGREEN S IDENTITIES AND GREEN S FUNCTIONS
GREEN S IENTITIES AN GREEN S FUNCTIONS Green s first identity First, recall the following theorem. Theorem: (ivergence Theorem) Let be a bounded solid region with a piecewise C 1 boundary surface. Let
More informationn E(X t T n = lim X s Tn = X s
Stochastic Calculus Example sheet - Lent 15 Michael Tehranchi Problem 1. Let X be a local martingale. Prove that X is a uniformly integrable martingale if and only X is of class D. Solution 1. If If direction:
More informationIEOR 4703: Homework 2 Solutions
IEOR 4703: Homework 2 Solutions Exercises for which no programming is required Let U be uniformly distributed on the interval (0, 1); P (U x) = x, x (0, 1). We assume that your computer can sequentially
More informationBrownian Motion. 1 Definition Brownian Motion Wiener measure... 3
Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................
More informationLimiting Distributions
Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the
More informationSTAT 414: Introduction to Probability Theory
STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises
More informationSTAT 418: Probability and Stochastic Processes
STAT 418: Probability and Stochastic Processes Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical
More informationLimiting Distributions
We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the two fundamental results
More informationMidterm Examination. STA 205: Probability and Measure Theory. Wednesday, 2009 Mar 18, 2:50-4:05 pm
Midterm Examination STA 205: Probability and Measure Theory Wednesday, 2009 Mar 18, 2:50-4:05 pm This is a closed-book examination. You may use a single sheet of prepared notes, if you wish, but you may
More informationReview. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda
Review DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Probability and statistics Probability: Framework for dealing with
More informationMidterm Exam 2 (Solutions)
EECS 6 Probability and Random Processes University of California, Berkeley: Spring 07 Kannan Ramchandran March, 07 Midterm Exam (Solutions) Last name First name SID Name of student on your left: Name of
More information1 Independent increments
Tel Aviv University, 2008 Brownian motion 1 1 Independent increments 1a Three convolution semigroups........... 1 1b Independent increments.............. 2 1c Continuous time................... 3 1d Bad
More informationOptional Stopping Theorem Let X be a martingale and T be a stopping time such
Plan Counting, Renewal, and Point Processes 0. Finish FDR Example 1. The Basic Renewal Process 2. The Poisson Process Revisited 3. Variants and Extensions 4. Point Processes Reading: G&S: 7.1 7.3, 7.10
More informationChapter 1. Poisson processes. 1.1 Definitions
Chapter 1 Poisson processes 1.1 Definitions Let (, F, P) be a probability space. A filtration is a collection of -fields F t contained in F such that F s F t whenever s
More informationLecture Notes 3 Convergence (Chapter 5)
Lecture Notes 3 Convergence (Chapter 5) 1 Convergence of Random Variables Let X 1, X 2,... be a sequence of random variables and let X be another random variable. Let F n denote the cdf of X n and let
More informationProblem set 1, Real Analysis I, Spring, 2015.
Problem set 1, Real Analysis I, Spring, 015. (1) Let f n : D R be a sequence of functions with domain D R n. Recall that f n f uniformly if and only if for all ɛ > 0, there is an N = N(ɛ) so that if n
More informationBrownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539
Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory
More informationUniversal examples. Chapter The Bernoulli process
Chapter 1 Universal examples 1.1 The Bernoulli process First description: Bernoulli random variables Y i for i = 1, 2, 3,... independent with P [Y i = 1] = p and P [Y i = ] = 1 p. Second description: Binomial
More informationconditional cdf, conditional pdf, total probability theorem?
6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random
More information. Find E(V ) and var(v ).
Math 6382/6383: Probability Models and Mathematical Statistics Sample Preliminary Exam Questions 1. A person tosses a fair coin until she obtains 2 heads in a row. She then tosses a fair die the same number
More informationELEMENTS OF PROBABILITY THEORY
ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable
More informationExercises in stochastic analysis
Exercises in stochastic analysis Franco Flandoli, Mario Maurelli, Dario Trevisan The exercises with a P are those which have been done totally or partially) in the previous lectures; the exercises with
More information1 Review of Probability
1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x
More informationThe concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.
The concentration of a drug in blood Exponential decay C12 concentration 2 4 6 8 1 C12 concentration 2 4 6 8 1 dc(t) dt = µc(t) C(t) = C()e µt 2 4 6 8 1 12 time in minutes 2 4 6 8 1 12 time in minutes
More informationManual for SOA Exam MLC.
Chapter 10. Poisson processes. Section 10.5. Nonhomogenous Poisson processes Extract from: Arcones Fall 2009 Edition, available at http://www.actexmadriver.com/ 1/14 Nonhomogenous Poisson processes Definition
More informationSTAT 200C: High-dimensional Statistics
STAT 200C: High-dimensional Statistics Arash A. Amini May 30, 2018 1 / 59 Classical case: n d. Asymptotic assumption: d is fixed and n. Basic tools: LLN and CLT. High-dimensional setting: n d, e.g. n/d
More informationExponential Distribution and Poisson Process
Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential
More informationI forgot to mention last time: in the Ito formula for two standard processes, putting
I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy
More informationModern Discrete Probability Branching processes
Modern Discrete Probability IV - Branching processes Review Sébastien Roch UW Madison Mathematics November 15, 2014 1 Basic definitions 2 3 4 Galton-Watson branching processes I Definition A Galton-Watson
More informationProbability inequalities 11
Paninski, Intro. Math. Stats., October 5, 2005 29 Probability inequalities 11 There is an adage in probability that says that behind every limit theorem lies a probability inequality (i.e., a bound on
More informationConvergence of Random Walks
Chapter 16 Convergence of Ranom Walks This lecture examines the convergence of ranom walks to the Wiener process. This is very important both physically an statistically, an illustrates the utility of
More information(A n + B n + 1) A n + B n
344 Problem Hints and Solutions Solution for Problem 2.10. To calculate E(M n+1 F n ), first note that M n+1 is equal to (A n +1)/(A n +B n +1) with probability M n = A n /(A n +B n ) and M n+1 equals
More informationProbability Models. 4. What is the definition of the expectation of a discrete random variable?
1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions
More informationEECS 126 Probability and Random Processes University of California, Berkeley: Spring 2017 Kannan Ramchandran March 21, 2017.
EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2017 Kannan Ramchandran March 21, 2017 Midterm Exam 2 Last name First name SID Name of student on your left: Name of
More informationµ n 1 (v )z n P (v, )
Plan More Examples (Countable-state case). Questions 1. Extended Examples 2. Ideas and Results Next Time: General-state Markov Chains Homework 4 typo Unless otherwise noted, let X be an irreducible, aperiodic
More informationA D VA N C E D P R O B A B I L - I T Y
A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2
More information2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.
CS450 Final Review Problems Fall 08 Solutions or worked answers provided Problems -6 are based on the midterm review Identical problems are marked recap] Please consult previous recitations and textbook
More informationOrder Statistics and Distributions
Order Statistics and Distributions 1 Some Preliminary Comments and Ideas In this section we consider a random sample X 1, X 2,..., X n common continuous distribution function F and probability density
More informationTentamentsskrivning: Stochastic processes 1
Tentamentsskrivning: Stochastic processes 1 Tentamentsskrivning i MSF2/MVE33, 7.5 hp. Tid: fredagen den 31 maj 213 kl 8.3-12.3 Examinator och jour: Serik Sagitov, tel. 772-5351, mob. 736 97 613, rum H326
More informationKernel Density Estimation
EECS 598: Statistical Learning Theory, Winter 2014 Topic 19 Kernel Density Estimation Lecturer: Clayton Scott Scribe: Yun Wei, Yanzhen Deng Disclaimer: These notes have not been subjected to the usual
More information1 Basic continuous random variable problems
Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and
More informationLecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.
Lecture 2 1 Martingales We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. 1.1 Doob s inequality We have the following maximal
More informationSpatial point process. Odd Kolbjørnsen 13.March 2017
Spatial point process Odd Kolbjørnsen 13.March 2017 Today Point process Poisson process Intensity Homogeneous Inhomogeneous Other processes Random intensity (Cox process) Log Gaussian cox process (LGCP)
More informationStationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process.
Stationary independent increments 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process. 2. If each set of increments, corresponding to non-overlapping collection of
More informationStat 150 Practice Final Spring 2015
Stat 50 Practice Final Spring 205 Instructor: Allan Sly Name: SID: There are 8 questions. Attempt all questions and show your working - solutions without explanation will not receive full credit. Answer
More informationµ X (A) = P ( X 1 (A) )
1 STOCHASTIC PROCESSES This appendix provides a very basic introduction to the language of probability theory and stochastic processes. We assume the reader is familiar with the general measure and integration
More informationECE534, Spring 2018: Solutions for Problem Set #4 Due Friday April 6, 2018
ECE534, Spring 2018: s for Problem Set #4 Due Friday April 6, 2018 1. MMSE Estimation, Data Processing and Innovations The random variables X, Y, Z on a common probability space (Ω, F, P ) are said to
More informationBrownian Motion and Stochastic Calculus
ETHZ, Spring 17 D-MATH Prof Dr Martin Larsson Coordinator A Sepúlveda Brownian Motion and Stochastic Calculus Exercise sheet 6 Please hand in your solutions during exercise class or in your assistant s
More informationFiltrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition
Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,
More informationMultiple points of the Brownian sheet in critical dimensions
Multiple points of the Brownian sheet in critical dimensions Robert C. Dalang Ecole Polytechnique Fédérale de Lausanne Based on joint work with: Carl Mueller Multiple points of the Brownian sheet in critical
More informationProbability A exam solutions
Probability A exam solutions David Rossell i Ribera th January 005 I may have committed a number of errors in writing these solutions, but they should be ne for the most part. Use them at your own risk!
More informationEECS 126 Probability and Random Processes University of California, Berkeley: Spring 2018 Kannan Ramchandran February 14, 2018.
EECS 6 Probability and Random Processes University of California, Berkeley: Spring 08 Kannan Ramchandran February 4, 08 Midterm Last Name First Name SID You have 0 minutes to read the exam and 90 minutes
More informationBrownian Motion and Conditional Probability
Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical
More informationSTOCHASTIC MODELS FOR WEB 2.0
STOCHASTIC MODELS FOR WEB 2.0 VIJAY G. SUBRAMANIAN c 2011 by Vijay G. Subramanian. All rights reserved. Permission is hereby given to freely print and circulate copies of these notes so long as the notes
More informationMath Spring Practice for the final Exam.
Math 4 - Spring 8 - Practice for the final Exam.. Let X, Y, Z be three independnet random variables uniformly distributed on [, ]. Let W := X + Y. Compute P(W t) for t. Honors: Compute the CDF function
More informationCLASSICAL PROBABILITY MODES OF CONVERGENCE AND INEQUALITIES
CLASSICAL PROBABILITY 2008 2. MODES OF CONVERGENCE AND INEQUALITIES JOHN MORIARTY In many interesting and important situations, the object of interest is influenced by many random factors. If we can construct
More informationEE5319R: Problem Set 3 Assigned: 24/08/16, Due: 31/08/16
EE539R: Problem Set 3 Assigned: 24/08/6, Due: 3/08/6. Cover and Thomas: Problem 2.30 (Maimum Entropy): Solution: We are required to maimize H(P X ) over all distributions P X on the non-negative integers
More informationLesson 4: Stationary stochastic processes
Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@univaq.it Stationary stochastic processes Stationarity is a rather intuitive concept, it means
More informationStochastic Processes
Stochastic Processes A very simple introduction Péter Medvegyev 2009, January Medvegyev (CEU) Stochastic Processes 2009, January 1 / 54 Summary from measure theory De nition (X, A) is a measurable space
More informationMidterm #1. Lecture 10: Joint Distributions and the Law of Large Numbers. Joint Distributions - Example, cont. Joint Distributions - Example
Midterm #1 Midterm 1 Lecture 10: and the Law of Large Numbers Statistics 104 Colin Rundel February 0, 01 Exam will be passed back at the end of class Exam was hard, on the whole the class did well: Mean:
More informationAn Introduction to Stochastic Processes in Continuous Time
An Introduction to Stochastic Processes in Continuous Time Flora Spieksma adaptation of the text by Harry van Zanten to be used at your own expense May 22, 212 Contents 1 Stochastic Processes 1 1.1 Introduction......................................
More informationECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.
NAME: ECE 302 Division MWF 0:30-:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding. If you are not in Prof. Pollak s section, you may not take this
More informationLectures on Markov Chains
Lectures on Markov Chains David M. McClendon Department of Mathematics Ferris State University 2016 edition 1 Contents Contents 2 1 Markov chains 4 1.1 The definition of a Markov chain.....................
More informationOn the convergence of sequences of random variables: A primer
BCAM May 2012 1 On the convergence of sequences of random variables: A primer Armand M. Makowski ECE & ISR/HyNet University of Maryland at College Park armand@isr.umd.edu BCAM May 2012 2 A sequence a :
More informationµ (X) := inf l(i k ) where X k=1 I k, I k an open interval Notice that is a map from subsets of R to non-negative number together with infinity
A crash course in Lebesgue measure theory, Math 317, Intro to Analysis II These lecture notes are inspired by the third edition of Royden s Real analysis. The Jordan content is an attempt to extend the
More informationECE 353 Probability and Random Signals - Practice Questions
ECE 353 Probability and Random Signals - Practice Questions Winter 2018 Xiao Fu School of Electrical Engineering and Computer Science Oregon State Univeristy Note: Use this questions as supplementary materials
More informationB8.1 Martingales Through Measure Theory. Concept of independence
B8.1 Martingales Through Measure Theory Concet of indeendence Motivated by the notion of indeendent events in relims robability, we have generalized the concet of indeendence to families of σ-algebras.
More information{σ x >t}p x. (σ x >t)=e at.
3.11. EXERCISES 121 3.11 Exercises Exercise 3.1 Consider the Ornstein Uhlenbeck process in example 3.1.7(B). Show that the defined process is a Markov process which converges in distribution to an N(0,σ
More informationRecap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks
Recap Probability, stochastic processes, Markov chains ELEC-C7210 Modeling and analysis of communication networks 1 Recap: Probability theory important distributions Discrete distributions Geometric distribution
More informationBranching Brownian motion seen from the tip
Branching Brownian motion seen from the tip J. Berestycki 1 1 Laboratoire de Probabilité et Modèles Aléatoires, UPMC, Paris 09/02/2011 Joint work with Elie Aidekon, Eric Brunet and Zhan Shi J Berestycki
More informationThe main results about probability measures are the following two facts:
Chapter 2 Probability measures The main results about probability measures are the following two facts: Theorem 2.1 (extension). If P is a (continuous) probability measure on a field F 0 then it has a
More informationProving the central limit theorem
SOR3012: Stochastic Processes Proving the central limit theorem Gareth Tribello March 3, 2019 1 Purpose In the lectures and exercises we have learnt about the law of large numbers and the central limit
More informationPOISSON PROCESSES 1. THE LAW OF SMALL NUMBERS
POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS 1.1. The Rutherford-Chadwick-Ellis Experiment. About 90 years ago Ernest Rutherford and his collaborators at the Cavendish Laboratory in Cambridge conducted
More information7. Higher Dimensional Poisson Process
1 of 5 7/16/2009 7:09 AM Virtual Laboratories > 14 The Poisson Process > 1 2 3 4 5 6 7 7 Higher Dimensional Poisson Process The Process The Poisson process can be defined in higher dimensions, as a model
More information1 Sequences of events and their limits
O.H. Probability II (MATH 2647 M15 1 Sequences of events and their limits 1.1 Monotone sequences of events Sequences of events arise naturally when a probabilistic experiment is repeated many times. For
More informationStochastic Calculus. Michael R. Tehranchi
Stochastic Calculus Michael R. Tehranchi Contents Chapter 1. A possible motivation: diffusions 5 1. Markov chains 5 2. Continuous-time Markov processes 6 3. Stochastic differential equations 6 4. Markov
More informationTheorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1
Chapter 2 Probability measures 1. Existence Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension to the generated σ-field Proof of Theorem 2.1. Let F 0 be
More informationGAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM
GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM STEVEN P. LALLEY 1. GAUSSIAN PROCESSES: DEFINITIONS AND EXAMPLES Definition 1.1. A standard (one-dimensional) Wiener process (also called Brownian motion)
More informationHomework # , Spring Due 14 May Convergence of the empirical CDF, uniform samples
Homework #3 36-754, Spring 27 Due 14 May 27 1 Convergence of the empirical CDF, uniform samples In this problem and the next, X i are IID samples on the real line, with cumulative distribution function
More informationSTAT 331. Martingale Central Limit Theorem and Related Results
STAT 331 Martingale Central Limit Theorem and Related Results In this unit we discuss a version of the martingale central limit theorem, which states that under certain conditions, a sum of orthogonal
More informationMATH 6605: SUMMARY LECTURE NOTES
MATH 6605: SUMMARY LECTURE NOTES These notes summarize the lectures on weak convergence of stochastic processes. If you see any typos, please let me know. 1. Construction of Stochastic rocesses A stochastic
More informationSTAT Sample Problem: General Asymptotic Results
STAT331 1-Sample Problem: General Asymptotic Results In this unit we will consider the 1-sample problem and prove the consistency and asymptotic normality of the Nelson-Aalen estimator of the cumulative
More informationd(x n, x) d(x n, x nk ) + d(x nk, x) where we chose any fixed k > N
Problem 1. Let f : A R R have the property that for every x A, there exists ɛ > 0 such that f(t) > ɛ if t (x ɛ, x + ɛ) A. If the set A is compact, prove there exists c > 0 such that f(x) > c for all x
More information2. Variance and Higher Moments
1 of 16 7/16/2009 5:45 AM Virtual Laboratories > 4. Expected Value > 1 2 3 4 5 6 2. Variance and Higher Moments Recall that by taking the expected value of various transformations of a random variable,
More informationRandom matrices: Distribution of the least singular value (via Property Testing)
Random matrices: Distribution of the least singular value (via Property Testing) Van H. Vu Department of Mathematics Rutgers vanvu@math.rutgers.edu (joint work with T. Tao, UCLA) 1 Let ξ be a real or complex-valued
More information3. WIENER PROCESS. STOCHASTIC ITÔ INTEGRAL
3. WIENER PROCESS. STOCHASTIC ITÔ INTEGRAL 3.1. Wiener process. Main properties. A Wiener process notation W W t t is named in the honor of Prof. Norbert Wiener; other name is the Brownian motion notation
More information2 Random Variable Generation
2 Random Variable Generation Most Monte Carlo computations require, as a starting point, a sequence of i.i.d. random variables with given marginal distribution. We describe here some of the basic methods
More informationn! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2
Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,
More informationLecture 4: Introduction to stochastic processes and stochastic calculus
Lecture 4: Introduction to stochastic processes and stochastic calculus Cédric Archambeau Centre for Computational Statistics and Machine Learning Department of Computer Science University College London
More informationStochastic Processes and Monte-Carlo Methods. University of Massachusetts: Spring 2018 version. Luc Rey-Bellet
Stochastic Processes and Monte-Carlo Methods University of Massachusetts: Spring 2018 version Luc Rey-Bellet April 5, 2018 Contents 1 Simulation and the Monte-Carlo method 3 1.1 Review of probability............................
More informationP (A G) dp G P (A G)
First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume
More informationHW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]
HW7 Solutions. 5 pts.) James Bond James Bond, my favorite hero, has again jumped off a plane. The plane is traveling from from base A to base B, distance km apart. Now suppose the plane takes off from
More information1 Stat 605. Homework I. Due Feb. 1, 2011
The first part is homework which you need to turn in. The second part is exercises that will not be graded, but you need to turn it in together with the take-home final exam. 1 Stat 605. Homework I. Due
More informationNon-parametric Inference and Resampling
Non-parametric Inference and Resampling Exercises by David Wozabal (Last update. Juni 010) 1 Basic Facts about Rank and Order Statistics 1.1 10 students were asked about the amount of time they spend surfing
More information9 Brownian Motion: Construction
9 Brownian Motion: Construction 9.1 Definition and Heuristics The central limit theorem states that the standard Gaussian distribution arises as the weak limit of the rescaled partial sums S n / p n of
More informationLarge Sample Theory. Consider a sequence of random variables Z 1, Z 2,..., Z n. Convergence in probability: Z n
Large Sample Theory In statistics, we are interested in the properties of particular random variables (or estimators ), which are functions of our data. In ymptotic analysis, we focus on describing the
More informationMultiple Random Variables
Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x
More informationLecture 3. David Aldous. 31 August David Aldous Lecture 3
Lecture 3 David Aldous 31 August 2015 This size-bias effect occurs in other contexts, such as class size. If a small Department offers two courses, with enrollments 90 and 10, then average class (faculty
More information