FE 5204 Stochastic Differential Equations

Size: px
Start display at page:

Download "FE 5204 Stochastic Differential Equations"

Transcription

1 Instructor: Jim Zhu zhu/ January 20, 2009

2 Preliminaries for dealing with continuous random processes. Brownian motions. Our main reference for this lecture is Chapter 2 of the textbook.

3 Finite probability space Finite probability space σ-algebra σ-algebra generated by a class of sets In coin tossing example with finite tosses, Ω represents a finite sample space. Algebras F i represents information that one has after ith game. Algebra F = n i F i represents information after all n games. Each subset A F represents a particular event whose probability is P(A). The triple (Ω, F,P) is called a probability space. In our example F = 2 Ω.

4 σ-algebra Finite probability space σ-algebra σ-algebra generated by a class of sets If we toss infinitely many times, Ω becomes an infinite set. We need F = i F i to be closed to the union of infinite sequence of subsets of F. Such F is called a σ-algebra. Note that F is also closed for intersection of infinite sequence by de Morgan s law.

5 σ-algebra definition Finite probability space σ-algebra σ-algebra generated by a class of sets σ-algebra Let Ω be a sample space and let F be a collection of subsets of Ω. We say F is a σ-algebra provide (i) for any A F, A c = Ω\A F and (ii) for any sequence A i F, i=1 A i F. This is a mathematical model for information.

6 Finite probability space σ-algebra σ-algebra generated by a class of sets Let Ω be a sample space and let F be a σ-algebra of subsets (events) of Ω We call (Ω, F) a measure space. A function P : F [0,1] is called a probability measure if P( ) = 0, P(Ω) = 1 and, for any sequence of disjoint sets A i F, P( A i ) = P(A i ). i=1 i=1 The triple (Ω, F,P) is called a probability space.

7 Almost surely Finite probability space σ-algebra σ-algebra generated by a class of sets Almost surely If event A F satisfies P(A) = 1 we say A happens almost surely or in brief a.s.

8 σ-algebra generated by a class of sets Finite probability space σ-algebra σ-algebra generated by a class of sets Let Ω be a sample space and let U be a collection of subsets of Ω. We use σ(u) to denote the σ-algebra generated by U the smallest σ-algebra containing U. This σ-algebra can be constructed by σ(u) = A {U A : A is a σ algebra}. (Exercise 2.3)

9 Examples Finite probability space σ-algebra σ-algebra generated by a class of sets Let Ω = {1,2,3}. Then F = {,Ω, {1}, {2,3}} is a σ-algebra, and G = {,Ω, {2}} is not. In fact, σ(g) = {,Ω, {2}, {1,3}}. What is σ({{2,3}})?

10 Examples Finite probability space σ-algebra σ-algebra generated by a class of sets Borel algebra Let U be all the open sets in R n. B := σ(u) is called the Borel σ-algebra.

11 Random variable Random variable σ-algebra generated by a random variable Probability integration Expectation and (co)variance Independence Random variable Let (Ω, F,P) be a probability space. A mapping X : Ω R n measurable with respect to F, i.e., for any open set A in R n, X 1 (A) F, is called a random variable. For random variable that takes only countably number of values a i,i = 1,2,... one needs only to check X 1 (a i ) F (Exercise 2.1(a)).

12 Examples Random variable σ-algebra generated by a random variable Probability integration Expectation and (co)variance Independence Let Ω = {1,2,3} and F = {,Ω, {1}, {2,3}}. Define Y (1) = 1, Y (2) = 0 and Y (3) = 1 then Y is not a random variable. Define Z(1) = 1 and Z(2) = Z(3) = 0 then Z is a random variable.

13 Random variable σ-algebra generated by a random variable Probability integration Expectation and (co)variance Independence σ-algebra generated by a random variable σ-algebra generated by a random variable Let X : Ω R n be a random variable and let B be the Borel σ-algebra. Then σ(x) = {X 1 (F) : F B} is a σ-algebra, called the σ-algebra generated by X.

14 Doob-Dynkin Lemma Random variable σ-algebra generated by a random variable Probability integration Expectation and (co)variance Independence The σ-algebra generated by X describes information contained in X. The Doob-Dynkin Lemma is about information relationship between two random variables. Doob-Dynkin Lemma Let X,Y : Ω R n be two functions. Then Y is σ(x) measurable if and only if there exists a Borel measurable function g : R n R n (for any A B,g 1 (A) B) such that Y = g(x).

15 Simple random variables Random variable σ-algebra generated by a random variable Probability integration Expectation and (co)variance Independence Simple random variables Let (Ω, F,P) be a probability space. A random variable X on Ω is simple if it has only finite number of outcomes. Alternatively, X(ω) = k x i χ Fi (ω), i=1 where F i are disjoint sets in F.

16 Random variable σ-algebra generated by a random variable Probability integration Expectation and (co)variance Independence Probability integration of simple random variables Probability integration of simple random variables Let X be a simple random variable on Ω defined by X(ω) = k x i χ Fi (ω), i=1 where F i are disjoint sets in F. Then k X(ω)dP(ω) := x i P(F i ). Ω i=1

17 Random variable σ-algebra generated by a random variable Probability integration Expectation and (co)variance Independence Probability integration of general random variables Let X be a random variable on Ω. Suppose that there exists a sequence of simple random variables X j such that X j X with probability 1 on Ω and Ω X j(ω)dp(ω) converges. Then we defined X(ω) = lim X j (ω)dp(ω). j Ω (This gives a way of calculation. Use it for Exercise 2.1(b),(c),(d).)

18 Technicalities Random variable σ-algebra generated by a random variable Probability integration Expectation and (co)variance Independence There are usually many sequence of simple random variables X j such that X j X with probability 1. In order to justify the above definition one has to show if Ω X j(ω)dp(ω) converges for one of such sequences then all the other sequences leads to integration that converges to the same limit.

19 Expectation Random variable σ-algebra generated by a random variable Probability integration Expectation and (co)variance Independence Expectation When Ω X(ω) dp(ω) < the number E[X] = X(ω)dP(ω) is called the expectation of X. Ω

20 Expectation Random variable σ-algebra generated by a random variable Probability integration Expectation and (co)variance Independence The expectation can also be calculated by E[X] = xdµ X (x), R N where µ X (B) = P(X 1 (B)) for any Borel set B. This is a generalization of the concept of expectation when Ω is finite. We can also use this formula to deal with Exercise 2.1 (b),(c),(d).

21 Variance Random variable σ-algebra generated by a random variable Probability integration Expectation and (co)variance Independence To generalize variance we need space L 2 (P) := {X : Ω R N : X(ω) 2 dp(ω) < }, with an inner product defined by Ω X,Y = E[XY ], X,Y L 2 (P). For X,Y L 2, their covariance is cov(x,y) = E[(X E[X])(Y E[Y ])]. For X L 2 (P) its variance is Var(X) = cov(x,x). Again note here we need an additional technical assumption to restrict the range in which the concept is applicable.

22 Independence Random variable σ-algebra generated by a random variable Probability integration Expectation and (co)variance Independence (two events) Events A,B F are independent if P(A B) = P(A)P(B). (multiple events) Events A i F,i = 1,2,... are independent if P(A i1 A i2... A ik ) = P(A i1 )P(A i2 )... P(A ik ). (multiple collection of events) Collection of events A i F,i = 1,2,... are independent if for all choices A ij A ij, P(A i1 A i2... A ik ) = P(A i1 )P(A i2 )... P(A ik ).

23 Independence of random variables Random variable σ-algebra generated by a random variable Probability integration Expectation and (co)variance Independence Two random variables X and Y are independent if σ(x) and σ(y ) are independent. For X,Y L 2 (P), this implies that or equivalently cov(x,y ) = 0. E[XY ] = E[X]E[Y ] The condition can be weakened but we will not pursue the technical details. X i,i = 1,2,... are independent if any finite subfamily of σ(x i ),i = 1,2,... are independent.

24 A continuous stochastic process Definition Path As a two variable function Stochastic process Let (Ω, F,P) be a probability space and let [0,T] be an interval. (X t ),t [0,T] is a stochastic process if for every t, X t is a random variable on (Ω, F,P).

25 Path Definition Path As a two variable function For each fixed ω Ω, t X t (ω) is called a path of the stochastic process (X t ).

26 As a two variable function Definition Path As a two variable function A stochastic process (X t ) can also be viewed as a two variable function (t,ω) X t (ω) = X(t,ω). Usually we will need a certain properties of this function. We often assume every path is continuous which ensures the joint measurability of X(t,ω).

27 General filtration Definition of filtration Adapted stochastic process Definition of filtration Let (Ω, F,P) be a probability space and let [0,T] be an interval. (F t ),t [0,T] is a filtration if for every t, F t F is a σ-algebra and, for any s < t, F s F t. Again, F t represents information available up to time t.

28 Adapted stochastic process Definition of filtration Adapted stochastic process Adapted stochastic process Let (F t ),t [0,T] is a filtration on probability space (Ω, F,P). We say a stochastic process (X t ) is (F t )-adapted provided that, for every t, X t is F t measurable. Note: there is no corresponding concept for predictable.

29 Definition of filtration Adapted stochastic process generated by a stochastic process generated by a stochastic process Let (X t ),t [0,T] be a stochastic process on probability space (Ω, F,P). Define F t = σ({xs 1 (F) : s [0,t],F B}), where B is all the Boral sets in R n. Then (F t ) is a filtration and (X t ) is F t -adapted.

30 History History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion 1 Named after Scottish botanist Robert Brown who in 1828 observed this motion from pollen suspended in liquid. 2 Louis Bachelier use it to model stock market in 1900, wrongly for allowing negative stock price. 3 Paul Samuelson gave the widely used geometric Brownian motion model for stock price movements in 1965, which do not allow any price jump. 4 All models are wrong. Some are wronger than others.

31 History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion Definition of 1-dimensional Brownian motion (Wiener) A stochastic process {B t : t [0,T)} is called a Brownian motion starting from x if 1 B 0 = x, 2 for 0 t 1 < t 2 <... < t k T, the random variables are independent, B t2 B t1,b t3 B t2,...,b tk B tk 1 3 for 0 s t T, B t B s has Gaussian distribution with mean 0 and variance t s, 4 for ω in a set of probability one, the path B t (ω) is continuous.

32 History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion Definition of multi-dimensional Brownian motion Multi-dimensional Brownian motion A vector stochastic process {B t : t [0,T]} in R n is called a Brownian motion starting from x = (x 1,x 2,...,x n ) if B t = (B 1 t,b 2 t,...,b n t ) where B i t,i = 1,2,...,n are independent 1-dimensional Brownian motion starting from x i. If x = 0, we say (B t ) is a standard Brownian motion. A Brownian motion starting from x can be viewed as a standard Brownian motion shifted by x.

33 Existence History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion 1 Existence of a Brownian motion needs to be justified. 2 By and large, there are two ways to do it: 3 by construction (pioneered by Wiener, see e.g. Steele s book), or 4 by Kolmogorov s extension theorem (if time permits, other wise see e.g. [T]).

34 Uniqueness History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion Brownian motions are not uniquely defined, but their effects are equivalent. We usually pick a convenient version.

35 Gaussian distribution History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion Let v = [v 1,...,v d ] be a d-dimensional random vector. We define the mean vector and the covariance matrix by µ = E[v] = [µ 1,...,µ d ], and Σ = where σ ij = cov(v i,v j ). σ 11 σ σ 1d σ 21 σ σ 2d σ d1 σ d2... σ dd,

36 Gaussian distribution History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion Gaussion distribution A d-dimensional random vector v is Gaussian with mean µ and covariance matrix Σ if the density of v is given by ( f (x) = (2π) d/2 (detσ) 1/2 exp 1 ) 2 (x µ) Σ 1 (x µ).

37 History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion Two simple and important facts of Gaussian distribution Gaussian random vectors are completely determined by its mean and covariance. The components of a Gaussian random vector are independent if and only if Σ is diagonal.

38 Characteristic functions History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion Let v be a d-dimensional random vector. Then φ(θ) = E[exp(iθ v)] is the characteristic function of v which uniquely determines the distribution of a random vector. By direct computation we have Characteristic function of a Gaussion vector Let v be a d-dimensional Gaussian random vector v with mean µ and covariance matrix Σ. Then ( φ(θ) = E[exp(iθ v)] = exp iθ µ 1 ) 2 θ Σθ.

39 Gaussian characterization History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion Gaussian characterization Let v be a d-dimensional random vector. Then v is Gaussian if and only if for any θ R d, θ v is a univariate Gaussian random variable.

40 Gaussian process History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion Gaussian process Let X t be a stochastic process. If, for any 0 t 1 < t 2 <... < t k, (X t1,x t2,...,x tk ) is a k-dimensional Gaussian random vector then we say (X t ) is a Gaussian process. A Gaussian process is completely determined by the mean and covariance functions µ(t) = E[X t ] and f (s,t) = cov(x s,x t ).

41 Covariance of a Brownian motion History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion A standard 1-dimensional Brownian motion (B t ) is a well behaved Gaussian process. For s < t, its covariance is cov(b s,b t ) = E[B t B s + B s,b s ] = E[B 2 s ] = s. In general, Covariance of a Brownian motion cov(b s,b t ) = s t.

42 Covariance of a Brownian motion The converse is also true: Covariance of a Brownian motion History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion Let (X t ) be a Gaussian process with continuous paths and E[X t ] = 0 for all t [0,T]. Then (X t ) is a standard Brownian motion if and only if cov(x s,x t ) = s t. Proof. We need to show that X t2 X t1,x t3 X t2,...,x tk X tk 1 are independent. This can be done by expending, for i < j, E[(X ti X ti 1 )(X tj X tj 1 )] to find E[X ti X tj ] E[X ti X tj 1 ] E[X ti 1 X tj ] + E[X ti 1 X tj 1 ] = t i t i t i 1 + t i 1 = 0.

43 Higher moments History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion For a standard 1-dimensional Brownian motion (B t ), we can also use its density function to directly calculate higher moments and E[(B t B s ) 3 ] = 0, E[(B t B s ) 4 ] = 3(t s) 2. Alternatively, one can also use the method in Exercise 2.8.

44 Multi-dimensional generalizations History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion Let (B t ) be an n-dimensional standard Brownian motion. Then we have Covariance of a Brownian motion cov(b s,b t ) = n(s t). In particular, E[ B t B s 2 ] = n(t s). and Higher moments E[ B t B s 4 ] = n(n + 2)(t s) 2.

45 Finite dimensional distribution Finite dimensional distribution Kolmogorov s extension theorem Existence of Brownian motion For a stochastic process (X t ) and t i,i = 1,2,...,k, µ t1,t 2,...,t k (F 1 F 2... F k ) := P(X t1 F 1,...,X tk F k ) defines a measure on R nk called a finite dimensional distribution of (X t ). Here F 1,...,F k are Borel sets in R n.

46 Finite dimensional distribution Kolmogorov s extension theorem Existence of Brownian motion Kolmogorov s extension theorem: conditions For all t i,i = 1,2,...,k, let ν t1,t 2,...,t k be probability measure on R nk such that ν tσ(1),t σ(2),...,t σ(k) (F 1 F 2... F k ) (1) = ν t1,t 2,...,t k (F σ 1 (1),F σ 1 (2),...,F σ 1 (k)) for all permutations σ on {1,2,...,k} and ν t1,t 2,...,t k (F 1 F 2... F k ) (2) = ν t1,t 2,...,t k,t k+1,...,t k+m (F 1 F 2... F k F k+1... F k+m ).

47 Finite dimensional distribution Kolmogorov s extension theorem Existence of Brownian motion Kolmogorov s extension theorem: conclusions Then there exists a probability space (Ω, F,P) and a R n stochastic process (X t ) on (Ω, F,P) such that ν t1,t 2,...,t k (F 1 F 2... F k ) := P(X t1 F 1,...,X tk F k ), for all t i and all Borel sets F i.

48 Finite dimensional distribution Kolmogorov s extension theorem Existence of Brownian motion Finite dimensional distribution for a Brownian motion For x R n, define p(t,x,y) = (2πt) n/2 exp( and p(0,x,y) = δ x (y). For t i,i = 1,2,...,k increasing define, ν t1,t 2,...,t k (F 1 F 2... F k ) = x y 2 ),y R n,t > 0, 2t F 1 F 2... F k p(t 1, x, x 1 )p(t 2 t 1, x 1, x 2 )...p(t k t k 1, x k 1, x k )dx 1... dx k. Extend this definition to arbitrary t i,i = 1,2,...,k using the permutation rule in Kolmogorov s theorem. Rule (2) automatically holds due to R n p(t,x,y)dy = 1.

49 Existence of a Brownian motion Finite dimensional distribution Kolmogorov s extension theorem Existence of Brownian motion Existence of a Brownian motion Let x R n. An n-dimensional Brownian motion starts from x exists. Proof. Using the Kolmogorov s extension theorem to the finite dimensional distribution for a Brownian motion defined above.

50 Homework is an important part of learning SDE. Homework problems are given as exercises following each lecture and those marked with * are optional. The homework of this lecture is due on Jan 27 at the beginning of the lecture. Discussions with me or classmates are encouraged but the final work should be independently completed. I expect that you submit clear and neatly written work with careful justifications for your conclusions. [T] 2.1, 2.3, 2.8(a)(b), 2.12, 2.16, 2.18, 2.20.

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example

More information

We will briefly look at the definition of a probability space, probability measures, conditional probability and independence of probability events.

We will briefly look at the definition of a probability space, probability measures, conditional probability and independence of probability events. 1 Probability 1.1 Probability spaces We will briefly look at the definition of a probability space, probability measures, conditional probability and independence of probability events. Definition 1.1.

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

A D VA N C E D P R O B A B I L - I T Y

A D VA N C E D P R O B A B I L - I T Y A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2

More information

MATH4210 Financial Mathematics ( ) Tutorial 7

MATH4210 Financial Mathematics ( ) Tutorial 7 MATH40 Financial Mathematics (05-06) Tutorial 7 Review of some basic Probability: The triple (Ω, F, P) is called a probability space, where Ω denotes the sample space and F is the set of event (σ algebra

More information

Verona Course April Lecture 1. Review of probability

Verona Course April Lecture 1. Review of probability Verona Course April 215. Lecture 1. Review of probability Viorel Barbu Al.I. Cuza University of Iaşi and the Romanian Academy A probability space is a triple (Ω, F, P) where Ω is an abstract set, F is

More information

Appendix A : Introduction to Probability and stochastic processes

Appendix A : Introduction to Probability and stochastic processes A-1 Mathematical methods in communication July 5th, 2009 Appendix A : Introduction to Probability and stochastic processes Lecturer: Haim Permuter Scribe: Shai Shapira and Uri Livnat The probability of

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1 Random Walks and Brownian Motion Tel Aviv University Spring 011 Lecture date: May 0, 011 Lecture 9 Instructor: Ron Peled Scribe: Jonathan Hermon In today s lecture we present the Brownian motion (BM).

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 3 9/10/2008 CONDITIONING AND INDEPENDENCE

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 3 9/10/2008 CONDITIONING AND INDEPENDENCE MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 3 9/10/2008 CONDITIONING AND INDEPENDENCE Most of the material in this lecture is covered in [Bertsekas & Tsitsiklis] Sections 1.3-1.5

More information

Statistical Inference

Statistical Inference Statistical Inference Lecture 1: Probability Theory MING GAO DASE @ ECNU (for course related communications) mgao@dase.ecnu.edu.cn Sep. 11, 2018 Outline Introduction Set Theory Basics of Probability Theory

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

M5A42 APPLIED STOCHASTIC PROCESSES

M5A42 APPLIED STOCHASTIC PROCESSES M5A42 APPLIED STOCHASTIC PROCESSES Professor G.A. Pavliotis Department of Mathematics Imperial College London, UK LECTURE 1 06/10/2016 Lectures: Thursdays 14:00-15:00, Huxley 140, Fridays 10:00-12:00,

More information

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,

More information

Notes 1 : Measure-theoretic foundations I

Notes 1 : Measure-theoretic foundations I Notes 1 : Measure-theoretic foundations I Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Section 1.0-1.8, 2.1-2.3, 3.1-3.11], [Fel68, Sections 7.2, 8.1, 9.6], [Dur10,

More information

JUSTIN HARTMANN. F n Σ.

JUSTIN HARTMANN. F n Σ. BROWNIAN MOTION JUSTIN HARTMANN Abstract. This paper begins to explore a rigorous introduction to probability theory using ideas from algebra, measure theory, and other areas. We start with a basic explanation

More information

Review of Probability. CS1538: Introduction to Simulations

Review of Probability. CS1538: Introduction to Simulations Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed

More information

Review of Probability Theory

Review of Probability Theory Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving

More information

Quick Tour of Basic Probability Theory and Linear Algebra

Quick Tour of Basic Probability Theory and Linear Algebra Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions

More information

Lecture 6 Basic Probability

Lecture 6 Basic Probability Lecture 6: Basic Probability 1 of 17 Course: Theory of Probability I Term: Fall 2013 Instructor: Gordan Zitkovic Lecture 6 Basic Probability Probability spaces A mathematical setup behind a probabilistic

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A ) 6. Brownian Motion. stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ, P) and a real valued stochastic process can be defined

More information

1.1 Definition of BM and its finite-dimensional distributions

1.1 Definition of BM and its finite-dimensional distributions 1 Brownian motion Brownian motion as a physical phenomenon was discovered by botanist Robert Brown as he observed a chaotic motion of particles suspended in water. The rigorous mathematical model of BM

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

1 Stat 605. Homework I. Due Feb. 1, 2011

1 Stat 605. Homework I. Due Feb. 1, 2011 The first part is homework which you need to turn in. The second part is exercises that will not be graded, but you need to turn it in together with the take-home final exam. 1 Stat 605. Homework I. Due

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

CONVERGENCE OF RANDOM SERIES AND MARTINGALES

CONVERGENCE OF RANDOM SERIES AND MARTINGALES CONVERGENCE OF RANDOM SERIES AND MARTINGALES WESLEY LEE Abstract. This paper is an introduction to probability from a measuretheoretic standpoint. After covering probability spaces, it delves into the

More information

Lecture 22: Variance and Covariance

Lecture 22: Variance and Covariance EE5110 : Probability Foundations for Electrical Engineers July-November 2015 Lecture 22: Variance and Covariance Lecturer: Dr. Krishna Jagannathan Scribes: R.Ravi Kiran In this lecture we will introduce

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

A PECULIAR COIN-TOSSING MODEL

A PECULIAR COIN-TOSSING MODEL A PECULIAR COIN-TOSSING MODEL EDWARD J. GREEN 1. Coin tossing according to de Finetti A coin is drawn at random from a finite set of coins. Each coin generates an i.i.d. sequence of outcomes (heads or

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

An Introduction to Stochastic Calculus

An Introduction to Stochastic Calculus An Introduction to Stochastic Calculus Haijun Li lih@math.wsu.edu Department of Mathematics Washington State University Weeks 3-4 Haijun Li An Introduction to Stochastic Calculus Weeks 3-4 1 / 22 Outline

More information

P (A G) dp G P (A G)

P (A G) dp G P (A G) First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume

More information

Recap of Basic Probability Theory

Recap of Basic Probability Theory 02407 Stochastic Processes Recap of Basic Probability Theory Uffe Høgsbro Thygesen Informatics and Mathematical Modelling Technical University of Denmark 2800 Kgs. Lyngby Denmark Email: uht@imm.dtu.dk

More information

Probability Background

Probability Background Probability Background Namrata Vaswani, Iowa State University August 24, 2015 Probability recap 1: EE 322 notes Quick test of concepts: Given random variables X 1, X 2,... X n. Compute the PDF of the second

More information

Recap of Basic Probability Theory

Recap of Basic Probability Theory 02407 Stochastic Processes? Recap of Basic Probability Theory Uffe Høgsbro Thygesen Informatics and Mathematical Modelling Technical University of Denmark 2800 Kgs. Lyngby Denmark Email: uht@imm.dtu.dk

More information

Probability Review. Chao Lan

Probability Review. Chao Lan Probability Review Chao Lan Let s start with a single random variable Random Experiment A random experiment has three elements 1. sample space Ω: set of all possible outcomes e.g.,ω={1,2,3,4,5,6} 2. event

More information

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3 Probability Paul Schrimpf January 23, 2018 Contents 1 Definitions 2 2 Properties 3 3 Random variables 4 3.1 Discrete........................................... 4 3.2 Continuous.........................................

More information

1: PROBABILITY REVIEW

1: PROBABILITY REVIEW 1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Multivariate probability distributions and linear regression

Multivariate probability distributions and linear regression Multivariate probability distributions and linear regression Patrik Hoyer 1 Contents: Random variable, probability distribution Joint distribution Marginal distribution Conditional distribution Independence,

More information

Lectures for APM 541: Stochastic Modeling in Biology. Jay Taylor

Lectures for APM 541: Stochastic Modeling in Biology. Jay Taylor Lectures for APM 541: Stochastic Modeling in Biology Jay Taylor November 3, 2011 Contents 1 Distributions, Expectations, and Random Variables 4 1.1 Probability Spaces...................................

More information

Discrete time Markov chains. Discrete Time Markov Chains, Definition and classification. Probability axioms and first results

Discrete time Markov chains. Discrete Time Markov Chains, Definition and classification. Probability axioms and first results Discrete time Markov chains Discrete Time Markov Chains, Definition and classification 1 1 Applied Mathematics and Computer Science 02407 Stochastic Processes 1, September 5 2017 Today: Short recap of

More information

18.440: Lecture 26 Conditional expectation

18.440: Lecture 26 Conditional expectation 18.440: Lecture 26 Conditional expectation Scott Sheffield MIT 1 Outline Conditional probability distributions Conditional expectation Interpretation and examples 2 Outline Conditional probability distributions

More information

Brownian Motion and Conditional Probability

Brownian Motion and Conditional Probability Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical

More information

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Data from one or a series of random experiments are collected. Planning experiments and collecting data (not discussed here). Analysis:

More information

At t = T the investors learn the true state.

At t = T the investors learn the true state. 1. Martingales A discrete time introduction Model of the market with the following submodels (i) T + 1 trading dates, t =0, 1,...,T. Mathematics of Financial Derivatives II Seppo Pynnönen Professor of

More information

Lecture 4: Introduction to stochastic processes and stochastic calculus

Lecture 4: Introduction to stochastic processes and stochastic calculus Lecture 4: Introduction to stochastic processes and stochastic calculus Cédric Archambeau Centre for Computational Statistics and Machine Learning Department of Computer Science University College London

More information

Lecture 2: Review of Probability

Lecture 2: Review of Probability Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

Homework 4 Solution, due July 23

Homework 4 Solution, due July 23 Homework 4 Solution, due July 23 Random Variables Problem 1. Let X be the random number on a die: from 1 to. (i) What is the distribution of X? (ii) Calculate EX. (iii) Calculate EX 2. (iv) Calculate Var

More information

Lecture 3 - Expectation, inequalities and laws of large numbers

Lecture 3 - Expectation, inequalities and laws of large numbers Lecture 3 - Expectation, inequalities and laws of large numbers Jan Bouda FI MU April 19, 2009 Jan Bouda (FI MU) Lecture 3 - Expectation, inequalities and laws of large numbersapril 19, 2009 1 / 67 Part

More information

MA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2

MA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2 MA 575 Linear Models: Cedric E Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2 1 Revision: Probability Theory 11 Random Variables A real-valued random variable is

More information

Random Process Lecture 1. Fundamentals of Probability

Random Process Lecture 1. Fundamentals of Probability Random Process Lecture 1. Fundamentals of Probability Husheng Li Min Kao Department of Electrical Engineering and Computer Science University of Tennessee, Knoxville Spring, 2016 1/43 Outline 2/43 1 Syllabus

More information

Probability and Measure

Probability and Measure Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability

More information

Math-Stat-491-Fall2014-Notes-I

Math-Stat-491-Fall2014-Notes-I Math-Stat-491-Fall2014-Notes-I Hariharan Narayanan October 2, 2014 1 Introduction This writeup is intended to supplement material in the prescribed texts: Introduction to Probability Models, 10th Edition,

More information

Chapter 4. Chapter 4 sections

Chapter 4. Chapter 4 sections Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation

More information

In terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3.

In terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3. 1. GAUSSIAN PROCESSES A Gaussian process on a set T is a collection of random variables X =(X t ) t T on a common probability space such that for any n 1 and any t 1,...,t n T, the vector (X(t 1 ),...,X(t

More information

CHANGE OF MEASURE. D.Majumdar

CHANGE OF MEASURE. D.Majumdar CHANGE OF MEASURE D.Majumdar We had touched upon this concept when we looked at Finite Probability spaces and had defined a R.V. Z to change probability measure on a space Ω. We need to do the same thing

More information

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else ECE 450 Homework #3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3 4 5

More information

Admin and Lecture 1: Recap of Measure Theory

Admin and Lecture 1: Recap of Measure Theory Admin and Lecture 1: Recap of Measure Theory David Aldous January 16, 2018 I don t use bcourses: Read web page (search Aldous 205B) Web page rather unorganized some topics done by Nike in 205A will post

More information

Stochastic process for macro

Stochastic process for macro Stochastic process for macro Tianxiao Zheng SAIF 1. Stochastic process The state of a system {X t } evolves probabilistically in time. The joint probability distribution is given by Pr(X t1, t 1 ; X t2,

More information

Brownian Motion. Chapter Stochastic Process

Brownian Motion. Chapter Stochastic Process Chapter 1 Brownian Motion 1.1 Stochastic Process A stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ,P and a real valued stochastic

More information

Independent random variables

Independent random variables CHAPTER 2 Independent random variables 2.1. Product measures Definition 2.1. Let µ i be measures on (Ω i,f i ), 1 i n. Let F F 1... F n be the sigma algebra of subsets of Ω : Ω 1... Ω n generated by all

More information

Random variables (discrete)

Random variables (discrete) Random variables (discrete) Saad Mneimneh 1 Introducing random variables A random variable is a mapping from the sample space to the real line. We usually denote the random variable by X, and a value that

More information

Lecture Notes for MA 623 Stochastic Processes. Ionut Florescu. Stevens Institute of Technology address:

Lecture Notes for MA 623 Stochastic Processes. Ionut Florescu. Stevens Institute of Technology  address: Lecture Notes for MA 623 Stochastic Processes Ionut Florescu Stevens Institute of Technology E-mail address: ifloresc@stevens.edu 2000 Mathematics Subject Classification. 60Gxx Stochastic Processes Abstract.

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

Elementary Probability. Exam Number 38119

Elementary Probability. Exam Number 38119 Elementary Probability Exam Number 38119 2 1. Introduction Consider any experiment whose result is unknown, for example throwing a coin, the daily number of customers in a supermarket or the duration of

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

Universal examples. Chapter The Bernoulli process

Universal examples. Chapter The Bernoulli process Chapter 1 Universal examples 1.1 The Bernoulli process First description: Bernoulli random variables Y i for i = 1, 2, 3,... independent with P [Y i = 1] = p and P [Y i = ] = 1 p. Second description: Binomial

More information

Chapter 2. Continuous random variables

Chapter 2. Continuous random variables Chapter 2 Continuous random variables Outline Review of probability: events and probability Random variable Probability and Cumulative distribution function Review of discrete random variable Introduction

More information

Measure theory and stochastic processes TA Session Problems No. 3

Measure theory and stochastic processes TA Session Problems No. 3 Measure theory and stochastic processes T Session Problems No 3 gnieszka Borowska 700 Note: this is only a draft of the solutions discussed on Wednesday and might contain some typos or more or less imprecise

More information

Brownian Motion. Chapter Definition of Brownian motion

Brownian Motion. Chapter Definition of Brownian motion Chapter 5 Brownian Motion Brownian motion originated as a model proposed by Robert Brown in 1828 for the phenomenon of continual swarming motion of pollen grains suspended in water. In 1900, Bachelier

More information

01 Probability Theory and Statistics Review

01 Probability Theory and Statistics Review NAVARCH/EECS 568, ROB 530 - Winter 2018 01 Probability Theory and Statistics Review Maani Ghaffari January 08, 2018 Last Time: Bayes Filters Given: Stream of observations z 1:t and action data u 1:t Sensor/measurement

More information

Lecture 1: Review on Probability and Statistics

Lecture 1: Review on Probability and Statistics STAT 516: Stochastic Modeling of Scientific Data Autumn 2018 Instructor: Yen-Chi Chen Lecture 1: Review on Probability and Statistics These notes are partially based on those of Mathias Drton. 1.1 Motivating

More information

Probability Theory and Simulation Methods

Probability Theory and Simulation Methods Feb 28th, 2018 Lecture 10: Random variables Countdown to midterm (March 21st): 28 days Week 1 Chapter 1: Axioms of probability Week 2 Chapter 3: Conditional probability and independence Week 4 Chapters

More information

{σ x >t}p x. (σ x >t)=e at.

{σ x >t}p x. (σ x >t)=e at. 3.11. EXERCISES 121 3.11 Exercises Exercise 3.1 Consider the Ornstein Uhlenbeck process in example 3.1.7(B). Show that the defined process is a Markov process which converges in distribution to an N(0,σ

More information

Exercises. T 2T. e ita φ(t)dt.

Exercises. T 2T. e ita φ(t)dt. Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.

More information

Part II Probability and Measure

Part II Probability and Measure Part II Probability and Measure Theorems Based on lectures by J. Miller Notes taken by Dexter Chua Michaelmas 2016 These notes are not endorsed by the lecturers, and I have modified them (often significantly)

More information

Bayesian statistics, simulation and software

Bayesian statistics, simulation and software Module 1: Course intro and probability brush-up Department of Mathematical Sciences Aalborg University 1/22 Bayesian Statistics, Simulations and Software Course outline Course consists of 12 half-days

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

x. Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ 2 ).

x. Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ 2 ). .8.6 µ =, σ = 1 µ = 1, σ = 1 / µ =, σ =.. 3 1 1 3 x Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ ). The Gaussian distribution Probably the most-important distribution in all of statistics

More information

Introduction to Probability Theory

Introduction to Probability Theory Introduction to Probability Theory Ping Yu Department of Economics University of Hong Kong Ping Yu (HKU) Probability 1 / 39 Foundations 1 Foundations 2 Random Variables 3 Expectation 4 Multivariate Random

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Deep Learning for Computer Vision

Deep Learning for Computer Vision Deep Learning for Computer Vision Lecture 3: Probability, Bayes Theorem, and Bayes Classification Peter Belhumeur Computer Science Columbia University Probability Should you play this game? Game: A fair

More information

9 Brownian Motion: Construction

9 Brownian Motion: Construction 9 Brownian Motion: Construction 9.1 Definition and Heuristics The central limit theorem states that the standard Gaussian distribution arises as the weak limit of the rescaled partial sums S n / p n of

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

LIST OF FORMULAS FOR STK1100 AND STK1110

LIST OF FORMULAS FOR STK1100 AND STK1110 LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

Some Tools From Stochastic Analysis

Some Tools From Stochastic Analysis W H I T E Some Tools From Stochastic Analysis J. Potthoff Lehrstuhl für Mathematik V Universität Mannheim email: potthoff@math.uni-mannheim.de url: http://ls5.math.uni-mannheim.de To close the file, click

More information

University of Regina. Lecture Notes. Michael Kozdron

University of Regina. Lecture Notes. Michael Kozdron University of Regina Statistics 252 Mathematical Statistics Lecture Notes Winter 2005 Michael Kozdron kozdron@math.uregina.ca www.math.uregina.ca/ kozdron Contents 1 The Basic Idea of Statistics: Estimating

More information

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Lecture 2: Review of Basic Probability Theory

Lecture 2: Review of Basic Probability Theory ECE 830 Fall 2010 Statistical Signal Processing instructor: R. Nowak, scribe: R. Nowak Lecture 2: Review of Basic Probability Theory Probabilistic models will be used throughout the course to represent

More information

Properties of Random Variables

Properties of Random Variables Properties of Random Variables 1 Definitions A discrete random variable is defined by a probability distribution that lists each possible outcome and the probability of obtaining that outcome If the random

More information