Fundamental Tools - Probability Theory II

Similar documents
STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Analysis of Engineering and Scientific Data. Semester

Topic 3: The Expectation of a Random Variable

Northwestern University Department of Electrical Engineering and Computer Science

Lecture 3. Discrete Random Variables

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Brief Review of Probability

SDS 321: Introduction to Probability and Statistics

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

3 Multiple Discrete Random Variables

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Probability Distributions

Outline PMF, CDF and PDF Mean, Variance and Percentiles Some Common Distributions. Week 5 Random Variables and Their Distributions

Expectation of Random Variables

2 Continuous Random Variables and their Distributions

Probability. Computer Science Tripos, Part IA. R.J. Gibbens. Computer Laboratory University of Cambridge. Easter Term 2008/9

Lecture 1: Review on Probability and Statistics

1 Random Variable: Topics

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

Statistika pro informatiku

1 Presessional Probability

Probability and Distributions

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Topic 3: The Expectation of a Random Variable

Random variables. DS GA 1002 Probability and Statistics for Data Science.

1 Review of Probability

Lecture 2: Repetition of probability theory and statistics

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Statistics and Econometrics I

Continuous Random Variables and Continuous Distributions

Relationship between probability set function and random variable - 2 -

Fundamental Tools - Probability Theory IV

Chapter 4 : Discrete Random Variables

1: PROBABILITY REVIEW

Chapter 2: Random Variables

Algorithms for Uncertainty Quantification

SDS 321: Introduction to Probability and Statistics

Chapter 3 Discrete Random Variables

X 1 ((, a]) = {ω Ω : X(ω) a} F, which leads us to the following definition:

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

Continuous Random Variables

Things to remember when learning probability distributions:

Chapter 2. Continuous random variables

Statistics 1B. Statistics 1B 1 (1 1)

More on Distribution Function

Mathematical Statistics 1 Math A 6330

Chapter 4. Chapter 4 sections

Quick Tour of Basic Probability Theory and Linear Algebra

Lecture Notes 2 Random Variables. Random Variable

Probability Review. Chao Lan

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

random variables T T T T H T H H

STAT 430/510: Lecture 10

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

STAT 430/510: Lecture 16

Discrete Random Variables

1 Probability theory. 2 Random variables and probability theory.

Random Variables and Their Distributions

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

a zoo of (discrete) random variables

Week 2. Review of Probability, Random Variables and Univariate Distributions

CME 106: Review Probability theory

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

Random Variables Example:

3 Continuous Random Variables

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Chapter 4. Continuous Random Variables 4.1 PDF

Chapter 3: Random Variables 1

Recitation 2: Probability

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x!

18.440: Lecture 28 Lectures Review

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

Motivation and Applications: Why Should I Study Probability?

STAT Chapter 5 Continuous Distributions

1 Bernoulli Distribution: Single Coin Flip

ACM 116: Lecture 2. Agenda. Independence. Bayes rule. Discrete random variables Bernoulli distribution Binomial distribution

Fall 2018 Statistics 201A (Introduction to Probability at an advanced level) - All Lecture Notes

Laws of Probability, Bayes theorem, and the Central Limit Theorem

Probability Review. Gonzalo Mateos

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

MATH Solutions to Probability Exercises

Quick review on Discrete Random Variables

Probability, Random Processes and Inference

Stochastic Models in Computer Science A Tutorial

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

Fundamentals. CS 281A: Statistical Learning Theory. Yangqing Jia. August, Based on tutorial slides by Lester Mackey and Ariel Kleiner

Review of Probability Theory

Continuous Distributions

Mathematical Statistics 1 Math A 6330

DS-GA 1002 Lecture notes 2 Fall Random variables

1.1 Review of Probability Theory

Laws of Probability, Bayes theorem, and the Central Limit Theorem

Special distributions

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Transcription:

Fundamental Tools - Probability Theory II MSc Financial Mathematics The University of Warwick September 29, 2015 MSc Financial Mathematics Fundamental Tools - Probability Theory II 1 / 22

Measurable random variables Information generated by a random variable Informal introduction to measurable random variables In an example of rolling a die with Ω = {1, 2, 3, 4, 5, 6}: A random variable maps each outcome in Ω to a real number. Eg { 1, ω {1, 3, 5}; X 1 (ω) = ω, X 2 (ω) = 1, ω {2, 4, 6}, are both random variables on Ω. X 1 gives the exact outcome of the roll, and X 2 is a binary variable whose value depends on whether the roll is odd or even. If we only have information on whether the roll is odd/even (represented by a σ-algebra F = {, Ω, {1, 3, 5}, {2, 4, 6}}), we can determine the value of X 2 but not X 1. We say X 2 is measurable w.r.t F, but X 1 is NOT measurable w.r.t F. MSc Financial Mathematics Fundamental Tools - Probability Theory II 2 / 22

Measurable random variables Information generated by a random variable Formal definition of random variables We wrap the formal definition of a random variable and measurability as follows: Definition (Measurable random variables) A random variable is a function X : Ω R. It is said to be measurable w.r.t F (or we say that X is a random variable w.r.t F) if for every Borel set B B(R) X 1 (B) := {ω Ω : X (ω) B} F. Informally, X is measurable w.r.t F if all possible inverses of X can be found in F. MSc Financial Mathematics Fundamental Tools - Probability Theory II 3 / 22

Measurable random variables Information generated by a random variable Examples 1 Back to our first example of rolling a die: The possible sets of inverse of X 2 are {1, 3, 5}, {2, 4, 6}, Ω and. They are all in F so X 2 is F-measurable. For X 1, note for example that X 1 1 (6) = {6} / F. X 1 is hence not F-measurable. 2 Let Ω = { 1, 0, 1} and F = {, Ω, { 1, 1}, {0}}: X 1 (ω) := ω is NOT F-measurable. Eg X 1 1 (1) = {1} / F. X 2 (ω) := ω 2 is F-measurable. MSc Financial Mathematics Fundamental Tools - Probability Theory II 4 / 22

Measurable random variables Information generated by a random variable σ-algebra generated by a random variable With a given information set (or a σ-algebra), we check if we can determine the value of a random variable (i.e. if it is F-measurable). Conversely, given a random variable we want to extract the information contained therein. Revisiting the example of rolling a die: The value of X 1 gives the information on the exact outcome. The value of X 2 gives the information on odd/even. Definition (σ-algebra generated by a r.v) The σ-algebra generated by a random variable X, denoted by σ(x ), is the smallest σ-algebra which X is measurable with respect to. MSc Financial Mathematics Fundamental Tools - Probability Theory II 5 / 22

Measurable random variables Information generated by a random variable Example It is hard (or too tedious) to write down precisely the set of σ(x ) apart from few simple examples. Example In an experiment of flipping a coin twice, let Ω = {HH, HT, TH, TT } and consider the random variables X 1 (ω) = { 1, ω {HH, HT }; 1, ω {TH, TT }, 2, ω {HH}; 1, ω {HT }; X 2 (ω) = 1, ω {TH}; 2, ω {TT }. Here, σ(x 1 ) = {Ω,, {HT, HT }, {TH, TT }} and σ(x 2 ) = 2 Ω. In particular, σ(x 1 ) σ(x 2 ) so X 2 is more informative than X 1. MSc Financial Mathematics Fundamental Tools - Probability Theory II 6 / 22

From probability space to distribution functions In practice, we seldom bother working with the abstract concept of a probability space (Ω, F, P), but rather just focusing on the distributional properties of a random variable X representing the random phenomenon. For example, suppose we want to model the number of coin flips required to get the first head: Formally, we would write Ω = {1, 2, 3,...}, F = 2 Ω and let P be a probability measure satisfying P({ω : ω = k}) = (1 p) k 1 p for k = 1, 2, 3... In practice, we would simply let X be the number of flips required, and consider P(X = k) = (1 p) k 1 p for k = 1, 2, 3... From now on whenever we write expression like P(X B), imagine there is a probability space in the background, and P(X B) actually means P({ω Ω : X (ω) B}). MSc Financial Mathematics Fundamental Tools - Probability Theory II 7 / 22

Cumulative distribution function For a random variable X, its cumulative distribution function (CDF) is defined as F (x) = P(X x), < x <. One can check that F has the following properties: 1 F is non-decreasing and right-continuous; 2 lim x F (x) = 1 and lim x F (x) = 0. Conversely, if a given function F satisfies the above properties, then it is a CDF of some random variable. MSc Financial Mathematics Fundamental Tools - Probability Theory II 8 / 22

Classes of random variables We can talk about CDF of general variables. But for random variables belonging to two important subclasses, it is more informative to consider their probability mass functions for discrete random variables; probability density functions for continuous random variables. Warning: there are random variables which are neither discrete nor continuous! MSc Financial Mathematics Fundamental Tools - Probability Theory II 9 / 22

A random variable X is discrete if it only takes value on a countable set S = {x 1, x 2, x 3,...}, which is called the support of X. A discrete random variable is fully characterised by its probability mass function (p.m.f). Definition (Probability mass function) A probability mass function of a discrete random variable X is defined as Conversely, if a function p(x) satisfies: p X (x) = P(X = x), x S. 1 p(x) > 0 for all x S, and p(x) = 0 for all x / S; 2 x S p(x) = 1. where S is some countable set. Then p( ) defines a probability mass function for some discrete random variable with support S. MSc Financial Mathematics Fundamental Tools - Probability Theory II 10 / 22

Expected value and variance For a discrete r.v X supported on S, its expected value is defined as E(X ) = x S xp(x = x). More generally, for a given function g( ) we define E(g(X )) = x S g(x)p(x = x). The variance of X is defined as var(x ) := E((X E(X )) 2 ) = E(X 2 ) (E(X )) 2. MSc Financial Mathematics Fundamental Tools - Probability Theory II 11 / 22

Some useful identities for computing E(X ) and var(x ) Geometric series x k = 1 1 x, kx k 1 1 = (1 x) 2 for x < 1. k=0 Binomial series n Ck n a k b n k = (a + b) n, k=0 Taylor s expansion of e x k=0 k=1 n kck n a k 1 b n k = n(a + b) n 1. k=1 x k k! = ex. MSc Financial Mathematics Fundamental Tools - Probability Theory II 12 / 22

Some popular discrete random variables Bernoulli Ber(p) A binary outcome of success (1) or failure (0) where the probability of success is p. Binomial Bin(n, p) Sum of n independent and identically distributed (i.i.d) Ber(p) r.v s. Poisson Poi(λ) Limiting case of Bin(n, p) on setting p = λ/n and then let n. Geometric Geo(p) Number of trails required to get the first success in a series of i.i.d Ber(p) experiments. MSc Financial Mathematics Fundamental Tools - Probability Theory II 13 / 22

Some popular discrete random variables: a summary Distribution Support Pmf P(X = k) E(X ) var(x ) Ber(p) {0, 1} p1 (k=1) + (1 p)1 (k=0) p p(1 p) Bin(n, p) {0,1,...,n} Ck npk (1 p) n k np np(1 p) e Poi(λ) {0, 1, 2,...} λ λ k k! λ λ Geo(p) {1, 2, 3,...} (1 p) k 1 p 1/p (1 p)/p 2 Exercises: For each distribution shown above, verify its E(X ) and var(x ) MSc Financial Mathematics Fundamental Tools - Probability Theory II 14 / 22

Continuous r.v s and probability density functions Definition (Continuous r.v and probability density function) X is a continuous random variable if there exists a non-negative function f such that P(X x) = F (x) = x f (u)du for any x. f is called the probability density function (p.d.f) of X. Conversely, if a given function f satisfies: 1 f (x) 0 for all x; 2 f (x)dx = 1. Then f is the probability density function for some continuous random variable. Remarks: f (x) = F (x). Thus the pdf is uniquely determined by the CDF of X. The set {x : f (x) > 0} is called the support of X. This is the range which X can take values on. MSc Financial Mathematics Fundamental Tools - Probability Theory II 15 / 22

Probability density function Probabilities can be computed by integration: for any set A, P(X A) = f (u)du. Now let s fix x and consider A = [x, x + δx]. Then P(x X x + δx) = A x+δx x f (u)du f (x)δx for small δx. Hence f (x) can be interpreted as the probability of X lying in [x, x + δx] normalised by δx. The second observation is that on setting δx = 0, we have P(X = x) = 0. Thus a continuous random variable has zero probability of taking a particular value. MSc Financial Mathematics Fundamental Tools - Probability Theory II 16 / 22

Expected value and variance of a continuous r.v For discrete random variables, we work out expected value by summing over the countable possible outcomes. In the continuous case, the analogue is to use integration. For a continuous r.v X, its expected value is defined as E(X ) = xf (x). More generally, for a given function g( ) we define E(g(X )) = The variance of X again is defined as g(x)f (x)dx. var(x ) := E((X E(X )) 2 ) = E(X 2 ) (E(X )) 2. MSc Financial Mathematics Fundamental Tools - Probability Theory II 17 / 22

Remarks on expectation and variance We have seen how to define E(X ) (and more generally E(g(X ))) when X is either discrete or continuous. For more general random variables, it is still possible to define E(g(X )) using notions from measure theory (which we won t discuss here). Some fundamental properties of expectation and variance: E(aX + by ) = ae(x ) + be(y ) for any two random variables X, Y and constants a, b. var(ax + b) = a 2 var(x ) for any two constants a and b. var(x + Y ) = var(x ) + var(y ) for any two independent random variables X and Y. MSc Financial Mathematics Fundamental Tools - Probability Theory II 18 / 22

Popular examples of continuous r.v s Distribution Support Pdf E(X ) var(x ) Uniform U[0, 1] [0, 1] 1 1/2 1/12 Exponential Exp(λ) [0, ) λe λx 1/λ 1/λ 2 Normal N(µ, σ 2 ) R 1 2πσ e (x µ)2 2σ 2 µ σ 2 Exercises: For each distribution shown above, verify its E(X ) and var(x ) MSc Financial Mathematics Fundamental Tools - Probability Theory II 19 / 22

Suppose X is a random variable with known distribution. Let g( ) be a function and define a new random variable via Y = g(x ). How to find the distribution function of Y? We start with the first principle: the cumulative distribution function of Y is defined as F Y (y) = P(Y y). Then F Y (y) = P(Y y) = P(g(X ) y). If g has an inverse, then we can write where F X is the cdf of X. F Y (y) = P(X g 1 (y)) = F X (g 1 (y)) If g does not have an inverse (eg g(x) = x 2 ), then special care has to be taken to work out P(g(X ) y). MSc Financial Mathematics Fundamental Tools - Probability Theory II 20 / 22

Example Random variables Let X U[0, 1]. Find the distribution and density function of Y = X. Sketch of answer: For X U[0, 1], 0, x < 0; F X (x) = x, 0 x 1; 1, x > 1. Thus for y 0, F Y (y) = P( X y) = P(X y 2 ) = F X (y 2 ), i.e 0, y < 0; F Y (y) = y 2, 0 y 1; 1, y > 1. Differentiating F Y gives the density function f Y (y) = 2y for y [0, 1] (and 0 elsewhere). MSc Financial Mathematics Fundamental Tools - Probability Theory II 21 / 22

Applications of transformation of random variables Log-normal random variable: Defined via Y = exp(x ) where X N(µ, σ). It could serve as a simple model of stock price (see problem sheet as well). Simulation of random variables: Given F a CDF of a random variable X. Define the right-continuous inverse as F 1 (y) = min{x : F (x) y}. Then for U U[0, 1], the random variable F 1 (U) has the same distribution as X. Consequence: If we want to simulate X on a computer and if F 1 has an easy expression, we just need to simulate a U from U[0, 1] (which is very easy) and then F 1 (U) is our sample of X. MSc Financial Mathematics Fundamental Tools - Probability Theory II 22 / 22