p. 6-1 Continuous Random Variables p. 6-2

Similar documents
Continuous Random Variables

SDS 321: Introduction to Probability and Statistics

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

Continuous distributions

Mathematical Statistics 1 Math A 6330

Math438 Actuarial Probability

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

Statistics and Econometrics I

Random variables, Expectation, Mean and Variance. Slides are adapted from STAT414 course at PennState

F X (x) = P [X x] = x f X (t)dt. 42 Lebesgue-a.e, to be exact 43 More specifically, if g = f Lebesgue-a.e., then g is also a pdf for X.

STAT 3610: Review of Probability Distributions

Lecture 3: Random variables, distributions, and transformations

2 Functions of random variables

Continuous Random Variables

Random Variables and Their Distributions

Will Landau. Feb 21, 2013

REVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B)

1 Expectation of a continuously distributed random variable

0 otherwise. Page 100 Exercise 9: Suppose that a random variable X has a discrete distribution with the following p.m.f.: { c. 2 x. 0 otherwise.

1 Solution to Problem 2.1

2 (Statistics) Random variables

Lecture 4. Continuous Random Variables and Transformations of Random Variables

1 Random Variable: Topics

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015

Continuous Random Variables and Continuous Distributions

Chapter 4. Continuous Random Variables 4.1 PDF

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Solution to Assignment 3

Chapter 3: Random Variables 1

Transformations and Expectations

Northwestern University Department of Electrical Engineering and Computer Science

p. 4-1 Random Variables

Probability theory for Networks (Part 1) CS 249B: Science of Networks Week 02: Monday, 02/04/08 Daniel Bilar Wellesley College Spring 2008

Chapter 4. Chapter 4 sections

1.1 Review of Probability Theory

Brief Review of Probability

Math 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential.

Lecture 5: Moment generating functions

Probability and Distributions

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Continuous Random Variables

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan

Chapter 4: Continuous Probability Distributions

Review of Probability. CS1538: Introduction to Simulations

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Lecture 1: August 28

Question Points Score Total: 76

STAT 430/510: Lecture 16

ENGG2430A-Homework 2

Continuous r.v practice problems

2 Continuous Random Variables and their Distributions

MATH4210 Financial Mathematics ( ) Tutorial 7

Review: mostly probability and some statistics

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

STA 256: Statistics and Probability I

A random variable is a quantity whose value is determined by the outcome of an experiment.

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Continuous r.v. s: cdf s, Expected Values

MTH 202 : Probability and Statistics

Math Spring Practice for the Second Exam.

BASICS OF PROBABILITY

Outline PMF, CDF and PDF Mean, Variance and Percentiles Some Common Distributions. Week 5 Random Variables and Their Distributions

ECO220Y Continuous Probability Distributions: Uniform and Triangle Readings: Chapter 9, sections

8 Laws of large numbers

HW Solution 12 Due: Dec 2, 9:19 AM

4. Distributions of Functions of Random Variables

CSE 312, 2017 Winter, W.L. Ruzzo. 7. continuous random variables

STAT 450: Statistical Theory. Distribution Theory. Reading in Casella and Berger: Ch 2 Sec 1, Ch 4 Sec 1, Ch 4 Sec 6.

More on Distribution Function

Continuous Distributions

1 Presessional Probability

Lecture 18. Uniform random variables

3 Operations on One Random Variable - Expectation

Functions of Random Variables

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

Chapter 4: Continuous Random Variable

1 Probability and Random Variables

3 Continuous Random Variables

Statistics for Economists. Lectures 3 & 4

Chapter 3: Random Variables 1

STAT509: Continuous Random Variable

Week 2. Review of Probability, Random Variables and Univariate Distributions

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

Practice Questions for Final

General Random Variables

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

Math 152 Take Home Test 1

1. Exercise on Page 8: 2. Exercise on Page 8: (b)c 1 (C 2 C 3 ) and (C 1 C 2 ) (C 1 C 3 ) (C 1 C 2 ) (C 2 C 3 ) Problem Set 1 Fall 2017

Probability and Statistical Decision Theory

Extra Topic: DISTRIBUTIONS OF FUNCTIONS OF RANDOM VARIABLES

Transcription:

Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability (>). Often, there is interest in random variables that can take (at least theoretically) on an uncountable number of possible values, e.g., the weight of a randomly selected person in a population, the length of time that a randomly selected light bulb works, the error in experimentally measuring the speed of light. Example (Uniform Spinner, LNp.3-6, 3-18): (, ] For (a, b], P((a, b]) (b a)/(2 ) Consider the random variables: X: R, and X( ) for Range of X: (, ] p. 6-1 Y: R, and Y( ) tan( ) for Range of Y: (, ) Then, X and Y are random variables that takes on an uncountable number of possible values. Some properties about the distribution of X (or Y) P X ({X x}) P({x}), for any x R. Probability for X to take any single value is zero But, for a<b, P X ({X (a, b] }) P((a, b]) (b a)/(2 )>. Positive probability (>) is assigned to any (a, b] Q: Can we still define a probability mass function for X? Q: If not, what can play a similar role like pmf for X? Recall. Find area under a curve by integration (uncountable sum). p. 6-2

Probability Density Function and Continuous Random Variable Definition. A function f: R R is called a probability density function (pdf) if 1. f(x), for all x (, ), and 2. f(x)dx =1. Definition: A random variable X is called continuous if there exists a pdf f such that for any set B of real numbers P X ({X B}) B f(x) dx. For example, P X (a X b) = b a f(x)dx. Theorem. If f is a pdf, then there must exist a continuous random variable with pdf f. Sketch of proof. p. 6-3 Some properties P X ({X = x}) = x x f(y)dy = for any x R It does not matter whether the intervals are open or close, i.e., P (X [a, b]) = P (X (a, b]) = P (X [a, b)) = P (X (a, b)). It is important to remember that the value of a pdf f(x) is NOT a probability itself It is quite possible for a pdf to have value greater than 1 Q: How to interpret the value of a pdf f(x)? For small dx, P x dx 2 X x + dx 2 = x+ dx 2 f(y)dy f(x) dx. x dx 2 f(x)dx is a measure of how likely it is that X will be near x We can characterize the distribution of a continuous random variable in terms of its 1.Probability Density Function (pdf) 2.(Cumulative) Distribution Function (cdf) 3.Moment Generating Function (mgf, Chapter 7) p. 6-4

NTHU MATH 281, 216 Relation between the pdf and the cdf Theorem. If F X and f X are the cdf and the pdf of a continuous random variable X, respectively, then F X (x) =P (X x) = x f X(y)dy for all <x< p. 6-5 f X (x) =FX (x) = d at continuity points of f dx F X(x) X Some Notes For a<b P (a <X b) =F X (b) F X (a) = b a f X(x)dx. The cdf for continuous random variables has the same interpretation and properties as discussed in the discrete case p. 6-6 The only difference is in plotting F X. In discrete case, there are jumps (step function). In continuous case, F X is a (absolutely) continuous non-decreasing function. Example (Uniform Distributions) If < < <, then f(x) = is a pdf since 1. f(x) for all x R, and 2. 1, if α <x β, β α, otherwise,

Its corresponding cdf is F (x) = x f(y)dy =, if x α, x α β α, if α <x β, 1, if x>β. (exercise) Conversely, it can be easily checked that F is a cdf and f(x) F (x) exceptat x and x (Derivative does not exist when x and x, but it does not matter.) p. 6-7 An example of Uniform distribution is the r.v. X in the Uniform Spinner example (LNp.6-1) where and. Transformation Q: Y g(x), how to find the distribution of Y? Suppose that X is a continuous random variable with cdf F X and pdf f X. Consider Y g(x), where g is a strictly monotone (increasing or decreasing) function. Let R Y be the range of g. p. 6-8 Note. Any strictly monotone function has an inverse function, i.e., g 1 exists on R Y. The cdf of Y, denoted by F Y 1.Suppose that g is a strictly increasing function. For y R Y, F Y (y) = P (Y y) = P (g(x) y) =P (X g 1 (y)) = F X (g 1 (y)). 2.Suppose that g is a strictly decreasing function. For y R Y, Theorem. Let X be a continuous random variable whose cdf F X possesses a unique inverse F X 1. Let Z F X (X), then Z has a uniform distribution on [, 1]. Proof. For z 1, F Z (z) =F X (F 1 X (z)) = z.

Theorem. Let U be a uniform random variable on [, 1] and F is a cdf which possesses a unique inverse F 1. Let X F 1 (U), then the cdf of X is F. Proof. F X (x) =F U (F (x)) = P (U F (x)) = F (x). The 2 theorems are useful for pseudo-random number generation in computer simulation. The key is to generate U(, 1) random numbers. X is r.v. F(X) is r.v. X 1,, X n : r.v. s with cdf F F(X 1 ),, F(X n ): r.v. s with distribution Uniform(, 1) p. 6-9 U 1,, U n : r.v. s with distribution Uniform(, 1) F 1 (U 1 ),, F 1 (U n ): r.v. s with cdf F The pdf of Y, denoted by f Y 1.Suppose that g is a differentiable strictly increasing function. For y R Y, p. 6-1 2.Suppose that g is a differentiable strictly decreasing function. For y R Y, Theorem. Let X be a continuous random variable with pdf f X. Let Y g(x), where g is differentiable and strictly monotone. Then, the pdf of Y, denoted by f Y, is f Y (y) =f X (g 1 (y)) dg 1 (y) dy, for y such that y g(x) for some x, and f Y (y)= otherwise.

NTHU MATH 281, 216 Q: What is the role of dg 1 (y)/dy? How to interpret it? p. 6-11 Some Examples. Given the pdf f X of random variable X, Find the pdf f Y of Y ax+b, where a. Find the pdf f Y of Y=1/X. Find the cdf F Y and pdf f Y of Y X 2. p. 6-12 Q: How to find the cdf F Y and pdf f Y for general piecewise strictly monotone transformation?

Expectation, Mean, and Variance Definition. If X has a pdf f X, then the expectation of X is defined by E(X) = x f X(x) dx, provided that the integral converges absolutely. Example (Uniform Distributions). If 1, if α <x β, f X (x) = β α, otherwise, then p. 6-13 Some properties of expectation Expectation of Transformation. If Y g(x), then E(Y )= y f Y (y) dy = g(x) f X(x) dx, provided that the integral converges absolutely. Proof. The proof is given in LNp.6-16 Expectation of Linear Function. For a, b R, E(aX+b)=a E(X)+b, since p. 6-14 Definition. If X has a pdf f X, then the expectation of X is also called the mean of X or f X and denoted by X, so that μ X = E(X) = x f X(x) dx. The variance of X (or f X ) is defined as and denoted by σ 2 X Some properties of mean and variance. The X is called the standard deviation. The mean and variance for continuous random variables have the same intuitive interpretation as in the discrete case. Var(X) E(X 2 ) [E(X)] 2

Variance of Linear Function. For a, b R, E(X) = = = = Var(aX+b)=a 2 Var(X) Theorem. For a nonnegative continuous random variable X, Proof. E(X) = 1 F X (x)dx = P (X >x)dx. x f X (x) dx x 1 dt f X(x) dx x f X(x) dt dx p. 6-15 t f X (x)dx dt = 1 F X (t)dt. Recall. CH4, Theoretical Exercise #5 (textbook), Let N be a nonnegative integer-valued r.v., Proof for the expectation of transformation (LNp.6-13). Let Y g(x). It holds because p. 6-16 and Example (Uniform Distributions) Reading: textbook, Sec 5.1, 5.2, 5.3, 5.7