Lecture 4 : Random variable and expectation

Similar documents
Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

STT 441 Final Exam Fall 2013

Lecture 11. Probability Theory: an Overveiw

STAT 430/510: Lecture 16

Lecture 2: Repetition of probability theory and statistics

Discrete Probability Refresher

Review of Probability. CS1538: Introduction to Simulations

More than one variable

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

Continuous Random Variables

18.440: Lecture 28 Lectures Review

p. 6-1 Continuous Random Variables p. 6-2

Joint Distribution of Two or More Random Variables

Appendix A : Introduction to Probability and stochastic processes

Chapter 4. Chapter 4 sections

Communication Theory II

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

Math 105 Course Outline

Algorithms for Uncertainty Quantification

BASICS OF PROBABILITY

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

Expectation of Random Variables

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

Week 12-13: Discrete Probability

18.440: Lecture 26 Conditional expectation

conditional cdf, conditional pdf, total probability theorem?

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

Bivariate distributions

3 Multiple Discrete Random Variables

Random Variables and Their Distributions

Bivariate Distributions

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

Review: mostly probability and some statistics

Random Variables and Expectations

3. Probability and Statistics

There are two basic kinds of random variables continuous and discrete.

Lecture 2: Review of Probability

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Preliminary Statistics. Lecture 3: Probability Models and Distributions

Lecture 1: August 28

Bayesian statistics, simulation and software

Chapter 2. Continuous random variables

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

EE514A Information Theory I Fall 2013

STAT 430/510: Lecture 10

Probability Background

Final Examination Solutions (Total: 100 points)

Formulas for probability theory and linear models SF2941

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Multivariate Random Variable

ENGG2430A-Homework 2

5 Operations on Multiple Random Variables

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

2 (Statistics) Random variables

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2

Joint Probability Distributions, Correlations

MA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2

SDS 321: Introduction to Probability and Statistics

1 Random Variable: Topics

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

Limiting Distributions

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

X = X X n, + X 2

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

UCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei)

REVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B)

MATH/STAT 3360, Probability FALL 2018 Hong Gu

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)

Probability Review. Chao Lan

Random variables (discrete)

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

ECE 4400:693 - Information Theory

CMPSCI 240: Reasoning Under Uncertainty

Basics on Probability. Jingrui He 09/11/2007

Probability Notes. Compiled by Paul J. Hurtado. Last Compiled: September 6, 2017

SDS 321: Introduction to Probability and Statistics

Gaussian random variables inr n

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

CS145: Probability & Computing

UCSD ECE153 Handout #34 Prof. Young-Han Kim Tuesday, May 27, Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei)

Probability and Distributions

The Binomial distribution. Probability theory 2. Example. The Binomial distribution

Stat 5101 Notes: Algorithms (thru 2nd midterm)

Recitation 2: Probability

Regression and Statistical Inference

1 Probability and Random Variables

Data Analysis and Monte Carlo Methods

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Probability and Statistics

1. Point Estimators, Review

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

Week 2. Review of Probability, Random Variables and Univariate Distributions

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

Discrete Distributions

Transcription:

Lecture 4 : Random variable and expectation Study Objectives: to learn the concept of 1. Random variable (rv), including discrete rv and continuous rv; and the distribution functions (pmf, pdf and cdf). 2. Conditional expectation and conditional distribution; independence. 3. Expectation, variance and covariance. 4. Weak law of large numbers. 5. Moment generating function. Applied Math./NCHU: Statistics Random Variable: examples Example 1: X= the outcome of a fair dice We call X a discrete random variable. -- Pr(X=1)=1/6,., Pr(X=6)=1/6; -- Σ {x} Pr(X=x)=1, {x}={1,2,3,4,5,6} Example 1 (cont.) See textbook page 89~90 for a joint distribution of X=X1+X2, X1 and X2 are the outcomes of two independent fair dices. 2

Example: Two electronic components with four possible results: Pr(d,d)=0.09, Pr(d,a)=0.21, Pr(a,d)=0.21, Pr(a,a)=0.49; where d and a denote defective and acceptable respectively. (1) If X=number of acceptable components Pr(X=0)=0.09, Pr(X=1)=0.42, Pr(X=2)=0.49 (2) If we further define I = 1, if X=1or 2 0, if X=0 Pr(I=1)=0.91 Pr(I=0)=0.09 3 Note: If A denotes the event that at least one acceptable component is obtained, the random variable I is called the indicator random variable for the (occurrence of) event A, because I will equal 1 or 0 depending on whether A occurs. 4

Types of random variables Discrete or categorical If the possible values of X is a sequence. Probability mass function: p(a)=p(x=a); p(a) is positive for at most a countable numbers of values of a.=> see Example 4.2a (textbook, page 92.) Continuous The possible values of X is an interval and is a subset of (-, ) 5 Cumulative distribution function (cdf) F(a)=Σ {x: x<=a} p(x) F(x), or F X (x),=pr(x x), X~F( ) Property: P(a<X b)=p(x b)-p(x a) = F(b)-F(a) Read: X be distributed as F Example: F(x)=0 for x 0 and F(x)=1-exp(-x 2 ) for x>0=> P(X>1)=1-P(X 1)=1-{1-exp(-1)} = exp(-1)=0.368 6

Probability distribution function (pdf) Continuous random variable f(x) is called a pdf if S f(x)=1, where S is the support of the random variable X, or the distribution f( ), indicating the range where f(x)>0. Let I=[a,b] S, => Pr(X I)= I f(x)dx df(x)/dx x=a =f(a), and F(a)= {I=(-,a)} f(x)dx Example: [textbook page 95, example 4.2b] 7 Joint distribution of two random variables (continuous case) Joint cdf of X and Y F(x,y)=P{X x,y y} = (-,x) (-,y) f X (x)f Y (y)dxdy Of course, (-, ) (-, ) f X (x)f Y (y)dxdy=1 The distribution of X can be obtained as: F X (x)=p{x x}= P{X x,y< }=F(x, ) = (-,x) (-, ) f X (x)f Y (y)dxdy X and Y are both continuous rv with support (-, ) 8

Joint distribution of two random variables (discrete case) p(x i,y j )=P{X=x i,y=y j } P(X=x i )=P( j {X=x i,y=y j }) =Σ j P{X=x i,y=y j } =Σ j p(x i,y j ) Example: [See textbook, page 97~98, examples 4.3a,b,c] 9 Independent random variables Continuous case: X and Y are said to be independent if P{X a,y b}=p{x a}p{y b}, or FX,Y(a,b)= FX (a) FY(b), for all a and b; or fx,y(a,b)= fx (a) fy(b), Discrete case: X and Y are said to be independent if px,y (x,y)=px(x)py(y) Example: [Textbook page 102~104, examples 4.3d,e] 10

Conditional distributions px Y (x y)=p{x=x Y=y} probability mass =P{X=x,Y=y}/P{Y=y} =px,y(x,y)/py(y) fx Y(x y)dx=fx,y(x,y)dxdy/fy(y)dy => fx Y(x y)=fx,y(x,y)/fy(y) note: fx Y(x y)fy(y)dy= fx,y(x,y)dy =fx(x) Conditional probability based on events Discrete case Continuous case Example: [Textbook page 105~107, examples 4.3f,g,h] 11 Expectation E(X), or simply EX, and is common denoted as μ, is defined as EX=Σ{x}xP{X=x}, for discrete rv = {x} xfx(x)dx, for continuous rv Properties: E[g(x)]= {x} g(x)fx(x)dx E(aX)= aex, for any constant a. E(X+Y)= EX+EY Example: [Textbook page 112~118, examples 4.5a~h] 12

Variance Var(X)=E{(X-μ) 2 },μ=e(x). Property 1: Var(X)=E(X 2 )-μ 2 [=>Var(aX)=a 2 Var(X)] For the case of two rv s X and Y with some joint distribution, we have: Property 2: E{E[X Y]}=EX; and, Exercise E{E[g(X) Y]}=E{g(X)} Property 3: Var(X)=E[Var(X Y)]+Var[E(X Y)] 13 Justification of Property 3 1. E{Var(X Y)}=E{E(X 2 Y)}-E{[g(Y)] 2 } by definition, where g(y)=e(x Y); 2. Var{E(X Y)}= E{[g(Y)] 2 }-E(X 2 ) can be easily obtained; Combining 1 and 2 we have E{Var(X Y)}+Var{E(X Y)}= E{E(X 2 Y)}-E(X 2 ), which is simply Var(X) using the Property 2. 14

Covariance and correlation The covariance of two rv s X and Y is defined as: cov(x,y)=e{(x-ex)(y-ey)}, which can be easily re-expressed as E(XY)-(EX)(EY) Var(X+Y)=Var(X)+2cov(X,Y)+Var(Y) and Var(X-Y)=Var(X)-2cov(X,Y)+Var(Y) cov(ax,y)=acov(x,y) and cov(x,ay)=acov(x,y) cov(x+z,y)=cov(x,y)+cov(z,y) Propositions 4.7.2 and 4.7.3 corr(x,y)=cov(x,y)/ (Var(X)) (Var(Y)) 15 Moment generating functions The moment generating function (mgf) of the random variable X is defined as g(t)=e{exp(tx)}. For discrete case, g(t)=σ{x}exp(tx)p(x) For continuous case, g(t)= {x}exp(tx)f(x)dx Let g (n) (t)=d n g(t)/dt n, for n>=1; then we have: g (n) (0)=E(X n ), explains why g is called a mgf, and, g X+Y (t)=g X (t)g Y (t) Theorem: The mgf uniquely determines the dsitribution; which means: a specific mgf corresponds to a unique distribution, and vice versa. 16

The weak law of large numbers (WLLN) Let X 1,..,X n be a set of iid rv s with mean μ. Then for anyε>0, P{ (X 1 + +X n )/n-μ >ε} converges to 0. By Chebyshev s inequality, X it can be proved that P{ (X 1 + +X n )/n-μ >ε} <Var{(X 1 + +X n )/n}/ε 2 =(σ 2 /n)ε 2. It certainly converges to zero. We say: (X 1 + +X n )/n μ in probability (in pr); in other words, the sample mean is a consistent estimator of the parameterμ, the 17 population mean. Homework and exercises 1. State and prove the Markov inequality and the Chebyshev s inequality. 2. Please complete the proof of Property 3: Var(X)=E[Var(X Y)]+Var[E(X Y)] 3. Let T(x 1, x n ) be unbiased for estimating μ and Var{T(x 1, x n )} converges to zero when n. Please show that the T-statistic is consistent for μ. 18

Homework and exercises (cont.) 4. Do the following problems in your textbook pages 130~140: [Level 1]: 4, 6, 8, 12, 30, 36, 45, 49, 52 [Level 2]: 10, 11, 16, 28, 29, 35, 40, 50, 54, 57 19