Ch3 Operations on one random variable-expectation

Size: px
Start display at page:

Download "Ch3 Operations on one random variable-expectation"

Transcription

1 Ch3 Operations on one random variable-expectation Previously we define a random variable as a mapping from the sample space to the real line We will now introduce some operations on the random variable. Most of these operations are based on the concept expectation

2 Expectation Expectation is the name given to the process of averaging of a random variable. The followings are equivalents: Expectation or expected value of random variable, which we use the notation E[] The mean value of random variable The statistical average of random variable The following notation are equivalent E[] = Example:3.1-1

3 Expected value of a random variable The everyday averaging procedure used in the above example carries over directly to RV Example: Let be a random variable the has the following sample space values S = {1, 2, 3, 4} Now if the numbers are equally likely to occur or selected = = () 1 ( ) + () 2 ( ) + () 3 ( ) + () 4 4 ( ) = 1 = 2 = 3 = 4 P( = 1) P( = 2) P( = 3) P( =) 4

4 In general then x S = x P( = x ) i i i where each value of the random variable (x i ) is weighted by the Probability P(= x i ) This motivate the concept of expected value or mean value of RV, [ ] E = = x S [ ] E = = i xp(x) i xf (x)dx i if is discrete values if is continuous value with Probability density

5 In the discrete random variable we use the probability mass to weight the random variable. P( = x i ) In the continuous random variable we use the density to weight the random variable. f (x) Observe that f (x)dx represent the probability of the random variable at the interval dx If the random variable is symmetrical about a line x = a f (x + a) = f ( x + a) E [ ] = a Example 3.1-2

6 Expected value of a Function of a random variable Assume a random variable which has the following values and probabilities = {1, 2, 3} P(=1) = P(=2) = P(=3) = Now define the random variable Y = Y = {1, 4, 9} 1 2 P(Y=1) = P(Y=4) = P(Y=9) = E[Y] = () 1 ( ) + () 4 ( ) + ( 9) ( ) Y= 1 Y= 4 Y= 9 P( Y= 1) P( Y = 4) P( Y = 9) P( = 1) P( = 2) P( = 3) yi {1,4,9} = y P(Y= y ) y i {1,4,9} = yp(= x ) i i i i

7 In general then [ ] Eg() = N g(x)p(x) i i i=1 for discrete random variable [ ] E g(x) = g(x)f (x)dx for continuous random variable

8 Conditional Expectation We define the conditional density function for a given event B = { b} f (x b) = f (x) x < b b f (x)dx 0 x b we now define the conditional expectation in similar manner [ ] b E B = xf (x B)dx = xf (x B)dx + xf b (x B)dx f (x) constant = F (b) x < b b = x dx + x 0 dx b b f (x)dx 0 = b x b xf (x)dx b f (x)dx constant = F ( b)

9 Moments The expected value defined previously as [ ] E = = xf (x)dx if is continuous value with probability density f (x) Is called the 1st moment of the random variable with probability density f (x) The word moment is used because a similar form exist in static were the 1st moment represent the center of gravity

10 Example Assume a mass is distributed on one dimension x as shown below were m(x) is the mass density function is Then we can calculate the 1st moment or center of gravity M as M = b a b x m(x) dx m(x) dx a total mass In the probability case the total area under the density function is unity or 1

11 The expected value of the function of random variable was defined as [ ] E g(x) = g(x)f (x)dx Let the function g() defined as n g() = n = 0, 1, 2,... Then we can define the n th moment m n as n n m n = E = xf(x)dx

12 Clearly m = E = x f (x)dx = 1 The area of the function f (x) E1 [] =1 f (x)dx = m = E = xf (x)dx = The expected value of

13 Central Moments Another type of moments of interest is the central moment defined as n n µ n = E ( ) = (x ) f (x)dx the m n moments are expected values about the origin however the central moment m n is moment or expected value about the mean or average The area of the function µ = E ( ) = ( ) f (x)dx = 1 f (x) E1 [] =1 f (x)dx = 1 1 µ 1 = E ( ) = E[] E[] = = 0 were we have used the fact that E[ a ] = a constant

14 Variance and skew The second central moment µ 2 is so important, that it is given the name variance and have the special notation 2 σ x Thus the variance is given by x 2 σ = µ = E ( ) = (x ) f (x)dx σ ( the positive square root of the variance) is called the standered deviation of RV σ is a measure of the spread of the random variable about its mean or average

15 σ is a measure of the spread of the random variable about its mean or average f(x) 1 f(x) 2 σ 1 σ 2 σ > σ 1 2 The spread of f(x) is more than the spread of 2 1 f(x) ( σ > σ ) 1 2

16 variance can be found from a knowledge of first and second moments as follows [ ] σ = µ = E 2 + = E 2E x 2 = E = m m Example

17 Properties of the variance σ Let c be a constant and be a RV 2 (1) σ c The probability density function of a constant deterministic number is a delta function with spread zero f (x) δ (x c)

18 (2) σ = σ 2 2 x + c x The variance does not change by shifting the random variable, the spread will remain the same, on the other hand shifting effect the mean only (3) σ = c σ cx x 3 The third central moment µ is a measure of the 3 = E ( ) a symmetry of f (x) about x = = m 1 It will be called the skew of the density function If f (x) symmetric about x =, it has zero skew (i.e µ 3 = 0 ) µ n = 0 for all odd n. 3 The normalized third central moment µ 3 σ is known as the x skewness or coefficient of skewness of the probability density function

19 Useful Inequalities A useful tool in some probability problems are some inequalities such as Chebychev s inequality and Markov s Inequality. Chebychev s Inequality It state that for a RV, { } 2 2 P є σ є for any ε > 0

20 Proof { } 2 2 P є σ є ε + ε P є = P є and є { } { } { } = P + є and є є = f (x)dx + f (x)dx = f (x)dx +є x є but since x σ = (x ) f (x)dx (x ) f (x)dx 2 2 (x ) ε 2 2 x { } f (x)dx = P { } 2 2 P є σ є

21 Another form of Chebychev s Inequality is { } 2 2 P < 1 (σ ) { } 2 2 P є σ є A consequence of that if 2 σ 0 for a random variable, then P{ < } 1 or P{ = } 1 In other words, if the variance of the RV approach zero, the probability approaches 1, that will equal its mean.

22 Markov s Inequality Let be a non negative random number, then P{ a } E[ ] a a > 0

23 3.4 Transformations of A Random Variable Let be a random variable with a known distribution F (x) and a known density f (x). Let T(.) be a transformation or a mapping or a function that maps the R.V into Y as Y = T() The problem is to find F Y (y) and f Y (y). You can view the problem as a block box problem

24 In general can be a discrete, continuous or mixed random variable and the Transformation T can be Linear Non-linear Segmented Staircase We will consider three cases: (1) is continuous and T is continuous and monotonically increasing or decreasing

25 (2) is continuous and T is continuous nonmonotonic (3) is discrete and T is continuous

26 Monotonic Increasing Transformations of a Continuous Random Variable A transformation T is called monotonically increasing if T(x 1 ) < T(x 2 ) for any x 1 < x 2. Assume that T is continuous and differentiable at all values of x for which f (x) 0. y = T( x ) or x = T ( y )

27 The events {Y y 0 } and { x 0 } are equivalent. P {Y y 0 } = P { x 0 } Differentiating both sides with respect to y 0 and using Leibniz rule Let = 0 y x T ( y ) Y = { } { } F ( y ) = P Y y = P x = F ( x ) Y f ( y) dy f ( x) dx β ( u) Gu ( ) = H( xudx, ) α ( u) β ( u) dg( u) dβ( u) dα( u) H ( x, u) = H[ β( u), u] H[ α( u), u] + dx du du du u α ( u)

28 Now the LHS, d dy y dy f ( y) dy = f ( y ) f ( y )(0) Y Y 0 Y 0 0 dy 0 y dfy ( y) dy dy 0 = f Y ( y ) 0 The RHS d dy o 1 1 0= 0 0= 0 x T [ y ] x T [ y ] dx f ( x) dx= f [ x ] f ( x )(0) dy0 df ( x) dx dy 0 = dt f ( T ( y )) ( y ) dy0 dt f ( y ) = f ( T ( y )) Y ( y ) dy0

29 Since the results apply for any y 0, fy( y) = f T ( y) dx or fy( y) = f ( x) dy 1 1 dt ( y) dy

30 Monotonic Decreasing Transformations of a Continuous Random Variable A transformation T is called monotonically decreasing if T(x 1 ) > T (x 2 ) for any x 1 < x 2. { } { } F ( y ) = P Y y = P x = 1 F ( x ) Y y x T ( y ) 1 0 0= 0 f ( y) dy = 1 f ( x) dx Y d dy y0 1 1 dt y 0 fy( y) dy = fy( y) = 0 f[ T ( y0)] 0 dy 0 ( )

31 1 dt ( y0) since is negative (monotone decreasing) dy0 1 1 ( ) ( ) ( ) dt fy y f y T y f( x) dx = = dy dy Then we conclude that dx f ( y) = f ( x) Y dy for both increasing and decreasing monotonic transformation.

32 Nonmonotonic Transformations of a Continuous Random Variable A transformation may not be monotonic(increasing and decreasing) in the more general case it can be both increasing and decreasing as shown below The event { Y y 0 } may correspond to more than one event of the random variable.

3 Operations on One Random Variable - Expectation

3 Operations on One Random Variable - Expectation 3 Operations on One Random Variable - Expectation 3.0 INTRODUCTION operations on a random variable Most of these operations are based on a single concept expectation. Even a probability of an event can

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

MS 2001: Test 1 B Solutions

MS 2001: Test 1 B Solutions MS 2001: Test 1 B Solutions Name: Student Number: Answer all questions. Marks may be lost if necessary work is not clearly shown. Remarks by me in italics and would not be required in a test - J.P. Question

More information

p. 6-1 Continuous Random Variables p. 6-2

p. 6-1 Continuous Random Variables p. 6-2 Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability (>). Often, there is interest in random variables

More information

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. Chapter 2 Random Variable CLO2 Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. 1 1. Introduction In Chapter 1, we introduced the concept

More information

Chapter 4. Chapter 4 sections

Chapter 4. Chapter 4 sections Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

1 Solution to Problem 2.1

1 Solution to Problem 2.1 Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto

More information

MATH4210 Financial Mathematics ( ) Tutorial 7

MATH4210 Financial Mathematics ( ) Tutorial 7 MATH40 Financial Mathematics (05-06) Tutorial 7 Review of some basic Probability: The triple (Ω, F, P) is called a probability space, where Ω denotes the sample space and F is the set of event (σ algebra

More information

[Chapter 6. Functions of Random Variables]

[Chapter 6. Functions of Random Variables] [Chapter 6. Functions of Random Variables] 6.1 Introduction 6.2 Finding the probability distribution of a function of random variables 6.3 The method of distribution functions 6.5 The method of Moment-generating

More information

3 Operations on One R.V. - Expectation

3 Operations on One R.V. - Expectation 0402344 Engineering Dept.-JUST. EE360 Signal Analysis-Electrical - Electrical Engineering Department. EE360-Random Signal Analysis-Electrical Engineering Dept.-JUST. EE360-Random Signal Analysis-Electrical

More information

Mathematical Statistics 1 Math A 6330

Mathematical Statistics 1 Math A 6330 Mathematical Statistics 1 Math A 6330 Chapter 2 Transformations and Expectations Mohamed I. Riffi Department of Mathematics Islamic University of Gaza September 14, 2015 Outline 1 Distributions of Functions

More information

1 Expectation of a continuously distributed random variable

1 Expectation of a continuously distributed random variable OCTOBER 3, 204 LECTURE 9 EXPECTATION OF A CONTINUOUSLY DISTRIBUTED RANDOM VARIABLE, DISTRIBUTION FUNCTION AND CHANGE-OF-VARIABLE TECHNIQUES Expectation of a continuously distributed random variable Recall

More information

STA 4321/5325 Solution to Homework 5 March 3, 2017

STA 4321/5325 Solution to Homework 5 March 3, 2017 STA 4/55 Solution to Homework 5 March, 7. Suppose X is a RV with E(X and V (X 4. Find E(X +. By the formula, V (X E(X E (X E(X V (X + E (X. Therefore, in the current setting, E(X V (X + E (X 4 + 4 8. Therefore,

More information

Mathematics 426 Robert Gross Homework 9 Answers

Mathematics 426 Robert Gross Homework 9 Answers Mathematics 4 Robert Gross Homework 9 Answers. Suppose that X is a normal random variable with mean µ and standard deviation σ. Suppose that PX > 9 PX

More information

Prelim 1 Solutions V2 Math 1120

Prelim 1 Solutions V2 Math 1120 Feb., Prelim Solutions V Math Please show your reasoning and all your work. This is a 9 minute exam. Calculators are not needed or permitted. Good luck! Problem ) ( Points) Calculate the following: x a)

More information

Brownian Motion and Stochastic Calculus

Brownian Motion and Stochastic Calculus ETHZ, Spring 17 D-MATH Prof Dr Martin Larsson Coordinator A Sepúlveda Brownian Motion and Stochastic Calculus Exercise sheet 6 Please hand in your solutions during exercise class or in your assistant s

More information

EE4601 Communication Systems

EE4601 Communication Systems EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two

More information

Expectation of Random Variables

Expectation of Random Variables 1 / 19 Expectation of Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 13, 2015 2 / 19 Expectation of Discrete

More information

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES VARIABLE Studying the behavior of random variables, and more importantly functions of random variables is essential for both the

More information

d(x n, x) d(x n, x nk ) + d(x nk, x) where we chose any fixed k > N

d(x n, x) d(x n, x nk ) + d(x nk, x) where we chose any fixed k > N Problem 1. Let f : A R R have the property that for every x A, there exists ɛ > 0 such that f(t) > ɛ if t (x ɛ, x + ɛ) A. If the set A is compact, prove there exists c > 0 such that f(x) > c for all x

More information

Tutorial 2: Comparative Statics

Tutorial 2: Comparative Statics Tutorial 2: Comparative Statics ECO42F 20 Derivatives and Rules of Differentiation For each of the functions below: (a) Find the difference quotient. (b) Find the derivative dx. (c) Find f (4) and f (3)..

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Continuous distributions

Continuous distributions CHAPTER 7 Continuous distributions 7.. Introduction A r.v. X is said to have a continuous distribution if there exists a nonnegative function f such that P(a X b) = ˆ b a f(x)dx for every a and b. distribution.)

More information

Order Statistics and Distributions

Order Statistics and Distributions Order Statistics and Distributions 1 Some Preliminary Comments and Ideas In this section we consider a random sample X 1, X 2,..., X n common continuous distribution function F and probability density

More information

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable ECE353: Probability and Random Processes Lecture 7 -Continuous Random Variable Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu Continuous

More information

Review: mostly probability and some statistics

Review: mostly probability and some statistics Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random

More information

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results

More information

Chapter 4. Continuous Random Variables 4.1 PDF

Chapter 4. Continuous Random Variables 4.1 PDF Chapter 4 Continuous Random Variables In this chapter we study continuous random variables. The linkage between continuous and discrete random variables is the cumulative distribution (CDF) which we will

More information

(d). Why does this imply that there is no bounded extension operator E : W 1,1 (U) W 1,1 (R n )? Proof. 2 k 1. a k 1 a k

(d). Why does this imply that there is no bounded extension operator E : W 1,1 (U) W 1,1 (R n )? Proof. 2 k 1. a k 1 a k Exercise For k 0,,... let k be the rectangle in the plane (2 k, 0 + ((0, (0, and for k, 2,... let [3, 2 k ] (0, ε k. Thus is a passage connecting the room k to the room k. Let ( k0 k (. We assume ε k

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

Introduction to ODE's (0A) Young Won Lim 3/9/15

Introduction to ODE's (0A) Young Won Lim 3/9/15 Introduction to ODE's (0A) Copyright (c) 2011-2014 Young W. Lim. Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

2 Functions of random variables

2 Functions of random variables 2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as

More information

Statistical signal processing

Statistical signal processing Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable

More information

ODE Background: Differential (1A) Young Won Lim 12/29/15

ODE Background: Differential (1A) Young Won Lim 12/29/15 ODE Background: Differential (1A Copyright (c 2011-2015 Young W. Lim. Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version

More information

Transformations and Expectations

Transformations and Expectations Transformations and Expectations 1 Distributions of Functions of a Random Variable If is a random variable with cdf F (x), then any function of, say g(), is also a random variable. Sine Y = g() is a function

More information

DRAFT - Math 101 Lecture Note - Dr. Said Algarni

DRAFT - Math 101 Lecture Note - Dr. Said Algarni 2 Limits 2.1 The Tangent Problems The word tangent is derived from the Latin word tangens, which means touching. A tangent line to a curve is a line that touches the curve and a secant line is a line that

More information

Monte Carlo Methods for Statistical Inference: Variance Reduction Techniques

Monte Carlo Methods for Statistical Inference: Variance Reduction Techniques Monte Carlo Methods for Statistical Inference: Variance Reduction Techniques Hung Chen hchen@math.ntu.edu.tw Department of Mathematics National Taiwan University 3rd March 2004 Meet at NS 104 On Wednesday

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

COMPLETE METRIC SPACES AND THE CONTRACTION MAPPING THEOREM

COMPLETE METRIC SPACES AND THE CONTRACTION MAPPING THEOREM COMPLETE METRIC SPACES AND THE CONTRACTION MAPPING THEOREM A metric space (M, d) is a set M with a metric d(x, y), x, y M that has the properties d(x, y) = d(y, x), x, y M d(x, y) d(x, z) + d(z, y), x,

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

Moment Properties of Distributions Used in Stochastic Financial Models

Moment Properties of Distributions Used in Stochastic Financial Models Moment Properties of Distributions Used in Stochastic Financial Models Jordan Stoyanov Newcastle University (UK) & University of Ljubljana (Slovenia) e-mail: stoyanovj@gmail.com ETH Zürich, Department

More information

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random

More information

THEODORE VORONOV DIFFERENTIABLE MANIFOLDS. Fall Last updated: November 26, (Under construction.)

THEODORE VORONOV DIFFERENTIABLE MANIFOLDS. Fall Last updated: November 26, (Under construction.) 4 Vector fields Last updated: November 26, 2009. (Under construction.) 4.1 Tangent vectors as derivations After we have introduced topological notions, we can come back to analysis on manifolds. Let M

More information

4.1 Expected value of a discrete random variable. Suppose we shall win $i if the die lands on i. What is our ideal average winnings per toss?

4.1 Expected value of a discrete random variable. Suppose we shall win $i if the die lands on i. What is our ideal average winnings per toss? Chapter 4 Mathematical Expectation 4.1 Expected value of a discrete random variable Examples: (a) Given 1, 2, 3, 4, 5, 6. What is the average? (b) Toss an unbalanced die 1 times. Lands on 1 2 3 4 5 6 Probab..1.3.1.2.2.1

More information

4 Pairs of Random Variables

4 Pairs of Random Variables B.Sc./Cert./M.Sc. Qualif. - Statistical Theory 4 Pairs of Random Variables 4.1 Introduction In this section, we consider a pair of r.v. s X, Y on (Ω, F, P), i.e. X, Y : Ω R. More precisely, we define a

More information

Copyright c 2007 Jason Underdown Some rights reserved. quadratic formula. absolute value. properties of absolute values

Copyright c 2007 Jason Underdown Some rights reserved. quadratic formula. absolute value. properties of absolute values Copyright & License Formula Copyright c 2007 Jason Underdown Some rights reserved. quadratic formula absolute value properties of absolute values equation of a line in various forms equation of a circle

More information

Advanced Econometrics II (Part 1)

Advanced Econometrics II (Part 1) Advanced Econometrics II (Part 1) Dr. Mehdi Hosseinkouchack Goethe University Frankfurt Summer 2016 osseinkouchack (Goethe University Frankfurt) Review Slides Summer 2016 1 / 22 Distribution For simlicity,

More information

Random Variables. P(x) = P[X(e)] = P(e). (1)

Random Variables. P(x) = P[X(e)] = P(e). (1) Random Variables Random variable (discrete or continuous) is used to derive the output statistical properties of a system whose input is a random variable or random in nature. Definition Consider an experiment

More information

Continuous r.v practice problems

Continuous r.v practice problems Continuous r.v practice problems SDS 321 Intro to Probability and Statistics 1. (2+2+1+1 6 pts) The annual rainfall (in inches) in a certain region is normally distributed with mean 4 and standard deviation

More information

The random variable 1

The random variable 1 The random variable 1 Contents 1. Definition 2. Distribution and density function 3. Specific random variables 4. Functions of one random variable 5. Mean and variance 2 The random variable A random variable

More information

Week 2. Review of Probability, Random Variables and Univariate Distributions

Week 2. Review of Probability, Random Variables and Univariate Distributions Week 2 Review of Probability, Random Variables and Univariate Distributions Probability Probability Probability Motivation What use is Probability Theory? Probability models Basis for statistical inference

More information

Symplectic and Poisson Manifolds

Symplectic and Poisson Manifolds Symplectic and Poisson Manifolds Harry Smith In this survey we look at the basic definitions relating to symplectic manifolds and Poisson manifolds and consider different examples of these. We go on to

More information

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition) Exam P Review Sheet log b (b x ) = x log b (y k ) = k log b (y) log b (y) = ln(y) ln(b) log b (yz) = log b (y) + log b (z) log b (y/z) = log b (y) log b (z) ln(e x ) = x e ln(y) = y for y > 0. d dx ax

More information

f (x) = k=0 f (0) = k=0 k=0 a k k(0) k 1 = a 1 a 1 = f (0). a k k(k 1)x k 2, k=2 a k k(k 1)(0) k 2 = 2a 2 a 2 = f (0) 2 a k k(k 1)(k 2)x k 3, k=3

f (x) = k=0 f (0) = k=0 k=0 a k k(0) k 1 = a 1 a 1 = f (0). a k k(k 1)x k 2, k=2 a k k(k 1)(0) k 2 = 2a 2 a 2 = f (0) 2 a k k(k 1)(k 2)x k 3, k=3 1 M 13-Lecture Contents: 1) Taylor Polynomials 2) Taylor Series Centered at x a 3) Applications of Taylor Polynomials Taylor Series The previous section served as motivation and gave some useful expansion.

More information

Notes on the second moment method, Erdős multiplication tables

Notes on the second moment method, Erdős multiplication tables Notes on the second moment method, Erdős multiplication tables January 25, 20 Erdős multiplication table theorem Suppose we form the N N multiplication table, containing all the N 2 products ab, where

More information

Econ 508B: Lecture 5

Econ 508B: Lecture 5 Econ 508B: Lecture 5 Expectation, MGF and CGF Hongyi Liu Washington University in St. Louis July 31, 2017 Hongyi Liu (Washington University in St. Louis) Math Camp 2017 Stats July 31, 2017 1 / 23 Outline

More information

Question 1. Question 4. Question 2. Question 5. Question 3. Question 6.

Question 1. Question 4. Question 2. Question 5. Question 3. Question 6. İstanbul Kültür University Faculty of Engineering MCB17 Introduction to Probability Statistics Second Midterm Fall 21-21 Number: Name: Department: Section: Directions You have 9 minutes to complete the

More information

The Multivariate Normal Distribution. In this case according to our theorem

The Multivariate Normal Distribution. In this case according to our theorem The Multivariate Normal Distribution Defn: Z R 1 N(0, 1) iff f Z (z) = 1 2π e z2 /2. Defn: Z R p MV N p (0, I) if and only if Z = (Z 1,..., Z p ) T with the Z i independent and each Z i N(0, 1). In this

More information

THE INVERSE FUNCTION THEOREM

THE INVERSE FUNCTION THEOREM THE INVERSE FUNCTION THEOREM W. PATRICK HOOPER The implicit function theorem is the following result: Theorem 1. Let f be a C 1 function from a neighborhood of a point a R n into R n. Suppose A = Df(a)

More information

Basics of Stochastic Modeling: Part II

Basics of Stochastic Modeling: Part II Basics of Stochastic Modeling: Part II Continuous Random Variables 1 Sandip Chakraborty Department of Computer Science and Engineering, INDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR August 10, 2016 1 Reference

More information

ACM 116: Lecture 2. Agenda. Independence. Bayes rule. Discrete random variables Bernoulli distribution Binomial distribution

ACM 116: Lecture 2. Agenda. Independence. Bayes rule. Discrete random variables Bernoulli distribution Binomial distribution 1 ACM 116: Lecture 2 Agenda Independence Bayes rule Discrete random variables Bernoulli distribution Binomial distribution Continuous Random variables The Normal distribution Expected value of a random

More information

Continuous Random Variables

Continuous Random Variables Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables

More information

Iowa State University. Instructor: Alex Roitershtein Summer Homework #5. Solutions

Iowa State University. Instructor: Alex Roitershtein Summer Homework #5. Solutions Math 50 Iowa State University Introduction to Real Analysis Department of Mathematics Instructor: Alex Roitershtein Summer 205 Homework #5 Solutions. Let α and c be real numbers, c > 0, and f is defined

More information

Definition: If y = f(x), then. f(x + x) f(x) y = f (x) = lim. Rules and formulas: 1. If f(x) = C (a constant function), then f (x) = 0.

Definition: If y = f(x), then. f(x + x) f(x) y = f (x) = lim. Rules and formulas: 1. If f(x) = C (a constant function), then f (x) = 0. Definition: If y = f(x), then Rules and formulas: y = f (x) = lim x 0 f(x + x) f(x). x 1. If f(x) = C (a constant function), then f (x) = 0. 2. If f(x) = x k (a power function), then f (x) = kx k 1. 3.

More information

Solution to Assignment 3

Solution to Assignment 3 The Chinese University of Hong Kong ENGG3D: Probability and Statistics for Engineers 5-6 Term Solution to Assignment 3 Hongyang Li, Francis Due: 3:pm, March Release Date: March 8, 6 Dear students, The

More information

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Klaus Witrisal witrisal@tugraz.at Signal Processing and Speech Communication Laboratory www.spsc.tugraz.at Graz University of

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 17: Continuous random variables: conditional PDF Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin

More information

MATH 56A SPRING 2008 STOCHASTIC PROCESSES 197

MATH 56A SPRING 2008 STOCHASTIC PROCESSES 197 MATH 56A SPRING 8 STOCHASTIC PROCESSES 197 9.3. Itô s formula. First I stated the theorem. Then I did a simple example to make sure we understand what it says. Then I proved it. The key point is Lévy s

More information

Continuous Random Variables and Continuous Distributions

Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable

More information

h(x) lim H(x) = lim Since h is nondecreasing then h(x) 0 for all x, and if h is discontinuous at a point x then H(x) > 0. Denote

h(x) lim H(x) = lim Since h is nondecreasing then h(x) 0 for all x, and if h is discontinuous at a point x then H(x) > 0. Denote Real Variables, Fall 4 Problem set 4 Solution suggestions Exercise. Let f be of bounded variation on [a, b]. Show that for each c (a, b), lim x c f(x) and lim x c f(x) exist. Prove that a monotone function

More information

Lecture 3: Random variables, distributions, and transformations

Lecture 3: Random variables, distributions, and transformations Lecture 3: Random variables, distributions, and transformations Definition 1.4.1. A random variable X is a function from S into a subset of R such that for any Borel set B R {X B} = {ω S : X(ω) B} is an

More information

MATH 317 Fall 2016 Assignment 5

MATH 317 Fall 2016 Assignment 5 MATH 37 Fall 26 Assignment 5 6.3, 6.4. ( 6.3) etermine whether F(x, y) e x sin y îı + e x cos y ĵj is a conservative vector field. If it is, find a function f such that F f. enote F (P, Q). We have Q x

More information

Chap 2.1 : Random Variables

Chap 2.1 : Random Variables Chap 2.1 : Random Variables Let Ω be sample space of a probability model, and X a function that maps every ξ Ω, toa unique point x R, the set of real numbers. Since the outcome ξ is not certain, so is

More information

Expectation, variance and moments

Expectation, variance and moments Expectation, variance and moments John Appleby Contents Expectation and variance Examples 3 Moments and the moment generating function 4 4 Examples of moment generating functions 5 5 Concluding remarks

More information

Brief Review of Probability

Brief Review of Probability Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

Linearization and Extreme Values of Functions

Linearization and Extreme Values of Functions Linearization and Extreme Values of Functions 3.10 Linearization and Differentials Linear or Tangent Line Approximations of function values Equation of tangent to y = f(x) at (a, f(a)): Tangent line approximation

More information

Maxima and Minima. (a, b) of R if

Maxima and Minima. (a, b) of R if Maxima and Minima Definition Let R be any region on the xy-plane, a function f (x, y) attains its absolute or global, maximum value M on R at the point (a, b) of R if (i) f (x, y) M for all points (x,

More information

CS145: Probability & Computing

CS145: Probability & Computing CS45: Probability & Computing Lecture 5: Concentration Inequalities, Law of Large Numbers, Central Limit Theorem Instructor: Eli Upfal Brown University Computer Science Figure credits: Bertsekas & Tsitsiklis,

More information

Laith Batarseh. internal forces

Laith Batarseh. internal forces Next Previous 1/8/2016 Chapter seven Laith Batarseh Home End Definitions When a member is subjected to external load, an and/or moment are generated inside this member. The value of the generated internal

More information

STAT 430/510: Lecture 10

STAT 430/510: Lecture 10 STAT 430/510: Lecture 10 James Piette June 9, 2010 Updates HW2 is due today! Pick up your HW1 s up in stat dept. There is a box located right when you enter that is labeled "Stat 430 HW1". It ll be out

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

STAT 430/510: Lecture 16

STAT 430/510: Lecture 16 STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions

More information

Scientific Computing: Monte Carlo

Scientific Computing: Monte Carlo Scientific Computing: Monte Carlo Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 Course MATH-GA.2043 or CSCI-GA.2112, Spring 2012 April 5th and 12th, 2012 A. Donev (Courant Institute)

More information

Today: Fundamentals of Monte Carlo

Today: Fundamentals of Monte Carlo Today: Fundamentals of Monte Carlo What is Monte Carlo? Named at Los Alamos in 1940 s after the casino. Any method which uses (pseudo)random numbers as an essential part of the algorithm. Stochastic -

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions

More information

Product measure and Fubini s theorem

Product measure and Fubini s theorem Chapter 7 Product measure and Fubini s theorem This is based on [Billingsley, Section 18]. 1. Product spaces Suppose (Ω 1, F 1 ) and (Ω 2, F 2 ) are two probability spaces. In a product space Ω = Ω 1 Ω

More information

Review for the Final Exam

Review for the Final Exam Math 171 Review for the Final Exam 1 Find the limits (4 points each) (a) lim 4x 2 3; x x (b) lim ( x 2 x x 1 )x ; (c) lim( 1 1 ); x 1 ln x x 1 sin (x 2) (d) lim x 2 x 2 4 Solutions (a) The limit lim 4x

More information

Math 152 Take Home Test 1

Math 152 Take Home Test 1 Math 5 Take Home Test Due Monday 5 th October (5 points) The following test will be done at home in order to ensure that it is a fair and representative reflection of your own ability in mathematics I

More information

where r n = dn+1 x(t)

where r n = dn+1 x(t) Random Variables Overview Probability Random variables Transforms of pdfs Moments and cumulants Useful distributions Random vectors Linear transformations of random vectors The multivariate normal distribution

More information

Math Spring Practice for the final Exam.

Math Spring Practice for the final Exam. Math 4 - Spring 8 - Practice for the final Exam.. Let X, Y, Z be three independnet random variables uniformly distributed on [, ]. Let W := X + Y. Compute P(W t) for t. Honors: Compute the CDF function

More information

2 Continuous Random Variables and their Distributions

2 Continuous Random Variables and their Distributions Name: Discussion-5 1 Introduction - Continuous random variables have a range in the form of Interval on the real number line. Union of non-overlapping intervals on real line. - We also know that for any

More information

Xt i Xs i N(0, σ 2 (t s)) and they are independent. This implies that the density function of X t X s is a product of normal density functions:

Xt i Xs i N(0, σ 2 (t s)) and they are independent. This implies that the density function of X t X s is a product of normal density functions: 174 BROWNIAN MOTION 8.4. Brownian motion in R d and the heat equation. The heat equation is a partial differential equation. We are going to convert it into a probabilistic equation by reversing time.

More information

ENGG2430A-Homework 2

ENGG2430A-Homework 2 ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,

More information

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying

More information