Chapter 4 Multiple Random Variables

Size: px
Start display at page:

Download "Chapter 4 Multiple Random Variables"

Transcription

1 Chapter 4 Multiple Random Variables Chapter 41 Joint and Marginal Distributions Definition 411: An n -dimensional random vector is a function from a sample space S into Euclidean space n R, n -dimensional Eample 41 (Sample space for dice) Roll two fair dice Recall that there are 36 equally likely outcomes in this eperiment Define =sum of dice and = difference of dice, then (, ) is a two dimensional random vector To calculate probabilities in terms of (, ), we need to go back to the original sample space For any have: If 1 1 P ((, ) A ) P ( s ) = { s S:( ( s), ( s)) A} A = I I = {(, y) : I, y I } (which is called a cross-product), we have: Specifically, if A = { } { y}, we have: P((, ) A) = P( I, I ) = P( s) 1 { s S : ( s ) I, ( s ) I } 1 P((, ) A) P(, y) P( s) = = = = { s S: ( s) = and ( s) = y} For eample, we can calculate: P((, ) A) = P((1,1)) + P((1,)) + P((,1)) + P((,)) = 4/36 if A = {( y, ) : + y 4} P ( = 5, = 3) = P((4,1)) + P((1,4)) = /36 A R, we 1

2 Definition 413: Let (, ) be a discrete bivariate random vector Then the function f ( yfrom, ) R into R defined by f ( y, ) = f, ( y, ) = P ( =, = y) is called the joint probability mass function or joint pmf of (, ) In this case, f ( ymust, ) satisfy (1) f ( y, ) 1 and () R f(, y) = 1 ( y, ) Eample: Find f, ( y, ) for (, ) defined in Eample 41 and calculate P((, ) A) y, /36 1/36 1/36 1/36 1/36 1/36 1 1/18 1/18 1/18 1/18 1/18 1/18 1/18 1/18 1/18 3 1/18 1/18 1/18 4 1/18 1/18 5 1/18 Similarly with the random variable P ( A) = f ( ), we have: A {:( s ( s), ( s)) A} (, y) A, P((, ) A) = P( s) = f (, y) For eample, we can calculate P((, ) A) for A = {( y, ) : = 7 and y 4} = {(7,),(7,1),(7,),(7,3),(7,4)}, We have P((, ) A) = f, (7,) + f, (7,1) + f, (7,) + f, (7,3) + f, (7,4) = f (7,1) + f (7,3) = 1/9,,

3 Similarly, can calculate P((, ) A) for A = {( y, ) : + y 4} = {(,),(,1),(3,),(3,1),(4,)}, We have P((, ) A) = f, (,) + f, (,1) + f, (3,) + f, (3,1) + f, (4,) = f (,) + f (3,1) + f (4,) = 4/36,,, Eample (Selecting a committee): An ad hoc committee of three is selected randomly from a pool of 1 students consisting of three seniors, three juniors, two sophomores, and two freshmen students Let be the number of seniors and the number of juniors selected What is the joint pmf of (, )? The joint pmf of (, ) is: f, (, y) = n(, y)/ /1 3 = y 3 y,, y =,1,,3 and + y 3 How do we compute epected value of a function g(, )? Eg(, ) = R g(, y) f(, y) ( y, ) Eample 414: Find E( ) for (, ) defined in eample 41 Let gy (, ) = y, then E 11 ( ) = gy (, ) f ( y, ) = yf ( y, ) = 13 ( y, ) R ( y, ) R 18 Theorem: A function f ( y, ) is a pmf of a random vector (, ) if and only if a f ( y, ) for all, y 3

4 b f(, y ) = 1 ( y, ) Eample 415 (Joint pmf for dice) Define f ( y, ) by f (,) = f (,1) = 1/6, f (1,) = f (,1) = 1/ 3 f ( y, ) = for any other (, y ) For eample, we can calculate (without referring to the original sample space) P ( = ) = P ( =, = ) + P ( = 1, = 1) = 1/6 + 1/3 = 1/ If we toss two fair dice and define = if the first die shows at most and = 1 otherwise and define = if the second die shows an odd number and = 1 other wise Then (, ) has the above pmf Theorem 416: Let (, ) be a discrete bivariate random vector with joint pmf f, (, y ) Then the marginal pmfs of and, f ( ) = P( = ) and f ( y) = P( = y), are given by f ( ) = R f (, y) and f ( y) = R f, (, y) y, Eample 417 (Marginal pmf for dice eample): y, f ( y) 1/36 1/36 1/36 1/36 1/36 1/36 6/36 1 1/18 1/18 1/18 1/18 1/18 1/36 4

5 1/18 1/18 1/18 1/18 8/36 3 1/18 1/18 1/18 6/36 4 1/18 1/18 4/36 5 1/18 /36 f ( ) 1/36 /36 3/36 4/36 5/36 6/36 5/36 4/36 3/36 /36 1/36 1 Eample 418 (Dice probabilities) Calculate the quantities involved only or using the joint or marginal pmf P ( < 3) = P ( = ) + P ( = 1) + P ( = ) = ( ) /36 = /3 1 = y= P ( < 3) = P ( =, = y ) = /3 Note: It is possible to have the same marginals but not the same joint pmf Hence, the joint pmf cannot be determined from just the marginals Eample 419 (Same marginals, different joint pmf) Define a joint pmf by (1) f (,) = f(,1) = 1/ 6, f(1,) = f(1,1) = 1/3, and () f (,) = 1/1, f(1,) = 5/1, f(,1) = f(1,1) = 3/1 Then they have same marginal distributions for and 5

6 (1) f () = f(,) + f(,1) == 1/ 6 + 1/ 6 = 1/3, f () = f(,) + f(1,) = 1/ 6 + 1/ 3 = 1/ () f () = f(,) + f(,1) == 1/1 + 3/1 = 1/3, f () = f(,) + f(1,) = 1/1 + 5/1 = 1/ Definition 411: A function f ( y, ) from continuous bivariate random vector (, ) if for every R into R is called a joint density function or joint pdf of the A R, P((, ) A) f(, y) ddy = A Notes: R A valid joint pdf f ( y, ) must satisfy (1) f ( y, ) and () f (, y) ddy = 1 The epected value of a real-valued function g(, ) is Eg(, ) g(, y) f(, y) ddy = R The marginal pdfs of and are given by f ( ) = f(, y) dy and f ( y) = f(, y) d Definition: The joint cdf of((, ) is defined by F(, y) = P(, y) for all For the continuous case, which by the Fundamental Theorem of Calculus implies y F(, y) f( s, t) dsdt = F(, y) = y f ( y, ) y R (, ) 6

7 Eample 4111 (Calculating joint probabilities) Define the joint pdf by y < < < y < f(, y) =, otherwise 6, 1 and 1, (1) Show that this is a valid joint pdf () Find f ( ) and f ( y ) (3) Find P ( + 1) (1) () f ( ) =, f ( y) 3y = (3) y 1 P ( + 1) = 6 yddy= 6 ydyd= 9/1 Eample 411: Let the continuous random vector (, ) have joint pdf y e, < < y<, f(, y) =, otherwise y Equivalently, f ( y, ) = e I{( uv, ): < u< v<} ( y, ) Please find P ( + 1) and marginal pdf of and 7

8 1/ 1 y P ( + 1) = 1 P ( + < 1) = 1 e dyd 1/ (1 ) 1/ 1 1 ( e e ) d e e = =, for f ( ) = f (, y) dy= ep( y) dy= ep( ) > y, for f ( y) = f (, y) d= ep( y) d= yep( y) y > Eample (Tensile Strength): The tensile strengths and of two kinds of nylon fibers have joint density function proportional to ep( ( + y)/ λ) for >, y > Then the joint pdf of (, ) is given by f, (, y) = cep( ( + y)/ λ) for >, y > (a) Find c (b) Find the cdf of (, ) (c) Find P ( + > 1) (1) () We have:, ep( / λ) ep( / λ) λ 1 = f (, y) ddy = cep( ( + y)/ λ) ddy = c d y dy= c 8

9 (3) We have: F (, y) = f ( s, t) dsdt = cep( ( s + t)/ λ) dsdt,, y y = c ep( s/ λ) ds cep( t/ λ) dt = [1 ep( / λ)][1 ep( y/ λ)] y P ( + > λ) = f ( yddy, ) = 1 f ( yddy, ) λ + y> λ, + y λ, λ y = 1 ( cep( ( + y)/ λ) d) dy λ = 1 ( cep( y/ λ)( ep( / λ) d) dy λ y λ 1 = 1 ep( y/ λ)[1 ep( ( λ y) / λ)] dy λ 1 λ = 1 (ep( y/ λ) ep( 1)) dy λ = 1 (1 ep( 1) ep( 1)) = ep( 1) 9

10 P ( + > λ) = f ( yddy, ) λ + y> λ, = ( cep( ( + y)/ λ) dy) d + ( cep( ( + y)/ λ) dy) d λ λ λ = ep( / λ)( ep( y/ λ) dy) d ep( / λ)( ep( y/ λ) dy) d λ + λ λ λ λ λ λ 1 1 = ep( / λ)ep( ( λ ) / λ) d+ ep( / λ) d λ λ λ 1 λ = ep( 1) d ep( / λ) ep( 1) λ = λ Eample (Obtaining pdf from cdf): Consider the cdf Then the pdf is: F, (, y) = [1 ep( )][1 ep( y)] for > and y > f, ( y, ) = Fy (, ) = ep( )ep( y) y Eample (Obtaining cdf from pdf): Consider the following pdf: f ( y, ) = 1 for < < 1, < y < 1 Then the corresponding cdf is: for > and y > 1

11 , < or y< ; y, < 1, y < 1; F(, y) = y,1, y< 1;, < 1,1 y; 1,1,1 y Chapter 4 Conditional Distributions and Independence Definition 41: Let (, ) be a discrete bivariate random vector with joint pmf f ( y, ) and marginal pmfs f ( ) and f ( y ) For any such such that P ( = ) = f ( ) >, the conditional pmf of given that = is the function of y denoted by f ( y ) : f ( y ) = P( = y = ) = f(, y)/ f ( ) For any such y such that P ( = y) = f ( y) >, the conditional pmf of given that = y is the function of denoted by f ( y ) : f ( y) = P( = = y) = f(, y)/ f ( y) Note: f ( y ) is a valid pmf since f ( y ), because f ( y, ) and f ( ) being joint and marginal pmfs, and since f( y ) = f(, y)/ f ( ) = f ( )/ f ( ) = 1 y y Eample 4: Define the joint pmf of (, ) by f (,1) = f (,) = /18, f (1,1) = f (1,3) = 3/18, f (1, ) = 4 /18, f (,3) = 4/18, that is y, 1 3 f ( y ) 1 /18 3/18 5/18 11

12 /18 4/18 6/18 3 3/18 4/18 7/18 f ( ) 4/18 1/18 4/18 1 (a) Obtain the conditional distribution of given (b) Find P ( > 1 = 1) = (a) f (1) = 1/18 f ( = 1 = 1) = 3/1, f ( = = 1) = 4/1, f ( = 3 = 1) = 3/1 (b) P ( > 1 = 1) = f( = = 1) + f( = 3 = 1) = 7/1 P ( > 1 = 1) = P ( > 1, = 1)/ P( = 1) = 7/1 Eample (Coin tossing): Consider tossing a fair coin three times and let be the number of heads and the number of tails before the first head, then the joint pmf of (, ) is given in the following table: 1 3 P ( = y) 1/8 /8 1/8 4/8 1 1/8 1/8 /8 1/8 1/8 3 1/8 1/8 P ( = ) 1/8 3/8 3/8 1/8 1 The conditional pmf f ( y ) is: 1

13 The epected value and moments: The probability: 1/4, = 1 or 3; f( ) = P ( = = ) = P ( =, = ) / P ( = ) = 1/, = ;, otherwise 1/, = 1 or ; f( 1) = P( = = 1) = P( =, = 1) / P( = 1) =, otherwise E( = ) = 1/4 + 1/* + 1/4*3 = E ( = ) = 1/4 + 1/* + 1/4*3 = 45 P ( = ) = P ( = = ) + P ( = 1 = ) + P ( = = ) = 3/4 Definition 43: Let (, ) be a continuous bivariate random vector with joint pdf f ( y, ) and marginal pdfs f ( ) and f ( y ) For any such such that f ( ) >, the conditional pdf of given that = is the function of y denoted by f ( y ) : f ( y ) = f(, y)/ f ( ) For any y such that f ( y ) >, the conditional pdf of given that = y is the function of denoted by f ( y ) : f ( y) = f(, y)/ f ( y) Eample 44: Let the continuous random vector (, ) have joint pdf f ( y, ) = e I ( y, ) y, {( uv, ): < u< v<} (a) Find the marginal pdf of (b) For any such that f ( ) >, find f ( y ) 13

14 (a) (, ) (b) f ( ) = e I ( ) f (, y) f y e I y, ( y ) ( ) = = (, ) ( ), > f ( ) Calculating Epected Values Using Conditional pmfs or pdfs: Let (, ) be a discrete (continuous) bivariate random vector with joint pmf (pdf) f ( y, ) and marginal pmfs (pdfs) f ( ) and f ( y ), and g ( ) is a function of, then the conditional epected value of g( ) given that = is denoted by E( g ( ) ) and is given by E( g ( ) ) = g( y) f( y ) [ E( g ( ) ) = g( y) f( y dy ) ] y Eample 44: (continued) (c) Find E( = ) (d) Find Var( = ) ( ) ( ) y 1 (c) E = = ye dy= + (d) ( ) y z E = = y e dy= ( z+ ) e dz = + +, therefore, ( ) Var E E ( = ) = ( = ) ( ( = )) = 1 Eample: A soft drink machine has random amount in supply at the beginning of a given day and dispenses a random amount 1 during the day It has been observed that they have the following joint pdf: 14

15 f( y1, y ) = 1/ for y1 y Then the conditional pdf of 1 given = y is: f ( y1 = y) = f( y1, y)/ f ( y ) = 1/ y for y1 y The probability P ( 1/ = 15) = 1//15= 1/3 1 Definition 45: Let (, ) be a bivariate random vector with joint pdf or pmf f ( y, ) and marginal pdfs or pmfs f ( ) and f ( y ) Then and are called independent random variables, if for every R and y R, f ( y, ) = f ( f ) ( y) Consequently, if and are independent, f ( y ) = f ( y) and f ( y) = f ( ) Technical Note: If f ( y, ) is the joint pdf for the continuous random vector (, ) where f ( y, ) f( f ) ( y) on a set Asuch that f (, y ) ddy=, then and are still called independent random A variables Eample 46: Consider the discrete bivariate random vector (, ), with joint pmf given by f (1,1) = f(,1) = f(,) = 1/1, f (1,) = f (1,3) = 1/5, and f (,3) = 3/1 Find the marginals of and Are and independent? No Because f(1,3) = 1/ 5 f (1) f (3) = 1/ 4 Question: Can we check for independence without knowing the marginals? 15

16 Lemma 47: Let (, ) be a bivariate random vector with joint pdf or pmf f (, y ) Then and are independent random variables if and only if there eist functions g( ) and h( y) such that, for every R and y R, f ( y, ) = ghy ( ) ( ) In this case f ( ) = cg( ) and f ( y) = dh( y), where c and d are some constants that would make f ( ) and f ( y ) valid pdfs or pmfs Eample 48: Consider the joint pdf independent random variables f(, y) y ( /) = y e, > and y > By Lemma 47, and are Notes: 1 Consider the set {(, y) : A and y B}, where A= { : f ( ) > } and B= { y: f ( ) > }, then this set is called a cross-product and denoted by A B If f ( y, ) is a joint pdf or pmf such that the set {(, y) : f(, y ) > } is not a cross-product, then the random variables and with joint pdf or pmf f ( y, ) are not independent 3 If it is known that and are independent random variables with marginal pdfs (or pmfs) f ( ) and f ( y ), then the joint pdf (or pmf) of and is given by f ( y, ) = f( f ) ( y) (See Eample 49 for discrete case) Theorem 41: Let and be independent random variables a For any A R and B R, P ( A, B) = P ( AP ) ( B) b Let g( ) be a function only of and h( y ) be a function only of y Then E( g( ) h( )) = E( g( )) E( h( )) 16

17 Proof: Eample 411: Let and be independent eponential(1) random variables a Find the joint pdf of and b Find P ( 4, < 3) c Find E( ) Note: Checking Independence: Let (, ) be a random vector, then and are called independent random variables if for every C = {(, y) : A, y B}, we have P((, ) C) = P( A, B) = P( A) P( B) In addition, this definition can be simplified in terms of cdf with certain conditions: and are called independent random variables if F, (, y) = F ( ) F ( y) for and Eample (Checking Independence): Suppose (, ) has the joint pdf: f, ( y, ) = (1 + y)/4 for < 1 and y < 1 Then and is not independent but and are independent The marginal pdf of is: 1 f( ) = f, (, y) dy = [(1 + y)/4] dy = 1/ 1 By symmetry, the marginal pdf of is: f ( y ) = 1/ for y < 1 for 1 < 17

18 So f, ( y, ) f( f ) ( y) and and is not independent To get the joint pdf of (, ), we first get the joint cdf of (, ): F (, y) = P(, y) (< < 1,< y< 1) So the joint pdf of (, ) is:, = P(, y y) y = { [(1 + st)/4] dt} ds y 1 = = = ( yds ) y P ( P ) ( y) 1 f (, y) = F (, y) =,, y 4 y for < < 1, < y < 1 Theorem 41: Let and be independent random variables with moment generating functions M () t and M () t Then the moment generating function of the random variable Z = + is given by M Z() t = M() t M() t Proof: Eample 413: Let ~ N( μ, σ ) and ~ n( γ, τ ) Find the pdf of Z = + (This is a very important result!!) 18

Chapter 4 Multiple Random Variables

Chapter 4 Multiple Random Variables Review for the previous lecture Definition: n-dimensional random vector, joint pmf (pdf), marginal pmf (pdf) Theorem: How to calculate marginal pmf (pdf) given joint pmf (pdf) Example: How to calculate

More information

The enumeration of all possible outcomes of an experiment is called the sample space, denoted S. E.g.: S={head, tail}

The enumeration of all possible outcomes of an experiment is called the sample space, denoted S. E.g.: S={head, tail} Random Experiment In random experiments, the result is unpredictable, unknown prior to its conduct, and can be one of several choices. Examples: The Experiment of tossing a coin (head, tail) The Experiment

More information

Math 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential.

Math 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential. Math 8A Lecture 6 Friday May 7 th Epectation Recall the three main probability density functions so far () Uniform () Eponential (3) Power Law e, ( ), Math 8A Lecture 6 Friday May 7 th Epectation Eample

More information

Ch. 5 Joint Probability Distributions and Random Samples

Ch. 5 Joint Probability Distributions and Random Samples Ch. 5 Joint Probability Distributions and Random Samples 5. 1 Jointly Distributed Random Variables In chapters 3 and 4, we learned about probability distributions for a single random variable. However,

More information

Stochastic processes Lecture 1: Multiple Random Variables Ch. 5

Stochastic processes Lecture 1: Multiple Random Variables Ch. 5 Stochastic processes Lecture : Multiple Random Variables Ch. 5 Dr. Ir. Richard C. Hendriks 26/04/8 Delft University of Technology Challenge the future Organization Plenary Lectures Book: R.D. Yates and

More information

ECON 5350 Class Notes Review of Probability and Distribution Theory

ECON 5350 Class Notes Review of Probability and Distribution Theory ECON 535 Class Notes Review of Probability and Distribution Theory 1 Random Variables Definition. Let c represent an element of the sample space C of a random eperiment, c C. A random variable is a one-to-one

More information

Notes 12 Autumn 2005

Notes 12 Autumn 2005 MAS 08 Probability I Notes Autumn 005 Conditional random variables Remember that the conditional probability of event A given event B is P(A B) P(A B)/P(B). Suppose that X is a discrete random variable.

More information

Random variable X is a mapping that maps each outcome s in the sample space to a unique real number x, < x <. ( ) X s. Real Line

Random variable X is a mapping that maps each outcome s in the sample space to a unique real number x, < x <. ( ) X s. Real Line Random Variable Random variable is a mapping that maps each outcome s in the sample space to a unique real number, <

More information

Chapter 1 Probability Theory

Chapter 1 Probability Theory Review for the previous lecture Eample: how to calculate probabilities of events (especially for sampling with replacement) and the conditional probability Definition: conditional probability, statistically

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

The story of the film so far... Mathematics for Informatics 4a. Jointly distributed continuous random variables. José Figueroa-O Farrill

The story of the film so far... Mathematics for Informatics 4a. Jointly distributed continuous random variables. José Figueroa-O Farrill The story of the film so far... Mathematics for Informatics 4a José Figueroa-O Farrill Lecture 2 2 March 22 X a c.r.v. with p.d.f. f and g : R R: then Y g(x is a random variable and E(Y g(f(d variance:

More information

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. Chapter 2 Random Variable CLO2 Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. 1 1. Introduction In Chapter 1, we introduced the concept

More information

Probability Rules. MATH 130, Elements of Statistics I. J. Robert Buchanan. Fall Department of Mathematics

Probability Rules. MATH 130, Elements of Statistics I. J. Robert Buchanan. Fall Department of Mathematics Probability Rules MATH 130, Elements of Statistics I J. Robert Buchanan Department of Mathematics Fall 2018 Introduction Probability is a measure of the likelihood of the occurrence of a certain behavior

More information

More on Distribution Function

More on Distribution Function More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution

More information

ORF 245 Fundamentals of Statistics Joint Distributions

ORF 245 Fundamentals of Statistics Joint Distributions ORF 245 Fundamentals of Statistics Joint Distributions Robert Vanderbei Fall 2015 Slides last edited on November 11, 2015 http://www.princeton.edu/ rvdb Introduction Joint Cumulative Distribution Function

More information

Lecture Notes 2 Random Variables. Random Variable

Lecture Notes 2 Random Variables. Random Variable Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution

More information

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2}, ECE32 Spring 25 HW Solutions April 6, 25 Solutions to HW Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where

More information

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf) Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution

More information

Stochastic Processes. Review of Elementary Probability Lecture I. Hamid R. Rabiee Ali Jalali

Stochastic Processes. Review of Elementary Probability Lecture I. Hamid R. Rabiee Ali Jalali Stochastic Processes Review o Elementary Probability bili Lecture I Hamid R. Rabiee Ali Jalali Outline History/Philosophy Random Variables Density/Distribution Functions Joint/Conditional Distributions

More information

MAT 271E Probability and Statistics

MAT 271E Probability and Statistics MAT 7E Probability and Statistics Spring 6 Instructor : Class Meets : Office Hours : Textbook : İlker Bayram EEB 3 ibayram@itu.edu.tr 3.3 6.3, Wednesday EEB 6.., Monday D. B. Bertsekas, J. N. Tsitsiklis,

More information

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x!

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x! Lectures 3-4 jacques@ucsd.edu 7. Classical discrete distributions D. The Poisson Distribution. If a coin with heads probability p is flipped independently n times, then the number of heads is Bin(n, p)

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

MAT 271E Probability and Statistics

MAT 271E Probability and Statistics MAT 71E Probability and Statistics Spring 013 Instructor : Class Meets : Office Hours : Textbook : Supp. Text : İlker Bayram EEB 1103 ibayram@itu.edu.tr 13.30 1.30, Wednesday EEB 5303 10.00 1.00, Wednesday

More information

Tutorial for Lecture Course on Modelling and System Identification (MSI) Albert-Ludwigs-Universität Freiburg Winter Term

Tutorial for Lecture Course on Modelling and System Identification (MSI) Albert-Ludwigs-Universität Freiburg Winter Term Tutorial for Lecture Course on Modelling and System Identification (MSI) Albert-Ludwigs-Universität Freiburg Winter Term 2016-2017 Tutorial 3: Emergency Guide to Statistics Prof. Dr. Moritz Diehl, Robin

More information

2. A Basic Statistical Toolbox

2. A Basic Statistical Toolbox . A Basic Statistical Toolbo Statistics is a mathematical science pertaining to the collection, analysis, interpretation, and presentation of data. Wikipedia definition Mathematical statistics: concerned

More information

Joint Distribution of Two or More Random Variables

Joint Distribution of Two or More Random Variables Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few

More information

Expected value of r.v. s

Expected value of r.v. s 10 Epected value of r.v. s CDF or PDF are complete (probabilistic) descriptions of the behavior of a random variable. Sometimes we are interested in less information; in a partial characterization. 8 i

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

MTH135/STA104: Probability

MTH135/STA104: Probability MTH5/STA4: Probability Homework # Due: Tuesday, Dec 6, 5 Prof Robert Wolpert Three subjects in a medical trial are given drug A After one week, those that do not respond favorably are switched to drug

More information

Chapter 3 Single Random Variables and Probability Distributions (Part 1)

Chapter 3 Single Random Variables and Probability Distributions (Part 1) Chapter 3 Single Random Variables and Probability Distributions (Part 1) Contents What is a Random Variable? Probability Distribution Functions Cumulative Distribution Function Probability Density Function

More information

Random variable X is a mapping that maps each outcome s in the sample space to a unique real number x, x. X s. Real Line

Random variable X is a mapping that maps each outcome s in the sample space to a unique real number x, x. X s. Real Line Random Variable Random variable is a mapping that maps each outcome s in the sample space to a unique real number,. s s : outcome Sample Space Real Line Eamples Toss a coin. Define the random variable

More information

Probability Long-Term Memory Review Review 1

Probability Long-Term Memory Review Review 1 Review. The formula for calculating theoretical probability of an event is What does the question mark represent? number of favorable outcomes P.? 2. True or False Experimental probability is always the

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An

More information

CHAPTER 3 RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. 3.1 Concept of a Random Variable. 3.2 Discrete Probability Distributions

CHAPTER 3 RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. 3.1 Concept of a Random Variable. 3.2 Discrete Probability Distributions CHAPTER 3 RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS 3.1 Concept of a Random Variable Random Variable A random variable is a function that associates a real number with each element in the sample space.

More information

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

More information

9.07 Introduction to Probability and Statistics for Brain and Cognitive Sciences Emery N. Brown

9.07 Introduction to Probability and Statistics for Brain and Cognitive Sciences Emery N. Brown 9.07 Introduction to Probabilit and Statistics for Brain and Cognitive Sciences Emer N. Brown I. Objectives Lecture 4: Transformations of Random Variables, Joint Distributions of Random Variables A. Understand

More information

g(y)dy = f(x)dx. y B If we write x = x(y) in the second integral, then the change of variable technique gives g(y)dy = f{x(y)} dx dy dy.

g(y)dy = f(x)dx. y B If we write x = x(y) in the second integral, then the change of variable technique gives g(y)dy = f{x(y)} dx dy dy. PROBABILITY DISTRIBUTIONS: (continued) The change of variables technique. Let f() and let y = y() be a monotonic transformation of such that = (y) eists. Let A be an event defined in terms of, and let

More information

A brief review of basics of probabilities

A brief review of basics of probabilities brief review of basics of probabilities Milos Hauskrecht milos@pitt.edu 5329 Sennott Square robability theory Studies and describes random processes and their outcomes Random processes may result in multiple

More information

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Random Variable Discrete Random

More information

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is Math 416 Lecture 3 Expected values The average or mean or expected value of x 1, x 2, x 3,..., x n is x 1 x 2... x n n x 1 1 n x 2 1 n... x n 1 n 1 n x i p x i where p x i 1 n is the probability of x i

More information

12 1 = = 1

12 1 = = 1 Basic Probability: Problem Set One Summer 07.3. We have A B B P (A B) P (B) 3. We also have from the inclusion-exclusion principle that since P (A B). P (A B) P (A) + P (B) P (A B) 3 P (A B) 3 For examples

More information

UNIT 4 MATHEMATICAL METHODS SAMPLE REFERENCE MATERIALS

UNIT 4 MATHEMATICAL METHODS SAMPLE REFERENCE MATERIALS UNIT 4 MATHEMATICAL METHODS SAMPLE REFERENCE MATERIALS EXTRACTS FROM THE ESSENTIALS EXAM REVISION LECTURES NOTES THAT ARE ISSUED TO STUDENTS Students attending our mathematics Essentials Year & Eam Revision

More information

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution Random Variable Theoretical Probability Distribution Random Variable Discrete Probability Distributions A variable that assumes a numerical description for the outcome of a random eperiment (by chance).

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

18.05 Practice Final Exam

18.05 Practice Final Exam No calculators. 18.05 Practice Final Exam Number of problems 16 concept questions, 16 problems. Simplifying expressions Unless asked to explicitly, you don t need to simplify complicated expressions. For

More information

MAS108 Probability I

MAS108 Probability I 1 BSc Examination 2008 By Course Units 2:30 pm, Thursday 14 August, 2008 Duration: 2 hours MAS108 Probability I Do not start reading the question paper until you are instructed to by the invigilators.

More information

Class 26: review for final exam 18.05, Spring 2014

Class 26: review for final exam 18.05, Spring 2014 Probability Class 26: review for final eam 8.05, Spring 204 Counting Sets Inclusion-eclusion principle Rule of product (multiplication rule) Permutation and combinations Basics Outcome, sample space, event

More information

Chapter 3. Chapter 3 sections

Chapter 3. Chapter 3 sections sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional Distributions 3.7 Multivariate Distributions

More information

Multivariate Random Variable

Multivariate Random Variable Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate

More information

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com 1 School of Oriental and African Studies September 2015 Department of Economics Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com Gujarati D. Basic Econometrics, Appendix

More information

Exercises with solutions (Set D)

Exercises with solutions (Set D) Exercises with solutions Set D. A fair die is rolled at the same time as a fair coin is tossed. Let A be the number on the upper surface of the die and let B describe the outcome of the coin toss, where

More information

Helpful Concepts for MTH 261 Final. What are the general strategies for determining the domain of a function?

Helpful Concepts for MTH 261 Final. What are the general strategies for determining the domain of a function? Helpful Concepts for MTH 261 Final What are the general strategies for determining the domain of a function? How do we use the graph of a function to determine its range? How many graphs of basic functions

More information

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) = Math 5. Rumbos Fall 07 Solutions to Review Problems for Exam. A bowl contains 5 chips of the same size and shape. Two chips are red and the other three are blue. Draw three chips from the bowl at random,

More information

Review of Elementary Probability Lecture I Hamid R. Rabiee

Review of Elementary Probability Lecture I Hamid R. Rabiee Stochastic Processes Review o Elementar Probabilit Lecture I Hamid R. Rabiee Outline Histor/Philosoph Random Variables Densit/Distribution Functions Joint/Conditional Distributions Correlation Important

More information

MATH/STAT 3360, Probability

MATH/STAT 3360, Probability MATH/STAT 3360, Probability Sample Final Examination This Sample examination has more questions than the actual final, in order to cover a wider range of questions. Estimated times are provided after each

More information

Continuous Random Variables

Continuous Random Variables 1 Continuous Random Variables Example 1 Roll a fair die. Denote by X the random variable taking the value shown by the die, X {1, 2, 3, 4, 5, 6}. Obviously the probability mass function is given by (since

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x

More information

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations Chapter 5 Statistical Models in Simulations 5.1 Contents Basic Probability Theory Concepts Discrete Distributions Continuous Distributions Poisson Process Empirical Distributions Useful Statistical Models

More information

Independent random variables

Independent random variables Will Monroe July 4, 017 with materials by Mehran Sahami and Chris Piech Independent random variables Announcements: Midterm Tomorrow! Tuesday, July 5, 7:00-9:00pm Building 30-105 (main quad, Geology Corner)

More information

Name: Exam 2 Solutions. March 13, 2017

Name: Exam 2 Solutions. March 13, 2017 Department of Mathematics University of Notre Dame Math 00 Finite Math Spring 07 Name: Instructors: Conant/Galvin Exam Solutions March, 07 This exam is in two parts on pages and contains problems worth

More information

STAT 509 Section 3.4: Continuous Distributions. Probability distributions are used a bit differently for continuous r.v. s than for discrete r.v. s.

STAT 509 Section 3.4: Continuous Distributions. Probability distributions are used a bit differently for continuous r.v. s than for discrete r.v. s. STAT 509 Section 3.4: Continuous Distributions Probability distributions are used a bit differently for continuous r.v. s than for discrete r.v. s. A continuous random variable is one for which the outcome

More information

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

STAT 430/510 Probability Lecture 7: Random Variable and Expectation STAT 430/510 Probability Lecture 7: Random Variable and Expectation Pengyuan (Penelope) Wang June 2, 2011 Review Properties of Probability Conditional Probability The Law of Total Probability Bayes Formula

More information

3.6 Conditional Distributions

3.6 Conditional Distributions STAT 42 Lecture Notes 58 3.6 Conditional Distributions Definition 3.6.. Suppose that X and Y have a discrete joint distribution with joint p.f. f and let f 2 denote the marginal p.f. of Y. For each y such

More information

Multivariate Calculus Solution 1

Multivariate Calculus Solution 1 Math Camp Multivariate Calculus Solution Hessian Matrices Math Camp In st semester micro, you will solve general equilibrium models. Sometimes when solving these models it is useful to see if utility functions

More information

Intermediate Math Circles November 8, 2017 Probability II

Intermediate Math Circles November 8, 2017 Probability II Intersection of Events and Independence Consider two groups of pairs of events Intermediate Math Circles November 8, 017 Probability II Group 1 (Dependent Events) A = {a sales associate has training} B

More information

Introduction to Probability Theory for Graduate Economics Fall 2008

Introduction to Probability Theory for Graduate Economics Fall 2008 Introduction to Probability Theory for Graduate Economics Fall 008 Yiğit Sağlam October 10, 008 CHAPTER - RANDOM VARIABLES AND EXPECTATION 1 1 Random Variables A random variable (RV) is a real-valued function

More information

Conditional Probability (cont'd)

Conditional Probability (cont'd) Conditional Probability (cont'd) April 26, 2006 Conditional Probability (cont'd) Midterm Problems In a ten-question true-false exam, nd the probability that a student get a grade of 70 percent or better

More information

Contents 1. Contents

Contents 1. Contents Contents 1 Contents 6 Distributions of Functions of Random Variables 2 6.1 Transformation of Discrete r.v.s............. 3 6.2 Method of Distribution Functions............. 6 6.3 Method of Transformations................

More information

What does independence look like?

What does independence look like? What does independence look like? Independence S AB A Independence Definition 1: P (AB) =P (A)P (B) AB S = A S B S B Independence Definition 2: P (A B) =P (A) AB B = A S Independence? S A Independence

More information

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES VARIABLE Studying the behavior of random variables, and more importantly functions of random variables is essential for both the

More information

1 Random variables and distributions

1 Random variables and distributions Random variables and distributions In this chapter we consider real valued functions, called random variables, defined on the sample space. X : S R X The set of possible values of X is denoted by the set

More information

Review of Probability. CS1538: Introduction to Simulations

Review of Probability. CS1538: Introduction to Simulations Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed

More information

Chapter 4. Chapter 4 sections

Chapter 4. Chapter 4 sections Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation

More information

MATH 3C: MIDTERM 1 REVIEW. 1. Counting

MATH 3C: MIDTERM 1 REVIEW. 1. Counting MATH 3C: MIDTERM REVIEW JOE HUGHES. Counting. Imagine that a sports betting pool is run in the following way: there are 20 teams, 2 weeks, and each week you pick a team to win. However, you can t pick

More information

Probability and Statisitcs

Probability and Statisitcs Probability and Statistics Random Variables De La Salle University Francis Joseph Campena, Ph.D. January 25, 2017 Francis Joseph Campena, Ph.D. () Probability and Statisitcs January 25, 2017 1 / 17 Outline

More information

Mathematics. ( : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2

Mathematics. (  : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2 ( : Focus on free Education) Exercise 16.2 Question 1: A die is rolled. Let E be the event die shows 4 and F be the event die shows even number. Are E and F mutually exclusive? Answer 1: When a die is

More information

MAT 271E Probability and Statistics

MAT 271E Probability and Statistics MAT 271E Probability and Statistics Spring 2011 Instructor : Class Meets : Office Hours : Textbook : Supp. Text : İlker Bayram EEB 1103 ibayram@itu.edu.tr 13.30 16.30, Wednesday EEB? 10.00 12.00, Wednesday

More information

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14 Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional

More information

k P (X = k)

k P (X = k) Math 224 Spring 208 Homework Drew Armstrong. Suppose that a fair coin is flipped 6 times in sequence and let X be the number of heads that show up. Draw Pascal s triangle down to the sixth row (recall

More information

Probabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation

Probabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation EE 178 Probabilistic Systems Analysis Spring 2018 Lecture 6 Random Variables: Probability Mass Function and Expectation Probability Mass Function When we introduce the basic probability model in Note 1,

More information

STA 247 Solutions to Assignment #1

STA 247 Solutions to Assignment #1 STA 247 Solutions to Assignment #1 Question 1: Suppose you throw three six-sided dice (coloured red, green, and blue) repeatedly, until the three dice all show different numbers. Assuming that these dice

More information

Probability Basics. Part 3: Types of Probability. INFO-1301, Quantitative Reasoning 1 University of Colorado Boulder

Probability Basics. Part 3: Types of Probability. INFO-1301, Quantitative Reasoning 1 University of Colorado Boulder Probability Basics Part 3: Types of Probability INFO-1301, Quantitative Reasoning 1 University of Colorado Boulder September 30, 2016 Prof. Michael Paul Prof. William Aspray Example A large government

More information

Math 1313 Experiments, Events and Sample Spaces

Math 1313 Experiments, Events and Sample Spaces Math 1313 Experiments, Events and Sample Spaces At the end of this recording, you should be able to define and use the basic terminology used in defining experiments. Terminology The next main topic in

More information

1 Probability Theory. 1.1 Introduction

1 Probability Theory. 1.1 Introduction 1 Probability Theory Probability theory is used as a tool in statistics. It helps to evaluate the reliability of our conclusions about the population when we have only information about a sample. Probability

More information

Lecture 2. October 21, Department of Biostatistics Johns Hopkins Bloomberg School of Public Health Johns Hopkins University.

Lecture 2. October 21, Department of Biostatistics Johns Hopkins Bloomberg School of Public Health Johns Hopkins University. Lecture 2 Department of Biostatistics Johns Hopkins Bloomberg School of Public Health Johns Hopkins University October 21, 2007 1 2 3 4 5 6 Define probability calculus Basic axioms of probability Define

More information

Notes 6 Autumn Example (One die: part 1) One fair six-sided die is thrown. X is the number showing.

Notes 6 Autumn Example (One die: part 1) One fair six-sided die is thrown. X is the number showing. MAS 08 Probability I Notes Autumn 005 Random variables A probability space is a sample space S together with a probability function P which satisfies Kolmogorov s aioms. The Holy Roman Empire was, in the

More information

and 1 P (w i)=1. n i N N = P (w i) lim

and 1 P (w i)=1. n i N N = P (w i) lim Chapter 1 Probability 1.1 Introduction Consider an experiment, result of which is random, and is one of the nite number of outcomes. Example 1. Examples of experiments and possible outcomes: Experiment

More information

Math 105 Course Outline

Math 105 Course Outline Math 105 Course Outline Week 9 Overview This week we give a very brief introduction to random variables and probability theory. Most observable phenomena have at least some element of randomness associated

More information

Exam 1 Review With Solutions Instructor: Brian Powers

Exam 1 Review With Solutions Instructor: Brian Powers Exam Review With Solutions Instructor: Brian Powers STAT 8, Spr5 Chapter. In how many ways can 5 different trees be planted in a row? 5P 5 = 5! =. ( How many subsets of S = {,,,..., } contain elements?

More information

Multivariate distributions

Multivariate distributions CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density

More information

Brief Review of Probability

Brief Review of Probability Brief Review of Probability Nuno Vasconcelos (Ken Kreutz-Delgado) ECE Department, UCSD Probability Probability theory is a mathematical language to deal with processes or experiments that are non-deterministic

More information

EE/CpE 345. Modeling and Simulation. Fall Class 5 September 30, 2002

EE/CpE 345. Modeling and Simulation. Fall Class 5 September 30, 2002 EE/CpE 345 Modeling and Simulation Class 5 September 30, 2002 Statistical Models in Simulation Real World phenomena of interest Sample phenomena select distribution Probabilistic, not deterministic Model

More information

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages Name No calculators. 18.05 Final Exam Number of problems 16 concept questions, 16 problems, 21 pages Extra paper If you need more space we will provide some blank paper. Indicate clearly that your solution

More information

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying

More information

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I Code: 15A04304 R15 B.Tech II Year I Semester (R15) Regular Examinations November/December 016 PROBABILITY THEY & STOCHASTIC PROCESSES (Electronics and Communication Engineering) Time: 3 hours Max. Marks:

More information

Recap. The study of randomness and uncertainty Chances, odds, likelihood, expected, probably, on average,... PROBABILITY INFERENTIAL STATISTICS

Recap. The study of randomness and uncertainty Chances, odds, likelihood, expected, probably, on average,... PROBABILITY INFERENTIAL STATISTICS Recap. Probability (section 1.1) The study of randomness and uncertainty Chances, odds, likelihood, expected, probably, on average,... PROBABILITY Population Sample INFERENTIAL STATISTICS Today. Formulation

More information

Chapter 2 Random Variables

Chapter 2 Random Variables Stochastic Processes Chapter 2 Random Variables Prof. Jernan Juang Dept. of Engineering Science National Cheng Kung University Prof. Chun-Hung Liu Dept. of Electrical and Computer Eng. National Chiao Tung

More information

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr. Simulation Discrete-Event System Simulation Chapter 4 Statistical Models in Simulation Purpose & Overview The world the model-builder sees is probabilistic rather than deterministic. Some statistical model

More information

Notes for Math 324, Part 20

Notes for Math 324, Part 20 7 Notes for Math 34, Part Chapter Conditional epectations, variances, etc.. Conditional probability Given two events, the conditional probability of A given B is defined by P[A B] = P[A B]. P[B] P[A B]

More information

TMA4265: Stochastic Processes

TMA4265: Stochastic Processes General information TMA4265: Stochastic Processes Andrea Riebler August 18th, 2015 You find all important information on the course webpage: https://wiki.math.ntnu.no/tma4265/2015h/start Please check this

More information