3. General Random Variables Part IV: Mul8ple Random Variables. ECE 302 Fall 2009 TR 3 4:15pm Purdue University, School of ECE Prof.
|
|
- Reginald Simon
- 5 years ago
- Views:
Transcription
1 3. General Random Variables Part IV: Mul8ple Random Variables ECE 302 Fall 2009 TR 3 4:15pm Purdue University, School of ECE Prof. Ilya Pollak
2 Joint PDF of two con8nuous r.v. s PDF of continuous r.v.'s X and Y is the function f X,Y (x, y) such that P((X,Y) A) = f X,Y (x, y)dxdy for any event A R 2 y A A x
3 Joint PDF of two con8nuous r.v. s PDF of continuous r.v.'s X and Y is the function f X,Y (x, y) such that P((X,Y) A) = f X,Y (x, y)dxdy for any event A R 2 y A A area δ 2, y+δ probability mass f X,Y (x, y) δ 2 y x x x+δ Interpretation : f X,Y (x, y) is the probability mass per unit area, P(x X x + δ,y Y y + δ) f X,Y (x, y) δ 2
4 Expecta8ons E[g(X,Y)] = g(x,y) f X,Y (x, y)dxdy Recall : for discrete random variables, y x E[g(X,Y)] = g(x, y) p X,Y (x, y)
5 Joint CDF of two r.v. s p X,Y (k,m) if X,Y are discrete r.v.'s m y k x F X,Y (x,y) = P(X x,y y) = y x f X,Y (a,b)dadb if X,Y are continuous r.v.'s
6 From the joint CDF to the joint PDF If X,Y are continuous r.v.'s, then f X,Y (x, y) = 2 x y F X,Y (x,y)
7 From the joint PDF to the marginal PDF f X (x) = d dx F X (x)
8 From the joint PDF to the marginal PDF f X (x) = d dx F (x) = d X P(X x) dx
9 From the joint PDF to the marginal PDF f X (x) = d dx F (x) = d X dx P(X x) = d P(X x, y < ) dx
10 From the joint PDF to the marginal PDF f X (x) = d dx F X (x) = d dx P(X x) = d dx P(X x, y < ) = d dx F X,Y (x, )
11 From the joint PDF to the marginal PDF f X (x) = d dx F X (x) = d dx P(X x) = d dx P(X x, y < ) = d dx F X,Y (x, ) = d dx x f X,Y (a,b)dadb
12 From the joint PDF to the marginal PDF f X (x) = d dx F X (x) = d dx P(X x) = d dx P(X x, y < ) = d dx F X,Y (x, ) = d dx x f X,Y (a,b)dadb = d dx x f X,Y (a,b)da db
13 From the joint PDF to the marginal PDF f X (x) = d dx F X (x) = d dx P(X x) = d dx P(X x, y < ) = d dx F X,Y (x, ) = d dx x f X,Y (a,b)dadb = d dx x f X,Y (a,b)da db = f X,Y (x,b)db
14 From the joint PDF to the marginal PDF f X (x) = d dx F X (x) = d dx P(X x) = d dx P(X x, y < ) = d dx F X,Y (x, ) = d dx x f X,Y (a,b)dadb = d dx x f X,Y (a,b)da db = f X,Y (x,b)db Similarly, f Y (y) = f X,Y (a,y)da
15 Obtaining marginal PDF from joint PDF: Interpreta8on f X (x) = f X,Y (x,b)db x x+δ P(x X x + δ) f X (x) δ f X,Y (x,y)dy δ
16 Independence of two con8nuous random variables Continuous r.v.'s X and Y are called independent if f X,Y (x, y) = f X (x) f Y (y)
17 Joint PDF of many con8nuous r.v. s PDF of continuous r.v.'s X 1,,X n is the function f X1,,X n (x 1,, x n ) such that P((X 1,, X n ) A) = f X1,,X n (x 1,,x n )dx 1 dx n for any event A R n A
18 Joint CDF of many r.v. s F X1 X n (x 1,, x n ) = P(X 1 x 1,,X n x n ) p X1 X n (k 1,,k n ) if X 1 X n are discrete r.v.'s k 1 x 1,,k n x n = x 1 x n f X1 X n (a 1,,a n )da 1 da n if X 1 X n are continuous r.v.'s
19 Expecta8ons R n E[g(X 1, X 2,, X n )] = g(x 1,,x n ) f X1,,X n (x 1,, x n ) dx 1 dx n If g(x 1,X 2,, X n ) = a 0 + a 1 X 1 + a 2 X a n X n, then E[g(X 1, X 2,, X n )] = a 0 + a 1 E[X 1 ] + a 2 E[X 2 ] + + a n E[X n ]
20 Independence of many con8nuous random variables Continuous r.v.'s X 1,,X n are called independent if f X1,,X n (x 1,,x n ) = f X1 (x 1 ) f X 2 (x 2 ) f X n (x n )
21 Proper8es of independent r.v. s If X 1,,X n are independent, then (1) so are g 1 (X 1 ),,g n (X n ); [ ] = E[ h 1 (X 1 )] E h n (X n ) (2) E h 1 (X 1 ) h n (X n ) (3) var(x X n ) = var(x 1 ) + + var(x n ) [ ];
22 Covariance and Correla8on Covariance of two random variables X and Y is defined as : [ ] cov(x,y) = E (X E[X])(Y E[Y])
23 Covariance and Correla8on Covariance of two random variables X and Y is defined as : [ ] cov(x,y) = E (X E[X])(Y E[Y]) It can be shown that cov(x,y) = E[XY] E[X]E[Y]
24 Covariance and Correla8on Covariance of two random variables X and Y is defined as : [ ] cov(x,y) = E (X E[X])(Y E[Y]) It can be shown that cov(x,y) = E[XY] E[X]E[Y] Correlation coefficient of two random variables X and Y is defined as : cov(x,y) ρ(x,y) = var(x) var(y)
25 Covariance and Correla8on Covariance of two random variables X and Y is defined as : [ ] cov(x,y) = E (X E[X])(Y E[Y]) It can be shown that cov(x,y) = E[XY] E[X]E[Y] Correlation coefficient of two random variables X and Y is defined as : cov(x,y) ρ(x,y) = var(x) var(y) Random variables X and Y are said to be uncorrelated if cov(x,y) = 0.
26 Covariance and Correla8on Covariance of two random variables X and Y is defined as : [ ] cov(x,y) = E (X E[X])(Y E[Y]) It can be shown that cov(x,y) = E[XY] E[X]E[Y] Correlation coefficient of two random variables X and Y is defined as : cov(x,y) ρ(x,y) = var(x) var(y) Random variables X and Y are said to be uncorrelated if cov(x,y) = 0. For random variables with nonzero variances, this is equivalent to ρ(x,y) = 0.
27 Proper8es of the correla8on coefficient 1 ρ(x,y) 1 (see Problems 4.20 and 4.21 in the recommended text)
28 Proper8es of the correla8on coefficient 1 ρ(x,y) 1 (see Problems 4.20 and 4.21 in the recommended text) If Y E[Y] is a posi8ve (resp., nega8ve) mul8ple of X E[X] then ρ(x,y) = 1 (resp., ρ(x,y) = 1)
29 Proper8es of the correla8on coefficient 1 ρ(x,y) 1 (see Problems 4.20 and 4.21 in the recommended text) If Y E[Y] is a posi8ve (resp., nega8ve) mul8ple of X E[X] then ρ(x,y) = 1 (resp., ρ(x,y) = 1) If ρ(x,y) = 1 (resp., ρ(x,y) = 1), then, with probability 1, Y E[Y] is a posi8ve (resp., nega8ve) mul8ple of X E[X].
30 Correla8on coefficient measures the strength of a linear dependence 10,000 points = 10,000 independent realiza8ons of (X,Y)
31 Correla8on coefficient measures the strength of a linear dependence 10,000 points = 10,000 independent realiza8ons of (X,Y) ρ(x,y) 0.9 thin cloud means ρ close to 1 thick cloud would mean ρ close to 0
32 Correla8on coefficient measures the strength of a linear dependence 10,000 points = 10,000 independent realiza8ons of (X,Y) ρ(x,y) = 1
33 Correla8on coefficient measures the strength of a linear dependence 10,000 points = 10,000 independent realiza8ons of (X,Y) ρ(x,y) = 1 Note: nega8ve slope means ρ<0 posi8ve slope means ρ>0 but ρ is NOT related to slope
34 Correla8on coefficient measures the strength of a linear dependence 10,000 points = 10,000 independent realiza8ons of (X,Y) ρ(x,y) = 0 X,Y uncorrelated var(x) = var(y)
35 Correla8on coefficient measures the strength of a linear dependence 10,000 points = 10,000 independent realiza8ons of (X,Y) ρ(x,y) = 0 X,Y uncorrelated var(x) var(y)
36 Correla8on coefficient measures the strength of a linear dependence 10,000 points = 10,000 independent realiza8ons of (X,Y) ρ(x,y) 0.7
37 Correla8on coefficient measures the strength of a linear dependence 10,000 points = 10,000 independent realiza8ons of (X,Y) ρ(x,y) 0.4
38 Goldman Sachs and JPMorgan prices
39 Goldman Sachs and JPMorgan daily returns empirical correlation coeff 0.71, slope 0.64
40 Correla8on or dependence do not imply causa8on Historically, daily returns of Goldman Sachs and JPMorgan have correla8on coefficient of about 0.71 (es8mated over Jan 5, 2005 Oct 19, 2009).
41 Correla8on or dependence do not imply causa8on Historically, daily returns of Goldman Sachs and JPMorgan have correla8on coefficient of about 0.71 (es8mated over Jan 5, 2005 Oct 19, 2009). This does not imply that GS has a causal effect on JPM or that JPM has a causal effect on GS.
42 Correla8on or dependence do not imply causa8on Historically, daily returns of Goldman Sachs and JPMorgan have correla8on coefficient of about 0.71 (es8mated over Jan 5, 2005 Oct 19, 2009). This does not imply that GS has a causal effect on JPM or that JPM has a causal effect on GS. Since both companies are in the same economy (US) and the same industry (banking), they are exposed to similar business environments, and hence it is natural to expect that their performance will be correlated.
43 Correla8on or dependence do not imply causa8on
44 Problem 4.17 Suppose X and Y are r.v. s with the same variance. Show that X Y and X+Y are uncorrelated
45 Problem 4.17 Suppose X and Y are r.v. s with the same variance. Show that X Y and X+Y are uncorrelated E[(X Y)(X+Y)] E[X Y]E[X+Y]
46 Problem 4.17 Suppose X and Y are r.v. s with the same variance. Show that X Y and X+Y are uncorrelated E[(X Y)(X+Y)] E[X Y]E[X+Y] = E[X 2 Y 2 ] (E[X]) 2 + (E[Y]) 2
47 Problem 4.17 Suppose X and Y are r.v. s with the same variance. Show that X Y and X+Y are uncorrelated E[(X Y)(X+Y)] E[X Y]E[X+Y] = E[X 2 Y 2 ] (E[X]) 2 + (E[Y]) 2 = var(x) var(y) = 0
48 Example 4.14 Let X and Y be the number of H s and T s, respec8vely, in n coin tosses. Find ρ(x,y).
49 Example 4.14 Let X and Y be the number of H s and T s, respec8vely, in n coin tosses. Find ρ(x,y). Method 1: since X = n Y, we immediately have: ρ(x,y) = 1.
50 Example 4.14 Let X and Y be the number of H s and T s, respec8vely, in n coin tosses. Find ρ(x,y). Method 1: since X = n Y, we immediately have: ρ(x,y) = 1. Method 2: since X = n Y, it follows that E[X] = n E[Y] and X E[X] = (Y E[Y]), and var(x) = var(y).
51 Example 4.14 Let X and Y be the number of H s and T s, respec8vely, in n coin tosses. Find ρ(x,y). Method 1: since X = n Y, we immediately have: ρ(x,y) = 1. Method 2: since X = n Y, it follows that E[X] = n E[Y] and X E[X] = (Y E[Y]), and var(x) = var(y). cov(x,y) = E[(X E[X])(Y E[Y])]
52 Example 4.14 Let X and Y be the number of H s and T s, respec8vely, in n coin tosses. Find ρ(x,y). Method 1: since X = n Y, we immediately have: ρ(x,y) = 1. Method 2: since X = n Y, it follows that E[X] = n E[Y] and X E[X] = (Y E[Y]), and var(x) = var(y). cov(x,y) = E[(X E[X])(Y E[Y])] = E[(X E[X]) 2 ] = var(x).
53 Example 4.14 Let X and Y be the number of H s and T s, respec8vely, in n coin tosses. Find ρ(x,y). Method 1: since X = n Y, we immediately have: ρ(x,y) = 1. Method 2: since X = n Y, it follows that E[X] = n E[Y] and X E[X] = (Y E[Y]), and var(x) = var(y). cov(x,y) = E[(X E[X])(Y E[Y])] = E[(X E[X]) 2 ] = var(x). ρ(x,y) = cov(x,y)/(σ X σ Y )
54 Example 4.14 Let X and Y be the number of H s and T s, respec8vely, in n coin tosses. Find ρ(x,y). Method 1: since X = n Y, we immediately have: ρ(x,y) = 1. Method 2: since X = n Y, it follows that E[X] = n E[Y] and X E[X] = (Y E[Y]), and var(x) = var(y). cov(x,y) = E[(X E[X])(Y E[Y])] = E[(X E[X]) 2 ] = var(x). ρ(x,y) = cov(x,y)/(σ X σ Y ) = var(x)/(σ X σ X )
55 Example 4.14 Let X and Y be the number of H s and T s, respec8vely, in n coin tosses. Find ρ(x,y). Method 1: since X = n Y, we immediately have: ρ(x,y) = 1. Method 2: since X = n Y, it follows that E[X] = n E[Y] and X E[X] = (Y E[Y]), and var(x) = var(y). cov(x,y) = E[(X E[X])(Y E[Y])] = E[(X E[X]) 2 ] = var(x). ρ(x,y) = cov(x,y)/(σ X σ Y ) = var(x)/(σ X σ X ) = 1.
56 Independence and Uncorrelatedness If X and Y are independent, then E[XY] = E[X] E[Y], and therefore cov(x,y)=0 which means that X and Y are uncorrelated.
57 Independence and Uncorrelatedness If X and Y are independent, then E[XY] = E[X] E[Y], and therefore cov(x,y)=0 which means that X and Y are uncorrelated. However, if X and Y are uncorrelated they are not necessarily independent (a problem on HW 8).
58 Independence and Uncorrelatedness If X and Y are independent, then E[XY] = E[X] E[Y], and therefore cov(x,y)=0 which means that X and Y are uncorrelated. However, if X and Y are uncorrelated they are not necessarily independent (a problem on HW 8). If X and Y are correlated, they are dependent.
59 Independence and Uncorrelatedness If X and Y are independent, then E[XY] = E[X] E[Y], and therefore cov(x,y)=0 which means that X and Y are uncorrelated. However, if X and Y are uncorrelated they are not necessarily independent (a problem on HW 8). If X and Y are correlated, they are dependent. If X and Y are dependent, they are not necessarily correlated.
60 Variance of the sum of r.v. s var(x 1 + X 2 ) = var(x 1 ) + var(x 2 ) + 2cov(X 1, X 2 ) More generally, var n i=1 X i = n i=1 var X i ( ) + cov(x i,x j ) {(i, j ) i j}
61 Variance of the sum of r.v. s Denoting X i = X i E[X i ], we have : var n i=1 X i = E n i=1 X i 2
62 Variance of the sum of r.v. s Denoting X i = X i E[X i ], we have : var n i=1 X i = E n i=1 X i 2 = E n i=1 n j=1 X i X j
63 Variance of the sum of r.v. s Denoting X i = X i E[X i ], we have : var n i=1 X i = E n i=1 n i=1 n j=1 X i 2 = E X i = E [ X ] j n i=1 n j=1 X i X j
64 Variance of the sum of r.v. s Denoting X i = X i E[X i ], we have : var n i=1 X i = E n i=1 n i=1 n j=1 X i 2 = E X i n i=1 = E = E [ X ] j n i=1 n j=1 X i [ 2 X ] i + E[ X i X ] j (i, j ) i j { } X j
65 Variance of the sum of r.v. s Denoting X i = X i E[X i ], we have : var n i=1 X i = E n i=1 n i=1 n j=1 X i 2 = E X i n i=1 = E n i=1 = var X i = E [ X ] j n i=1 n j=1 X i [ 2 X ] i + E[ X i X ] j (i, j ) i j { } X j ( ) + cov(x i,x j ) {(i, j ) i j}
Functions of two random variables. Conditional pairs
Handout 10 Functions of two random variables. Conditional pairs "Science is a wonderful thing if one does not have to earn a living at it. One should earn one's living by work of which one is sure one
More informationf X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx
INDEPENDENCE, COVARIANCE AND CORRELATION Independence: Intuitive idea of "Y is independent of X": The distribution of Y doesn't depend on the value of X. In terms of the conditional pdf's: "f(y x doesn't
More informationBivariate Distributions
Bivariate Distributions EGR 260 R. Van Til Industrial & Systems Engineering Dept. Copyright 2013. Robert P. Van Til. All rights reserved. 1 What s It All About? Many random processes produce Examples.»
More informationENGG2430A-Homework 2
ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,
More informationCHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable
CHAPTER 4 MATHEMATICAL EXPECTATION 4.1 Mean of a Random Variable The expected value, or mathematical expectation E(X) of a random variable X is the long-run average value of X that would emerge after a
More informationJoint Distribution of Two or More Random Variables
Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few
More informationMore than one variable
Chapter More than one variable.1 Bivariate discrete distributions Suppose that the r.v. s X and Y are discrete and take on the values x j and y j, j 1, respectively. Then the joint p.d.f. of X and Y, to
More informationP (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n
JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are
More informationRandom Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay
1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued
More informationRandom Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.
Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationSTAT 430/510: Lecture 16
STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions
More informationRaquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010
Raquel Prado Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010 Final Exam (Type B) The midterm is closed-book, you are only allowed to use one page of notes and a calculator.
More information4. Distributions of Functions of Random Variables
4. Distributions of Functions of Random Variables Setup: Consider as given the joint distribution of X 1,..., X n (i.e. consider as given f X1,...,X n and F X1,...,X n ) Consider k functions g 1 : R n
More informationChapter 4 continued. Chapter 4 sections
Chapter 4 sections Chapter 4 continued 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP:
More informationProblem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},
ECE32 Spring 25 HW Solutions April 6, 25 Solutions to HW Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where
More informationIAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES
IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES VARIABLE Studying the behavior of random variables, and more importantly functions of random variables is essential for both the
More informationSTAT/MATH 395 PROBABILITY II
STAT/MATH 395 PROBABILITY II Bivariate Distributions Néhémy Lim University of Washington Winter 2017 Outline Distributions of Two Random Variables Distributions of Two Discrete Random Variables Distributions
More informationAlgorithms for Uncertainty Quantification
Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example
More informationSTT 441 Final Exam Fall 2013
STT 441 Final Exam Fall 2013 (12:45-2:45pm, Thursday, Dec. 12, 2013) NAME: ID: 1. No textbooks or class notes are allowed in this exam. 2. Be sure to show all of your work to receive credit. Credits are
More informationChapter 4. Chapter 4 sections
Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation
More informationReview of Probability. CS1538: Introduction to Simulations
Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed
More informationBivariate distributions
Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient
More informationJointly Distributed Random Variables
Jointly Distributed Random Variables CE 311S What if there is more than one random variable we are interested in? How should you invest the extra money from your summer internship? To simplify matters,
More informationAppendix A : Introduction to Probability and stochastic processes
A-1 Mathematical methods in communication July 5th, 2009 Appendix A : Introduction to Probability and stochastic processes Lecturer: Haim Permuter Scribe: Shai Shapira and Uri Livnat The probability of
More informationECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else
ECE 450 Homework #3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3 4 5
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More information5 Operations on Multiple Random Variables
EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y
More informationProblem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51
Yi Lu Correlation and Covariance Yi Lu ECE 313 2/51 Definition Let X and Y be random variables with finite second moments. the correlation: E[XY ] Yi Lu ECE 313 3/51 Definition Let X and Y be random variables
More informationECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.
ECE32 Exam 2 Version A April 21, 214 1 Name: Solution Score: /1 This exam is closed-book. You must show ALL of your work for full credit. Please read the questions carefully. Please check your answers
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More informationMULTIVARIATE PROBABILITY DISTRIBUTIONS
MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined
More informationBivariate Distributions
STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 17 Néhémy Lim Bivariate Distributions 1 Distributions of Two Random Variables Definition 1.1. Let X and Y be two rrvs on probability space (Ω, A, P).
More informationECE Homework Set 3
ECE 450 1 Homework Set 3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3
More informationJoint probability distributions: Discrete Variables. Two Discrete Random Variables. Example 1. Example 1
Joint probability distributions: Discrete Variables Two Discrete Random Variables Probability mass function (pmf) of a single discrete random variable X specifies how much probability mass is placed on
More informationECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable
ECE353: Probability and Random Processes Lecture 7 -Continuous Random Variable Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu Continuous
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationProbability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.
Probability UBC Economics 326 January 23, 2018 1 2 3 Wooldridge (2013) appendix B Stock and Watson (2009) chapter 2 Linton (2017) chapters 1-5 Abbring (2001) sections 2.1-2.3 Diez, Barr, and Cetinkaya-Rundel
More information6.041/6.431 Fall 2010 Quiz 2 Solutions
6.04/6.43: Probabilistic Systems Analysis (Fall 200) 6.04/6.43 Fall 200 Quiz 2 Solutions Problem. (80 points) In this problem: (i) X is a (continuous) uniform random variable on [0, 4]. (ii) Y is an exponential
More informationIntroduction to Computational Finance and Financial Econometrics Probability Review - Part 2
You can t see this text! Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2 Eric Zivot Spring 2015 Eric Zivot (Copyright 2015) Probability Review - Part 2 1 /
More informationconditional cdf, conditional pdf, total probability theorem?
6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random
More informationCovariance and Correlation
Covariance and Correlation ST 370 The probability distribution of a random variable gives complete information about its behavior, but its mean and variance are useful summaries. Similarly, the joint probability
More informationReview: mostly probability and some statistics
Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random
More informationStatistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University
Statistics for Economists Lectures 6 & 7 Asrat Temesgen Stockholm University 1 Chapter 4- Bivariate Distributions 41 Distributions of two random variables Definition 41-1: Let X and Y be two random variables
More informationMATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)
MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) Last modified: March 7, 2009 Reference: PRP, Sections 3.6 and 3.7. 1. Tail-Sum Theorem
More information18.440: Lecture 26 Conditional expectation
18.440: Lecture 26 Conditional expectation Scott Sheffield MIT 1 Outline Conditional probability distributions Conditional expectation Interpretation and examples 2 Outline Conditional probability distributions
More informationSTOR Lecture 16. Properties of Expectation - I
STOR 435.001 Lecture 16 Properties of Expectation - I Jan Hannig UNC Chapel Hill 1 / 22 Motivation Recall we found joint distributions to be pretty complicated objects. Need various tools from combinatorics
More informationECE Lecture #9 Part 2 Overview
ECE 450 - Lecture #9 Part Overview Bivariate Moments Mean or Expected Value of Z = g(x, Y) Correlation and Covariance of RV s Functions of RV s: Z = g(x, Y); finding f Z (z) Method : First find F(z), by
More informationMultivariate Random Variable
Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate
More information2. Discrete Random Variables Part II: Expecta:on. ECE 302 Spring 2012 Purdue University, School of ECE Prof. Ilya Pollak
2. Discrete Random Variables Part II: Expecta:on ECE 302 Spring 2012 Purdue University, School of ECE Prof. Expected value of X: Defini:on Expected value of X: Defini:on E[X] is also called the mean of
More informationProbability and Statistics Notes
Probability and Statistics Notes Chapter Five Jesse Crawford Department of Mathematics Tarleton State University Spring 2011 (Tarleton State University) Chapter Five Notes Spring 2011 1 / 37 Outline 1
More information2 (Statistics) Random variables
2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationECSE B Solutions to Assignment 8 Fall 2008
ECSE 34-35B Solutions to Assignment 8 Fall 28 Problem 8.1 A manufacturing system is governed by a Poisson counting process {N t ; t < } with rate parameter λ >. If a counting event occurs at an instant
More informationE X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.
E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,
More informationSTA 256: Statistics and Probability I
Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationREVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B)
REVIEW OF MAIN CONCEPTS AND FORMULAS Boolean algebra of events (subsets of a sample space) DeMorgan s formula: A B = Ā B A B = Ā B The notion of conditional probability, and of mutual independence of two
More informationWeek 10 Worksheet. Math 4653, Section 001 Elementary Probability Fall Ice Breaker Question: Do you prefer waffles or pancakes?
Week 10 Worksheet Ice Breaker Question: Do you prefer waffles or pancakes? 1. Suppose X, Y have joint density f(x, y) = 12 7 (xy + y2 ) on 0 < x < 1, 0 < y < 1. (a) What are the marginal densities of X
More informationRandom Variables and Expectations
Inside ECOOMICS Random Variables Introduction to Econometrics Random Variables and Expectations A random variable has an outcome that is determined by an experiment and takes on a numerical value. A procedure
More informationMA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2
MA 575 Linear Models: Cedric E Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2 1 Revision: Probability Theory 11 Random Variables A real-valued random variable is
More informationStatistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v }
Statistics 35 Probability I Fall 6 (63 Final Exam Solutions Instructor: Michael Kozdron (a Solving for X and Y gives X UV and Y V UV, so that the Jacobian of this transformation is x x u v J y y v u v
More informationReview of Probability Theory
Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving
More informationVariance reduction. Michel Bierlaire. Transport and Mobility Laboratory. Variance reduction p. 1/18
Variance reduction p. 1/18 Variance reduction Michel Bierlaire michel.bierlaire@epfl.ch Transport and Mobility Laboratory Variance reduction p. 2/18 Example Use simulation to compute I = 1 0 e x dx We
More information5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors
EE401 (Semester 1) 5. Random Vectors Jitkomut Songsiri probabilities characteristic function cross correlation, cross covariance Gaussian random vectors functions of random vectors 5-1 Random vectors we
More informationExpectation and Variance
Expectation and Variance August 22, 2017 STAT 151 Class 3 Slide 1 Outline of Topics 1 Motivation 2 Expectation - discrete 3 Transformations 4 Variance - discrete 5 Continuous variables 6 Covariance STAT
More informationClass 8 Review Problems solutions, 18.05, Spring 2014
Class 8 Review Problems solutions, 8.5, Spring 4 Counting and Probability. (a) Create an arrangement in stages and count the number of possibilities at each stage: ( ) Stage : Choose three of the slots
More informationNotes for Math 324, Part 19
48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which
More informationProbability Review. Chao Lan
Probability Review Chao Lan Let s start with a single random variable Random Experiment A random experiment has three elements 1. sample space Ω: set of all possible outcomes e.g.,ω={1,2,3,4,5,6} 2. event
More informationRandom variables (discrete)
Random variables (discrete) Saad Mneimneh 1 Introducing random variables A random variable is a mapping from the sample space to the real line. We usually denote the random variable by X, and a value that
More informationDiscrete Probability Refresher
ECE 1502 Information Theory Discrete Probability Refresher F. R. Kschischang Dept. of Electrical and Computer Engineering University of Toronto January 13, 1999 revised January 11, 2006 Probability theory
More informationHomework 5 Solutions
126/DCP126 Probability, Fall 214 Instructor: Prof. Wen-Guey Tzeng Homework 5 Solutions 5-Jan-215 1. Let the joint probability mass function of discrete random variables X and Y be given by { c(x + y) ifx
More informationBASICS OF PROBABILITY
October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,
More informationCovariance and Correlation Class 7, Jeremy Orloff and Jonathan Bloom
1 Learning Goals Covariance and Correlation Class 7, 18.05 Jerem Orloff and Jonathan Bloom 1. Understand the meaning of covariance and correlation. 2. Be able to compute the covariance and correlation
More informationMore on Distribution Function
More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution
More informationEEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as
L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace
More informationEECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.
EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015 Midterm Exam Last name First name SID Rules. You have 80 mins (5:10pm - 6:30pm)
More informationChapter 5 continued. Chapter 5 sections
Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationCDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables
CDA5530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables Definition Random variable (R.V.) X: A function on sample space X: S R Cumulative distribution function
More informationCS70: Jean Walrand: Lecture 22.
CS70: Jean Walrand: Lecture 22. Confidence Intervals; Linear Regression 1. Review 2. Confidence Intervals 3. Motivation for LR 4. History of LR 5. Linear Regression 6. Derivation 7. More examples Review:
More informationConditional distributions (discrete case)
Conditional distributions (discrete case) The basic idea behind conditional distributions is simple: Suppose (XY) is a jointly-distributed random vector with a discrete joint distribution. Then we can
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More informationLecture 4: Proofs for Expectation, Variance, and Covariance Formula
Lecture 4: Proofs for Expectation, Variance, and Covariance Formula by Hiro Kasahara Vancouver School of Economics University of British Columbia Hiro Kasahara (UBC) Econ 325 1 / 28 Discrete Random Variables:
More informationLecture 25: Review. Statistics 104. April 23, Colin Rundel
Lecture 25: Review Statistics 104 Colin Rundel April 23, 2012 Joint CDF F (x, y) = P [X x, Y y] = P [(X, Y ) lies south-west of the point (x, y)] Y (x,y) X Statistics 104 (Colin Rundel) Lecture 25 April
More informationLecture 2: Review of Probability
Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................
More information2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).
Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent
More information18 Bivariate normal distribution I
8 Bivariate normal distribution I 8 Example Imagine firing arrows at a target Hopefully they will fall close to the target centre As we fire more arrows we find a high density near the centre and fewer
More informationIntroduction to Probability and Stocastic Processes - Part I
Introduction to Probability and Stocastic Processes - Part I Lecture 1 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark
More informationStatistical Methods in Particle Physics
Statistical Methods in Particle Physics Lecture 3 October 29, 2012 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline Reminder: Probability density function Cumulative
More informationCommunication Theory II
Communication Theory II Lecture 5: Review on Probability Theory Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt Febraury 22 th, 2015 1 Lecture Outlines o Review on probability theory
More informationIf g is also continuous and strictly increasing on J, we may apply the strictly increasing inverse function g 1 to this inequality to get
18:2 1/24/2 TOPIC. Inequalities; measures of spread. This lecture explores the implications of Jensen s inequality for g-means in general, and for harmonic, geometric, arithmetic, and related means in
More informationMidterm Exam 1 Solution
EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:
More informationLecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN
Lecture Notes 5 Convergence and Limit Theorems Motivation Convergence with Probability Convergence in Mean Square Convergence in Probability, WLLN Convergence in Distribution, CLT EE 278: Convergence and
More informationHW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y)
HW5 Solutions 1. (50 pts.) Random homeworks again (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] Answer: Applying the definition of expectation we have
More informationProblem Set #5. Econ 103. Solution: By the complement rule p(0) = 1 p q. q, 1 x 0 < 0 1 p, 0 x 0 < 1. Solution: E[X] = 1 q + 0 (1 p q) + p 1 = p q
Problem Set #5 Econ 103 Part I Problems from the Textbook Chapter 4: 1, 3, 5, 7, 9, 11, 13, 15, 25, 27, 29 Chapter 5: 1, 3, 5, 9, 11, 13, 17 Part II Additional Problems 1. Suppose X is a random variable
More informationStat 5101 Notes: Algorithms (thru 2nd midterm)
Stat 5101 Notes: Algorithms (thru 2nd midterm) Charles J. Geyer October 18, 2012 Contents 1 Calculating an Expectation or a Probability 2 1.1 From a PMF........................... 2 1.2 From a PDF...........................
More informationUCSD ECE153 Handout #34 Prof. Young-Han Kim Tuesday, May 27, Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei)
UCSD ECE53 Handout #34 Prof Young-Han Kim Tuesday, May 7, 04 Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei) Linear estimator Consider a channel with the observation Y XZ, where the
More informationLecture 16 : Independence, Covariance and Correlation of Discrete Random Variables
Lecture 6 : Independence, Covariance and Correlation of Discrete Random Variables 0/ 3 Definition Two discrete random variables X and Y defined on the same sample space are said to be independent if for
More informationApplied Econometrics - QEM Theme 1: Introduction to Econometrics Chapter 1 + Probability Primer + Appendix B in PoE
Applied Econometrics - QEM Theme 1: Introduction to Econometrics Chapter 1 + Probability Primer + Appendix B in PoE Warsaw School of Economics Outline 1. Introduction to econometrics 2. Denition of econometrics
More information