Homework 5 Solutions

Size: px
Start display at page:

Download "Homework 5 Solutions"

Transcription

1 126/DCP126 Probability, Fall 214 Instructor: Prof. Wen-Guey Tzeng Homework 5 Solutions 5-Jan Let the joint probability mass function of discrete random variables X and Y be given by { c(x + y) ifx = 1, 2, 3, y = 1, 2 p(x, y) = Answer. (a) the value of the constant c 3 2 x=1 y=1 c(x + y) = 1, implies that c = 1/21. (b) the marginal probability mass functions of X and Y. p X (x) = 2 y=1 (1/21)(x + y) = (2x + 3)/21. x = 1, 2, 3. p Y (y) = 3 x=1 (1/21)(x + y) = (6 + 3y)/21. y = 1, 2. (c) P (X 2 Y = 1) P (X 2 Y = 1) = p(2,1)+p(3,1) p Y (1) = /21 = 7/9 (d)e(x) and E(Y ). E(X) = 3 2 x=1 y=1 (1/21)x(x + y) = 46/21 E(Y ) = 3 2 x=1 y=1 (1/21)y(x + y) = 11/7 2. Two dice are rolled. The sum of the outcomes is denoted by X and the absolute value of their difference by Y. Calculate the joint probability mass function of X and Y and the marginal probability mass functions of X and Y. Answer. Table 1 gives p(x, y), the joint probability mass function of X and Y; p X (x), the marginal probability mass function of X; and p Y (y), the marginal probability mass function of Y. 3. Two points are placed on a segment of length l independently and at random to divide the line into three parts. What is the probability that the length of none of the three parts exceeds a given value α, l/3 α l? Answer. Let X and Y be the two points that are placed on the segment. Let E be the event that the length of none of the three parts exceeds the given value α. Clearly, P (E X < Y ) = P (E Y < X) and P (X < Y ) = P (Y < X) = 1/2. Therefore, P (E) = P (E X < Y )P (X < Y ) + P (E Y < X)P (Y < X) = P (E X < Y )1 + P (E X < Y )1 = P (E X < Y ). 1-1

2 y x p X (x) 2 1/36 1/36 3 2/36 2/36 4 1/36 2/36 3/36 5 2/36 2/36 4/36 6 1/36 2/36 2/36 5/36 7 2/36 2/36 2/36 6/36 8 1/36 2/36 2/36 5/36 9 2/36 2/36 4/36 1 1/36 2/36 3/ /36 2/ /36 1/36 p Y (y) 6/36 1/36 8/36 6/36 4/36 2/36 Table 1: The joint probability mass function of X and Y in question 2 This shows that for calculation of P (E), we may reduce the sample space to the case where X < Y. The reduced sample space is S = {(x, y) : x < y, < x < l, < y < l}. The desired probability is the area of R = {(x, y)s : x < α, y x < α, y > l α} divided by area (S) = l 2 /2. But area(r) = { (3α l) 2 2 if l 3 α l 2 l 2 2 3l2 2 (1 α l )2 if l 2 α l Hence the desired probability is { ( 3α P (E) = l 1) 2 if l 3 α l 2 1 3(1 α l )2 if l 2 α l 4. Let the joint probability mass function of random variables X and Y be given by p(x, y) = 7 x2 y i (1, 1), (1, 2), (2, 1) Are X and Y independent? Why or why not? Answer. Note that p(1, 1) = 1/7 p X (1) = p(1, 1) + p(1, 2) = 1/7 + 2/7 = 3/7 p Y (1) = p(1, 1) + p(2, 1) = 1/7 + 5/7 = 6/7. Since p(1, 1) p X (1)p Y (1), X and Y are dependent. 5. The joint probability mass function p(x,y) of the random variables X and Y is given by the following table. Determine if X and Y are independent. 1-2

3 y x Answer. For i, j {, 1, 2, 3}, the sum of the numbers in the ith row is p X (i) and the sum of the numbers in the jth row is p Y (j). We have that p X () =.41, p X (1) =.44, p X (2) =.14, p X (3) =.1; p Y () =.41, p Y (1) =.44, p Y (2) =.14, p Y (3) =.1. Since for all x, y {, 1, 2, 3}, p(x, y) = p X (x)p Y (y), X and Y are independent. 6. A point is selected at random from the disk R = {(x, y) R 2 : x 2 + y 2 1}. Let X be the x-coordinate and Y be the y-coordinate of the point selected. Determine if X and Y are independent random variables. Answer. The joint probability density function of X and Y is given by { 1 area(r) = 1 π if(x, y) R Now f X (x) = 1 x x 2 π dy = 2 π 1 x 2, f Y (y) = 1 y y 2 π dy = 2 π 1 y 2 Since f(x, y) f X (x)f Y (y), the random variables X and Y are not independent. 7. Let the joint probability density function of continuous random variables X and Y be given by { 2 if < x < y < 1 Find f X Y (x y). Answer. Since F Y (y) = y 2dx = 2y, < y < 1, we have that f X Y (x y) = f(x,y) f Y (y) = 2/2y = 1/y, < x < y, < y < First a point Y is selected at random from the interval (, 1). Then another point X is selected at random from the interval (Y, 1). Find the probability density function of X. Answer. Let f(x, y) be the joint probability density function of X and Y. Clearly, f X Y (x y)fy (y). Thus f X (x) = f X Y (x y)fy (y)dy. Now < y < 1 f Y (y) = and 1-3

4 Therefore, for < x < 1, f X (x) = x f X Y (x y) = 1 y if < y < 1, y < x < 1 f X (x) = 1 1 y dy = ln(1 x) and hence { ln(1 x) < x < 1 9. A point is selected at random and uniformly from the region R = {(x, y) : x + y 1}. Find the conditional probability density function of X given Y = y. Answer. Let f (x, y) be the joint probability density function of X and Y. {.5 x + y 1 and f Y (y) = 1 y, 1 y 1. Hence f X Y (x y) =.5 1 y = 1 2(1 y ), 1 + y x 1 y, 1 y Let the joint probability density function of random variables X and Y be { 2e (x+2y) if x, y elsewhere. Find E(X), E(Y ), and E(X 2 + Y 2 ). Answer. Since e x.2e 2y, X and Y are independent exponential random variables with parameters 1 and 2, respectively. Thus E(X) = 1, E(Y ) = 1/2, E(X 2 ) = V ar(x) + [E(X)] 2 = = 2, E(Y 2 ) = V ar(y ) + [E(Y )] 2 = 1/4 + 1/4 = 1/2. Therefore, E(X 2 + Y 2 ) = 2 + 1/2 = 5/2 11. Suppose that random digits are generated from the set {, 1,..., 9} independently and successively. Find the expected number of digits to be generated until the pattern (a) 7 appears, (b) appears, (c) appears. Answer. (a) E( 7) = E(7 7) = 1,. (b) E( ) = E( 156) + E( ) = E( ) + E( ) = 1, + 1,, = 1, 1,. (c) E( ) = E( 57) + E( ) + E( ) = E(57 57)+E( )+E( ) = 1+1, +1,, = 1, 1,

5 12. Suppose that 8 balls are placed into 4 boxes at random and independently. What is the expected number of the empty boxes? Answer. Let if the ith box is empty X i = elsewhere. The expected number of the empty boxes is E(X 1 + X X 25 ) = 4E(X i ) = 4P (X i = 1) = 4( 39 4 ) Let the joint probability mass function of random variables X and Y be given by p(x, y) = 7x(x + y) if x = 1, 2, 3, y = 3, 4 elsewhere. Find Cov(X, Y ). Answer. E(X) = 3 4 x=1 y=3 (1/7)x2 (x + y) = 17/7 E(Y ) = 3 4 x=1 y=3 (1/7)xy(x + y) = 124/35 E(XY ) = 3 4 x=1 y=3 (1/7)x2 y(x + y) = 43/5. Therefore Cov(X, Y ) = E(XY ) E(X)E(Y ) = 43/5 (17/7).(124/35) = 1/ Let X and Y be the coordinates of a random point selected uniformly from the unit disk {(x, y) : x 2 + y 2 1}. Are X and Y independent? Are they uncorrelated? Why or why not? Answer. The joint probability density function of X and Y is given by π x 2 + y 2 1 elsewhere. X and Y are dependent because, for example, P ( < X <.5 Y = ) = 1/4 while, P ( < X < 2) = 2 1/2 1 x 2 1 π dydx = 2 1/2 π 1 x 2 dx = 1/ π P ( < X <.5 Y = ). X and Y are uncorrelated because E(X) = x 2 +y 2 1 x/πdxdy = 1/π 1 E(Y ) = x 2 +y 2 1 y/πdxdy = 1/π 1 2π and E(XY ) = x 2 +y 2 1 xy/πdxdy = 1/π 1 Cov(X, Y ) = E(XY ) E(X)E(Y ) =. r 2 cos(θ)dθdr = 2π r 2 sin(θ)dθdr = 2π r 3 cos(θ) sin(θ)dθdr = implying that 15. Mr. Ingham has invested money in three assets; 18% in the first asset, 4% in the second one, and 42% in the third one. Let r 1, r 2, and r 3 be the annual rate of returns for these three investments, respectively. For 1 i, j 3, Cov(r i, r j ) is the ith element in the jth row of the following table. [Note that V ar(r i ) = Cov(r i, r i )]. Find the standard deviation of the annual rate of return for Mr. Ingham s total investment. Answer. Let r be the annual rate of return for Mr. Inghams total investment. We have 1-5

6 r 1 r 2 r 3 r r r V ar(r) = V ar(.18r 1 +.4r r 3 ) = (.18) 2 V ar(r 1 ) + (.4) 2 V ar(r 2 ) + (.42) 2 V ar(r 3 ) +2(.18)(.4)Cov(r 1, r 2 ) + 2(.18)(.42)Cov(r 1, r 3 ) + 2(.4)(.42)Cov(r 2, r 3 ) = (.18) 2 (.64) + (.4) 2 (.144) + (.42) 2 (.1) +2(.18)(.4)(.3) + 2(.18)(.42)(.15) + 2(.4)(.42)(.21) = Let the joint probability density function of X and Y be given by { sin x sin y if x π/2, y π/2 elsewhere. Calculate the correlation coefficient of X and Y. Answer. Since X and Y are independent random variables. [This can also be shown directly by verifying the relation f X (x)f Y (y).] Hence Cov(X, Y ) =, and therefore ρ(x, Y ) =. 1-6

Homework 10 (due December 2, 2009)

Homework 10 (due December 2, 2009) Homework (due December, 9) Problem. Let X and Y be independent binomial random variables with parameters (n, p) and (n, p) respectively. Prove that X + Y is a binomial random variable with parameters (n

More information

ENGG2430A-Homework 2

ENGG2430A-Homework 2 ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,

More information

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

Homework 9 (due November 24, 2009)

Homework 9 (due November 24, 2009) Homework 9 (due November 4, 9) Problem. The join probability density function of X and Y is given by: ( f(x, y) = c x + xy ) < x

More information

Math 510 midterm 3 answers

Math 510 midterm 3 answers Math 51 midterm 3 answers Problem 1 (1 pts) Suppose X and Y are independent exponential random variables both with parameter λ 1. Find the probability that Y < 7X. P (Y < 7X) 7x 7x f(x, y) dy dx e x e

More information

More than one variable

More than one variable Chapter More than one variable.1 Bivariate discrete distributions Suppose that the r.v. s X and Y are discrete and take on the values x j and y j, j 1, respectively. Then the joint p.d.f. of X and Y, to

More information

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010 Raquel Prado Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010 Final Exam (Type B) The midterm is closed-book, you are only allowed to use one page of notes and a calculator.

More information

STAT 515 MIDTERM 2 EXAM November 14, 2018

STAT 515 MIDTERM 2 EXAM November 14, 2018 STAT 55 MIDTERM 2 EXAM November 4, 28 NAME: Section Number: Instructor: In problems that require reasoning, algebraic calculation, or the use of your graphing calculator, it is not sufficient just to write

More information

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS STAT/MA 4 Answers Homework November 5, 27 Solutions by Mark Daniel Ward PROBLEMS Chapter Problems 2a. The mass p, corresponds to neither of the first two balls being white, so p, 8 7 4/39. The mass p,

More information

HW4 : Bivariate Distributions (1) Solutions

HW4 : Bivariate Distributions (1) Solutions STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 7 Néhémy Lim HW4 : Bivariate Distributions () Solutions Problem. The joint probability mass function of X and Y is given by the following table : X Y

More information

Continuous r.v practice problems

Continuous r.v practice problems Continuous r.v practice problems SDS 321 Intro to Probability and Statistics 1. (2+2+1+1 6 pts) The annual rainfall (in inches) in a certain region is normally distributed with mean 4 and standard deviation

More information

Mathematics 426 Robert Gross Homework 9 Answers

Mathematics 426 Robert Gross Homework 9 Answers Mathematics 4 Robert Gross Homework 9 Answers. Suppose that X is a normal random variable with mean µ and standard deviation σ. Suppose that PX > 9 PX

More information

Class 8 Review Problems solutions, 18.05, Spring 2014

Class 8 Review Problems solutions, 18.05, Spring 2014 Class 8 Review Problems solutions, 8.5, Spring 4 Counting and Probability. (a) Create an arrangement in stages and count the number of possibilities at each stage: ( ) Stage : Choose three of the slots

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested

More information

Random Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University.

Random Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University. Random Signals and Systems Chapter 3 Jitendra K Tugnait Professor Department of Electrical & Computer Engineering Auburn University Two Random Variables Previously, we only dealt with one random variable

More information

Math 426: Probability MWF 1pm, Gasson 310 Exam 3 SOLUTIONS

Math 426: Probability MWF 1pm, Gasson 310 Exam 3 SOLUTIONS Name: ANSWE KEY Math 46: Probability MWF pm, Gasson Exam SOLUTIONS Problem Points Score 4 5 6 BONUS Total 6 Please write neatly. You may leave answers below unsimplified. Have fun and write your name above!

More information

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx INDEPENDENCE, COVARIANCE AND CORRELATION Independence: Intuitive idea of "Y is independent of X": The distribution of Y doesn't depend on the value of X. In terms of the conditional pdf's: "f(y x doesn't

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

sheng@mail.ncyu.edu.tw Content Joint distribution functions Independent random variables Sums of independent random variables Conditional distributions: discrete case Conditional distributions: continuous

More information

Notes for Math 324, Part 19

Notes for Math 324, Part 19 48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2 You can t see this text! Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2 Eric Zivot Spring 2015 Eric Zivot (Copyright 2015) Probability Review - Part 2 1 /

More information

Properties of Summation Operator

Properties of Summation Operator Econ 325 Section 003/004 Notes on Variance, Covariance, and Summation Operator By Hiro Kasahara Properties of Summation Operator For a sequence of the values {x 1, x 2,..., x n, we write the sum of x 1,

More information

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Lecture 25: Review. Statistics 104. April 23, Colin Rundel Lecture 25: Review Statistics 104 Colin Rundel April 23, 2012 Joint CDF F (x, y) = P [X x, Y y] = P [(X, Y ) lies south-west of the point (x, y)] Y (x,y) X Statistics 104 (Colin Rundel) Lecture 25 April

More information

Multivariate distributions

Multivariate distributions CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density

More information

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) = Math 5. Rumbos Fall 07 Solutions to Review Problems for Exam. A bowl contains 5 chips of the same size and shape. Two chips are red and the other three are blue. Draw three chips from the bowl at random,

More information

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

MATH/STAT 3360, Probability Sample Final Examination Model Solutions MATH/STAT 3360, Probability Sample Final Examination Model Solutions This Sample examination has more questions than the actual final, in order to cover a wider range of questions. Estimated times are

More information

Bivariate Distributions

Bivariate Distributions Bivariate Distributions EGR 260 R. Van Til Industrial & Systems Engineering Dept. Copyright 2013. Robert P. Van Til. All rights reserved. 1 What s It All About? Many random processes produce Examples.»

More information

Problem 1. Problem 2. Problem 3. Problem 4

Problem 1. Problem 2. Problem 3. Problem 4 Problem Let A be the event that the fungus is present, and B the event that the staph-bacteria is present. We have P A = 4, P B = 9, P B A =. We wish to find P AB, to do this we use the multiplication

More information

ECE 302: Probabilistic Methods in Engineering

ECE 302: Probabilistic Methods in Engineering Purdue University School of Electrical and Computer Engineering ECE 32: Probabilistic Methods in Engineering Fall 28 - Final Exam SOLUTION Monday, December 5, 28 Prof. Sanghavi s Section Score: Name: No

More information

Expectation, inequalities and laws of large numbers

Expectation, inequalities and laws of large numbers Chapter 3 Expectation, inequalities and laws of large numbers 3. Expectation and Variance Indicator random variable Let us suppose that the event A partitions the sample space S, i.e. A A S. The indicator

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

Chapter 5 Joint Probability Distributions

Chapter 5 Joint Probability Distributions Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger Chapter 5 Joint Probability Distributions 5 Joint Probability Distributions CHAPTER OUTLINE 5-1 Two

More information

CH5 CH6(Sections 1 through 5) Homework Problems

CH5 CH6(Sections 1 through 5) Homework Problems 550.40 CH5 CH6(Sections 1 through 5) Homework Problems 1. Part of HW #6: CH 5 P1. Let X be a random variable with probability density function f(x) = c(1 x ) 1 < x < 1 (a) What is the value of c? (b) What

More information

Solution to Assignment 3

Solution to Assignment 3 The Chinese University of Hong Kong ENGG3D: Probability and Statistics for Engineers 5-6 Term Solution to Assignment 3 Hongyang Li, Francis Due: 3:pm, March Release Date: March 8, 6 Dear students, The

More information

Review of Probability Theory

Review of Probability Theory Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving

More information

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay 1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

Multivariate Random Variable

Multivariate Random Variable Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

STAT/MA 416 Midterm Exam 3 Monday, November 19, Circle the section you are enrolled in:

STAT/MA 416 Midterm Exam 3 Monday, November 19, Circle the section you are enrolled in: STAT/MA 46 Midterm Exam 3 Monday, November 9, 27 Name Purdue student ID ( digits) Circle the section you are enrolled in: STAT/MA 46-- STAT/MA 46-2- 9: AM :5 AM 3: PM 4:5 PM REC 4 UNIV 23. The testing

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc. ECE32 Exam 2 Version A April 21, 214 1 Name: Solution Score: /1 This exam is closed-book. You must show ALL of your work for full credit. Please read the questions carefully. Please check your answers

More information

CS145: Probability & Computing

CS145: Probability & Computing CS45: Probability & Computing Lecture 0: Continuous Bayes Rule, Joint and Marginal Probability Densities Instructor: Eli Upfal Brown University Computer Science Figure credits: Bertsekas & Tsitsiklis,

More information

Assignment 1 (Sol.) Introduction to Machine Learning Prof. B. Ravindran. 1. Which of the following tasks can be best solved using Clustering.

Assignment 1 (Sol.) Introduction to Machine Learning Prof. B. Ravindran. 1. Which of the following tasks can be best solved using Clustering. Assignment 1 (Sol.) Introduction to Machine Learning Prof. B. Ravindran 1. Which of the following tasks can be best solved using Clustering. (a) Predicting the amount of rainfall based on various cues

More information

Jointly Distributed Random Variables

Jointly Distributed Random Variables Jointly Distributed Random Variables CE 311S What if there is more than one random variable we are interested in? How should you invest the extra money from your summer internship? To simplify matters,

More information

UCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei)

UCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei) UCSD ECE53 Handout #7 Prof. Young-Han Kim Tuesday, May 6, 4 Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei). Neural net. Let Y = X + Z, where the signal X U[,] and noise Z N(,) are independent.

More information

HW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y)

HW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y) HW5 Solutions 1. (50 pts.) Random homeworks again (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] Answer: Applying the definition of expectation we have

More information

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) Last modified: March 7, 2009 Reference: PRP, Sections 3.6 and 3.7. 1. Tail-Sum Theorem

More information

Course 1 Solutions November 2001 Exams

Course 1 Solutions November 2001 Exams Course Solutions November Exams . A For i =,, let R = event that a red ball is drawn form urn i i B = event that a blue ball is drawn from urn i. i Then if x is the number of blue balls in urn, ( R R)

More information

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14 Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional

More information

Let X and Y denote two random variables. The joint distribution of these random

Let X and Y denote two random variables. The joint distribution of these random EE385 Class Notes 9/7/0 John Stensby Chapter 3: Multiple Random Variables Let X and Y denote two random variables. The joint distribution of these random variables is defined as F XY(x,y) = [X x,y y] P.

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

Multivariate probability distributions and linear regression

Multivariate probability distributions and linear regression Multivariate probability distributions and linear regression Patrik Hoyer 1 Contents: Random variable, probability distribution Joint distribution Marginal distribution Conditional distribution Independence,

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

Covariance and Correlation

Covariance and Correlation Covariance and Correlation ST 370 The probability distribution of a random variable gives complete information about its behavior, but its mean and variance are useful summaries. Similarly, the joint probability

More information

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

More information

STAT 516 Midterm Exam 3 Friday, April 18, 2008

STAT 516 Midterm Exam 3 Friday, April 18, 2008 STAT 56 Midterm Exam 3 Friday, April 8, 2008 Name Purdue student ID (0 digits). The testing booklet contains 8 questions. 2. Permitted Texas Instruments calculators: BA-35 BA II Plus BA II Plus Professional

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

STAT 430/510: Lecture 16

STAT 430/510: Lecture 16 STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions

More information

CS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro

CS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro CS37300 Class Notes Jennifer Neville, Sebastian Moreno, Bruno Ribeiro 2 Background on Probability and Statistics These are basic definitions, concepts, and equations that should have been covered in your

More information

RYERSON UNIVERSITY DEPARTMENT OF MATHEMATICS MTH 514 Stochastic Processes

RYERSON UNIVERSITY DEPARTMENT OF MATHEMATICS MTH 514 Stochastic Processes RYERSON UNIVERSITY DEPARTMENT OF MATHEMATICS MTH 514 Stochastic Processes Midterm 2 Assignment Last Name (Print):. First Name:. Student Number: Signature:. Date: March, 2010 Due: March 18, in class. Instructions:

More information

FINAL EXAM: Monday 8-10am

FINAL EXAM: Monday 8-10am ECE 30: Probabilistic Methods in Electrical and Computer Engineering Fall 016 Instructor: Prof. A. R. Reibman FINAL EXAM: Monday 8-10am Fall 016, TTh 3-4:15pm (December 1, 016) This is a closed book exam.

More information

Partial Solutions for h4/2014s: Sampling Distributions

Partial Solutions for h4/2014s: Sampling Distributions 27 Partial Solutions for h4/24s: Sampling Distributions ( Let X and X 2 be two independent random variables, each with the same probability distribution given as follows. f(x 2 e x/2, x (a Compute the

More information

Machine Learning: Homework Assignment 2 Solutions

Machine Learning: Homework Assignment 2 Solutions 10-601 Machine Learning: Homework Assignment 2 Solutions Professor Tom Mitchell Carnegie Mellon University January 21, 2009 The assignment is due at 1:30pm (beginning of class) on Monday, February 2, 2009.

More information

Lecture 22: Variance and Covariance

Lecture 22: Variance and Covariance EE5110 : Probability Foundations for Electrical Engineers July-November 2015 Lecture 22: Variance and Covariance Lecturer: Dr. Krishna Jagannathan Scribes: R.Ravi Kiran In this lecture we will introduce

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

18 Bivariate normal distribution I

18 Bivariate normal distribution I 8 Bivariate normal distribution I 8 Example Imagine firing arrows at a target Hopefully they will fall close to the target centre As we fire more arrows we find a high density near the centre and fewer

More information

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. IE 230 Seat # Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. Score Exam #3a, Spring 2002 Schmeiser Closed book and notes. 60 minutes. 1. True or false. (for each,

More information

ECONOMICS 207 SPRING 2006 LABORATORY EXERCISE 5 KEY. 8 = 10(5x 2) = 9(3x + 8), x 50x 20 = 27x x = 92 x = 4. 8x 2 22x + 15 = 0 (2x 3)(4x 5) = 0

ECONOMICS 207 SPRING 2006 LABORATORY EXERCISE 5 KEY. 8 = 10(5x 2) = 9(3x + 8), x 50x 20 = 27x x = 92 x = 4. 8x 2 22x + 15 = 0 (2x 3)(4x 5) = 0 ECONOMICS 07 SPRING 006 LABORATORY EXERCISE 5 KEY Problem. Solve the following equations for x. a 5x 3x + 8 = 9 0 5x 3x + 8 9 8 = 0(5x ) = 9(3x + 8), x 0 3 50x 0 = 7x + 7 3x = 9 x = 4 b 8x x + 5 = 0 8x

More information

DIFFERENTIAL EQUATIONS

DIFFERENTIAL EQUATIONS DIFFERENTIAL EQUATIONS 1. Basic Terminology A differential equation is an equation that contains an unknown function together with one or more of its derivatives. 1 Examples: 1. y = 2x + cos x 2. dy dt

More information

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is Math 416 Lecture 3 Expected values The average or mean or expected value of x 1, x 2, x 3,..., x n is x 1 x 2... x n n x 1 1 n x 2 1 n... x n 1 n 1 n x i p x i where p x i 1 n is the probability of x i

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

DIFFERENTIAL EQUATIONS

DIFFERENTIAL EQUATIONS DIFFERENTIAL EQUATIONS Basic Terminology A differential equation is an equation that contains an unknown function together with one or more of its derivatives. 1 Examples: 1. y = 2x + cos x 2. dy dt =

More information

Stat 5101 Notes: Algorithms (thru 2nd midterm)

Stat 5101 Notes: Algorithms (thru 2nd midterm) Stat 5101 Notes: Algorithms (thru 2nd midterm) Charles J. Geyer October 18, 2012 Contents 1 Calculating an Expectation or a Probability 2 1.1 From a PMF........................... 2 1.2 From a PDF...........................

More information

Chapter 2. Probability

Chapter 2. Probability 2-1 Chapter 2 Probability 2-2 Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance with certainty. Examples: rolling a die tossing

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying

More information

STOR Lecture 16. Properties of Expectation - I

STOR Lecture 16. Properties of Expectation - I STOR 435.001 Lecture 16 Properties of Expectation - I Jan Hannig UNC Chapel Hill 1 / 22 Motivation Recall we found joint distributions to be pretty complicated objects. Need various tools from combinatorics

More information

Lecture 2: Review of Probability

Lecture 2: Review of Probability Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................

More information

Week 10 Worksheet. Math 4653, Section 001 Elementary Probability Fall Ice Breaker Question: Do you prefer waffles or pancakes?

Week 10 Worksheet. Math 4653, Section 001 Elementary Probability Fall Ice Breaker Question: Do you prefer waffles or pancakes? Week 10 Worksheet Ice Breaker Question: Do you prefer waffles or pancakes? 1. Suppose X, Y have joint density f(x, y) = 12 7 (xy + y2 ) on 0 < x < 1, 0 < y < 1. (a) What are the marginal densities of X

More information

DIFFERENTIAL EQUATIONS

DIFFERENTIAL EQUATIONS DIFFERENTIAL EQUATIONS 1. Basic Terminology A differential equation is an equation that contains an unknown function together with one or more of its derivatives. 1 Examples: 1. y = 2x + cos x 2. dy dt

More information

Functions of two random variables. Conditional pairs

Functions of two random variables. Conditional pairs Handout 10 Functions of two random variables. Conditional pairs "Science is a wonderful thing if one does not have to earn a living at it. One should earn one's living by work of which one is sure one

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An

More information

MATH 38061/MATH48061/MATH68061: MULTIVARIATE STATISTICS Solutions to Problems on Random Vectors and Random Sampling. 1+ x2 +y 2 ) (n+2)/2

MATH 38061/MATH48061/MATH68061: MULTIVARIATE STATISTICS Solutions to Problems on Random Vectors and Random Sampling. 1+ x2 +y 2 ) (n+2)/2 MATH 3806/MATH4806/MATH6806: MULTIVARIATE STATISTICS Solutions to Problems on Rom Vectors Rom Sampling Let X Y have the joint pdf: fx,y) + x +y ) n+)/ π n for < x < < y < this is particular case of the

More information

Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C,

Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C, Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: PMF case: p(x, y, z) is the joint Probability Mass Function of X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) PDF case: f(x, y, z) is

More information

Solutions to Assignment #8 Math 501 1, Spring 2006 University of Utah

Solutions to Assignment #8 Math 501 1, Spring 2006 University of Utah Solutions to Assignment #8 Math 5, Spring 26 University of Utah Problems:. A man and a woman agree to meet at a certain location at about 2:3 p.m. If the man arrives at a time uniformly distributed between

More information

FINAL EXAM: 3:30-5:30pm

FINAL EXAM: 3:30-5:30pm ECE 30: Probabilistic Methods in Electrical and Computer Engineering Spring 016 Instructor: Prof. A. R. Reibman FINAL EXAM: 3:30-5:30pm Spring 016, MWF 1:30-1:0pm (May 6, 016) This is a closed book exam.

More information

18.440: Lecture 26 Conditional expectation

18.440: Lecture 26 Conditional expectation 18.440: Lecture 26 Conditional expectation Scott Sheffield MIT 1 Outline Conditional probability distributions Conditional expectation Interpretation and examples 2 Outline Conditional probability distributions

More information

Probability Theory and Statistics. Peter Jochumzen

Probability Theory and Statistics. Peter Jochumzen Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................

More information

Problem Set #5. Econ 103. Solution: By the complement rule p(0) = 1 p q. q, 1 x 0 < 0 1 p, 0 x 0 < 1. Solution: E[X] = 1 q + 0 (1 p q) + p 1 = p q

Problem Set #5. Econ 103. Solution: By the complement rule p(0) = 1 p q. q, 1 x 0 < 0 1 p, 0 x 0 < 1. Solution: E[X] = 1 q + 0 (1 p q) + p 1 = p q Problem Set #5 Econ 103 Part I Problems from the Textbook Chapter 4: 1, 3, 5, 7, 9, 11, 13, 15, 25, 27, 29 Chapter 5: 1, 3, 5, 9, 11, 13, 17 Part II Additional Problems 1. Suppose X is a random variable

More information

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π Solutions to Homework Set #5 (Prepared by Lele Wang). Neural net. Let Y X + Z, where the signal X U[,] and noise Z N(,) are independent. (a) Find the function g(y) that minimizes MSE E [ (sgn(x) g(y))

More information

x i p X (x i ) if X is discrete provided that the relevant sum or integral is absolutely convergent, i.e., i

x i p X (x i ) if X is discrete provided that the relevant sum or integral is absolutely convergent, i.e., i Chapter 5 Expectation 5. Introduction Def The expectation (mean), E[X] or µ X, of a random variable X is defined by: x i p X (x i ) if X is discrete E[X] µ X i xf(x)dx if X is continuous provided that

More information

Assignment 9. Due: July 14, 2017 Instructor: Dr. Mustafa El-Halabi. ! A i. P (A c i ) i=1

Assignment 9. Due: July 14, 2017 Instructor: Dr. Mustafa El-Halabi. ! A i. P (A c i ) i=1 CCE 40: Communication Systems Summer 206-207 Assignment 9 Due: July 4, 207 Instructor: Dr. Mustafa El-Halabi Instructions: You are strongly encouraged to type out your solutions using mathematical mode

More information

Statistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v }

Statistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v } Statistics 35 Probability I Fall 6 (63 Final Exam Solutions Instructor: Michael Kozdron (a Solving for X and Y gives X UV and Y V UV, so that the Jacobian of this transformation is x x u v J y y v u v

More information

Stat 5101 Notes: Algorithms

Stat 5101 Notes: Algorithms Stat 5101 Notes: Algorithms Charles J. Geyer January 22, 2016 Contents 1 Calculating an Expectation or a Probability 3 1.1 From a PMF........................... 3 1.2 From a PDF...........................

More information

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

STAT 430/510 Probability Lecture 7: Random Variable and Expectation STAT 430/510 Probability Lecture 7: Random Variable and Expectation Pengyuan (Penelope) Wang June 2, 2011 Review Properties of Probability Conditional Probability The Law of Total Probability Bayes Formula

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

Business Statistics 41000: Homework # 2 Solutions

Business Statistics 41000: Homework # 2 Solutions Business Statistics 4000: Homework # 2 Solutions Drew Creal February 9, 204 Question #. Discrete Random Variables and Their Distributions (a) The probabilities have to sum to, which means that 0. + 0.3

More information