Continuous r.v practice problems

Size: px
Start display at page:

Download "Continuous r.v practice problems"

Transcription

1 Continuous r.v practice problems SDS 321 Intro to Probability and Statistics 1. ( pts) The annual rainfall (in inches) in a certain region is normally distributed with mean 4 and standard deviation 4. (a) What is the probability that, starting with this year, it will take over 1 years before a year occurs having a rainfall of over 5 inches? Let R denote the amount of rainfall. P (R 5) P ( R ) 1 Φ(2.5).6. Let X be the number of years before we see over 5 inches of rainfall. P (X 1) P (None of the first 1 years have more than 5 inches of rain) (1.6) pt for getting P (R 5). 1 pt for correct calculation with geometric. (b) What is the probability that at least 4 out of the next 5 years will have a rainfall of over 5 inches? This is a binomial probability with n 5, p.6 and np.3. May be better to use X P oisson(3) where X denotes number of years with rainfall more than 5 inches. P (X 4) 1 ( 3 P (X i)) i 1 pt for understanding this is a binomial. 1 pt for the expression. OK if they dont evaluate it. (c) What is the expected number of years with over 5 inches of rainfall in the next 5 years? Expectation of a Poisson: pt for using the expectation of a binomial. (d) What assumptions are you making? The rainfall on each day is independent of each other. 1 pt for correct answer. 2. ( pts) The joint pdf of two random variables X and Y are given by: 24xy x, y [, 1], x + y 1 f X,Y (x, y) otherwise (a) Show that f X,Y (x, y) is a valid joint probability density function. x 24xydxdy 12 x(1 x) 2 dx 12 (x 2x 2 + x 3 ) 1 1 pt for correct limits on y (or x) and 1 pt for seeing it integrates to one. 1

2 (b) Find f X (x). f X (x) y f X,Y (x, y)dxdy x 24xydy 12x(1 x) 2 When x and zero otherwise. same as before. (c) Find E[X] and var(x). E[X] 12 xf X (x)dx 12 var(x) E[X 2 ] E[X] 2 12 x 2 (1 x) 2 dx (x 2 2x 3 + x 4 )dx 12(1/3 1/2 + 1/5) 2/5 x 2 f X (x)dx (2/5) 2 x 3 (1 2x + x 2 )dx 4/25 12 (x 3 2x 4 + x 5 )dx 4/25.4 Any consistent rubric would do as long as they know the definitions of E and Var and know how to integrate x i. (d) Find f Y (y). By symmetry, this is f Y (y) 12y(1 y) 2 when y and zero otherwise. Like the corresponding rubric for X. (e) Find E(Y ) and var(y ). Since the pdf s are the same, the expectation and variances are too. 3. ( pts) The random variables X and Y have joint density function cxy(1 x) < x < 1, < y < 1 f X,Y (x, y) otherwise (a) Find c. c xy(1 x)dxdy c c 12 x(1 x)dx ydy c 12 1 (b) Find E[X]. f X (x) f X,Y (x, y)dy 6x(1 x). So E[X] 6x2 (1 x)dx.5 (c) Find E[Y ]. f Y (y) f X,Y (x, y)dx 2y. So E[Y ] 2y2 dy 2/3. (d) Find V ar(x). var(x) E[X 2 ].25 6x3 (1 x)dx.25 3/1 1/4 1/2 (e) Find V ar(y ). var(y ) E[Y 2 ] 4/9 2y3 dy 4/9 1/2 4/9 1/18 2

3 4. The random variables X and Y have a joint density function given by: 2e 2x /x x <, y x f(x, y) otherwise (a) What are f X (x) and f Y (y)? (b) What are E[X] and E[Y ]? We have: f X (x) y f X,Y (x, y)dy x 2e 2x /xdy 2e 2x. So And So, E[Y ] f Y (y) E[X] yf Y (y)dy 2e 2x x ( x x x2e 2x dx 1/2 f X,Y (x, y)dx y( ydy)dx y 2e 2x x y dx)dy 2e 2x /xdx xe 2x dx 1/4 ve v dv 1/4 In the last step, we changed the order of the integrals. Originally the outer integral was over y < and inner was over y x <. But for ease of integration, the outer integral is now over x < and the inner is over y x. 5. f X (x) C(2x x 3 ) < x < 5/2 x Could f be a probability density function? If so find C. We want 1 5/2 C(2x x 3 )dx C(2.5 2 (2.5) 4 /4) C2.5 2 (1 25/16) 3.51C. Now say we use C 1/ If f(x) > for all x and integrates to one, then we call it a valid pdf. If you can find a x where it is negative then this is not. f(1) C(x 2 x 4 /4) 3C/4. If C < then this is negative and thats not okay. So no, this cannot be a probability density function. 6. Consider the density function f X (x) C(2 x) < x < 2 x Could f be a probability density function? If so find C. We know that 2 C(2 x)dx 1 and so C(4 (2) 2 /2) 2C 1 and so C 1/2. Also (2 x)/2 for < x < 2. So this is a valid pdf. (a) We know that 2 C(2 x)dx 1 and so C(4 (2)2 /2) 2C 1 and so C 1/2. (b) Also (2 x)/2 for < x < 2. (c) So this is a valid pdf. 3

4 7. Let U be an uniform [, 1] r.v and let a < b be constants. Show that: (a) If b > then bu Uniform([, b]). (b) a + U Uniform([a, a + 1]) (c) What function of U is distributed as Uniform([a, b]) (d) Show that min(u, 1 U) Uniform(, 1/2) (a) Let X bu. Fist note the values X bu can take. X [, b] When F X (x) P (X x) P (bu x) For x < F X (x) P (U x/b) x/b du u/b For x/b 1 1 For x/b > 1 So f X (x) 1/b when x [, b] and otherwise. But this is the pdf of a Uniform([, b]). (b) First note that a + U [a, a + 1]. Now, F a+u (x) P (a + U x) P (U x a) (x a), when x [a, a + 1], when x < a and 1 when x > a. So f a+u (x) 1 when x [a, a + 1] and otherwise. This is Uniform[a, a + 1]. (c) From the last two exercises we see that adding a constant shifts the Uniform distribution, and multiplying by a constant stretches it. To convert U nif orm([, 1]) to U nif orm([a, b]) we need both shifting and stretching. Let X αu + β Uniform([a, b]). X [β, α + β]. β a and α + β b and so α b a. So (b a)u + a Uniform([a, b]) (d) Let X min(u, 1 U). First note that X has to lie in [, 1/2]. F X (x) P (min(u, 1 U) x) 1 P (U x, 1 U x) 1 P (x U 1 x) 1 (1 2x) 2x If x 1/2 and otherwise And f X (x) 2 if x [, 1/2]. This is Uniform([, 1/2]). 8. The joint density of X and Y are given by: xe (x+y) x >, y > f X,Y (x, y) otherwise (a) Are X and Y independent? (b) What are f X (x) and f Y (y)? (c) What is P (X + Y 2)? (a) Check if the joint factorizes for all x, y? Yes. So independent. Always remember, check if the bounds on x involve y, that can lead to dependence. Here it does not. (b) f X (x) y xe (x+y) dy xe x (c) f Y (y) x xe (x+y) dy e y xe x dx e y. 4

5 (d) This is convolution, because the random variables are independent. P (X + Y 2) x 2 x 2 x 2 x y f X (x)f Y (y)dxdy 2 x xe x e y dydx y xe x (1 e (2 x) )dx 2 x 2 xe x dx e 2 xdx (1 3/e 2 ) 2/e 2 1 5/e 2 Note that after you plug in f Y (y), you basically say that 2 x so x 2 and that changes the limits on x. 9. The joint density function of X and Y is: x + y < x < 1, < y < 1 f X,Y (x, y) otherwise (a) Are X and Y independent? (b) Find f X (x) (c) Find P (X + Y < 1) (a) No, the joint density does not factorize into a product of functions of x and y. (b) f X (x) y (x + y)dy x + 1/2 and f Y (y) y + 1/2 (c) P (X + Y 1) f X,Y (x, Y )dxdy (x,y):x+y 1 x x y (x + y)dxdy (x(1 x) + (1 x) 2 /2)dx (x x 2 + 1/2 x + x 2 /2) (1/2 x 2 /2) 1/2 1/6 1/3 1. The running time in seconds of an algorithm on a medium sized data set is approximately normally distributed with mean 3 and variance 25. (a) What is the probability that the running time of a run selected at random will exceed 25 seconds? (b) What is the probability that the running time of at least one of four randomly selected runs will exceed 25 seconds? 5

6 (c) What is the probability that the running time of all runs will exceed 25 seconds given at least one of four randomly selected runs will exceed 25 seconds? (a).8413 (b) 1 (1.8413) 4 (c) /(1 (1.8413) 4 ) 11. ( pts) The joint density of X and Y is given by c < x < y, < y < 1 f X,Y (x, y) otherwise (a) What is c? (b) What is f Y (y)? (c) What is E[Y ]? (d) What is the conditional pdf f X Y (x y)? (e) Are X and Y independent? (a) The pdf has to normalize to one. xy f X,Y (x, y)dxdy c 2 x yx cdxdy c (1 x)dx c/2 1 x (b) f Y (y) x f X,Y (x, y)dx y x 2dx 2y When < y < 1 Otherwise (c) E[Y ] yf Y (y)dy 2y 2 dy 2/3 (d) f X Y (x y) f X,Y (x, y) f Y (y) 2 2y 1 y When < x < y, < y < 1 and otherwise. (e) f X (x) y f X,Y (x, y)dy yx 2dy 2(1 x) When < x < 1. otherwise No, since < x < y < 1, there are values such as x.5, y.1, where the pdf is zero but f X (x) and f Y (y) is nonzero. 6

7

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested

More information

STAT 430/510: Lecture 16

STAT 430/510: Lecture 16 STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions

More information

Random Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University.

Random Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University. Random Signals and Systems Chapter 3 Jitendra K Tugnait Professor Department of Electrical & Computer Engineering Auburn University Two Random Variables Previously, we only dealt with one random variable

More information

Multivariate distributions

Multivariate distributions CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density

More information

Homework 10 (due December 2, 2009)

Homework 10 (due December 2, 2009) Homework (due December, 9) Problem. Let X and Y be independent binomial random variables with parameters (n, p) and (n, p) respectively. Prove that X + Y is a binomial random variable with parameters (n

More information

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2}, ECE32 Spring 25 HW Solutions April 6, 25 Solutions to HW Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where

More information

Homework 9 (due November 24, 2009)

Homework 9 (due November 24, 2009) Homework 9 (due November 4, 9) Problem. The join probability density function of X and Y is given by: ( f(x, y) = c x + xy ) < x

More information

ECE Lecture #9 Part 2 Overview

ECE Lecture #9 Part 2 Overview ECE 450 - Lecture #9 Part Overview Bivariate Moments Mean or Expected Value of Z = g(x, Y) Correlation and Covariance of RV s Functions of RV s: Z = g(x, Y); finding f Z (z) Method : First find F(z), by

More information

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS STAT/MA 4 Answers Homework November 5, 27 Solutions by Mark Daniel Ward PROBLEMS Chapter Problems 2a. The mass p, corresponds to neither of the first two balls being white, so p, 8 7 4/39. The mass p,

More information

HW4 : Bivariate Distributions (1) Solutions

HW4 : Bivariate Distributions (1) Solutions STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 7 Néhémy Lim HW4 : Bivariate Distributions () Solutions Problem. The joint probability mass function of X and Y is given by the following table : X Y

More information

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc. ECE32 Exam 2 Version A April 21, 214 1 Name: Solution Score: /1 This exam is closed-book. You must show ALL of your work for full credit. Please read the questions carefully. Please check your answers

More information

Chapter 5 Joint Probability Distributions

Chapter 5 Joint Probability Distributions Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger Chapter 5 Joint Probability Distributions 5 Joint Probability Distributions CHAPTER OUTLINE 5-1 Two

More information

Homework 5 Solutions

Homework 5 Solutions 126/DCP126 Probability, Fall 214 Instructor: Prof. Wen-Guey Tzeng Homework 5 Solutions 5-Jan-215 1. Let the joint probability mass function of discrete random variables X and Y be given by { c(x + y) ifx

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

p. 6-1 Continuous Random Variables p. 6-2

p. 6-1 Continuous Random Variables p. 6-2 Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability (>). Often, there is interest in random variables

More information

Chapter 4 Multiple Random Variables

Chapter 4 Multiple Random Variables Review for the previous lecture Definition: n-dimensional random vector, joint pmf (pdf), marginal pmf (pdf) Theorem: How to calculate marginal pmf (pdf) given joint pmf (pdf) Example: How to calculate

More information

STAT 430/510 Probability

STAT 430/510 Probability STAT 430/510 Probability Hui Nie Lecture 16 June 24th, 2009 Review Sum of Independent Normal Random Variables Sum of Independent Poisson Random Variables Sum of Independent Binomial Random Variables Conditional

More information

Lecture 19: Properties of Expectation

Lecture 19: Properties of Expectation Lecture 19: Properties of Expectation Dan Sloughter Furman University Mathematics 37 February 11, 4 19.1 The unconscious statistician, revisited The following is a generalization of the law of the unconscious

More information

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx INDEPENDENCE, COVARIANCE AND CORRELATION Independence: Intuitive idea of "Y is independent of X": The distribution of Y doesn't depend on the value of X. In terms of the conditional pdf's: "f(y x doesn't

More information

1 Solution to Problem 2.1

1 Solution to Problem 2.1 Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

Class 8 Review Problems solutions, 18.05, Spring 2014

Class 8 Review Problems solutions, 18.05, Spring 2014 Class 8 Review Problems solutions, 8.5, Spring 4 Counting and Probability. (a) Create an arrangement in stages and count the number of possibilities at each stage: ( ) Stage : Choose three of the slots

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

18.440: Lecture 19 Normal random variables

18.440: Lecture 19 Normal random variables 18.440 Lecture 19 18.440: Lecture 19 Normal random variables Scott Sheffield MIT Outline Tossing coins Normal random variables Special case of central limit theorem Outline Tossing coins Normal random

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 17: Continuous random variables: conditional PDF Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin

More information

Lesson 3: Linear differential equations of the first order Solve each of the following differential equations by two methods.

Lesson 3: Linear differential equations of the first order Solve each of the following differential equations by two methods. Lesson 3: Linear differential equations of the first der Solve each of the following differential equations by two methods. Exercise 3.1. Solution. Method 1. It is clear that y + y = 3 e dx = e x is an

More information

MATH/STAT 395 Winter 2013 This homework is due at the beginning of class on Friday, March 1.

MATH/STAT 395 Winter 2013 This homework is due at the beginning of class on Friday, March 1. Homework 6 KEY MATH/STAT 395 Winter 23 This homework is due at the beginning of class on Friday, March.. A nut company markets cans of deluxe mixed nuts containing almonds, cashews and peanuts. The net

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

STAT 515 MIDTERM 2 EXAM November 14, 2018

STAT 515 MIDTERM 2 EXAM November 14, 2018 STAT 55 MIDTERM 2 EXAM November 4, 28 NAME: Section Number: Instructor: In problems that require reasoning, algebraic calculation, or the use of your graphing calculator, it is not sufficient just to write

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

2 Functions of random variables

2 Functions of random variables 2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as

More information

STAT 430/510: Lecture 15

STAT 430/510: Lecture 15 STAT 430/510: Lecture 15 James Piette June 23, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.4... Conditional Distribution: Discrete Def: The conditional

More information

Chapter 4. Chapter 4 sections

Chapter 4. Chapter 4 sections Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation

More information

General Random Variables

General Random Variables 1/65 Chia-Ping Chen Professor Department of Computer Science and Engineering National Sun Yat-sen University Probability A general random variable is discrete, continuous, or mixed. A discrete random variable

More information

we need to describe how many cookies the first person gets. There are 6 choices (0, 1,... 5). So the answer is 6.

we need to describe how many cookies the first person gets. There are 6 choices (0, 1,... 5). So the answer is 6. () (a) How many ways are there to divide 5 different cakes and 5 identical cookies between people so that the first person gets exactly cakes. (b) How many ways are there to divide 5 different cakes and

More information

Chapter 4 Multiple Random Variables

Chapter 4 Multiple Random Variables Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

1 x 2 and 1 y 2. 0 otherwise. c) Estimate by eye the value of x for which F(x) = x + y 0 x 1 and 0 y 1. 0 otherwise

1 x 2 and 1 y 2. 0 otherwise. c) Estimate by eye the value of x for which F(x) = x + y 0 x 1 and 0 y 1. 0 otherwise Eample 5 EX: Which of the following joint density functions have (correlation) ρ XY = 0? (Remember that ρ XY = 0 is possible with symmetry even if X and Y are not independent.) a) b) f (, y) = π ( ) 2

More information

1 Review of di erential calculus

1 Review of di erential calculus Review of di erential calculus This chapter presents the main elements of di erential calculus needed in probability theory. Often, students taking a course on probability theory have problems with concepts

More information

ENGG2430A-Homework 2

ENGG2430A-Homework 2 ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,

More information

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable ECE353: Probability and Random Processes Lecture 7 -Continuous Random Variable Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu Continuous

More information

Continuous Random Variables

Continuous Random Variables Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables

More information

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009. NAME: ECE 32 Division 2 Exam 2 Solutions, /4/29. You will be required to show your student ID during the exam. This is a closed-book exam. A formula sheet is provided. No calculators are allowed. Total

More information

Conditional distributions. Conditional expectation and conditional variance with respect to a variable.

Conditional distributions. Conditional expectation and conditional variance with respect to a variable. Conditional distributions Conditional expectation and conditional variance with respect to a variable Probability Theory and Stochastic Processes, summer semester 07/08 80408 Conditional distributions

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results

More information

Let X and Y denote two random variables. The joint distribution of these random

Let X and Y denote two random variables. The joint distribution of these random EE385 Class Notes 9/7/0 John Stensby Chapter 3: Multiple Random Variables Let X and Y denote two random variables. The joint distribution of these random variables is defined as F XY(x,y) = [X x,y y] P.

More information

Stochastic Models of Manufacturing Systems

Stochastic Models of Manufacturing Systems Stochastic Models of Manufacturing Systems Ivo Adan Organization 2/47 7 lectures (lecture of May 12 is canceled) Studyguide available (with notes, slides, assignments, references), see http://www.win.tue.nl/

More information

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text. TEST #3 STA 5326 December 4, 214 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. (You will have access to

More information

SOLUTION FOR HOMEWORK 11, ACTS 4306

SOLUTION FOR HOMEWORK 11, ACTS 4306 SOLUTION FOR HOMEWORK, ACTS 36 Welcome to your th homework. This is a collection of transformation, Central Limit Theorem (CLT), and other topics.. Solution: By definition of Z, Var(Z) = Var(3X Y.5). We

More information

Solution to Assignment 3

Solution to Assignment 3 The Chinese University of Hong Kong ENGG3D: Probability and Statistics for Engineers 5-6 Term Solution to Assignment 3 Hongyang Li, Francis Due: 3:pm, March Release Date: March 8, 6 Dear students, The

More information

ECO220Y Continuous Probability Distributions: Uniform and Triangle Readings: Chapter 9, sections

ECO220Y Continuous Probability Distributions: Uniform and Triangle Readings: Chapter 9, sections ECO220Y Continuous Probability Distributions: Uniform and Triangle Readings: Chapter 9, sections 9.8-9.9 Fall 2011 Lecture 8 Part 1 (Fall 2011) Probability Distributions Lecture 8 Part 1 1 / 19 Probability

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14 Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional

More information

STAT 516 Midterm Exam 3 Friday, April 18, 2008

STAT 516 Midterm Exam 3 Friday, April 18, 2008 STAT 56 Midterm Exam 3 Friday, April 8, 2008 Name Purdue student ID (0 digits). The testing booklet contains 8 questions. 2. Permitted Texas Instruments calculators: BA-35 BA II Plus BA II Plus Professional

More information

STAT 509 Section 3.4: Continuous Distributions. Probability distributions are used a bit differently for continuous r.v. s than for discrete r.v. s.

STAT 509 Section 3.4: Continuous Distributions. Probability distributions are used a bit differently for continuous r.v. s than for discrete r.v. s. STAT 509 Section 3.4: Continuous Distributions Probability distributions are used a bit differently for continuous r.v. s than for discrete r.v. s. A continuous random variable is one for which the outcome

More information

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are

More information

Chapter 4. Multivariate Distributions. Obviously, the marginal distributions may be obtained easily from the joint distribution:

Chapter 4. Multivariate Distributions. Obviously, the marginal distributions may be obtained easily from the joint distribution: 4.1 Bivariate Distributions. Chapter 4. Multivariate Distributions For a pair r.v.s (X,Y ), the Joint CDF is defined as F X,Y (x, y ) = P (X x,y y ). Obviously, the marginal distributions may be obtained

More information

Continuous distributions

Continuous distributions CHAPTER 7 Continuous distributions 7.. Introduction A r.v. X is said to have a continuous distribution if there exists a nonnegative function f such that P(a X b) = ˆ b a f(x)dx for every a and b. distribution.)

More information

Quantitative Methods in Economics Conditional Expectations

Quantitative Methods in Economics Conditional Expectations Quantitative Methods in Economics Conditional Expectations Maximilian Kasy Harvard University, fall 2016 1 / 19 Roadmap, Part I 1. Linear predictors and least squares regression 2. Conditional expectations

More information

Review: mostly probability and some statistics

Review: mostly probability and some statistics Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random

More information

STT 441 Final Exam Fall 2013

STT 441 Final Exam Fall 2013 STT 441 Final Exam Fall 2013 (12:45-2:45pm, Thursday, Dec. 12, 2013) NAME: ID: 1. No textbooks or class notes are allowed in this exam. 2. Be sure to show all of your work to receive credit. Credits are

More information

STAT515, Review Worksheet for Midterm 2 Spring 2019

STAT515, Review Worksheet for Midterm 2 Spring 2019 STAT55, Review Worksheet for Midterm 2 Spring 29. During a week, the proportion of time X that a machine is down for maintenance or repair has the following probability density function: 2( x, x, f(x The

More information

ECE 302: Probabilistic Methods in Engineering

ECE 302: Probabilistic Methods in Engineering Purdue University School of Electrical and Computer Engineering ECE 32: Probabilistic Methods in Engineering Fall 28 - Final Exam SOLUTION Monday, December 5, 28 Prof. Sanghavi s Section Score: Name: No

More information

Solution. This is a routine application of the chain rule.

Solution. This is a routine application of the chain rule. EXAM 2 SOLUTIONS 1. If z = e r cos θ, r = st, θ = s 2 + t 2, find the partial derivatives dz ds chain rule. Write your answers entirely in terms of s and t. dz and dt using the Solution. This is a routine

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

4. CONTINUOUS RANDOM VARIABLES

4. CONTINUOUS RANDOM VARIABLES IA Probability Lent Term 4 CONTINUOUS RANDOM VARIABLES 4 Introduction Up to now we have restricted consideration to sample spaces Ω which are finite, or countable; we will now relax that assumption We

More information

Math438 Actuarial Probability

Math438 Actuarial Probability Math438 Actuarial Probability Jinguo Lian Department of Math and Stats Jan. 22, 2016 Continuous Random Variables-Part I: Definition A random variable X is continuous if its set of possible values is an

More information

CHAPTER 8. Normal distribution Introduction. A synonym for normal is Gaussian. The first thing to do is show that this is a density.

CHAPTER 8. Normal distribution Introduction. A synonym for normal is Gaussian. The first thing to do is show that this is a density. CHAPTER 8 Normal distribution 8.. Introduction A r.v. is a standard normal written N, )) if it has density e x /. A synonym for normal is Gaussian. The first thing to do is show that this is a density.

More information

Stat 366 A1 (Fall 2006) Midterm Solutions (October 23) page 1

Stat 366 A1 (Fall 2006) Midterm Solutions (October 23) page 1 Stat 366 A1 Fall 6) Midterm Solutions October 3) page 1 1. The opening prices per share Y 1 and Y measured in dollars) of two similar stocks are independent random variables, each with a density function

More information

Introduction to Probability and Stocastic Processes - Part I

Introduction to Probability and Stocastic Processes - Part I Introduction to Probability and Stocastic Processes - Part I Lecture 1 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark

More information

A review of probability theory

A review of probability theory 1 A review of probability theory In this book we will study dynamical systems driven by noise. Noise is something that changes randomly with time, and quantities that do this are called stochastic processes.

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C,

Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C, Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: PMF case: p(x, y, z) is the joint Probability Mass Function of X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) PDF case: f(x, y, z) is

More information

Lecture 8: Channel Capacity, Continuous Random Variables

Lecture 8: Channel Capacity, Continuous Random Variables EE376A/STATS376A Information Theory Lecture 8-02/0/208 Lecture 8: Channel Capacity, Continuous Random Variables Lecturer: Tsachy Weissman Scribe: Augustine Chemparathy, Adithya Ganesh, Philip Hwang Channel

More information

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014 Time: 3 hours, 8:3-11:3 Instructions: MATH 151, FINAL EXAM Winter Quarter, 21 March, 214 (1) Write your name in blue-book provided and sign that you agree to abide by the honor code. (2) The exam consists

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x

More information

Statistics 100A Homework 5 Solutions

Statistics 100A Homework 5 Solutions Chapter 5 Statistics 1A Homework 5 Solutions Ryan Rosario 1. Let X be a random variable with probability density function a What is the value of c? fx { c1 x 1 < x < 1 otherwise We know that for fx to

More information

Mathematics 426 Robert Gross Homework 9 Answers

Mathematics 426 Robert Gross Homework 9 Answers Mathematics 4 Robert Gross Homework 9 Answers. Suppose that X is a normal random variable with mean µ and standard deviation σ. Suppose that PX > 9 PX

More information

4 Pairs of Random Variables

4 Pairs of Random Variables B.Sc./Cert./M.Sc. Qualif. - Statistical Theory 4 Pairs of Random Variables 4.1 Introduction In this section, we consider a pair of r.v. s X, Y on (Ω, F, P), i.e. X, Y : Ω R. More precisely, we define a

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

Practice Examination # 3

Practice Examination # 3 Practice Examination # 3 Sta 23: Probability December 13, 212 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use a single

More information

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH A Test #2 June 11, Solutions

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH A Test #2 June 11, Solutions YORK UNIVERSITY Faculty of Science Department of Mathematics and Statistics MATH 2. A Test #2 June, 2 Solutions. (5 + 5 + 5 pts) The probability of a student in MATH 4 passing a test is.82. Suppose students

More information

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay 1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/

More information

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π Solutions to Homework Set #5 (Prepared by Lele Wang). Neural net. Let Y X + Z, where the signal X U[,] and noise Z N(,) are independent. (a) Find the function g(y) that minimizes MSE E [ (sgn(x) g(y))

More information

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

This does not cover everything on the final. Look at the posted practice problems for other topics.

This does not cover everything on the final. Look at the posted practice problems for other topics. Class 7: Review Problems for Final Exam 8.5 Spring 7 This does not cover everything on the final. Look at the posted practice problems for other topics. To save time in class: set up, but do not carry

More information

UCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei)

UCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei) UCSD ECE53 Handout #7 Prof. Young-Han Kim Tuesday, May 6, 4 Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei). Neural net. Let Y = X + Z, where the signal X U[,] and noise Z N(,) are independent.

More information

Review of Probability Theory

Review of Probability Theory Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving

More information

Math 201 Solutions to Assignment 1. 2ydy = x 2 dx. y = C 1 3 x3

Math 201 Solutions to Assignment 1. 2ydy = x 2 dx. y = C 1 3 x3 Math 201 Solutions to Assignment 1 1. Solve the initial value problem: x 2 dx + 2y = 0, y(0) = 2. x 2 dx + 2y = 0, y(0) = 2 2y = x 2 dx y 2 = 1 3 x3 + C y = C 1 3 x3 Notice that y is not defined for some

More information

1 Expectation of a continuously distributed random variable

1 Expectation of a continuously distributed random variable OCTOBER 3, 204 LECTURE 9 EXPECTATION OF A CONTINUOUSLY DISTRIBUTED RANDOM VARIABLE, DISTRIBUTION FUNCTION AND CHANGE-OF-VARIABLE TECHNIQUES Expectation of a continuously distributed random variable Recall

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

We introduce methods that are useful in:

We introduce methods that are useful in: Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more

More information

SDS 321: Practice questions

SDS 321: Practice questions SDS 2: Practice questions Discrete. My partner and I are one of married couples at a dinner party. The 2 people are given random seats around a round table. (a) What is the probability that I am seated

More information

Random Variables and Probability Distributions

Random Variables and Probability Distributions CHAPTER Random Variables and Probability Distributions Random Variables Suppose that to each point of a sample space we assign a number. We then have a function defined on the sample space. This function

More information

More than one variable

More than one variable Chapter More than one variable.1 Bivariate discrete distributions Suppose that the r.v. s X and Y are discrete and take on the values x j and y j, j 1, respectively. Then the joint p.d.f. of X and Y, to

More information