HW Solution 12 Due: Dec 2, 9:19 AM

Similar documents
1 Joint and marginal distributions

HW Solution 7 Due: Oct 24

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

HW Solution 7 Due: Oct 28, 9:19 AM (in tutorial session)

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

Recitation 2: Probability

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

Extra Topic: DISTRIBUTIONS OF FUNCTIONS OF RANDOM VARIABLES

Chapter 4 Multiple Random Variables

Northwestern University Department of Electrical Engineering and Computer Science

This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner.

p. 6-1 Continuous Random Variables p. 6-2

Review of Probability Theory

Review: mostly probability and some statistics

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

Probability review. September 11, Stoch. Systems Analysis Introduction 1

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

SDS 321: Introduction to Probability and Statistics

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27

RYERSON UNIVERSITY DEPARTMENT OF MATHEMATICS MTH 514 Stochastic Processes

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

STAT509: Continuous Random Variable

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

ENGG2430A-Homework 2

Multivariate Random Variable

STAT/MATH 395 PROBABILITY II

Lecture 11. Probability Theory: an Overveiw

Functions of two random variables. Conditional pairs

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Probability and Stochastic Processes: A Friendly Introduction for Electrical and Computer Engineers Roy D. Yates and David J.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

HW 13 Due: Dec 6, 5 PM

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture 2: Probability

Lecture 2: Repetition of probability theory and statistics

4. Distributions of Functions of Random Variables

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Jointly Distributed Random Variables

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be

HW Solution 2 Due: July 10:39AM

Chapter 4: Continuous Random Variable

Lecture 2: Probability

Chapter 4: Continuous Probability Distributions

ECE Homework Set 3

Chapter 4 Multiple Random Variables

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

STT 441 Final Exam Fall 2013

Chapter 2: Random Variables

EXAM # 3 PLEASE SHOW ALL WORK!

UNIT Define joint distribution and joint probability density function for the two random variables X and Y.

Random Variables and Their Distributions

Transformations and Expectations

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 1

Probability and Stochastic Processes

Basics on Probability. Jingrui He 09/11/2007

Communication Theory II

More on Distribution Function

Multiple Random Variables

Chapter 5 Random vectors, Joint distributions. Lectures 18-23

1 Expectation of a continuously distributed random variable

Algorithms for Uncertainty Quantification

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

F X (x) = P [X x] = x f X (t)dt. 42 Lebesgue-a.e, to be exact 43 More specifically, if g = f Lebesgue-a.e., then g is also a pdf for X.

MAS113 Introduction to Probability and Statistics. Proofs of theorems

Bivariate Distributions

ECE 353 Probability and Random Signals - Practice Questions

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Problem Set 1 Sept, 14

Chapter 5 continued. Chapter 5 sections

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

STAT 430/510: Lecture 15

Lecture 16 : Independence, Covariance and Correlation of Discrete Random Variables

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3

Chapter 2. Probability

Numerical Methods I Monte Carlo Methods

Problem Set #5. Econ 103. Solution: By the complement rule p(0) = 1 p q. q, 1 x 0 < 0 1 p, 0 x 0 < 1. Solution: E[X] = 1 q + 0 (1 p q) + p 1 = p q

Math Review Sheet, Fall 2008

Session #9 Critical Parameter Management & Error Budgeting

4 Pairs of Random Variables

Probability Review. Gonzalo Mateos

Massachusetts Institute of Technology

2 Functions of random variables

Homework 10 (due December 2, 2009)

Probability (continued)

First Year Examination Department of Statistics, University of Florida

Discrete Random Variable

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

CMPSCI 240: Reasoning Under Uncertainty

ECS 452: Digital Communication Systems 2015/2. HW 1 Due: Feb 5

STAT Chapter 5 Continuous Distributions

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

1.1 Review of Probability Theory

Statistics 3657 : Moment Approximations

IEOR 3106: Introduction to Operations Research: Stochastic Models. Fall 2011, Professor Whitt. Class Lecture Notes: Thursday, September 15.

Class 8 Review Problems 18.05, Spring 2014

Applied Statistics and Probability for Engineers. Sixth Edition. Chapter 4 Continuous Random Variables and Probability Distributions.

Transcription:

ECS 315: Probability and Random Processes 2015/1 HW Solution 12 Due: Dec 2, 9:19 AM Lecturer: Prapun Suksompong, Ph.D. Problem 1. Let X E(3). (a) For each of the following function g(x). Indicate whether the random variable Y = g(x) is a continuous random variable. (i) g(x) = x 2. { 1, x 0, (ii) g(x) = 0, x < 0.. { 4e (iii) g(x) = 4x, x 0, 0, x < 0. { x, x 5, (iv) g(x) = 5, x > 5. (b) Repeat part (a), but now check whether the random variable Y = g(x) is a discrete random variable. Hint: As shown in class, to check whether Y is a continuous random variable, we may check whether P [Y = y] = 0 for any real number y. Because Y = g(x), the probability under consideration is P [Y = y] = P [g (X) = y] = P [X {x : g (x) = y}]. At a particular y value, if there are at most countably many x values that satisfy g(x) = y, then P [Y = y] = P [X = x]. x:g(x)=y When X is a continuous random variable, P [X = x] = 0 for any x. P [Y = y] = 0. Therefore, Conclusion: Suppose X is a continuous random variable and there are at most countably many x values that satisfy g(x) = y for any y. Then, Y is a continuous random variable. 12-1

Suppose at a particular y value, we have uncountably many x values that satisfy g(x) = y. Then, we can t rewrite P [X {x : g (x) = y}] as P [X = x] because x:g(x)=y axiom P3 of probability is only applicable to countably many disjoint sets (cases). When X is a continuous random variable, we can find probability by integrating its pdf: P [Y = y] = P [X {x : g (x) = y}] = f X (x)dx. {x:g(x)=y} If we can find a y value whose integration above is > 0, we can conclude right away that Y is not a continuous random variable. (a) Solution: (i) YES. When y < 0, there is no x that satisfies g(x) = y. When y = 0, there is exactly one x (x = 0) that satisfies g(x) = y. When y > 0, there is exactly two x (x = ± y) that satisfies g(x) = y. Therefore, for any y, there are at most countably many x values that satisfy g(x) = y. Because X is a continuous random variable, we conclude that Y is also a continuous random variable. (ii) NO. An easy way to see this is that there can be only two values out of the function g( ): 0 or 1. So, Y = g(x) is a discrete random variable. Alternatively, consider y = 1. We see that any x 0, can make g(x) = 1. Therefore, P [Y = 1] = P [X 0]. For X E(3), P [X 0] = 1 > 0. Because we found a y with P [Y = y] > 0. Y can not be a continuous random variable. (iii) YES. The plot of the function g(x) may help you see the following facts: When y > 4 or y < 0, there is no x that gives y = g(x). When 0 < y < 4, there is exactly one x that satisfies y = g(x). Because X is a continuous random variable, we can conclude that P [Y = y] is 0 for y 0. When y = 0, any x < 0 would satisfy g(x) = y. So, P [Y = 0] = P [X < 0]. However, because X E(3) is always positive. P [X < 0] = 0. 12-2

(iv) NO. Consider y = 5. We see that any x 5, can make g(x) = 5. Therefore, P [Y = 5] = P [X 5]. For X E(3), P [X 5] = 3e 3x dx = e 15 > 0. 5 Because P [Y = 5] > 0, we conclude that Y can t be a continuous random variable. (b) To check whether a random variable is discrete, we simply check whether it has a countable support. Also, if we have already checked that a random variable is continuous, then it can t also be discrete. (i) (ii) (iii) (iv) NO. We checked before that it is a continuous random variable. YES as discussed in part (a). NO. We checked before that it is a continuous random variable. NO. Because X is positive, Y = g(x) can be any positive number in the interval (0, 5]. The interval is uncountable. Therefore, Y is not discrete. We have shown previously that Y is not a continuous random variable. Here. knowing that it is not discrete means that it is of the last type: mixed random variable. Problem 2. The input X and output Y of a system subject to random perturbations are described probabilistically by the following joint pmf matrix: x 1 3 y 2 4 5 0.02 0.10 0.08 0.08 0.32 0.40 Evaluate the following quantities: (a) The marginal pmf p X (x) (b) The marginal pmf p Y (y) (c) EX 12-3

(d) Var X (e) EY (f) Var Y (g) P [XY < 6] (h) P [X = Y ] Solution: The MATLAB codes are provided in the file P XY marginal.m. (a) The marginal pmf p X (x) is founded by the sums along the rows of the pmf matrix: 0.2, x = 1 p X (x) = 0.8, x = 3 0, otherwise. (b) The marginal pmf p Y (y) is founded by the sums along the columns of the pmf matrix: 0.1, y = 2 0.42, y = 4 p Y (y) = 0.48, y = 5 0, otherwise. (c) EX = x xp X (x) = 1 0.2 + 3 0.8 = 0.2 + 2.4 = 2.6. (d) E [X 2 ] = x x 2 p X (x) = 1 2 0.2 + 3 2 0.8 = 0.2 + 7.2 = 7.4. So, Var X = E [X 2 ] (EX) 2 = 7.4 (2.6) 2 = 7.4 6.76 = 0.64. (e) EY = y yp Y (y) = 2 0.1 + 4 0.42 + 5 0.48 = 0.2 + 1.68 + 2.4 = 4.28. (f) E [Y 2 ] = y y 2 p Y (y) = 2 2 0.1 + 4 2 0.42 + 5 2 0.48 = 19.12. So, Var Y = E [Y 2 ] (EY ) 2 = 19.12 4.28 2 = 0.8016. (g) Among the 6 possible pairs of (x, y) shown in the joint pmf matrix, only the pairs (1, 2), (1, 4), (1, 5) satisfy xy < 6. Therefore, [XY < 6] = [X = 1] which implies P [XY < 6] = P [X = 1] = 0.2. (h) Among the 6 possible pairs of (x, y) shown in the joint pmf matrix, there is no pair which has x = y. Therefore, P [X = Y ] = 0. 12-4

Problem 3. The time until a chemical reaction is complete (in milliseconds) is approximated by the cumulative distribution function { 1 e F X (x) = 0.01x, x 0, 0, otherwise. (a) Determine the probability density function of X. (b) What proportion of reactions is complete within 200 milliseconds? Solution: See handwritten solution Problem 4. Let a continuous random variable X denote the current measured in a thin copper wire in milliamperes. Assume that the probability density function of X is { 5, 4.9 x 5.1, f X (x) = 0, otherwise. (a) Find the probability that a current measurement is less than 5 milliamperes. (b) Find and plot the cumulative distribution function of the random variable X. (c) Find the expected value of X. (d) Find the variance and the standard deviation of X. (e) Find the expected value of power when the resistance is 100 ohms? Solution: See handwritten solution 12-5

Q3: pdf and cdf - chemical reaction Thursday, November 13, 2014 11:07 AM ECS315 2015 HW12 Page 1

Q4: pdf, cdf, expected value, variance - current and power Thursday, November 13, 2014 11:01 AM ECS315 2015 HW12 Page 2

ECS315 2015 HW12 Page 3

Extra Question Problem 5. The input X and output Y of a system subject to random perturbations are described probabilistically by the joint pmf p X,Y (x, y), where x = 1, 2, 3 and y = 1, 2, 3, 4, 5. Let P denote the joint pmf matrix whose i,j entry is p X,Y (i, j), and suppose that P = 1 71 (a) Find the marginal pmfs p X (x) and p Y (y). (b) Find EX (c) Find EY (d) Find Var X (e) Find Var Y 7 2 8 5 4 4 2 5 5 9 2 4 8 5 1 Solution: All of the calculations in this question are simply plugging numbers into appropriate formula. The MATLAB codes are provided in the file P XY marginal 2.m. (a) The marginal pmf p X (x) is founded by the sums along the rows of the pmf matrix: 26/71, x = 1 0.3662, x = 1 25/71, x = 2 0.3521, x = 2 p X (x) = 20/71, x = 3 0.2817, x = 3 0, otherwise 0, otherwise. The marginal pmf p Y (y) is founded by the sums along the columns of the pmf matrix: 13/71, y = 1 0.1831, y = 1 8/71, y = 2 0.1127, y = 2 21/71, y = 3 0.2958, y = 3 p Y (y) = 15/71, y = 4 0.2113, y = 4 14/71, y = 5 0.1972, y = 5 0, otherwise 0, otherwise. (b) EX = 136 71 1.9155 (c) EY = 222 71 3.1268 (d) Var X = 3230 5041 0.6407 (e) Var Y = 9220 5041 1.8290 12-6