ECSE B Solutions to Assignment 8 Fall 2008

Similar documents
Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

ENGG2430A-Homework 2

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010

Bivariate distributions

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

THE INTERNATIONAL UNIVERSITY (IU) Course: Probability Models in Operations Research Department of Industrial System Engineering

1 Probability and Random Variables

Multivariate Random Variable

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

STA 256: Statistics and Probability I

Joint Distribution of Two or More Random Variables

Lecture 2: Repetition of probability theory and statistics

ACM 116: Lectures 3 4

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

CHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable

Chapter 4 continued. Chapter 4 sections

Algorithms for Uncertainty Quantification

Math Spring Practice for the final Exam.

Bivariate Distributions

conditional cdf, conditional pdf, total probability theorem?

Bivariate Distributions

01 Probability Theory and Statistics Review

Chapter 4 Multiple Random Variables

PROBABILITY THEORY REVIEW

HW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y)

CS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro

Appendix A : Introduction to Probability and stochastic processes

Midterm Exam 1 Solution

Lecture 2: Review of Probability

Bivariate Distributions. Discrete Bivariate Distribution Example

More than one variable

Multiple Random Variables

Continuous Random Variables

STAT 430/510: Lecture 16

Class 8 Review Problems solutions, 18.05, Spring 2014

Random Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University.

Jointly Distributed Random Variables

Week 10 Worksheet. Math 4653, Section 001 Elementary Probability Fall Ice Breaker Question: Do you prefer waffles or pancakes?

Stat 5101 Notes: Algorithms (thru 2nd midterm)

5 Operations on Multiple Random Variables

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 9 Fall 2007

STAT Chapter 5 Continuous Distributions

Chapter 5. Chapter 5 sections

Time : 3 Hours Max. Marks: 60. Instructions : Answer any two questions from each Module. All questions carry equal marks. Module I

Review of Probability. CS1538: Introduction to Simulations

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

Problem 1. Problem 2. Problem 3. Problem 4

Lecture 11. Probability Theory: an Overveiw

Probability Theory Review Reading Assignments

Regression and Covariance

SDS 321: Introduction to Probability and Statistics

Chapter 2. Probability

Probability and Statistics Notes

Introduction to Machine Learning

STAT/MATH 395 PROBABILITY II

The Binomial distribution. Probability theory 2. Example. The Binomial distribution

Chapter 5 continued. Chapter 5 sections

MULTIVARIATE PROBABILITY DISTRIBUTIONS

EE4601 Communication Systems

Covariance and Correlation

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

3. General Random Variables Part IV: Mul8ple Random Variables. ECE 302 Fall 2009 TR 3 4:15pm Purdue University, School of ECE Prof.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51

Problem Set #5. Econ 103. Solution: By the complement rule p(0) = 1 p q. q, 1 x 0 < 0 1 p, 0 x 0 < 1. Solution: E[X] = 1 q + 0 (1 p q) + p 1 = p q

Tutorial 1 : Probabilities

STAT 515 MIDTERM 2 EXAM November 14, 2018

Lecture 1: August 28

Stochastic Models of Manufacturing Systems

Brandon C. Kelly (Harvard Smithsonian Center for Astrophysics)

Variance reduction. Michel Bierlaire. Transport and Mobility Laboratory. Variance reduction p. 1/18

Functions of two random variables. Conditional pairs

ECE Lecture #9 Part 2 Overview

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

Tom Salisbury

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

STT 441 Final Exam Fall 2013

Some Concepts of Probability (Review) Volker Tresp Summer 2018

Stat 5101 Notes: Algorithms

3 Multiple Discrete Random Variables

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 9 Spring 2006

Preliminary Statistics. Lecture 3: Probability Models and Distributions

Expectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Problem #1 #2 #3 #4 Total Points /5 /7 /8 /4 /24

Review of Probability Theory

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.

Practice Midterm 2 Partial Solutions

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 8 Fall 2007

3. Probability and Statistics

A Probability Review

Hypergeometric, Poisson & Joint Distributions

Statistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v }

Notes for Math 324, Part 19

2 (Statistics) Random variables

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π

Transcription:

ECSE 34-35B Solutions to Assignment 8 Fall 28 Problem 8.1 A manufacturing system is governed by a Poisson counting process {N t ; t < } with rate parameter λ >. If a counting event occurs at an instant t, t <, the system generates an item of product P. The value of any product item is described by an R valued Gaussian random variable V with mean 5 and variance 3. (Yes, the value can be negative if the product is defective!) Each item is stored in a buffer B when it is produced. B is empty at t, and the counting process and all the generated random variables {V 1, V 2,...} are mutually independent. (a) The buffer is observed at an instant t, t >. What is the mean value of the number of items N t in the buffer B? Is this mean value finite as t? If yes, what is the mean value in the limit? (b) The total value W t Σ Nt k1 V k of the items in B obviously has a Gaussian distribution when conditioned on the number of items N t in B at an instant t, t > ; find the conditional mean and variance of W t. (c) Use (b) to find the unconditional mean and variance of the total value W t of the items in B at t, t >. (d) Use the conditional and unconditional probability distributions in (b) and (c) to give a formula (involving two integrals) for the conditional probability P (N t k V t v) for k <, v R. Solution 8.1 (a) At the instant t, the PMF of the number N t of items in the buffer B is: λt (λt)n P (N t n) e, n, 1,... n! 1

Hence, E{N t } n n1 λt (λt)n ne n! e λt (λt)e λt λt, t (λt) n (n 1)! n1 (λt) n 1 (n 1)! Thus the mean goes to as t. (b) Conditional mean: Conditional variance: N t E Nt {W t } E Nt {V k } 5N t k1 var Nt {W t } E Nt {(W t 5N t ) 2 } ( Nt E Nt (V k 5) k1 N t E Nt (V k 5) 2 k1 3N t ) 2 (c) Unconditional mean of W t E{W t } E Nt {E Nt {W t }} E Nt {5N t } 5(E{N t }) 5λt Unconditional variance of W t E{(W t E{W t }) 2 } E Nt {E Nt (W t 3N t ) 2 } E Nt {3N t } 3λt 2

(d) P (N t k W t v) P (W t v N t k) P (W t v) P (W t v N t k) P (N t k) P (W t v) Now define α t,k (v) P (W t v N t k) 1 1 v (2π) 3Nt e 1 2 (x 5N t) 2 3N t dx β t (v) P (W t v) P (W t v N t k) P (N t k) v k λt (λt)k e e 1 2 k! (x 5k)2 3k dx ) v λt (λt)k (e k! k e 1 (x 5k)2 2 3k dx 2π 3k Thus, P (N t k W t v) α t,k(v) (λt)k e λt β t (v) k! Problem 8.2 For the bivariate random variables (X, Y ) distributed as ce 3x e 2y, y x < f X,Y (x, y), otherwise use the formulas for the marginal densities in the lectures to find EX, EY, and hence find the covariance function Cov(X, Y ) E(X EX)(Y EY ) EXY EXEY and the correlation coefficient ρ Cov(X,Y ) σx 2 σ 2 y Solution 8.2 3

x c 2 f X,Y (x, y)dxdy 1, ce 3x e 2y dydx 1, ( e 3x e 5x) dx 1, c 15, f X (x) f Y (y) x Hence, by using integration by parts, we have EX EY 15e 3x e 2y dy 15 ( e 3x e 5x), x < 2 15e 3x e 2y dx 5e 5y, y <. EXY 15 EX 2 EY 2 x 15 2 (e 3x e 5x )dx 8 15, y (5e 5y )dy 1 5, x xy e 3x e 2y dydx 11 75, x 2 15 2 (e 3x 2e 5x )dx 98 225, y 2 (5e 5y )dy 2 25. Then Cov(X, Y ) EXY EXEY 11 75 8 15 15 1 25, and Problem 8.3 ρ Cov(X, Y ) σ x σ y EXY EXEY EX2 (EX) 2 EY 2 (EY ) 3.51 2 34 Assume the random variables X and Y representing marks obtained in Midterm I and Midterm II in a course are jointly normally distributed and that µ X 65, 18, µ Y 6, σ Y 2, ρ ( σ Y ) 1 [E(x µ X )(y µ y )] ( σ Y ) 1,Y.75. 4

(i) Verify that ρ σ Y,Y σ 2 X. (Use the expression on the left or the right of this equality in the work below depending upon which you find most convenient.) (ii) Assume that the conditional mean of Y given that X x is given by the formula µ Y X E(Y X x) (µ Y + ρ σ Y (x µ X )). Find the value of E X [E(Y X x)] E(Y X x)p(x)dx. (iii) (a) Assume a student s mark in Midterm I is X 75. Use the formula in (ii) to find the conditional mean of the student s mark E(Y X 75) in Midterm II. (b) Assume a student s mark in Midterm I is X 81; find also in this case the conditional mean of the student s mark in Midterm II. Solution 8.3 (i) ρ σ Y ( σ Y ) 1,Y σy,y σ 2 X (ii) E X [E(Y X x)] E(Y X x)p(x)dx µ Y + ρ σ Y (x µ X ))p(x)dx µ Y + ρ σ Y E X (x µ X ) µ Y + µ Y (iii) (a) From (ii), E(Y X 75) µ Y + ρ σ Y (75 µ X ) 6 +.75 2 (75 65) 68.3 18 (b) Similarly, E(Y X 81) µ Y + ρ σ Y (81 µ X ) 6 +.75 2 (81 65) 73.3 18 5

Problem 8.4 A Poission RV with parameter λ > takes values in the postive integers N; its probability mass function is given by: P ( X n) e λ λn n!, n N. The number of times that a person contracts a cold in a given year is described by a Poisson random random variable with parameter λ 4. Suppose that a new wonder drug (based on large quantities of vitamin C) has just been marketed that reduces the Poisson parameter to λ 3 for 6% of the population. For the other 4% of the population the drug has no appreciable effects on colds. A certain individual tries the drug for a year and has 2 colds in that time; use Bayes theorem and the Poisson distribution to determine how likely it is that the drug is beneficial for him or her (i.e. find the probability that the individual falls in the first group). Solution 8.4 Let P (A) P (drug is beneficial).6 and let the random variable X be the number of cold the person contracts in a year. Then P (individual falls in the first group) P (A X 2). Using Bayes theorem and have that P (A X 2) P (X 2 A)P (A), P (X 2) where and P (X 2) P (X 2 A)P (A) + P (X 2 A C )P (A C ) e 3 3 2.6 + e 4 4 2.4 2! 2! Thus, we have P (A X 2).6964. P (X 2 A) e 3 3 2. 2! 6