UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 8 Fall 2007

Similar documents
UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 9 Spring 2006

More than one variable

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

5 Operations on Multiple Random Variables

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2

UCSD ECE153 Handout #34 Prof. Young-Han Kim Tuesday, May 27, Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei)

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:

Math 510 midterm 3 answers

We introduce methods that are useful in:

ENGG2430A-Homework 2

Solutions to Homework Set #6 (Prepared by Lele Wang)

MAS113 Introduction to Probability and Statistics. Proofs of theorems

ACM 116: Lectures 3 4

Random Variables and Their Distributions

1 Basic continuous random variable problems

Midterm Exam 1 Solution

Continuous Random Variables

STOR Lecture 16. Properties of Expectation - I

Chapter 4 continued. Chapter 4 sections

Properties of Summation Operator

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π

LIST OF FORMULAS FOR STK1100 AND STK1110

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

Formulas for probability theory and linear models SF2941

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 4: Solutions Fall 2007

Chapter 4 : Expectation and Moments

1 Basic continuous random variable problems

Bivariate distributions

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise

Algorithms for Uncertainty Quantification

1 Random Variable: Topics

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 9 Fall 2007

Chp 4. Expectation and Variance

Multiple Random Variables

Final Solutions Fri, June 8

Section 9.1. Expected Values of Sums

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Lecture 2: Repetition of probability theory and statistics

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

UCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei)

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Multivariate Random Variable

HW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y)

EE4601 Communication Systems

FINAL EXAM: Monday 8-10am

Homework 9 (due November 24, 2009)

ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

Covariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance

DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO

Outline Properties of Covariance Quantifying Dependence Models for Joint Distributions Lab 4. Week 8 Jointly Distributed Random Variables Part II

EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY GRADUATE DIPLOMA, Statistical Theory and Methods I. Time Allowed: Three Hours

18 Bivariate normal distribution I

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

CHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable

A Probability Review

Jointly Distributed Random Variables

14.30 Introduction to Statistical Methods in Economics Spring 2009

Statistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v }

18.440: Lecture 26 Conditional expectation

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

UCSD ECE153 Handout #30 Prof. Young-Han Kim Thursday, May 15, Homework Set #6 Due: Thursday, May 22, 2011

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Random vectors X 1 X 2. Recall that a random vector X = is made up of, say, k. X k. random variables.

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS223 Statistical Inference and Modelling Exercises

18.440: Lecture 28 Lectures Review

Joint Distribution of Two or More Random Variables

ORF 245 Fundamentals of Statistics Great Expectations

CME 106: Review Probability theory

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.

ECON Fundamentals of Probability

Chapter 4 Multiple Random Variables

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

STA 256: Statistics and Probability I

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

Covariance and Correlation Class 7, Jeremy Orloff and Jonathan Bloom

MULTIVARIATE PROBABILITY DISTRIBUTIONS

Introduction to Computational Finance and Financial Econometrics Matrix Algebra Review

STAT:5100 (22S:193) Statistical Inference I

Multivariate probability distributions and linear regression

Problem Set 7 Due March, 22

ECSE B Solutions to Assignment 8 Fall 2008

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes

Problem Set #5. Econ 103. Solution: By the complement rule p(0) = 1 p q. q, 1 x 0 < 0 1 p, 0 x 0 < 1. Solution: E[X] = 1 q + 0 (1 p q) + p 1 = p q

Class 8 Review Problems 18.05, Spring 2014

ECE 541 Stochastic Signals and Systems Problem Set 9 Solutions

Problem Set 1. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 20

Final Examination Solutions (Total: 100 points)

3. Probability and Statistics

Lecture 4: Least Squares (LS) Estimation

conditional cdf, conditional pdf, total probability theorem?

Brandon C. Kelly (Harvard Smithsonian Center for Astrophysics)

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Class 8 Review Problems solutions, 18.05, Spring 2014

Transcription:

UC Berkeley Department of Electrical Engineering and Computer Science EE 6: Probablity and Random Processes Problem Set 8 Fall 007 Issued: Thursday, October 5, 007 Due: Friday, November, 007 Reading: Bertsekas & Tsitsiklis, 4.4 4.7 Problem 8. Let X, X,... be independent and identically distributed random variables, where each X i is distributed according to the logarithmic PMF with parameter p i.e., p X (k) ( p)k, k,, 3,..., k ln(/p) where 0 < p <. Discrete random variable N has the Poisson PMF with parameter λ i.e., where λ > 0. λ λk p N (k) e, k 0,,,..., (a) Determine the transform M X (s) associated with each random variable X i. Hint: You may find the following identity useful: Provided < a, ln( + a) a a + a3 3 a4 4 +... (b) Defining Y N i X i, determine the transform M Y (s). Solution:. The Transform of p X (k) ( p)k k ln(/p) is M X (s) k ln(/p) (e sk )( p) k k ln(/p) (e s ( p)) k k ln( es ( p)) ln p k where we used ln( α) α + α +... k αk k for < α. The Transform of a Poisson Random Variable p N (k) e λ λk is given by M N (s) E[e sn ] e λ(es )

Using the Law of Iterated Expectation, the Transform of Y N i X i is M Y (s) E[e sy ] E[E[e sy N]] E[E[e s P N i X i N]] E[M X (s) N ] M N (s) e s M X (s) e λ(es ) e s M X (s) e ln( e s ( p)) λ( ) ln p e λ ln p ( ln( es ( p))+ln p) [e ( ln( es ( p))+ln p) ] λ ln p [e ln( es ( p)) e ln p ] λ ln p p [ e s ( p) ] λ ln p Problem 8. N male-female couples at a dinner party play the following game. Each member of the couple, writes his/her name on a piece of paper (so there are a total of N pieces of paper). Then men throw their paper into hat A, while the women throw their paper in hat B. Once all the papers are in, the host draws a piece of paper from each hat, and the two people chosen are dance partners for the rest of the night. The host continues in this fashion until all N poeple are paired up for night. Let M be the number of couples that are reunited by this game. Find E[M] and V ar(m). Hint: set up indicator variables and use covariance. Solution: We will use indicator functions to solve this problem. Define variables {X i } such that X i if couple i is reunited, and is 0 otherwise. Clearly M N i X i. We know that: E[M] N E[X i ] NE[X i ] i Now we easily see that: E[X i ] N and therefore E[N]. Finding the variance is a bit more tricky since the X i are not independent. Thus we have: V ar(m) N i V ar(x i ) + i<j Cov(X i, X j ) Now, V ar(x i ) E[Xi ] E[X i ] E[X i ] E[X i ] N N N N. Furthermore, Cov(X i, X j ) E[X i X j ] E[X i ]E[X j ] N N N N (N )

Thus we find: V ar(x) N N ( ) N N + N (N ) N N (N ) + N N (N ) N N + N Problem 8.3 Suppose that the random variables X, X have joint density function: [ ] f X,X (x, x ) γ exp (x 6x x + 0x ), for < x, x <. (a) Find µ x, µ x, Cov(x, x ) and the correlation coefficient ρ. (b) Find E[x x 3]. (c) For y ax + bx and y cx find a, b, c such that ρ 0, σ y σ y. Solution:. To find the means, the covariance, and the correlation coefficient, we need to compare given pdf to the generic bivariate normal PDF: { [ πσ x σ exp x ρ (x µ x ) ρ σx ρ x µ x x µ x + (x µ x ) ]} σ x σ x σx and we find: µ x µ x 0. (For more background on the bivariate normal random variable, please refer to chapter 4 of the course notes.) Therefore: [ (x µ x ) ρ σx ρ x µ x x µ x + (x µ x ) ] σ x σ x σx x 6x x + x ρ ρ σx, ( ρ 6, )σ x σ x ρ σx 0 σ x 0, σ x, ρ 3 0. and Cov(x, x ) σ x σ x ρ 3.. To find the conditional mean of x given x 3 we use the formula: E[x x ] µ x + ρσ x σ x (x µ x ) and we find: E[x x 3] 9 0. 3

3. We want to find a, b, c such that ρ 0, σ y σ y. We can see that: E[y ] E[y ] 0. Thus we have σy E[y ] and for this to equal, we need c ±. We also have: σ y E[y ] E[a x + abx x + b x ] 0a + 6ab + b, and Cov(y, y ) a E[x x ] + b E[x ] 3a + b b 3a 0a 8a + 9a a ± a, b 3, c a, b 3, c a, b 3, c a, b 3, c Problem 8.4 The receiver is an optical communications system uses a photodetector that counts the number of photons that arrive during the communication session. (The time of the communication session is time unit.) The sender conveys information by either transmitting or not transmitting photons to the photodetector. The probability of transmitting is p. If she transmits, the number of photons X that she transmits during the session has a Poisson PMF with mean λ per time unit. If she does not transmit, she generates no photons. Unfortunately, regardless of whether or not she transmits, there may still be photons arriving at the photodetector because of a phenomenon called shot noise. The number N of photons that arrive because of the shot noise has a Poisson PMF with mean µ. N and X are independent. The total number of photons counted by the photodetector is equal to the sum of the transmitted photons and the photons generated by the shot noise effect. (a) What is the probability that the sender transmitted if the photodetector counted k photons? (b) Before you know anything else about a particular communication session, what is your least squares estimate of the number of photons transmitted? (c) What is the least squares estimate of the number of photons transmitted by the sender if the photodetector counted k photons? (d) What is the best linear predictor for the number of photons transmitted by the sender as a function of k, the number of the detected photons? Solution: 4

. Let A be the event that the sender transmitted, and K be the number of photons counted by the photodetector. Using Bayes rule, P(A K k) p K A(k)P(A) p K (k) p X+N (k) p p N (k) ( p) + p X+N (k) p The discrete random variables X and N are given by the following PMF s: p X (x) λx e λ, x 0 x! p N (n) µn e µ, n 0 n! The sum of two independent Poisson random variables is also Poisson, with mean equal to the sum of the means of each of the random variables. This fact can be derived by looking at the product of the transforms of X and N. Therefore: Thus, p X+N (k) (λ + µ)k e (λ+µ), k 0 P(A K k) p (λ+µ)k e (λ+µ) p (λ+µ)k e (λ+µ) + ( p) µk e µ ( ) + p µ ke λ p λ+µ. Let S be the number of photons transmitted by the sender. Then with probability p, S X, and with probability p, S 0. The least squares estimate of the number of photons transmitted by the sender is simply the mean, in the absence of any additional information: Ŝ E[S] p λ + ( p) 0 pλ. 3. The least squares predictor has a form Using Bayes rule, Ŝ (k) E[S K k]. Ŝ (k) k sp S K (s k) s0 k s0 s p K S(k s)p S (s) p K (k) p K (k) k sp K S (k s)p S (s). s0 From the definitions of S and K, the following are true: { ( p) + pe λ, s 0 p S (s) p λs e λ s!, s,,... p K S (k s) p N (k s) µ(k s) e µ p K (k) p (λ + µ)k e (λ+µ) 5 (k s)! + ( p) µk e µ

In order to obtain the last expression, we observe that K S + N with probability p, and K N with probability p. Substituting into the formula above, [ ] Ŝ (k) 0 ( p) µ(k 0) e µ k + sp λs e λ µ (k s) e µ p K (k) (k 0)! s! (k s)! s0 p K (k) pe λ µ (λ + µ)k k ( ) λ s ( ) µ (k s) e s s!(k s)! λ + µ λ + µ s0 ( ) ( ) p K (k) pe λ µ (λ + µ)k λ λ pe (λ+µ) (λ + µ) k e k k λ + µ λ + µ p K (k) ( ) λ k ( ) λ + µ + p µ ke. λ p λ+µ Thus Ŝ (k) kλ λ + µ ( + p µ p λ+µ ) ke. λ Note that as k increases, the estimator can be approximated by 4. The linear least squares predictor has the form Ŝ 3 (k) E[S] + kλ λ+µ. Cov(S, K) σk (k E[K]) () Note that since X and N are independent, S and N are also independent. E[S] pλ E[S ] pe[x ] + ( p)(0) p(λ + λ) σs E[S ] (E[S]) p(λ + λ) (pλ) p( p)λ + pλ. E[K] E[S] + E[N] pλ + µ σ K σ S + σ N p( p)λ + pλ + µ. Finally, we need to find Cov(S, K). Cov(S, K) E[(S E[S])(K E[K])] E[(S E[S])(S E[S] + N E[N])] E[(S E[S])(S E[S])] + E[(S E[S])(N E[N])] σ S + E[(S E[S])(N E[N])] σ S p( p)λ + pλ. Note that we have used the fact that (S E[S]) and (N E[N]) are independent, and E[(S E[S])] 0 E[(N E[N])]. 6

Therefore, substituting all the numbers into the equation above, we get the linear predictor: Ŝ 3 (k) pλ + p( p)λ + pλ p( p)λ (k pλ µ). + pλ + µ Problem 8.5 Let X and Y be two random variables with positive variances, associated with the same experiment. (a) Show that ρ, the correlation coefficient of X and Y, equals to if and only if there exists a constant b and a negative constant a such that Y ax + b. (b) Let ˆ XL be the linear least mean squares estimator of X based on Y. Show that E[(X ˆ X L )Y ] 0. Use this property to show that the correlation of the estimation error X ˆ X L with Y is zero. (c) Let ˆX E[X Y ] be the least mean squares estimator of X given Y. Show that for any function h. E[(X ˆX)h(Y )] 0, (d) Is it true that the estimation error X E[X Y ] is independent of Y? Solution: (a) First suppose there exists a constant b and a negative constant a such that Y ax + b. ρ E[UV ] E[ X E[X] σ X Y E[Y ] σ Y ] Now, Y ax + b, E[Y ] ae[x] + b and var(y ) a var(x). Therefore, σ Y aσ X (since a < 0). We obtain ρ E[ X E[X] σ X (ax + b ae[x] b) X E[X] ] σ X σ X E[ U ] E[U ]. aσ X ] E[ X E[X] Now we do the proof in the other direction. Suppose that ρ E[UV ]. First we will show that E[(U E[UV ]V ) ] 0. We observe that E[U] E[V ] 0 and E[U ] E[V ]. Then, E[(U E[UV ]V ) ] E[U ] (E[UV ]) + (E[UV ]) E[V ] (E[UV ]) 0. 7

However, By the definition of expection, we have E[(U E[UV ]V ) ] (u E[UV ]v) f U,V (u, v)dudv. The integrand 0 because it is a squared term. Therefore, the integrand must be zero in order for the integration to be zero. Therefore U E[UV ]V V. Thus X E[X] σ X Y E[Y ] σ Y, which means Y ax + b where a σ Y σ X and b E[Y ] σ Y σ X E[X]. This completes the proof. (b) ˆXL E[X] + cov(x,y ) (Y E[Y ]), so we have E[(X ˆX L )Y ] E[XY (E[X] + cov(x, Y ) (Y E[Y ]))Y ] E[XY E[X]Y cov(x, Y ) (Y Y E[Y ])] E[XY ] E[X]E[Y ] cov(x, Y )E[Y ] σ Y cov(x, Y )[ E[Y ] σ Y + cov(x, Y )[ σ Y ] 0. E[Y ] ] + cov(x, Y )E[Y ] σ Y Now, let us examine the correlation of the estimation error X ˆX L with Y. ρ cov(x ˆX L,Y ) σ X ˆXL σ Y. So we must show that cov(x ˆX L, Y ) 0. Now cov(x ˆX L, Y ) E[(X ˆX L )Y ] E[X ˆX L ]E[Y ] E[X ˆX L ]E[Y ] E[X ˆX L ] E[X E[X] cov(x, Y ) (Y E[Y ])] E[X] E[X] cov(x, Y ) E[Y E[Y ]] 0. Hence, we have proven the desired property. (c) E[(X ˆX)h(Y )] E[(X E[X Y ])h(y )], by definition of ˆX. Now applying the linearity of expectations, we get E[Xh(Y )] E[E[X Y ]h(y )] E[Xh(Y )] E[E[Xh(Y ) Y ]] σ Y E[Xh(Y )] E[Xh(Y )] 0 where the first equality is obtained by noticing that E[X Y ] is taken with respect to X thus allowing h(y ) to be pulled into the expectation. The second equality results from the law of iterated expectations. 8

(d) No. Consider the following counterexample (as in problem 8.3). Let X and Y be discrete random variables with the following joint pmf { p X,Y (x, y) 4 for (x, y) (, 0), (0, ), (, 0), (0, ) 0 otherwise. Notice that E[X Y y] 0, for all feasible y. So, E[X Y ] 0. More precisely, E[X Y ] is a random variable that equals zero with probability one. So, we have X E[X Y ] X (where the equality refers to equality in distribution). X and Y are not independent, so X E[X Y ] is not independent of Y. Problem 8.6 Let random variables X and Y have the bivariate normal PDF { ( x f X,Y (x, y) π ρ exp ρxy + y ) } ( ρ, ) < x, y <, where ρ denotes the correlation coefficient between X and Y. (a) Determine the numerical values of E[X], var(x), E[Y ] and var(y ). (b) Show that X and Z (Y ρx)/ ρ are independent normal random variables, and determine the numerical values of E[Z] and var(z). (c) Deduce that Solution ( P {X > 0} ) {Y > 0} 4 + π sin ρ.. Two random variables X and Y are said to be jointly normal if they can be expressed in the form X au + bv Y cu + dv where U and V are independent normal random variable (Section 4.7 in text) The bivariate normal PDF of X and Y has the form ( ) (x E[X]) ρ(x E[X])(y E[Y ]) (y E[Y ]) f X,Y (x, y) π exp ρ σ X σ Y σx σ X σ Y + ( ρ ), < x, y <, where E[X],E[Y ], σx, σ Y are the mean and variance of X and Y. By inspection, E[X] E[Y ] 0 and var(x) var(y ) for { ( x f X,Y (x, y) π ρ exp ρxy + y ) } ( ρ, < x, y <. ) 9

. From part (a), X is normal since it is a linear combination of two independent normal random variables. Since Z is a linear combination of X and Y, it can also be expressed as a linear combination of independent random variables U and V. Therefore, Z is also normal. To show that X and Z are independent, we need to show that cov(x, Z) 0. (Independence implies zero covariance, but the converse is not true except for normal random variables). E[Z] σ Z Z Y ρ E[Y ] ρ E[X] 0 ρ ρ ρx ρ σ Y ρ ρcov(x, Y ) ρ + ρ σx ρ ρ ρ ρ + ρ ρ ρ ρ Solving for the covariance of X and Z, we have cov(x, Z) E[(X E[X])(Z E[Z])] Y E[XZ] E[X ( ρx )] ρ ρ E[XY ] ρ ρe[x ] ρ E[XY ] ρ ρ ρ E[XY ] ρ E[XY ] ρ 0 Therefore, X and Z are independent. 3. Since Z Y ρx, if Y > 0, then Z > ρx ρ ρ. Therefore we have, P(X > 0, Y > 0) P(X > 0, Z > ρx ) ρ Changing to polar coordinates yields P(X > 0, Y > 0) π θα π α r0 X>0 π Z> ρx ρ r exp rdrdθ π (x +z ) exp dzdx π dθ where α ρ tan ( ) ρ sin ρ 4 + π sin ρ 0