UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7

Similar documents
UCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

UCSD ECE153 Handout #30 Prof. Young-Han Kim Thursday, May 15, Homework Set #6 Due: Thursday, May 22, 2011

UCSD ECE153 Handout #34 Prof. Young-Han Kim Tuesday, May 27, Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei)

Solutions to Homework Set #6 (Prepared by Lele Wang)

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y)

Chapter 6: Random Processes 1

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

Stochastic Processes

ECE Homework Set 3

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

Chapter 4 Random process. 4.1 Random process

Final Examination Solutions (Total: 100 points)

Random Processes Why we Care

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process.

Probability and Statistics

Introduction to Probability and Stochastic Processes I

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

UCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei)

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

EEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2:

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Exercises with solutions (Set D)

Final. Fall 2016 (Dec 16, 2016) Please copy and write the following statement:

Lecture 1: August 28

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

Problem Sheet 1 Examples of Random Processes

Gaussian, Markov and stationary processes

ECE-340, Spring 2015 Review Questions

PROBABILITY AND RANDOM PROCESSESS

16.584: Random (Stochastic) Processes

Name of the Student: Problems on Discrete & Continuous R.Vs

Probability and Statistics for Final Year Engineering Students

Properties of the Autocorrelation Function

Appendix A : Introduction to Probability and stochastic processes

ECE534, Spring 2018: Solutions for Problem Set #5

UCSD ECE 153 Handout #20 Prof. Young-Han Kim Thursday, April 24, Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei)

MA6451 PROBABILITY AND RANDOM PROCESSES

Quick Tour of Basic Probability Theory and Linear Algebra

Multivariate probability distributions and linear regression

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

2. (a) What is gaussian random variable? Develop an equation for guassian distribution

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

Module 9: Stationary Processes


Chapter 3: Random Variables 1

Ph.D. Qualifying Exam Monday Tuesday, January 4 5, 2016

Name of the Student: Problems on Discrete & Continuous R.Vs

Multivariate distributions

ENGR352 Problem Set 02

3F1 Random Processes Examples Paper (for all 6 lectures)

Chapter 6 - Random Processes

Massachusetts Institute of Technology

Chapter 2 Random Processes

6.041/6.431 Fall 2010 Quiz 2 Solutions

ECE Lecture #10 Overview

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

ECE 636: Systems identification

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.

FINAL EXAM: 3:30-5:30pm

Homework 4 Solution, due July 23

Chapter 6. Random Processes

Twelfth Problem Assignment

Lecture Notes 3 Convergence (Chapter 5)

Homework for 1/13 Due 1/22

Conditional distributions (discrete case)

Statistical signal processing

Gaussian Basics Random Processes Filtering of Random Processes Signal Space Concepts

Stochastic Processes. Chapter Definitions

Chapter 3: Random Variables 1

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Test Code: STA/STB (Short Answer Type) 2013 Junior Research Fellowship for Research Course in Statistics

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

THE QUEEN S UNIVERSITY OF BELFAST

ECE 541 Stochastic Signals and Systems Problem Set 11 Solution

ECE 673-Random signal analysis I Final

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

4 Derivations of the Discrete-Time Kalman Filter

Lecture 2: Repetition of probability theory and statistics

Question Paper Code : AEC11T03

Recursive Estimation

Things to remember when learning probability distributions:

Midterm Exam 1 Solution

ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1

Econometría 2: Análisis de series de Tiempo

Mathematical Methods for Computer Science

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

Random Process. Random Process. Random Process. Introduction to Random Processes

EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix)

1 Random Variable: Topics

[POLS 8500] Review of Linear Algebra, Probability and Information Theory

Transcription:

UCSD ECE50 Handout #4 Prof Young-Han Kim Wednesday, June 6, 08 Solutions to Exercise Set #7 Polya s urn An urn initially has one red ball and one white ball Let X denote the name of the first ball drawn from the urn Replace that ball and one like it Let X denote the name of the next ball drawn Replace it and one like it Continue, drawing and replacing (a) Argue that the probability of drawing k reds follwed by n k whites is 3 k k+ (n k) (k +) (n+) k!(n k)! (n+)! ( (n+) nk ) (b) Let P n be the proportion of red balls in the urn after the n th drawing Argue that Pr{P n k n+ } n+, for k,,,n + Thus all proportions are equally probable This shows that P n tends to a uniformly distributed random Variable in distribution, ie, lim Pr{P n t} t, 0 t n (c) Whatcan yousay aboutthebehaviorof theproportionp n ifyou startedinitially withoneredballintheurnandtwowhiteballs? Specifically, whatisthelimiting distribution of P n? Can you show Pr{P n k n+3 } k n+, for k,,,n+? Solution: (a) Let r be the number of red balls, w the number of white balls and t the total number of balls r +w in the urn The picture of the balls in the urn for each drawing is given below The ratio r/t or w/t gives the probability for the next drawing

r/t / st drawing r/t /3 d drawing r/t 3/4 (k ) th drawing r/t k/k + k th drawing w/t /k + (k+) th drawing w/t /k +3 (k+) th drawing w/t 3/k +4 (n ) th drawing n th drawing w/t n k/n+ The probability of drawing k reds followed by n k whites is 3 k (k +) (n k) (k +) (n+) k!(n k)! (n+)! ( n k) (n+) In fact, regardless of the ordering, the probability of drawing k reds in n draws is given by this expression as well Note that after the n th drawing, there are k + red balls and n k + white balls for a total of n+ balls in the urn (b) After the n th drawing, where k red balls have been drawn, the proportion P n of red balls is number of red balls in the urn P n total number of balls in the urn k+ n+ There are ( n k) orderings of outcomes with k red balls and n k white balls, and each ordering has the same probability In fact, in the expression of the probability of drawing k reds followed by n k whites, there will be just permutations of the numerators in the case of a different sequence of drawings Therefore { Pr P n k+ } n+ ( ) n k n+ ( ( n k ) ) (n+) for k 0,,n All the proportions are equally probable (the probability does not depend on k) and P n tends to a uniformly distributed random Variable Unif[0,] in distribution

(c) If there are one red ball and two white balls to start with: r/t /3 st drawing r/t /4 d drawing r/t 3/5 (k ) th drawing r/t k/k + k th drawing w/t /k +3 (k +) th drawing w/t 3/k +4 (k +) th drawing w/t 4/k +5 (n ) th drawing n th drawing The probability of drawing k reds followed by n k whites is w/t n k+/n+ 3 4 k (k +) (n k+) k!(n k +)! (k +3) (n+) (n+)! After the n th drawing, number of red balls in the urn P n total number of balls in the urn k+ n+3 Similarly as in part (b), { Pr P n k + } n+3 ( ) ( ) n k!(n k+)! k (n+)! ( ) n! k!(n k +)! k!(n k)! (n+)! (n k+) (n+)(n+) The distribution is linearly decreasing in p k+ Pr{P n p} ( (n+3) (n+)(n+) n+3 ) p+ ( ) n+ Therefore the limiting distribution of P n as n goes to infinity is a triangular distribution with density f Pn (p) ( p) 3

Note: the Polya urn for this problem generates a process that is identical to the following mixture of Bernoulli s Θ f Θ (θ) ( θ), 0 θ < Let X i Bernoulli(Θ), then P n converges to Θ as n tends to infinity Considering the general case of λ red balls and λ white balls to start with, the limiting distribution is f Θ (θ) C(λ,λ )θ λ ( θ) λ where C(λ,λ ) is a function of λ and λ Symmetric random walk Let X n be a random walk defined by X 0 0, n X n Z i, where Z,Z, are iid with P{Z } P{Z } (a) Find P{X 0 0} (b) Approximate P{ 0 X 00 0} using the central limit theorem (c) Find P{X n k} Solution: (a) Since the event {X 0 0} is equivalent to {Z Z 0 }, we have P{X 0 0} 0 (b) Since E(Z j ) 0 and E(Zj ), by the central limit theorem, { ( ) } 00 P{ 0 X 00 0} P Z i 00 (c) i i Q() Φ() 068 P{X n k} P{(n+k)/ heads in n independent coin tosses} ( ) n n+k n for n k n with n+k even 4

3 Absolute-value random walk ConsiderthesymmetricrandomwalkX n intheprevious problem Define the absolute value random process Y n X n (a) Find P{Y n k} (b) Find P{max i<0 Y i 0 Y 0 0} Solution: (a) If k 0 then P{Y n k} P{X n +k or X n k} If k > 0 then P{Y n k} P{X n k}, while P{Y n 0} P{X n 0} Thus ( n n (n+k)/)( ) k > 0, n k is even, n k 0 ( P{Y n k} n n n/)( ) k 0, n is even, n 0 0 otherwise (b) If Y 0 X 0 0 then there are only two sample paths with max i<0 X i 0 that is, Z Z Z 0 +,Z Z 0 or Z Z Z 0,Z Z 0 + Since the total number of sample paths is ( 0 0) and all paths are equally likely, P { max Y i 0 Y 0 0 } ( i<0 0 ) 0 84756 9378 4 Discrete-time Wiener process Let Z n, n 0 be a discrete time white Gaussian noise (WGN) process, ie, Z,Z, are iid N(0,) Define the process X n, n as X 0 0, and X n X n +Z n for n (a) Is X n an independent increment process? Justify your answer (b) Is X n a Gaussian process? Justify your answer (c) Find the mean and autocorrelation functions of X n (d) Specify the first order pdf of X n (e) Specify the joint pdf of X 3,X 5, and X 8 (f) Find E(X 0 X,X,,X 0 ) (g) Given X 4, X, and 0 X 3 4, find the minimum MSE estimate of X 4 Solution: 5

(a) Yes The increments X n, X n X n,,x nk X nk are sums of different Z i random variables, and the Z i are IID (b) Yes Any set of samples of X n, n are obtained by a linear transformation of IID N(0,) random variables and therefore all nth order distributions of X n are jointly Gaussian (it is not sufficient to show that the random variable X n is Gaussian for each n) (c) We have [ n ] E[X n ] E Z i i n E[Z i ] i R X (n,n ) E[X n X n ] n n E Z i i j Z j n n E[Z i Z j ] i j min(n,n ) i min(n,n ) n 0 0, i E(Z i) (IID) (d) As shown above, X n is Gaussian with mean zero and variance Thus, X n N(0,n) Var(X n ) E[X n ] E [X n ] R X (n,n) 0 n Cov(X n,x n ) E(X n X n ) E(X n )E(X n ) min(n,n ) Therefore, X n and X n are jointly ( Gaussian random variables ) with mean µ [0 0] T n and covariance matrix Σ min(n,n ) min(n,n ) n (e) X n, n is a zero mean Gaussian random process Thus X 3 E[X 3 ] R X (3,3) R X (3,5) R X (3,8) X 5 N E[X 5 ], R X (5,3) R X (5,5) R X (5,8) X 8 E[X 8 ] R X (8,3) R X (8,5) R X (8,8) 6

Substituting, we get X 3 0 3 3 3 X 5 N 0, 3 5 5 X 8 0 3 5 8 (f) Since X n is an independent increment process, E(X 0 X,X,,X 0 ) E(X 0 X 0 +X 0 X,X,,X 0 ) E(X 0 X 0 X,X,,X 0 )+E(X 0 X,X,,X 0 ) E(X 0 X 0 )+X 0 0+X 0 X 0 (g) The MMSE estimate of X 4 given X x, X x and a X 3 b equals E[X 4 X x,x x,a X 3 b] Thus, the MMSE estimate is given by [ ] {X E X 4 x,x x,a X 3 b} [ ] {X E X +Z 3 +Z 4 x,x x,a X +Z 3 b} [ ] {a x x +E Z 3 Z 3 b x } +E[Z 4 ] ( since Z 3 is independent of (X,X ), and Z 4 is independent of (X,X,Z 3 ) ) [ ] {a x x +E Z 3 Z 3 b x } Plugging in the values, the required MMSE estimate is ˆX [ {Z3 4 +E Z 3 ] [,]} We have f Z3 (z 3 ) f Z3 Z 3 [,](z 3 ) P(Z 3 [,]), z 3 [,], 0, otherwise [ ] {Z3 which is symmetric about z 3 0 Thus, E Z 3 [,]} 0 and we have ˆX 4 5 A random process Let X n Z n + Z n for n, where Z 0,Z,Z, are iid N(0,) (a) Find the mean and autocorrelation function of {X n } 7

(b) Is {X n } Gaussian? Justify your answer (c) Find E(X 3 X,X ) (d) Find E(X 3 X ) (e) Is {X n } Markov? Justify your answer (f) Is {X n } independent increment? Justify your answer Solution: (a) E(X n ) E(Z n )+E(Z n ) 0 R X (m,n) E(X m X n ) E[(Z m +Z m )(Z n +Z n )] E[Zn ], n m E[Zn ]+E[Z n ], n m E[Zn], m n 0, otherwise, n m, n m 0, otherwise (b) Since (X,,X n ) is a linear transform of a GRV (Z 0,Z,,Z n ), the process is Gaussian (c) Since the process is Gaussian, the conditional expectation (MMSE estimate) is linear Hence, E(X 3 X,X ) [ 0 ][ ] [ ] X 3 (X X ) (d) Similarly, E(X 3 X ) (/)X (e) Since E(X 3 X,X ) E(X 3 X ), the process is not Markov (f) Since the process is not Markov, it is not independent increment X 6 Autoregressive process Let X 0 0 and X n X n +Z n for n, wherez,z, are iid N(0,) Find the mean and autocorrelation function of X n 8

Solution: E[X n ] E[X n ]+E[Z n ] 4 E[X n ]+ E[Z n ] n E[X ] n E[Z ] 0 For n > m we can write X n n mx m + Therefore, n m Z m+ ++ Z n +Z n n m n mx m + R X (n,m) E(X n X m ) (n m) E[X m], i0 iz n i since X m and Z n i, i 0,,n m, are independent and E[X m ] E[Z n i ] 0 To find E[X m ] consider Thus in general, E[X ], E[X ] 4 E[X ]+E[Z ] 4 +, E[X n ] 4 n ++ 4 + 4 3 ( 4 n) R X (n,m) E(X n X m ) n m 4 3 [ ( 4 )min{n,m} ] 7 Moving average process Let Z 0,Z,Z, be iid N(0,) (a) Let X n Z n +Z n for n Find the mean and autocorrelation function of X n (b) Is {X n } wide-sense stationary? Justify your answer (c) Is {X n } Gaussian? Justify your answer (d) Is {X n } strict-sense stationary? Justify your answer (e) Find E(X 3 X,X ) (f) Find E(X 3 X ) (g) Is {X n } Markov? Justify your answer (h) Is {X n } independent increment? Justify your answer (i) Let Y n Z n +Z n for n Find the mean and autocorrelation function of Y n 9

(j) Is {Y n } wide-sense stationary? Justify your answer (k) Is {Y n } Gaussian? Justify your answer (l) Is {Y n } strict-sense stationary? Justify your answer (m) Find E(Y 3 Y,Y ) (n) Find E(Y 3 Y ) (o) Is {Y n } Markov? Justify your answer (p) Is {Y n } independent increment? Justify your answer Solution: (a) E(X n ) E(Z n )+E(Z n ) 0 R X (m,n) E(X m X n ) [( )( )] E Z n +Z n Z m +Z m E[Z n ], n m 4 E[Z n ]+E[Z n ], n m E[Z n], m n 0, otherwise 5 4, n m, n m 0, otherwise (b) Since the mean and autocorrelation functions are time-invariant, the process is WSS (c) Since (X,,X n ) is a linear transform of a GRV (Z 0,Z,,Z n ), the process is Gaussian (d) Since the process is WSS and Gaussian, it is SSS (e) Since the process is Gaussian, the conditional expectation (MMSE estimate) is linear Hence, E(X 3 X,X ) [ 0 (f) Similarly, E(X 3 X ) (/5)X ] [ 5 4 5 4 ] [ ] X X (0X 4X ) 0

(g) Since E(X 3 X,X ) E(X 3 X ), the process is not Markov (h) Since the process is not Markov, it is not independent increment (i) The mean function µ n E(Y n ) 0 The autocorrelation function n m, R Y (n,m) n m or n m+, 0 otherwise (j) Since the mean and autocorrelation functions are time-invariant, the process is WSS (k) Since (Y,,Y n ) is a linear transform of a GRV (Z 0,Z,,Z n ), the process is Gaussian (l) Since the process is WSS and Gaussian, it is SSS (m) Since the process is Gaussian, the conditional expectation (MMSE estimate) is linear Hence, (n) Similarly, E(Y 3 Y ) (/)Y E(Y 3 Y,Y ) [ 0 ][ ] [ ] Y 3 (Y Y ) (o) Since E(Y 3 Y,Y ) E(Y 3 Y ), the process is not Markov (p) Since the process is not Markov, it is not independent increment Y 8 Gauss-Markov process Let X 0 0 and X n X n +Z n for n, wherez,z, are iid N(0,) Find the mean and autocorrelation function of X n Solution: Using the method of lecture notes 8, we can easily verify that E(X n ) 0 for every n and that R X (n,m) E(X n X m ) n m 4 3 [ ( 4 )min{n,m} ] 9 QAM random process Consider the random process X(t) Z cosωt+z sinωt, < t <, wherez andz areiiddiscreterandomvariablessuchthatp Zi (+) p Zi ( ) (a) Is X(t) wide-sense stationary? Justify your answer (b) Is X(t) strict-sense stationary? Justify your answer

Solution: (a) We first check the mean E(X(t)) E(Z )cosωt+e(z )sinωt 0 cos(ωt) + 0 sin(ωt) 0 The mean is independent of t Next we consider the autocorrelation function E(X(t+τ)X(t)) E((Z cos(ω(t+τ))+z sin(ω(t+τ)))(z cos(ωt)+z sin(ωt))) E(Z )cos(ω(t+τ))cos(ωt)+e(z )sin(ω(t+τ))sin(ωt) cos(ω(t+τ))cos(ωt)+sin(ω(t+τ))sin(ωt) cos(ω(t+τ) ωt)) cosωτ The autocorrelation function is also time invariant Therefore X(t) is WSS (b) Note that X(0) Z cos0 + Z sin0 Z, so X(0) has the same pmf as Z On the other hand, X( π 4ω ) Z cos(π/4)+z (sinπ/4) (Z +Z ) wp 4 0 wp wp 4 This shows that X(π/4ω) does not have the same pdf or even same range as X(0) Therefore X(t) is not first-order stationary and as a result is not SSS 0 Mixture of two WSS processes Let X(t) and Y(t) be two zero-mean WSS processes with autocorrelation functions R X (τ) and R Y (τ), respectively Define the process { X(t), with probability Z(t) Y(t), with probability Find the mean and autocorrelation functions for Z(t) Is Z(t) a WSS process? Justify your answer Solution: To show that Z(t) is WSS, we show that its mean and autocorrelation

functions are time invariant Consider µ Z (t) E[Z(t)] E(Z Z X)P{Z X}+E(Z Z Y)P{Z Y} (µ X +µ Y ) 0, and similarly R Z (t+τ,t) E[Z(t+τ)Z(t)] (R X(τ)+R Y (τ)) Since µ Z (t) is independent of time and R Z (t+τ,t) depends only on τ, Z(t) is WSS 3