UCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7

Similar documents
UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

UCSD ECE153 Handout #30 Prof. Young-Han Kim Thursday, May 15, Homework Set #6 Due: Thursday, May 22, 2011

UCSD ECE153 Handout #34 Prof. Young-Han Kim Tuesday, May 27, Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei)

Solutions to Homework Set #6 (Prepared by Lele Wang)

Lecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs

Chapter 6: Random Processes 1

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

6.041/6.431 Fall 2010 Quiz 2 Solutions

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

UCSD ECE 153 Handout #20 Prof. Young-Han Kim Thursday, April 24, Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei)

Lecture 1: August 28

Introduction to Probability and Stochastic Processes I

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

ECE Homework Set 3

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

1 Random Variable: Topics

Things to remember when learning probability distributions:

Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process.

Random Processes Why we Care

ECE534, Spring 2018: Solutions for Problem Set #5

UCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei)

Chapter 4 : Expectation and Moments

Chapter 3: Random Variables 1

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

1.1 Review of Probability Theory

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

THE QUEEN S UNIVERSITY OF BELFAST

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

18.440: Lecture 28 Lectures Review

Midterm Exam 1 Solution

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

Chapter 4 Random process. 4.1 Random process

Multivariate distributions

ECE534, Spring 2018: Solutions for Problem Set #3

Mathematical Methods for Computer Science

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes

University of Illinois ECE 313: Final Exam Fall 2014

Final Examination Solutions (Total: 100 points)

Exercises with solutions (Set D)

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

ECE Lecture #10 Overview

Quick Tour of Basic Probability Theory and Linear Algebra

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

UNIT-2: MULTIPLE RANDOM VARIABLES & OPERATIONS

Chapter 6 - Random Processes

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

Solutions to Set #2 Data Compression, Huffman code and AEP

We introduce methods that are useful in:

Chapter 3: Random Variables 1

Chapter 2: Random Variables

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

Appendix A : Introduction to Probability and stochastic processes

Solutions to Homework Set #3 (Prepared by Yu Xiang) Let the random variable Y be the time to get the n-th packet. Find the pdf of Y.

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.

Test Code: STA/STB (Short Answer Type) 2013 Junior Research Fellowship for Research Course in Statistics

Conditional distributions (discrete case)

Final. Fall 2016 (Dec 16, 2016) Please copy and write the following statement:

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

Discrete Random Variables

MAS223 Statistical Inference and Modelling Exercises

i=1 k i=1 g i (Y )] = k f(t)dt and f(y) = F (y) except at possibly countably many points, E[g(Y )] = f(y)dy = 1, F(y) = y

18.440: Lecture 28 Lectures Review

8 Laws of large numbers

EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix)

[POLS 8500] Review of Linear Algebra, Probability and Information Theory

Multivariate probability distributions and linear regression

Gaussian, Markov and stationary processes

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Homework 4 Solution, due July 23

Discrete Probability Refresher

ECE 313: Conflict Final Exam Tuesday, May 13, 2014, 7:00 p.m. 10:00 p.m. Room 241 Everitt Lab

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

Probability reminders

ECE-340, Spring 2015 Review Questions

4 Sums of Independent Random Variables

Discrete Distributions

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

Algorithms for Uncertainty Quantification

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

Twelfth Problem Assignment

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

M378K In-Class Assignment #1

5 Operations on Multiple Random Variables

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157

conditional cdf, conditional pdf, total probability theorem?

6.1 Moment Generating and Characteristic Functions

Probability and Statistics

Transcription:

UCSD ECE50 Handout #0 Prof. Young-Han Kim Monday, February 6, 07 Solutions to Exercise Set #7. Minimum waiting time. Let X,X,... be i.i.d. exponentially distributed random variables with parameter λ, i.e., f Xi (x) λe λx, for x 0. (a) Does Y n min{x,x,...,x n } converge in probability as n approaches infinity? (b) If it converges what is the limit? (c) What about Z n ny n? (a) For any set of values X i s, the sequence Y n is monotonically decreasing in n. Since the random variables are nonnegative, it is reasonable to guess that Y n converges to 0. Now Y n will converge in probability to 0 if and only if for any ǫ > 0, lim n P{ Y n > ǫ} 0. P{ Y n > ǫ} P{Y n > ǫ} P{X > ǫ,x > ǫ,..,x n > ǫ} P{X > ǫ}p{x > ǫ}...p{x n > ǫ} ( F X (ǫ))( F X (ǫ))...( F X (ǫ)) ( F X (ǫ)) n ( ( e λǫ )) n e λǫn. As n goes to infinity (for any finite ǫ > 0) this converges to zero. Therefore Y n converges to 0 in probability. (b) The limit to which Y n converges in probability is 0. (c) Does Z n ny n converges to 0 in probability? No, it does not. In fact, P{ Z n > ǫ} P{nY n > ǫ} P{Y n > ǫ n } e λ ǫ n n e λǫ which does not depend on n. So Z n does not converge to 0 in probability. Note that the distribution of Z n is exponential with parameter λ, the same as the distribution of X i. F Zn (z) P{Z n < z} e λz.

In conclusion, if X i s are i.i.d. exp(λ), then Y n min i n {X i} exp(nλ) and Thus but Z n ny n exp(λ). P{Y n > ǫ} e λǫn 0, so Y n 0 in probability, P{Z n > ǫ} e λǫ 0, so Z n 0 in probability.. Roundoff errors. The sum of a list of 00 real numbers is to be computed. Suppose that these numbers are rounded off to the nearest integer so that each number has an error that is uniformly distributed in the interval ( 0.5, 0.5). Use the central limit theorem to estimate the probability that the total error in the sum of the 00 numbers exceeds 6. Errorsareindependentanduniformlydistributedin( 0.5,0.5), i.e. e i Unif( 0.5,0.5). Usingthe Central Limit Theorem, thetotal error e 00 i e i can beapproximated as a Gaussian random Variable with mean and Variance E(e) σ e 0 00 E(e i ) i 00 Var(e i ) i 00 8.33. Therefore the probability of the given event is { e E(e) P{e > 6} P σ e ( ) 6 Q 8.33 > 6 0 } 8.33 0.088. If the error is interpreted as an absolute difference, then P{ e > 6} P{e > 6} 0.0376.

3. The signal received over a wireless communication channel can be represented by two sums X n n Z j cosθ j, and n j X n n Z j sinθ j, n j where Z,Z,... are i.i.d. with mean µ and variance [ σ and Θ,Θ,... are i.i.d. U[0,π Xn independent of Z,Z,... Find the distribution of as n approaches. Since Z,Z,... are i.i.d. and Θ,Θ,... are also i.i.d. and independent of Z,Z,..., [ [ Z cosθ Z cosθ, Z sinθ Z sinθ [ Zj cosθ Let Y j j Z j sinθ j,... are i.i.d. random vectors.. We have Similarly, E[Z j sinθ j 0. Thus, E[Y j 0. Moreover, we have X n E[Z j cosθ j E[Z j E[cosΘ j µ 0. π 0 π cosθdθ E[(Z j cosθ j ) E[Z je[cos Θ j Similarly, E[(Z j sinθ j ) µ +σ. Also, (µ +σ ) µ +σ. π 0 π cos θdθ E[(Z j cosθ j ) (Z j sinθ j ) E[Z j E[cosΘ j sinθ j (µ +σ ) 0. π 0 π cosθsinθdθ 3

Thus, Now, [ Xn X n Cov(Y j ) E[Y j Y T j E[Y j E[Y j T E[Y j Yj T [ E[(Z j cosθ j ) E[(Z j cosθ j ) (Z j sinθ j ) E[(Z j cosθ j ) (Z j sinθ j ) E[(Z j sinθ j ) µ +σ I. n Y j, and thus, by the central limit theorem, as n, n j [ Xn X n N (0, µ +σ ) I in distribution. 4. Polya urn. An urn initially has one red ball and one white ball. Let X denote the name of the first ball drawn from the urn. Replace that ball and one like it. Let X denote the name of the next ball drawn. Replace it and one like it. Continue, drawing and replacing. (a) Argue that the probability of drawing k reds follwed by n k whites is 3... k k+ (n k)... (k +) (n+) k!(n k)! (n+)! ( (n+) nk ). (b) Let P n be the proportion of red balls in the urn after the n th drawing. Argue that Pr{P n k n+ } n+, for k,,...,n+. Thus all proportions are equally probable. This shows that P n tends to a uniformly distributed random Variable in distribution, i.e., lim Pr{P n t} t, 0 t. n (c) What can you say about the behavior of the proportion P n if you started initially with one red ball in the urn and two white balls? Specifically, what is the limiting distribution of P n? Can you show Pr{P n k n+3 } k n+, for k,,...,n+? (a) Let r be the number of red balls, w the number of white balls and t the total number of balls r +w in the urn. The picture of the balls in the urn for each drawing is given below. The ratio r/t or w/t gives the probability for the next drawing. 4

r/t / st drawing r/t /3 d drawing r/t 3/4... (k ) th drawing r/t k/k + k th drawing w/t /k + (k+) th drawing w/t /k +3 (k+) th drawing w/t 3/k +4.. (n ) th drawing n th drawing The probability of drawing k reds followed by n k whites is. w/t n k/n+ 3... k (k +) (n k)... (k +) (n+) k!(n k)! (n+)! ( n. k) (n+) In fact, regardless of the ordering, the probability of drawing k reds in n draws is given by this expression as well. Note that after the n th drawing, there are k+ red balls and n k+ white balls for a total of n+ balls in the urn. (b) After the n th drawing, where k red balls have been drawn, the proportion P n of red balls is number of red balls in the urn P n total number of balls in the urn k+ n+. There are ( n k) orderings of outcomes with k red balls and n k white balls, and each ordering has the same probability. In fact, in the expression of the probability of drawing k reds followed by n k whites, there will be just permutations of the numerators in the case of a different sequence of drawings. Therefore { Pr P n k+ } n+ ( ) n k n+ ( ( n k ) ) (n+) for k 0,...,n. All the proportions are equally probable (the probability does not depend on k) and P n tends to a uniformly distributed random Variable Unif[0, in distribution. (c) If there are one red ball and two white balls to start with: 5

r/t /3 st drawing r/t /4 d drawing r/t 3/5... (k ) th drawing r/t k/k + k th drawing w/t /k +3 (k +) th drawing w/t 3/k +4 (k +) th drawing w/t 4/k +5.. (n ) th drawing n th drawing The probability of drawing k reds followed by n k whites is After the n th drawing,. w/t n k+/n+ 3 4... k (k +) (n k+) k!(n k +)!... (k +3) (n+) (n+)! Similarly as in part (b), { Pr P n k + } n+3 number of red balls in the urn P n total number of balls in the urn k+ n+3. The distribution is linearly decreasing in p k+ Pr{P n p} ( ) ( ) n k!(n k+)! k (n+)! ( ) n! k!(n k +)! k!(n k)! (n+)! (n k+) (n+)(n+). n+3. ( ) ( ) (n+3) p+. (n+)(n+) n+ Therefore the limiting distribution of P n as n goes to infinity is a triangular distribution with density f Pn (p) ( p). Note: the Polya urn for this problem generates a process that is identical to the following mixture of Bernoulli s Θ f Θ (θ) ( θ), 0 θ <. Let X i Bernoulli(Θ), then P n converges to Θ as n tends to infinity. Considering the general case of λ red balls and λ white balls to start with, the limiting distribution is where C(λ,λ ) is a function of λ and λ. f Θ (θ) C(λ,λ )θ λ ( θ) λ 6

5. Symmetric random walk. Let X n be a random walk defined by X 0 0, n X n Z i, where Z,Z,... are i.i.d. with P{Z } P{Z }. (a) Find P{X 0 0}. (b) Approximate P{ 0 X 00 0} using the central limit theorem. (c) Find P{X n k}. (a) Since the event {X 0 0} is equivalent to {Z Z 0 }, we have P{X 0 0} 0. (b) Since E(Z j ) 0 and E(Zj ), by the central limit theorem, { ( ) } 00 P{ 0 X 00 0} P Z i 00 (c) i i Q() Φ() 0.68. P{X n k} P{(n+k)/ heads in n independent coin tosses} ( ) n n+k n for n k n with n+k even. 6. Absolute-value random walk. Consider the symmetric random walk X n in the previous problem. Define the absolute value random process Y n X n. (a) Find P{Y n k}. (b) Find P{max i<0 Y i 0 Y 0 0}. (a) If k 0 then P{Y n k} P{X n +k or X n k}. If k > 0 then P{Y n k} P{X n k}, while P{Y n 0} P{X n 0}. Thus ( n n (n+k)/)( ) k > 0, n k is even, n k 0 ( P{Y n k} n n n/)( ) k 0, n is even, n 0 0 otherwise. 7

(b) If Y 0 X 0 0 then there are only two sample paths with max i<0 X i 0 that is, Z Z Z 0 +,Z Z 0 or Z Z Z 0,Z Z 0 +. Since the total number of sample paths is ( 0 0) and all paths are equally likely, P { max Y i 0 Y 0 0 } ( i<0 0 ) 0 84756 9378. 7. Discrete-time Wiener process. Let Z n, n 0 be a discrete time white Gaussian noise (WGN) process, i.e., Z,Z,... are i.i.d. N(0,). Define the process X n, n as X 0 0, and X n X n +Z n for n. (a) Is X n an independent increment process? Justify your answer. (b) Is X n a Gaussian process? Justify your answer. (c) Find the mean and autocorrelation functions of X n. (d) Specify the first order pdf of X n. (e) Specify the joint pdf of X 3,X 5, and X 8. (f) Find E(X 0 X,X,...,X 0 ). (g) Given X 4, X, and 0 X 3 4, find the minimum MSE estimate of X 4. (a) Yes. The increments X n, X n X n,...,x nk X nk are sums of different Z i random variables, and the Z i are IID. (b) Yes. Any set of samples of X n, n are obtained by a linear transformation of IID N(0,) random variables and therefore all nth order distributions of X n are jointly Gaussian (it is not sufficient to show that the random variable X n is Gaussian for each n). (c) We have [ n E[X n E Z i i n E[Z i i R X (n,n ) E[X n X n n n E Z i i j Z j n n E[Z i Z j i j min(n,n ) i min(n,n ). n 0 0, i E(Z i) (IID) 8

(d) As shown above, X n is Gaussian with mean zero and variance Var(X n ) E[Xn E [X n R X (n,n) 0 n. Thus, X n N(0,n). Cov(X n,x n ) E(X n X n ) E(X n )E(X n ) min(n,n ). Therefore, X n and X n are ( jointly Gaussian random variables ) with mean µ [0 0 T n and covariance matrix Σ min(n,n ). min(n,n ) n (e) X n, n is a zero mean Gaussian random process. Thus X 3 E[X 3 R X (3,3) R X (3,5) R X (3,8) X 5 N E[X 5, R X (5,3) R X (5,5) R X (5,8) X 8 E[X 8 R X (8,3) R X (8,5) R X (8,8) Substituting, we get X 3 0 3 3 3 X 5 N 0, 3 5 5. X 8 0 3 5 8 (f) Since X n is an independent increment process, E(X 0 X,X,...,X 0 ) E(X 0 X 0 +X 0 X,X,...,X 0 ) E(X 0 X 0 X,X,...,X 0 )+E(X 0 X,X,...,X 0 ) E(X 0 X 0 )+X 0 0+X 0 X 0. (g) The MMSE estimate of X 4 given X x, X x and a X 3 b equals E[X 4 X x,x x,a X 3 b. Thus, the MMSE estimate is given by [ {X E X 4 x,x x,a X 3 b} [ {X E X +Z 3 +Z 4 x,x x,a X +Z 3 b} [ {a x x +E Z 3 Z 3 b x } +E[Z 4 ( since Z 3 is independent of (X,X ), and Z 4 is independent of (X,X,Z 3 ) ) [ {a x x +E Z 3 Z 3 b x }. 9

Plugging in the values, the required MMSE estimate is ˆX [ {Z3 4 +E Z 3 [,}. We have f Z3 (z 3 ) f Z3 Z 3 [,(z 3 ) P(Z 3 [,), z 3 [,, 0, otherwise [ {Z3 which is symmetric about z 3 0. Thus, E Z 3 [,} 0 and we have ˆX 4. 8. A random process. Let X n Z n +Z n for n, where Z 0,Z,Z,... are i.i.d. N(0,). (a) Find the mean and autocorrelation function of {X n }. (b) Is {X n } Gaussian? Justify your answer. (c) Find E(X 3 X,X ). (d) Find E(X 3 X ). (e) Is {X n } Markov? Justify your answer. (f) Is {X n } independent increment? Justify your answer. (a) E(X n ) E(Z n )+E(Z n ) 0. R X (m,n) E(X m X n ) E[(Z m +Z m )(Z n +Z n ) E[Zn, n m E[Zn +E[Z n, n m E[Zn, m n 0, otherwise, n m, n m 0, otherwise. (b) Since (X,...,X n ) is a linear transform of a GRV (Z 0,Z,...,Z n ), the process is Gaussian. (c) Since the process is Gaussian, the conditional expectation (MMSE estimate) is linear. Hence, E(X 3 X,X ) [ 0 [ [ X 3 (X X ). (d) Similarly, E(X 3 X ) (/)X. (e) Since E(X 3 X,X ) E(X 3 X ), the process is not Markov. (f) Since the process is not Markov, it is not independent increment. 0 X

9. Moving average process. Let X n Z n + Z n for n, where Z 0,Z,Z,... are i.i.d. N(0,). Find the mean and autocorrelation function of X n. E(X n ) E(Z n )+E(Z n ) 0. R X (m,n) E(X m X n ) [( )( ) E Z m +Z m Z n +Z n E[Z n, n m 4 E[Z n +E[Z n, n m E[Z n, m n 0, otherwise 5 4, n m, n m 0, otherwise. 0. Autoregressive process. Let X 0 0 and X n X n +Z n for n, where Z,Z,... are i.i.d. N(0,). Find the mean and autocorrelation function of X n. E[X n E[X n +E[Z n 4 E[X n + E[Z n... n E[X n E[Z 0. For n > m we can write Therefore, X n n mx m + n m Z m+ +...+ Z n +Z n n m n mx m + R X (n,m) E(X n X m ) (n m) E[X m, i0 iz n i. since X m and Z n i, i 0,...,n m, are independent and E[X m E[Z n i 0. To find E[X m consider Thus in general, E[X, E[X 4 E[X +E[Z 4 +,. E[X n 4 n +...+ 4 + 4 3 ( 4 n). R X (n,m) E(X n X m ) n m 4 3 [ ( 4 )min{n,m}.

. Random binary waveform. In a digital communication channel the symbol is represented by the fixed duration rectangular pulse { for 0 t < g(t) 0 otherwise, and the symbol 0 is represented by g(t). The data transmitted over the channel is represented by the random process X(t) A k g(t k), for t 0, k0 where A 0,A,... are i.i.d random variables with { + w.p. A i w.p.. (a) Find its first and second order pmfs. (b) Find the mean and the autocorrelation function of the process X(t). (a) The first order pmf is p X(t) (x) P(X (t) x) ( ) P A k g(t k) x k0 P ( A t x ) P(A 0 x) IID {, x ± 0, otherwise. Now note that X(t ) and X(t ) are dependent only if t and t fall within the same time interval. Otherwise, they are independent. Thus, the second order pmf is p X(t )X(t )(x,y) P(X(t ) x,x (t ) y) ( ) P A k g(t k) x, A k g(t k) y k0 k0 P ( A t x,a t y ) { P(A0 x,a 0 y), t t P(A 0 x,a y), otherwise, t t & (x,y) (,),(, ) 4, t t & (x,y) (,),(, ),(,),(, ) 0, otherwise.

(b) For t 0, [ E[X(t) E A k g(t k) k0 g(t k)e[a k k0 0. For the autocorrelation R X (t,t ), we note once again that only if t and t fall within the same interval, will X(t ) be dependent on X(t ); if they do not fall in the same interval then they are independent from one another. Then, R X (t,t ) E[X(t )X(t ) g(t k)g(t k)e[a k k0 {, t t 0, otherwise. 3