UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

Similar documents
UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

UCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)

UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7

UCSD ECE 153 Handout #20 Prof. Young-Han Kim Thursday, April 24, Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei)

ELEMENTS OF PROBABILITY THEORY

UCSD ECE153 Handout #30 Prof. Young-Han Kim Thursday, May 15, Homework Set #6 Due: Thursday, May 22, 2011

ECE-340, Spring 2015 Review Questions

Final Examination Solutions (Total: 100 points)

UCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei)

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes

Chapter 6: Random Processes 1

Introduction to Probability and Stochastic Processes I

Stochastic Processes

Name of the Student: Problems on Discrete & Continuous R.Vs

ECE534, Spring 2018: Solutions for Problem Set #5

UCSD ECE153 Handout #34 Prof. Young-Han Kim Tuesday, May 27, Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei)

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Properties of the Autocorrelation Function

ECE Homework Set 3

ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

2 Functions of random variables

Final. Fall 2016 (Dec 16, 2016) Please copy and write the following statement:

Problems 5: Continuous Markov process and the diffusion equation

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

Basics of Stochastic Modeling: Part II

Random Process. Random Process. Random Process. Introduction to Random Processes

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

ECE 353 Probability and Random Signals - Practice Questions

Solutions to Homework Set #3 (Prepared by Yu Xiang) Let the random variable Y be the time to get the n-th packet. Find the pdf of Y.

Solutions to Homework Set #6 (Prepared by Lele Wang)

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

This examination consists of 10 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS

UNIT Define joint distribution and joint probability density function for the two random variables X and Y.

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Gaussian Basics Random Processes Filtering of Random Processes Signal Space Concepts

Name of the Student: Problems on Discrete & Continuous R.Vs

The exponential distribution and the Poisson process

16.584: Random (Stochastic) Processes

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Chapter 4 Random process. 4.1 Random process

Kolmogorov Equations and Markov Processes

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

1. Point Estimators, Review

ENGR352 Problem Set 02

Lecture 5: Moment generating functions

PROBABILITY AND RANDOM PROCESSESS

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Fig 1: Stationary and Non Stationary Time Series

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

i=1 k i=1 g i (Y )] = k f(t)dt and f(y) = F (y) except at possibly countably many points, E[g(Y )] = f(y)dy = 1, F(y) = y

Probability and Statistics for Final Year Engineering Students

1 Probability and Random Variables

Sampling Distributions

Lecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs

3F1 Random Processes Examples Paper (for all 6 lectures)

ECE6604 PERSONAL & MOBILE COMMUNICATIONS. Week 3. Flat Fading Channels Envelope Distribution Autocorrelation of a Random Process

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π

where r n = dn+1 x(t)

Chapter 5,6 Multiple RandomVariables

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

University of Illinois ECE 313: Final Exam Fall 2014

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Statistics for scientists and engineers

7 The Waveform Channel

Solution: The process is a compound Poisson Process with E[N (t)] = λt/p by Wald's equation.

1.1 Review of Probability Theory

Gaussian, Markov and stationary processes

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Stat410 Probability and Statistics II (F16)

conditional cdf, conditional pdf, total probability theorem?

Random Processes Why we Care

This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner.

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

Verona Course April Lecture 1. Review of probability

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.

Chapter 5 Random Variables and Processes

ECE 636: Systems identification

Time : 3 Hours Max. Marks: 60. Instructions : Answer any two questions from each Module. All questions carry equal marks. Module I

Multiple Random Variables

SDS 321: Introduction to Probability and Statistics

Exponential Distribution and Poisson Process

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

SOLUTION FOR HOMEWORK 11, ACTS 4306

MATH 180A - INTRODUCTION TO PROBABILITY PRACTICE MIDTERM #2 FALL 2018

Continuous Random Variables

Massachusetts Institute of Technology

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

ECE534, Spring 2018: Solutions for Problem Set #4 Due Friday April 6, 2018

Poisson Processes. Stochastic Processes. Feb UC3M

Manual for SOA Exam MLC.

1 Random Variable: Topics

Transcription:

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable as possible. Please justify any claim that you make.. Conditional expectations (40 pts). Let X and Y be i.i.d. Exp() random variables. Let Z = X +Y and W = X Y. (a) Find the joint pdf f X,Z (x,z) of X and Z. (b) Find the joint pdf f Z,W (z,w) of Z and W. (c) Find E[Z X]. (d) Find E[X Z]. 2. MMSE estimation (30 pts). Let X Exp() and Y = min{x,}. (a) Find E[Y]. (b) Findtheestimate ˆX = g(y)ofx giveny thatminimizes themeansquare error E[(X ˆX) 2 ] = E[(X g(y)) 2 ], and plot g(y) as a function of y. (c) Find the mean square error of the estimate found in part (b). 3. Is the grass always greener on the other side? (30 pts). Let X and Y be two i.i.d. continuous nonnegative random variables with invertible common cdf F, i.e., P{X x} = P{Y x} = F(x). (a) Find P{X > Y} and P{X < Y}. Suppose now that we observe the value of X and make a decision on whether X is larger or smaller than Y.

(b) Find the optimal decision rule d(x) that minimizes the error probability. Your answer should be in terms of the common cdf F. (c) Find the probability of error for the decision rule found in part (b). 4. Sampled Wiener process (60 pts). Let {W(t),t 0} be the standard Brownian motion. For n =,2,..., let ( ) X n = n W. n (a) Find the mean and autocorrelation functions of {X n }. (b) Is {X n } WSS? Justify your answer. (c) Is {X n } Markov? Justify your answer. (d) Is {X n } independent increment? Justify your answer. (e) Is {X n } Gaussian? Justify your answer. (f) For n =,2,..., let S n = X n /n. Find the limit in probability. lim n S n 5. Poisson process (40 pts). Let {N(t),t 0} be a Poisson process with arrival rate λ > 0. Let s t. (a) Find the conditional pmf of N(t) given N(s). (b) Find E[N(t) N(s)] and its pmf. (c) Find the conditional pmf of N(s) given N(t). (d) Find E[N(s) N(t)] and its pmf. 6. Hidden Markov process (60 pts). Let X 0 N(0,σ 2 ) and X n = 2 X n +Z n for n, where Z,Z 2,... are i.i.d. N(0,), independent of X 0. Let Y n = X n +V n, where V n are i.i.d. N(0,), independent of {X n }. 2

(a) Find the variance σ 2 such that {X n } and {Y n } are jointly WSS. Under the value of σ 2 found in part (a), answer the following. (b) Find R Y (n). (c) Find R XY (n). (d) Find the MMSE estimate of X n given Y n. (e) Find the MMSE estimate of X n given (Y n,y n ). (f) Find the MMSE estimate of X n given (Y n,y n+ ). 3

Practice Final Examination (Winter 206) There are 4 problems, each problem with multiple parts, each part worth 0 points. Your answer should be as clear and readable as possible. Justify any claim that you make.. Additive exponential noise channel (60 pts). A device has two equally likely states S = 0 and S =. When it is inactive (S = 0), it transmits X = 0. When it is active (S = ), it transmits X Exp(). Now suppose the signal is observed through the additive exponential noise channel with output Y = X +Z, where Z Exp(2) is independent of (X,S). One wishes to decide whether the device is active or not. (a) Find f Y S (y 0). (b) Find f Y S (y ). (c) Find f Y (y). (d) Find p S Y (0 y) and p S Y ( y). (e) Find the decision rule d(y) that minimizes the probability of error P(S d(y)). (f) Find the corresponding probability of error. (Hint: Recall that Z Exp(λ) means that its pdf is f Z (z) = λe λz, z 0.) 2. Brownian bridge (40 pts). Let {W(t)} t=0 be the standard Brownian motion (Wiener process). Recall that the process is independent-increment with W(0) = 0 and W(t) W(s) N(0,t s), 0 s < t. In the following, we investigate several properties of the process conditioned on {W() = 0}. (a) Find the conditional distribution of W(/2) given W() = 0. 4

(b) Find E[W(t) W() = 0] for t [0,]. (c) Find E[(W(t)) 2 W() = 0] for t [0,]. (d) Find E[W(t )W(t 2 ) W() = 0] for t,t 2 [0,]. 3. Convergence of random processes (30 pts). Let {N(t)} t=0 be a Poisson process with rate λ. Recall that the process is independent increment and N(t) N(s), 0 s < t, has the pmf Define p N(t) N(s) (n) = e λ(t s) (λ(t s)) n, n = 0,,... n! M(t) = N(t), t > 0. t (a) Find the mean and autocorrelation function of {M(t)} t>0. (b) Does {M(t)} t>0 converge in mean square as t, that is, lim t E[ (M(t) M) 2] = 0 for some random variable (or constant) M? If so, what is the limit? Now consider L(t) = t t 0 N(s) s ds, t > 0. (c) Does {L(t)} t>0 converge in mean square as t? If so, what is the limit? (Hint: /xdx = lnx+c, lnxdx = xlnx x+c, and lim x 0 xlnx = 0.) 4. Random binary waveform (40 pts). Let {N(t)} t=0 be a Poisson process with rate λ, and Z be independent of {N(t)} with P(Z = ) = P(Z = ) = /2. Define X(t) = Z ( ) N(t), t 0. (a) Find the mean and autocorrelation function of {X(t)} t=0. 5

(b) Is {X(t)} t=0 wide-sense stationary? (c) Find the first-order pmf p X(t) (x) = P(X(t) = x). (d) Find the second-order pmf p X(t ),X(t 2 )(x,x 2 ) = P(X(t ) = x, X(t 2 ) = x 2 ). (Hint: k even xk /k! = (e x +e x )/2 and k odd xk /k! = (e x e x )/2.) 6

Practice Final Examination (Winter 208) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable as possible. Please justify any claim that you make.. Nonlinear and linear MMSE estimation (30 pts). Let X and Y be two random variables with joint pdf { x+y, 0 x, 0 y, f X,Y (x,y) = 0, otherwise. (a) (0 points) Find the linear MMSE estimator of X given Y. (b) (0 points) Find the corresponding MSE. (c) (0 points) Find the MMSE estimator of X given Y. Is it the same as the linear MMSE estimator? 2. Convergence(30pts). Considerthesequenceofi.i.d.randomvariablesX,X 2,... with { 0 w.p. for all i. Define the sequence Let Y n = X i = 2 w.p., 2, 2 X n, for all n w.p. 3, X 2 n, for all n w.p., 3 0, for all n w.p.. 3 M n = n n Y i. (a) (0 points) Determine the probability mass function (pmf) of Y n. (b) (0 points) Determine the random variable (or constant) that M n converges to (in probability) as n approaches infinity. Justify your answer. i= 7

(c) (0 points) Use the central limit theorem to estimate the probability that the random variable M 84 exceeds 2 3. 3. Poisson Process (40 pts). Let {N(t),t 0} be a Poisson process with arrival rate λ > 0. (a) (0 points) Let T M be the time of the M-th arrival. Find E[T M ] and Var(T M ). (b) (0 points) Let s t. Assume k arrivals occur in t seconds, that is N(t) = k. Show that the conditional distribution of N(s) given N(t) = k satisfies N(s) {N(t) = k} Binom ( k, s t). (c) (0 points) Let s t. Determine the conditional expectation E[N(s) N(t)] and give its probability mass function (pmf). (d) (0 points) Assume N(t) = k. Determine the probability that all k arrivals occur in the first t 2 seconds. 4. Moving Average Process (40 pts). Let Z 0,Z,Z 2,... be i.i.d. N(0,). Let Y n = Z n +Z n for n. (a) (0 points) Find the mean function and autocorrelation function of {Y n }. (b) (5 points) Is {Y n } wide-sense stationary? Justify your answer. (c) (0 points) Is {Y n } Gaussian? Justify your answer. (d) (5 points) Is {Y n } strict-sense stationary? Justify your answer. (e) (0 points) Is {Y n } Markov? Justify your answer. [Hint: Compare E(Y 3 Y,Y 2 ) to E(Y 3 Y 2 ).] 5. WSS process through linear filter (40 pts). Let Y(t) be a short-term integration of a WSS process X(t): Y(t) = T t t T X(u)du. 8

The frequency response H(f) of this linear integration system is H(f) = e jπftsin(πft). πft Suppose the input X(t) has mean E[X(t)] and autocorrelation function R X (τ) = { τ T, τ T 0, otherwise. (a) (0 points) Determine the constant a such that E[Y(t)] = ae[x(t)]. (b) (0 points) Find S Y (f). (c) (0 points) Find R Y (τ). (You can leave your answer in the form of a convolution.) (d) (0 points) Determine explicitly the average power of the output E[Y 2 (t)]. ( ) 2 sin πft Hint: You may use the transform pair R X (τ) T πft and Fourier Transform relationships from the tables provided. 6. Optimal linear estimation (20 pts). Let X(t) be a zero-mean WSS process with autocorrelation function R X (τ) = e τ. (a) (0 points) Find the MMSE estimator for X(t) of the form ˆX(t) = ax(t t )+bx(t t 2 ), where t = t 0 and t 2 = 2t 0, where t 0 > 0. (b) (0 points) Find the MSE of this estimator. 9