Random Signals and Noise Winter Semester 2017 Problem Set 12 Wiener Filter Continuation

Similar documents
MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science. BACKGROUND EXAM September 30, 2004.

First Year Quantitative Comp Exam Spring, Part I - 203A. f X (x) = 0 otherwise

Estimation of the Mean and the ACVF

AAEC/ECON 5126 FINAL EXAM: SOLUTIONS

Estimation for Complete Data

EECS564 Estimation, Filtering, and Detection Hwk 2 Solns. Winter p θ (z) = (2θz + 1 θ), 0 z 1

Question1 Multiple choices (circle the most appropriate one):

1 Introduction to reducing variance in Monte Carlo simulations

1.3 Convergence Theorems of Fourier Series. k k k k. N N k 1. With this in mind, we state (without proof) the convergence of Fourier series.

Chapter 6 Principles of Data Reduction

Chapter 10 Advanced Topics in Random Processes

MATH 320: Probability and Statistics 9. Estimation and Testing of Parameters. Readings: Pruim, Chapter 4

ECE534, Spring 2018: Final Exam

1 Inferential Methods for Correlation and Regression Analysis

AMS570 Lecture Notes #2

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 6 9/23/2013. Brownian motion. Introduction

This exam contains 19 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

Econ 325/327 Notes on Sample Mean, Sample Proportion, Central Limit Theorem, Chi-square Distribution, Student s t distribution 1.

STAT Homework 1 - Solutions

Maximum Likelihood Estimation and Complexity Regularization

Lecture Chapter 6: Convergence of Random Sequences

ECONOMETRIC THEORY. MODULE XIII Lecture - 34 Asymptotic Theory and Stochastic Regressors

Random Variables, Sampling and Estimation

ECE 8527: Introduction to Machine Learning and Pattern Recognition Midterm # 1. Vaishali Amin Fall, 2015

Castiel, Supernatural, Season 6, Episode 18

Exponential Families and Bayesian Inference

Ch3 Discrete Time Fourier Transform

32 estimating the cumulative distribution function

Expectation and Variance of a random variable

Distribution of Random Samples & Limit theorems

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 21 11/27/2013

Problem Set 4 Due Oct, 12

Approximations and more PMFs and PDFs

In this section we derive some finite-sample properties of the OLS estimator. b is an estimator of β. It is a function of the random sample data.

SCORE. Exam 2. MA 114 Exam 2 Fall 2017

Created by T. Madas SERIES. Created by T. Madas

Lecture 12: September 27

Lecture 12: November 13, 2018

Lecture 6 Ecient estimators. Rao-Cramer bound.

Simulation. Two Rule For Inverting A Distribution Function

Properties and Hypothesis Testing

This section is optional.

Unbiased Estimation. February 7-12, 2008

CSE 527, Additional notes on MLE & EM

Linear Regression Models

Lecture 22: Review for Exam 2. 1 Basic Model Assumptions (without Gaussian Noise)

7.1 Convergence of sequences of random variables

7.1 Convergence of sequences of random variables

Lecture 01: the Central Limit Theorem. 1 Central Limit Theorem for i.i.d. random variables

Solution of Linear Constant-Coefficient Difference Equations

Probability and Statistics Estimation Chapter 7 Section 3 Estimating p in the Binomial Distribution

Mathematical Statistics - MS

It is often useful to approximate complicated functions using simpler ones. We consider the task of approximating a function by a polynomial.

Additional Notes on Power Series

Review Questions, Chapters 8, 9. f(y) = 0, elsewhere. F (y) = f Y(1) = n ( e y/θ) n 1 1 θ e y/θ = n θ e yn

Solutions: Homework 3

Element sampling: Part 2

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 5

Lecture 7: Properties of Random Samples

LECTURE 8: ASYMPTOTICS I

Lecture 13: Maximum Likelihood Estimation

Problems from 9th edition of Probability and Statistical Inference by Hogg, Tanis and Zimmerman:

PH 411/511 ECE B(k) Sin k (x) dk (1)

Lecture 23: Minimal sufficiency

Topic 9: Sampling Distributions of Estimators

Math 113 Exam 3 Practice

Econ 325 Notes on Point Estimator and Confidence Interval 1 By Hiro Kasahara

ECE 901 Lecture 13: Maximum Likelihood Estimation

ECE 564/645 - Digital Communication Systems (Spring 2014) Final Exam Friday, May 2nd, 8:00-10:00am, Marston 220

TAMS24: Notations and Formulas

ECE 901 Lecture 14: Maximum Likelihood Estimation and Complexity Regularization

DISTRIBUTION LAW Okunev I.V.

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

The Maximum-Likelihood Decoding Performance of Error-Correcting Codes

Chapter 2. Simulation Techniques. References:

Carleton College, Winter 2017 Math 121, Practice Final Prof. Jones. Note: the exam will have a section of true-false questions, like the one below.

IE 230 Probability & Statistics in Engineering I. Closed book and notes. No calculators. 120 minutes.

Test of Statistics - Prof. M. Romanazzi

18.S096: Homework Problem Set 1 (revised)

Discrete Mathematics for CS Spring 2008 David Wagner Note 22

CS/ECE 715 Spring 2004 Homework 5 (Due date: March 16)

Discrete-Time Systems, LTI Systems, and Discrete-Time Convolution

Lecture 3. Properties of Summary Statistics: Sampling Distribution

Confidence Level We want to estimate the true mean of a random variable X economically and with confidence.

6.041/6.431 Spring 2009 Final Exam Thursday, May 21, 1:30-4:30 PM.

THE KALMAN FILTER RAUL ROJAS

Machine Learning Assignment-1

Signals & Systems Chapter3

EE Midterm Test 1 - Solutions

Math 128A: Homework 1 Solutions

1the 1it is said to be overdamped. When 1, the roots of

Probability and Statistics

Lecture 6 Chi Square Distribution (χ 2 ) and Least Squares Fitting

Statistics for Applications Fall Problem Set 7

University of California, Los Angeles Department of Statistics. Practice problems - simple regression 2 - solutions

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 22

The picture in figure 1.1 helps us to see that the area represents the distance traveled. Figure 1: Area represents distance travelled

Let us give one more example of MLE. Example 3. The uniform distribution U[0, θ] on the interval [0, θ] has p.d.f.

Topic 9: Sampling Distributions of Estimators

Transcription:

Radom Sigals ad Noise Witer Semester 7 Problem Set Wieer Filter Cotiuatio Problem (Sprig, Exam A) Give is the sigal W t, which is a Gaussia white oise with expectatio zero ad power spectral desity fuctio S WW ht u t u t T, give i the figure: ht N W t is passed through the filter T t Let us deote by X t the output of the filter, amely: X t W t* ht a Fid the expectatio of the process X t b Fid the autocorrelatio fuctio of X t, c Draw RXX XX R We would like to predict the future of the process from observig its mometary value a For X t X t from, fid the optimal MMSE estimator of b What is the achieved mea squared error? 3 Now, we would like to predict the future of the process from observig its mometary value ad a additioal value from the past a For, D, fid the optimal MMSE estimator of pair of samples X t, X t D X t from the b Are there values of D for which the estimators from sectio ad 3 are idetical? If so, what are they? Explai your aswer

A bright egieer foud the optimal estimator of X t from all the samples from the past together: X t t t 4 For what values of is this estimator: a idetical to the estimator from sectio? Explai! b idetical to the estimator from sectio 3? Explai! There is o eed to fid the estimator Problem : Two JWSS radom processes, autocorrelatio ad cross-correlatio fuctios N, R N, R N, N spectrums S S, S, respectively N, N N, N N t N t are give, each with expectatio zero, R ad Fid the optimal liear estimator that uses all the sample sigal N t : Write dow the optimal Nˆ t ht t N t H dt Now, it is give that N t V t g t N t V t g t, where process with expectatio zero ad spectrum: S V a The frequecy resposes of the filters are give by: G, G Explai why ideed N t Nt calculate S S, S N, N N, N V t is a WSS, are JWSS with expectatio zero ad 3 Calculate the estimator you foud i sectio ad its mea squared error 4 For what values of the parameters, is the estimatio error zero? Prove mathematically ad explai ituitively 5 For what values of the parameters, is the estimatio error maximal? What is the estimator i this case ad what is the estimatio error?

Problem 3: Give is a discrete-time WSS Gaussia radom process, X, with expectatio X ad autocorrelatio fuctio RX k Let us assume the sigal Y X Z is give, where ad idepedet of Z is Gaussia white oise with expectatio zero, PSD S e X What is the optimal MMSE estimator of X from Y? What is the optimal MMSE estimator of X from Y? Z Z Problem 4: Give is a discrete-time WSS radom process SX e, show i the figure, ad it is give that oise with PSD N, idepedet of X SX e X with expectatio zero ad PSD Y X Z, where Z is white A Calculate RX k Calculate the optimal liear estimator of X from Y, ad the mea squared error, as a fuctio of the delay Fid the value of for which the error is miimal ad explai the behavior of the error as a fuctio of 3 Calculate the mea squared error of the optimal liear estimator of X from Y, Y 5 4 Fid the optimal filter for estimatig X from Y squared error ad calculate its mea 5 Compare the errors of the estimators from sectios,3,4

Problem 5: Two idepedet WSS Gaussia radom processes X[ ], Z[ ], are give The two processes have zero expectatio ad the followig Power Spectral Desities:, Z, SX e S e We are iterested i calculatig X [ ] out of Y[ ] X[ ] Z[ ] Are X [ ] ad Y [] JWSS? Calculate the optimal MMSE estimator of the process X [ ] from the process Y [] 3 What is the mea squared error (MSE) of the estimator from sectio? We defie the processes ' [ ] Y X [ ] Z [ ] '' ad Y 4 Are X [ ] ad Y '' [ ] JWSS? ' [ ] Y [ ] 5 What is the optimal estimator (ot ecessarily LTI) of the process X [ ] from the process 6 Are X [ ] ad Y '' [ ]? Y ' [ ] JWSS? 7 What is the optimal estimator (ot ecessarily LTI) of the process X [ ] from the process Y ' [ ]? Problem 6: Give are three idepedet radom processes Xt, () N () t ad N () t The three processes are statioary, with expectatio zero ad spectrums S X, S ad S, respectively Also a RV P is give, idepedet of the three processes Xt, () N () t ad N () t, with the followig distributio: w p p P p w p p where p is costat

N t Y t P X t N t X t P Y t P X t N t N t Show that the three processes Xt, () Y() t ad Y () t are JWSS i pairs (meaig that each pair of them is JWSS) ˆX t, which is the optimal liear MMSE estimator of the process Calculate Xt () from the process Y() t 3 Calculate ˆX t, which is the optimal liear MMSE estimator of the process Xt () from the process Y () t We ow defie a ew estimator ( t) X ( t) Xˆ ( t) Xˆ ( t) Xˆ ( t) Xˆ t ad the estimatio error 4 Show that the cross-correlatio betwee () t ad ay oe of the two processes Y() t, Y () t is zero 5 Deduce that X ˆ () t is the optimal liear estimator of the process Xtfrom () both processes t Y t Y t t,

Wieer Process Problem 7: (based o a questio from a exam) t X is Wieer radom process with parameter Let us defie costat times t,t, ad assume t t We will ow defie the radom variable (for a real costat c): Y X t c X t Fid a costat c such that Y will be idepedet of the sample X t Let us ow cosider two more samples of the process: X t X 3, t that: t t t t3 Based o the costat c foud i sectio : a Is Y idepedet of X t 3? b Is Y idepedet of X t?, such Problem 8: W t is Wieer radom process with parameter We defie its sig process t sigw t Y Calculate Yt expectatio ad autocorrelatio fuctios Express your aswer usig Q-fuctio Q e Prove that R t, t t, t Y 3 Prove that t, t lim R Y t x dx