Lecture 19: Bayesian Linear Estimators
|
|
- Rose Lang
- 6 years ago
- Views:
Transcription
1 ECE 830 Fall 2010 Statistical Signal Processing instructor: R Nowa, scribe: I Rosado-Mendez Lecture 19: Bayesian Linear Estimators 1 Linear Minimum Mean-Square Estimator Suppose our data is set X R n, which is considered to be a random vector governed by a distribution p(x θ), which depends on the parameter θ Moreover, the parameter θ R is treated as a random variable with E[θ] = 0 and E[θθ T ] = Σ θθ Also, assume that E[x] = 0 and let Σ xx := E[xx T ] and Σ θx := E[θx T ] Then, as we saw in the previous lecture, the best linear estimator of θ is given by:  = arg min E [ θ A T x 2 ] 2 A R n  = Σ 1 xx Σ xθ As a consequence, our Linear Minimum Mean Square Error Estimator or LMMSE estimator becomes: 2 Orthogonality Principle ˆθ = ÂT x = Σ θx Σ 1 xx x Let ˆθ = Σ θx Σ 1 xx x be the LMMSE estimator, defined above Then E [ (θ ˆθ) T x ] = E [ tr ( )] T θ ˆθ)x E [ (θ ˆθ) T x ] = tr ( Σ θx Σ θx Σ 1 ) xx Σ xx E [ (θ ˆθ) T x ] = 0 In other words, the error (θ ˆθ) is orghogonal to the data x This is shown graphically in Fig 1 It is important to note that if ˆθ = A T x, then, by the orthogonality principle which is the previously derived LMMSE estimator 0 = E [ (θ ˆθ)x T ] = Σ θx A T Σ xx A T = Σ θx Σ 1 xx 1
2 Lecture 19: Bayesian Linear Estimators 2 21 Linear signal model Figure 1: Orthogonality between the estimator ˆθ and its error θ ˆθ Suppose we model our detected signal as X = Hθ + W, where X and W R n, θ R, H n is a nown linear transformation, and W is a noise process Furhtermore we now that E[w] = 0, E[ww T ] = σ 2 wi n n E[θ] = 0, E[θθ T ] = σ 2 θi In addition, we now that the parameter and the noise process are uncorrelated, ie, E[θw T ] = E[wθ T ]0 As demonstrated before, the LMMSE estimator is ˆθ = Σ 1 xx Σ xθ x where Σ 1 xx and Σ xθ can be obtained as follows: Therefore, the LMMSE estimator is given by Σ xθ = E[θx T ] = E[θ(Hθ + w) T ] = σ 2 θh T Σ xx = E[xx T ] = E[(Hθ + w)(hθ + w) T ] = σ 2 θhh T + σ 2 w ˆθ = σ 2 θh T (σ 2 θhh T + σ 2 wi n n ) 1 x ˆθ = H T (HH T + σ2 w σθ 2 I n n ) 1 x When does the LMMSE estimator minimize the Bayes MSE amog all possible estimators? When is the linear estimar optimal? The LMMSE esteimator is optimal, ie, it is the minimum Bayesian MSE estimator when the Maximum A Posteriori estimator is linear
3 Lecture 19: Bayesian Linear Estimators 3 3 Gauss-Marov Theorem Let X and Y be jointly Gaussian random vectors, whose joint distribution can be expressed as [ X Y ] N then the conditional distribution of Y given X is ([ µx ] [ ]) Σxx Σ, xy µ y Σ yx Σ yy Y X N ( µ y + Σ yx Σ 1 xx (x µ x ), Σ yy Σ yx Σ 1 xx Σ xy ) 31 Application to the Linear Signal Model We model the detected signal as X = Hθ + W where W N (0, σ 2 wi n n ) and θ N (0, σ 2 θ I ) Then, the vector [Xθ] T is a multivarian Gaussian random vector As we saw in previous lectures, the Bayesian MSE is minimized by the posterior mean E[θ X] which, in this case, using the Gauss-Marov theorem, is E[θ x] = µ θ + Σ θx Σ 1 xx (x µ x ) E[θ x] = 0 + σ 2 θh T (σ 2 θhh T + σ 2 wi n n ) 1 (x 0) E[θ x] = σ 2 θh T (σ 2 θhh T + σ 2 wi n n ) 1 x, which is the previously derived LMMSE estimator Therefore, Gaussian case linear estimators are optimal in the 32 Proof of the Gauss-Marov theorem Without loss of generality assume that X and Y are zero-mean random vectors Therefore P (Y X) = P (X, Y ) P (X) [ ] = (2π) n/2 (2π) n/2 Σ 1 exp{ [ 1 2 x y Σ 1 x y ] T } (2π) n/2 Σ xx 1 exp{ 1 2 xt Σ 1 xx x} where Σ = [ ] Σxx Σ xy Σ yx Σ yy To simplify the formula we need to determine Σ 1 The inverse can be written as: where [ ] 1 [ ] Σxx Σ xy Σ 1 xx 0 = 0 0 Σ yx Σ yy [ ] Σ 1 xx Σ + xy Q 1 [ Σ I yx Σ 1 xx I ] Q := Σ yy Σ yx Σ 1 xx Σ xy
4 Lecture 19: Bayesian Linear Estimators 4 This formula for the inverse is easily verified by multiplying it by Σ to get the identity matrix Substituting the inverse into P (Y X) yields P (X Y ) = (2π) n/2 Q 1 exp{ 1 2 (y Σ yxσ 1 xx x) T Q 1 (y Σ yx Σ 1 xx x)} which shows that Y X N (Σ yx Σ 1 xx x, Q) For the general case when E[X] = µ x and E[Y ] = µ y then 4 The Wiener Filter (Y µ y ) (X µ x ) N (Σ yx Σ 1 xx (x µ x ), Q) Y X N (µ y + Σ yx Σ 1 xx (x µ x ), Q) When the expected values of the parameter θ R and the data x R n are zero, then the Wiener filter A opt is obtained by minimizing the mean square error between the parameter and estimator: A opt = arg min E [ θ Ax 2 ] 2 A:ˆθ=Ax which results in A opt = Σ θx Σ 1 xx, involving second order moments and which becomes the optimal estimator when both the data and the parameter are jointly Gaussian distributed 41 Signal + Noise Model We model our detected signal as X = S +W where the noiseless signal S (our parameter) follows a Gaussian distribution N (0, Σ ss ) and W N (0, Σ ww ) In addition, S and W are uncorrelated Therefore, the data vector X N (0, Σ ss +Σ ww ) and E[sx T ] = E[s(s+w) T ] = Σ ss From here, the LMMSE estimator ŝ becomes: 42 Linear Signal + Noise Model ŝ = Σ ss (Σ ss + Σ ww ) 1 x Now we assume that the detected signal can be model as X = Hθ + W where now θ N (0, Σ θθ ) and W N (0, Σ ww ) where θ R and W R n (which are uncorrelated) and H is an n linear transformation matrix Therefore X N (0, HΣ θθ H T + Σ ww ) In addition, E[θx T ] = E[θ(Hθ + w) T ] = Σ θθ H T and the estimator becomes ˆθ = Σ θθ H T (HΣ θθ H T + Σ ww ) 1 x Now suppose that Σ θθ = σ 2 θ I and Σ ww = σ 2 wi n n, then ˆθ = σ 2 θh T (σ 2 θhh T + σ 2 wi n n ) 1 x and the LMMSE estimator of the noisless signal becomes
5 Lecture 19: Bayesian Linear Estimators 5 ŝ = H ˆθ = σ 2 θhh T (σ 2 θhh T + σ 2 wi n n ) 1 x In some cases HH T can be diagonalized under an orthonormal transformation U (its columns are orthonormal to each other) in such a way that only the first elements of the diagonal are nonzero As a consequence λ HH T = U 0 λ U T = UDU T ŝ = σ 2 θudu T (σ 2 θudu T + σ 2 wi n n ) 1 x ŝ = σ 2 θudu T (σ 2 θudu T + σ 2 wuu T ) 1 x ŝ = σ 2 θudu T (U [ σ 2 θd + σ 2 wi n n ] U T ) 1 x ŝ = σ 2 θud [ σ 2 θd + σ 2 wi n n ] 1U T x ŝ = U(σ 2 θd [ σ 2 θd + σ 2 wi n n ] 1)U T x Note that the term in parenthesis reduces to a diagonal matrix of the form σ 2 θ λ σθ 2 λ1+σ2 w σ 0 2 θ λ σθ 2 λ 0 0 +σw which, as σ 2 w/σ 2 θ tends to zero, converges to and ŝ P H x
6 Lecture 19: Bayesian Linear Estimators 6 43 Frequency Domain Wiener Filter In tihs part we review the signal + noise example (41) but approaching the problem from Fourier domain Again, our model is X = S + W and now we tae the DFT of both sides: U T X = U T S + U T W X = S + W where S N (0, Λ s ) and W N (0, Λ w ) Equivalently X = UU T S + UU T W = S + W so S N (0, UΛ s U T ) and W N (0, UΛ w U T ) Therefore, the Wiener-Fielter estimator becomes ŝ = U ŝ = Σ ss (Σ ss + Σ ww ) 1 x ŝ = UΛ s U T (U[Λ s + Λ w ]U T ) 1 x ŝ = UΛ s U T U[Λ s + Λ w ] 1 U T x σ 2 1 σ1 2+γ ŝ = UΛ s [Λ s + Λ w ] 1 U T x σ 2 i 0 0 σi 2+γ2 i σ 2 n σ 2 n +γ2 n U T x where σ 2 j and γ2 j are the jt h elements of the diagonal matrices Λ s and Λ w, respectively Therefore the filtering process can be synthesized by the following algorithm: 1 Tae the DFT of the measured signal 2 Attenuate each frequency component according to 1 1+SNR 1 j 3 Tae the inverse DFT of the attenuated spectrum 44 Classical derivation of the Wiener Filter at frequency ω j, where SNR j = σ 2 j /γ2 j Again, we start with the model X = S+W where X, S, W are wide-sense stationary processes We re-express them as time series x[n] = s[n] + w[n] We aim at defining a filter h[n] that will be convolved with x[n] to etstimate s[n]
7 Lecture 19: Bayesian Linear Estimators 7 ŝ[n] = Σ h[]x[n ] Our filter should minimize the MSE: MSE(ŝ[n]) = E [ (s[n] ŝ[n]) 2] MSE(ŝ[n]) = E [ (s[n] 2 2s[n]Σ h[]x[n ] + (Σ h[]x[n ]) 2] Differentiating with respect to h[m] and maing the derivative equal to zero MSE(ŝ[n]) h[m] MSE(ŝ[n]) h[m] MSE(ŝ[n]) h[m] = E [ 2s[n]h[m] + 2(Σ h[]x[n ])x[n m] ] = 2R sx [m] + 2(Σ h[]r xx [m ]) ( ) = 2R ss [m] + 2 Σh[](R ss [m ] + R ww [m ]) = 0 Therefore the optimal filter satisfies R ss [m] = Σ h[](r ss [m ] + R ww [m ]), which is just the familiar Wiener-Hopf equation Taing the DFT of both sides, we get S ss (ω) = H(ω) ( S ss (ω) + S ww (ω) ) where S ss (ω) and S ww (ω) are the power spectra of the signal and the noise process, respectively Therefore, the filter becomes: 5 Deconvolution H(ω) = S ss (ω) S ss (ω) + S ww (ω) The final topic of this lecture is deconvolution We model the detected signal as X = GS + W where G is a circular convolution operator (a blurring transformation, shown in Fig 2) As in the previous sections S N (0, UΛ s U T ) and W N (0, UΛ w U T ) Furthermore, since G is circulant, G = UDU T, where D is a diagonal matrix, which is basically the frequency response of G In this case, the Wiener filter solution is computed as follows: ŝ = Σ ss G T (GΣ ss G T + Σ ww ) 1 x ŝ = UΛ s U T G T (GUΛ s U T G T + UΛ w U T ) 1 x ŝ = UΛ s U T UD T U T (UDU T UΛ s U T UD T U T + UΛ w U T ) 1 x ŝ = UΛ s D T (DΛ s D T + Λ w ) 1 U T x
8 Lecture 19: Bayesian Linear Estimators 8 Figure 2: Blurring process (a) Original impulse signal (b) Blurring function (c) Blurred signal ŝ = U DU T x, where D = D T D 2 + P 1 and P = Λ s(, ) Λ w (, ) Do not forget that the transpose operator wors as the conjugate transpose operator when the matrix has complex elements 51 Classical Wiener Filter Following a derivation simmilar to that of Section 44, in the case of a blurred, noise time series modeled as x[n] = g[n] s[n] + w[n] we aim at obtaining a filter h[n] such that the estimator of the deblurred, noisless signal is computed from ŝ[n] = Σ h[]x[n ] The resulting filter in Fourier domain is: H(ω) = G (ω)s ss (ω) G(ω) 2 S ss (ω) + S ww (ω) where G(ω) is the transfer function of the blurring filter g[n] and G is its complex conjugate
INTRODUCTION Noise is present in many situations of daily life for ex: Microphones will record noise and speech. Goal: Reconstruct original signal Wie
WIENER FILTERING Presented by N.Srikanth(Y8104060), M.Manikanta PhaniKumar(Y8104031). INDIAN INSTITUTE OF TECHNOLOGY KANPUR Electrical Engineering dept. INTRODUCTION Noise is present in many situations
More information26. Filtering. ECE 830, Spring 2014
26. Filtering ECE 830, Spring 2014 1 / 26 Wiener Filtering Wiener filtering is the application of LMMSE estimation to recovery of a signal in additive noise under wide sense sationarity assumptions. Problem
More informationMachine Learning. A Bayesian and Optimization Perspective. Academic Press, Sergios Theodoridis 1. of Athens, Athens, Greece.
Machine Learning A Bayesian and Optimization Perspective Academic Press, 2015 Sergios Theodoridis 1 1 Dept. of Informatics and Telecommunications, National and Kapodistrian University of Athens, Athens,
More informationECE 636: Systems identification
ECE 636: Systems identification Lectures 3 4 Random variables/signals (continued) Random/stochastic vectors Random signals and linear systems Random signals in the frequency domain υ ε x S z + y Experimental
More informationEstimation techniques
Estimation techniques March 2, 2006 Contents 1 Problem Statement 2 2 Bayesian Estimation Techniques 2 2.1 Minimum Mean Squared Error (MMSE) estimation........................ 2 2.1.1 General formulation......................................
More informationECE 275A Homework 6 Solutions
ECE 275A Homework 6 Solutions. The notation used in the solutions for the concentration (hyper) ellipsoid problems is defined in the lecture supplement on concentration ellipsoids. Note that θ T Σ θ =
More informationEEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as
L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace
More informationEstimation theory. Parametric estimation. Properties of estimators. Minimum variance estimator. Cramer-Rao bound. Maximum likelihood estimators
Estimation theory Parametric estimation Properties of estimators Minimum variance estimator Cramer-Rao bound Maximum likelihood estimators Confidence intervals Bayesian estimation 1 Random Variables Let
More informationELEG 5633 Detection and Estimation Minimum Variance Unbiased Estimators (MVUE)
1 ELEG 5633 Detection and Estimation Minimum Variance Unbiased Estimators (MVUE) Jingxian Wu Department of Electrical Engineering University of Arkansas Outline Minimum Variance Unbiased Estimators (MVUE)
More informationECE534, Spring 2018: Solutions for Problem Set #5
ECE534, Spring 08: s for Problem Set #5 Mean Value and Autocorrelation Functions Consider a random process X(t) such that (i) X(t) ± (ii) The number of zero crossings, N(t), in the interval (0, t) is described
More informationECE 541 Stochastic Signals and Systems Problem Set 9 Solutions
ECE 541 Stochastic Signals and Systems Problem Set 9 Solutions Problem Solutions : Yates and Goodman, 9.5.3 9.1.4 9.2.2 9.2.6 9.3.2 9.4.2 9.4.6 9.4.7 and Problem 9.1.4 Solution The joint PDF of X and Y
More informationAdaptive Filter Theory
0 Adaptive Filter heory Sung Ho Cho Hanyang University Seoul, Korea (Office) +8--0-0390 (Mobile) +8-10-541-5178 dragon@hanyang.ac.kr able of Contents 1 Wiener Filters Gradient Search by Steepest Descent
More informationProperties of the Autocorrelation Function
Properties of the Autocorrelation Function I The autocorrelation function of a (real-valued) random process satisfies the following properties: 1. R X (t, t) 0 2. R X (t, u) =R X (u, t) (symmetry) 3. R
More information5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors
EE401 (Semester 1) 5. Random Vectors Jitkomut Songsiri probabilities characteristic function cross correlation, cross covariance Gaussian random vectors functions of random vectors 5-1 Random vectors we
More informationwhere r n = dn+1 x(t)
Random Variables Overview Probability Random variables Transforms of pdfs Moments and cumulants Useful distributions Random vectors Linear transformations of random vectors The multivariate normal distribution
More informationDETECTION theory deals primarily with techniques for
ADVANCED SIGNAL PROCESSING SE Optimum Detection of Deterministic and Random Signals Stefan Tertinek Graz University of Technology turtle@sbox.tugraz.at Abstract This paper introduces various methods for
More informationEstimation Theory for Engineers
Estimation Theory for Engineers Roberto Togneri 30th August 2005 Applications Modern estimation theory can be found at the heart of many electronic signal processing systems designed to extract information.
More informationIII.C - Linear Transformations: Optimal Filtering
1 III.C - Linear Transformations: Optimal Filtering FIR Wiener Filter [p. 3] Mean square signal estimation principles [p. 4] Orthogonality principle [p. 7] FIR Wiener filtering concepts [p. 8] Filter coefficients
More informationA Probability Review
A Probability Review Outline: A probability review Shorthand notation: RV stands for random variable EE 527, Detection and Estimation Theory, # 0b 1 A Probability Review Reading: Go over handouts 2 5 in
More informationDetecting Parametric Signals in Noise Having Exactly Known Pdf/Pmf
Detecting Parametric Signals in Noise Having Exactly Known Pdf/Pmf Reading: Ch. 5 in Kay-II. (Part of) Ch. III.B in Poor. EE 527, Detection and Estimation Theory, # 5c Detecting Parametric Signals in Noise
More informationStatistical Techniques in Robotics (16-831, F12) Lecture#17 (Wednesday October 31) Kalman Filters. Lecturer: Drew Bagnell Scribe:Greydon Foil 1
Statistical Techniques in Robotics (16-831, F12) Lecture#17 (Wednesday October 31) Kalman Filters Lecturer: Drew Bagnell Scribe:Greydon Foil 1 1 Gauss Markov Model Consider X 1, X 2,...X t, X t+1 to be
More informationDS-GA 1002 Lecture notes 10 November 23, Linear models
DS-GA 2 Lecture notes November 23, 2 Linear functions Linear models A linear model encodes the assumption that two quantities are linearly related. Mathematically, this is characterized using linear functions.
More informationWiener Filter for Deterministic Blur Model
Wiener Filter for Deterministic Blur Model Based on Ch. 5 of Gonzalez & Woods, Digital Image Processing, nd Ed., Addison-Wesley, 00 One common application of the Wiener filter has been in the area of simultaneous
More informationLecture 11 FIR Filters
Lecture 11 FIR Filters Fundamentals of Digital Signal Processing Spring, 2012 Wei-Ta Chu 2012/4/12 1 The Unit Impulse Sequence Any sequence can be represented in this way. The equation is true if k ranges
More informationEEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2:
EEM 409 Random Signals Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Consider a random process of the form = + Problem 2: X(t) = b cos(2π t + ), where b is a constant,
More informationStochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno
Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.
More informationUCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011
UCSD ECE53 Handout #40 Prof. Young-Han Kim Thursday, May 9, 04 Homework Set #8 Due: Thursday, June 5, 0. Discrete-time Wiener process. Let Z n, n 0 be a discrete time white Gaussian noise (WGN) process,
More informationBayesian Interpretations of Regularization
Bayesian Interpretations of Regularization Charlie Frogner 9.50 Class 15 April 1, 009 The Plan Regularized least squares maps {(x i, y i )} n i=1 to a function that minimizes the regularized loss: f S
More informationProbabilistic Graphical Models
2016 Robert Nowak Probabilistic Graphical Models 1 Introduction We have focused mainly on linear models for signals, in particular the subspace model x = Uθ, where U is a n k matrix and θ R k is a vector
More information3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE
3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE 3.0 INTRODUCTION The purpose of this chapter is to introduce estimators shortly. More elaborated courses on System Identification, which are given
More informationfor valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I
Code: 15A04304 R15 B.Tech II Year I Semester (R15) Regular Examinations November/December 016 PROBABILITY THEY & STOCHASTIC PROCESSES (Electronics and Communication Engineering) Time: 3 hours Max. Marks:
More information13. Parameter Estimation. ECE 830, Spring 2014
13. Parameter Estimation ECE 830, Spring 2014 1 / 18 Primary Goal General problem statement: We observe X p(x θ), θ Θ and the goal is to determine the θ that produced X. Given a collection of observations
More informationSystem Identification, Lecture 4
System Identification, Lecture 4 Kristiaan Pelckmans (IT/UU, 2338) Course code: 1RT880, Report code: 61800 - Spring 2012 F, FRI Uppsala University, Information Technology 30 Januari 2012 SI-2012 K. Pelckmans
More informationAdvanced Digital Signal Processing -Introduction
Advanced Digital Signal Processing -Introduction LECTURE-2 1 AP9211- ADVANCED DIGITAL SIGNAL PROCESSING UNIT I DISCRETE RANDOM SIGNAL PROCESSING Discrete Random Processes- Ensemble Averages, Stationary
More informationSystem Identification, Lecture 4
System Identification, Lecture 4 Kristiaan Pelckmans (IT/UU, 2338) Course code: 1RT880, Report code: 61800 - Spring 2016 F, FRI Uppsala University, Information Technology 13 April 2016 SI-2016 K. Pelckmans
More informationStatistical Filtering and Control for AI and Robotics. Part II. Linear methods for regression & Kalman filtering
Statistical Filtering and Control for AI and Robotics Part II. Linear methods for regression & Kalman filtering Riccardo Muradore 1 / 66 Outline Linear Methods for Regression Gaussian filter Stochastic
More informationUniversity of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout 2:. The Multivariate Gaussian & Decision Boundaries
University of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout :. The Multivariate Gaussian & Decision Boundaries..15.1.5 1 8 6 6 8 1 Mark Gales mjfg@eng.cam.ac.uk Lent
More information2 Statistical Estimation: Basic Concepts
Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof. N. Shimkin 2 Statistical Estimation:
More informationWiener Filtering. EE264: Lecture 12
EE264: Lecture 2 Wiener Filtering In this lecture we will take a different view of filtering. Previously, we have depended on frequency-domain specifications to make some sort of LP/ BP/ HP/ BS filter,
More informationUCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)
UCSD ECE 53 Handout #46 Prof. Young-Han Kim Thursday, June 5, 04 Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei). Discrete-time Wiener process. Let Z n, n 0 be a discrete time white
More informationECE-S Introduction to Digital Signal Processing Lecture 3C Properties of Autocorrelation and Correlation
ECE-S352-701 Introduction to Digital Signal Processing Lecture 3C Properties of Autocorrelation and Correlation Assuming we have two sequences x(n) and y(n) from which we form a linear combination: c(n)
More informationIntroduction to Sparsity in Signal Processing
1 Introduction to Sparsity in Signal Processing Ivan Selesnick Polytechnic Institute of New York University Brooklyn, New York selesi@poly.edu 212 2 Under-determined linear equations Consider a system
More informationStatistics for scientists and engineers
Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3
More informationProbability and Statistics for Final Year Engineering Students
Probability and Statistics for Final Year Engineering Students By Yoni Nazarathy, Last Updated: May 24, 2011. Lecture 6p: Spectral Density, Passing Random Processes through LTI Systems, Filtering Terms
More informationPrincipal Component Analysis and Linear Discriminant Analysis
Principal Component Analysis and Linear Discriminant Analysis Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208 http://www.eecs.northwestern.edu/~yingwu 1/29
More informationWhitening and Coloring Transformations for Multivariate Gaussian Data. A Slecture for ECE 662 by Maliha Hossain
Whitening and Coloring Transformations for Multivariate Gaussian Data A Slecture for ECE 662 by Maliha Hossain Introduction This slecture discusses how to whiten data that is normally distributed. Data
More informationThe goal of the Wiener filter is to filter out noise that has corrupted a signal. It is based on a statistical approach.
Wiener filter From Wikipedia, the free encyclopedia In signal processing, the Wiener filter is a filter proposed by Norbert Wiener during the 1940s and published in 1949. [1] Its purpose is to reduce the
More informationFundamentals of Unconstrained Optimization
dalmau@cimat.mx Centro de Investigación en Matemáticas CIMAT A.C. Mexico Enero 2016 Outline Introduction 1 Introduction 2 3 4 Optimization Problem min f (x) x Ω where f (x) is a real-valued function The
More informationLinear models. x = Hθ + w, where w N(0, σ 2 I) and H R n p. The matrix H is called the observation matrix or design matrix 1.
Linear models As the first approach to estimator design, we consider the class of problems that can be represented by a linear model. In general, finding the MVUE is difficult. But if the linear model
More informationName of the Student: Problems on Discrete & Continuous R.Vs
Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan
More informationEstimation, Detection, and Identification
Estimation, Detection, and Identification Graduate Course on the CMU/Portugal ECE PhD Program Spring 2008/2009 Chapter 5 Best Linear Unbiased Estimators Instructor: Prof. Paulo Jorge Oliveira pjcro @ isr.ist.utl.pt
More information2A1H Time-Frequency Analysis II Bugs/queries to HT 2011 For hints and answers visit dwm/courses/2tf
Time-Frequency Analysis II (HT 20) 2AH 2AH Time-Frequency Analysis II Bugs/queries to david.murray@eng.ox.ac.uk HT 20 For hints and answers visit www.robots.ox.ac.uk/ dwm/courses/2tf David Murray. A periodic
More informationLINEAR MMSE ESTIMATION
LINEAR MMSE ESTIMATION TERM PAPER FOR EE 602 STATISTICAL SIGNAL PROCESSING By, DHEERAJ KUMAR VARUN KHAITAN 1 Introduction Linear MMSE estimators are chosen in practice because they are simpler than the
More informationECE4270 Fundamentals of DSP Lecture 20. Fixed-Point Arithmetic in FIR and IIR Filters (part I) Overview of Lecture. Overflow. FIR Digital Filter
ECE4270 Fundamentals of DSP Lecture 20 Fixed-Point Arithmetic in FIR and IIR Filters (part I) School of ECE Center for Signal and Information Processing Georgia Institute of Technology Overview of Lecture
More informationEcon 2148, fall 2017 Gaussian process priors, reproducing kernel Hilbert spaces, and Splines
Econ 2148, fall 2017 Gaussian process priors, reproducing kernel Hilbert spaces, and Splines Maximilian Kasy Department of Economics, Harvard University 1 / 37 Agenda 6 equivalent representations of the
More informationGaussian Basics Random Processes Filtering of Random Processes Signal Space Concepts
White Gaussian Noise I Definition: A (real-valued) random process X t is called white Gaussian Noise if I X t is Gaussian for each time instance t I Mean: m X (t) =0 for all t I Autocorrelation function:
More informationECE531 Lecture 12: Linear Estimation and Causal Wiener-Kolmogorov Filtering
ECE531 Lecture 12: Linear Estimation and Causal Wiener-Kolmogorov Filtering D. Richard Brown III Worcester Polytechnic Institute 16-Apr-2009 Worcester Polytechnic Institute D. Richard Brown III 16-Apr-2009
More informationPhysics 403. Segev BenZvi. Parameter Estimation, Correlations, and Error Bars. Department of Physics and Astronomy University of Rochester
Physics 403 Parameter Estimation, Correlations, and Error Bars Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Review of Last Class Best Estimates and Reliability
More informationLecture 14: Windowing
Lecture 14: Windowing ECE 401: Signal and Image Analysis University of Illinois 3/29/2017 1 DTFT Review 2 Windowing 3 Practical Windows Outline 1 DTFT Review 2 Windowing 3 Practical Windows DTFT Review
More informationProbability and Random Processes
0000000000000000000000000000 000000000000000000000000000000 0000000000000000000000000000 00000000000000000 000000000000000000000000 00000000000000000000 0000000000000000 000000000000000000 0000000000000000000
More information5 Kalman filters. 5.1 Scalar Kalman filter. Unit delay Signal model. System model
5 Kalman filters 5.1 Scalar Kalman filter 5.1.1 Signal model System model {Y (n)} is an unobservable sequence which is described by the following state or system equation: Y (n) = h(n)y (n 1) + Z(n), n
More informationMassachusetts Institute of Technology
Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.011: Introduction to Communication, Control and Signal Processing QUIZ, April 1, 010 QUESTION BOOKLET Your
More informationProf. Dr.-Ing. Armin Dekorsy Department of Communications Engineering. Stochastic Processes and Linear Algebra Recap Slides
Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering Stochastic Processes and Linear Algebra Recap Slides Stochastic processes and variables XX tt 0 = XX xx nn (tt) xx 2 (tt) XX tt XX
More informationParametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes
Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides were adapted
More informationData Analysis and Manifold Learning Lecture 6: Probabilistic PCA and Factor Analysis
Data Analysis and Manifold Learning Lecture 6: Probabilistic PCA and Factor Analysis Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline of Lecture
More informationReview (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology
Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology M. Soleymani Fall 2012 Some slides have been adopted from Prof. H.R. Rabiee s and also Prof. R. Gutierrez-Osuna
More informationRowan University Department of Electrical and Computer Engineering
Rowan University Department of Electrical and Computer Engineering Estimation and Detection Theory Fall 2013 to Practice Exam II This is a closed book exam. There are 8 problems in the exam. The problems
More informationComputing the MLE and the EM Algorithm
ECE 830 Fall 0 Statistical Signal Processing instructor: R. Nowak Computing the MLE and the EM Algorithm If X p(x θ), θ Θ, then the MLE is the solution to the equations logp(x θ) θ 0. Sometimes these equations
More informationLecture 3: Review of Linear Algebra
ECE 83 Fall 2 Statistical Signal Processing instructor: R Nowak, scribe: R Nowak Lecture 3: Review of Linear Algebra Very often in this course we will represent signals as vectors and operators (eg, filters,
More informationLecture 8: Bayesian Estimation of Parameters in State Space Models
in State Space Models March 30, 2016 Contents 1 Bayesian estimation of parameters in state space models 2 Computational methods for parameter estimation 3 Practical parameter estimation in state space
More informationECE534, Spring 2018: Solutions for Problem Set #3
ECE534, Spring 08: Solutions for Problem Set #3 Jointly Gaussian Random Variables and MMSE Estimation Suppose that X, Y are jointly Gaussian random variables with µ X = µ Y = 0 and σ X = σ Y = Let their
More information1 Bayesian Linear Regression (BLR)
Statistical Techniques in Robotics (STR, S15) Lecture#10 (Wednesday, February 11) Lecturer: Byron Boots Gaussian Properties, Bayesian Linear Regression 1 Bayesian Linear Regression (BLR) In linear regression,
More informationModule 1 - Signal estimation
, Arraial do Cabo, 2009 Module 1 - Signal estimation Sérgio M. Jesus (sjesus@ualg.pt) Universidade do Algarve, PT-8005-139 Faro, Portugal www.siplab.fct.ualg.pt February 2009 Outline of Module 1 Parameter
More informationImage Transforms. Digital Image Processing Fundamentals of Digital Image Processing, A. K. Jain. Digital Image Processing.
Digital Image Processing Fundamentals of Digital Image Processing, A. K. Jain 2D Orthogonal and Unitary Transform: Orthogonal Series Expansion: {a k,l (m,n)}: a set of complete orthonormal basis: N N *
More informationA Very Brief Summary of Bayesian Inference, and Examples
A Very Brief Summary of Bayesian Inference, and Examples Trinity Term 009 Prof Gesine Reinert Our starting point are data x = x 1, x,, x n, which we view as realisations of random variables X 1, X,, X
More informationECE531 Lecture 8: Non-Random Parameter Estimation
ECE531 Lecture 8: Non-Random Parameter Estimation D. Richard Brown III Worcester Polytechnic Institute 19-March-2009 Worcester Polytechnic Institute D. Richard Brown III 19-March-2009 1 / 25 Introduction
More informationEE482: Digital Signal Processing Applications
Professor Brendan Morris, SEB 3216, brendan.morris@unlv.edu EE482: Digital Signal Processing Applications Spring 2014 TTh 14:30-15:45 CBC C222 Lecture 11 Adaptive Filtering 14/03/04 http://www.ee.unlv.edu/~b1morris/ee482/
More informationAppendix A : Introduction to Probability and stochastic processes
A-1 Mathematical methods in communication July 5th, 2009 Appendix A : Introduction to Probability and stochastic processes Lecturer: Haim Permuter Scribe: Shai Shapira and Uri Livnat The probability of
More informationEAS 305 Random Processes Viewgraph 1 of 10. Random Processes
EAS 305 Random Processes Viewgraph 1 of 10 Definitions: Random Processes A random process is a family of random variables indexed by a parameter t T, where T is called the index set λ i Experiment outcome
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More informationSpatial Statistics with Image Analysis. Lecture L08. Computer exercise 3. Lecture 8. Johan Lindström. November 25, 2016
C3 Repetition Creating Q Spectral Non-grid Spatial Statistics with Image Analysis Lecture 8 Johan Lindström November 25, 216 Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 1/39 Lecture L8 C3 Repetition
More informationLecture 6: April 19, 2002
EE596 Pat. Recog. II: Introduction to Graphical Models Spring 2002 Lecturer: Jeff Bilmes Lecture 6: April 19, 2002 University of Washington Dept. of Electrical Engineering Scribe: Huaning Niu,Özgür Çetin
More informationEstimation Theory. as Θ = (Θ 1,Θ 2,...,Θ m ) T. An estimator
Estimation Theory Estimation theory deals with finding numerical values of interesting parameters from given set of data. We start with formulating a family of models that could describe how the data were
More informationLecture 13 and 14: Bayesian estimation theory
1 Lecture 13 and 14: Bayesian estimation theory Spring 2012 - EE 194 Networked estimation and control (Prof. Khan) March 26 2012 I. BAYESIAN ESTIMATORS Mother Nature conducts a random experiment that generates
More informationECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering
ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering Lecturer: Nikolay Atanasov: natanasov@ucsd.edu Teaching Assistants: Siwei Guo: s9guo@eng.ucsd.edu Anwesan Pal:
More informationECE Lecture Notes: Matrix Computations for Signal Processing
ECE Lecture Notes: Matrix Computations for Signal Processing James P. Reilly c Department of Electrical and Computer Engineering McMaster University October 17, 2005 2 Lecture 2 This lecture discusses
More informationECE531 Screencast 5.5: Bayesian Estimation for the Linear Gaussian Model
ECE53 Screencast 5.5: Bayesian Estimation for the Linear Gaussian Model D. Richard Brown III Worcester Polytechnic Institute Worcester Polytechnic Institute D. Richard Brown III / 8 Bayesian Estimation
More informationFactor Analysis and Kalman Filtering (11/2/04)
CS281A/Stat241A: Statistical Learning Theory Factor Analysis and Kalman Filtering (11/2/04) Lecturer: Michael I. Jordan Scribes: Byung-Gon Chun and Sunghoon Kim 1 Factor Analysis Factor analysis is used
More informationKalman Filter. Man-Wai MAK
Kalman Filter Man-Wai MAK Dept. of Electronic and Information Engineering, The Hong Kong Polytechnic University enmwmak@polyu.edu.hk http://www.eie.polyu.edu.hk/ mwmak References: S. Gannot and A. Yeredor,
More informationManifold Learning for Signal and Visual Processing Lecture 9: Probabilistic PCA (PPCA), Factor Analysis, Mixtures of PPCA
Manifold Learning for Signal and Visual Processing Lecture 9: Probabilistic PCA (PPCA), Factor Analysis, Mixtures of PPCA Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inria.fr http://perception.inrialpes.fr/
More informationAdvanced Signal Processing Minimum Variance Unbiased Estimation (MVU)
Advanced Signal Processing Minimum Variance Unbiased Estimation (MVU) Danilo Mandic room 813, ext: 46271 Department of Electrical and Electronic Engineering Imperial College London, UK d.mandic@imperial.ac.uk,
More informationE190Q Lecture 11 Autonomous Robot Navigation
E190Q Lecture 11 Autonomous Robot Navigation Instructor: Chris Clark Semester: Spring 013 1 Figures courtesy of Siegwart & Nourbakhsh Control Structures Planning Based Control Prior Knowledge Operator
More informationUNIT-2: MULTIPLE RANDOM VARIABLES & OPERATIONS
UNIT-2: MULTIPLE RANDOM VARIABLES & OPERATIONS In many practical situations, multiple random variables are required for analysis than a single random variable. The analysis of two random variables especially
More informationESE 524 Detection and Estimation Theory
ESE 524 Detection and Estimation heory Joseh A. O Sullivan Samuel C. Sachs Professor Electronic Systems and Signals Research Laboratory Electrical and Systems Engineering Washington University 2 Urbauer
More information4 Derivations of the Discrete-Time Kalman Filter
Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof N Shimkin 4 Derivations of the Discrete-Time
More information5 Operations on Multiple Random Variables
EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y
More informationA6523 Modeling, Inference, and Mining Jim Cordes, Cornell University
A6523 Modeling, Inference, and Mining Jim Cordes, Cornell University Lecture 19 Modeling Topics plan: Modeling (linear/non- linear least squares) Bayesian inference Bayesian approaches to spectral esbmabon;
More informationECE 636: Systems identification
ECE 636: Systems identification Lectures 9 0 Linear regression Coherence Φ ( ) xy ω γ xy ( ω) = 0 γ Φ ( ω) Φ xy ( ω) ( ω) xx o noise in the input, uncorrelated output noise Φ zz Φ ( ω) = Φ xy xx ( ω )
More informationGaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012
Gaussian Processes Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 01 Pictorial view of embedding distribution Transform the entire distribution to expected features Feature space Feature
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More information