ELEG-636: Statistical Signal Processing

Size: px
Start display at page:

Download "ELEG-636: Statistical Signal Processing"

Transcription

1 ELEG-636: Statistical Signal Processing Gonzalo R. Arce Department of Electrical and Computer Engineering University of Delaware Spring 2010 Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

2 Course Objectives & Structure Course Objectives & Structure Objective: Given a discrete time sequence {x(n)}, develop Statistical and spectral signal representation Filtering, prediction, and system identification algorithms Optimization methods that are Statistical Adaptive Course Structure: Weekly lectures [notes: arce] Periodic homework (theory & Matlab implementations) [15%] Midterm & Final examinations [85%] Textbook: Haykin, Adaptive Filter Theory. Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

3 Course Objectives & Structure Course Objectives & Structure Broad Applications in Communications, Imaging, Sensors. Emerging application in Brain-imaging techniques Brain-machine interfaces, Implantable devices. Neurofeedback presents real-time physiological signals from MRIs in a visual or auditory form to provide information about brain activity. These signals are used to train the patient to alter neural activity in a desired direction. Traditionally, feedback using EEGs or other mechanisms has not focused on the brain because the resolution is not good enough. Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

4 Wiener Filtering Problem Statement Produce an estimate of a desired process statistically related to a set of observations Historical Notes: The linear filtering problem was solved by Andrey Kolmogorov for discrete time his 1938 paper established the basic theorems for smoothing and predicting stationary stochastic processes Norbert Wiener in 1941 for continuous time not published until the 1949 paper Extrapolation, Interpolation, and Smoothing of Stationary Time Series Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

5 Wiener Filtering System restrictions and considerations: Filter is linear Filter is discrete time Filter is finite impulse response (FIR) The process is WSS Statistical optimization is employed Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

6 " # Wiener (MSE) Filtering Theory For the discrete time case Wiener Filtering "!! % $ The filter impulse response is finite and given by { w h k = k for k = 0, 1,, M 1 0 otherwise The output ˆd(n) is an estimate of the desired signal d(n) x(n) and d(n) are statistically related ˆd(n) and d(n) are statistically related Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

7 In convolution and vector form Wiener Filtering where ˆd(n) = M 1 k=0 w k x(n k) = wh x(n) w = [w 0, w 1,, w M 1 ] T [filter coefficient vector] x = [x(n), x(n 1),, x(n M + 1)] T [observation vector] The error can now be written as e(n) = d(n) ˆd(n) = d(n) w H x(n) Question: Under what criteria should the error be minimized? Selected Criteria: Mean squared-error (MSE) J(w) = E{e(n)e (n)} ( ) Result: The w that minimizes J(w) is the optimal (Wiener) filter Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

8 Wiener Filtering Utilizing e(n) = d(n) w H x(n) in ( ) and expanding, J(w) = E{e(n)e (n)} = E{(d(n) w H x(n))(d (n) x H (n)w)} = E{ d(n) 2 d(n)x H (n)w w H x(n)d (n) +w H x(n)x H (n)w} = E{ d(n) 2 } E{d(n)x H (n)}w w H E{x(n)d (n)} +w H E{x(n)x H (n)}w ( ) Let R = E{x(n)x H (n)} [autocorrelation of x(n)] p = E{x(n)d (n)} [cross correlation between x(n) and d(n)] Then ( ) can be compactly expressed as J(w) = σd 2 ph w w H p+w H Rw where we have assumed x(n) & d(n) are zero mean, WSS Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

9 Wiener Filtering The MSE criteria as a function of the filter weight vector w J(w) = σ 2 d ph w w H p+w H Rw Observation: The error is a quadratic function of w Consequences: The error is an M dimensional bowl shaped function of w with a unique minimum Result: The optimal weight vector, w 0, is determined by differentiating J(w) and setting the result to zero w J(w) w=w0 = 0 A closed form solution exists Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

10 Wiener Filtering Example Consider a two dimensional case, i.e., a M = 2 tap filter. Plot the error surface and error contours. Error Surface Error Contours Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

11 Wiener Filtering Aside (Matrix Differentiation): For complex data, w k = a k + jb k, k = 0, 1,, M 1 the gradient, with respect to w k, is k (J) = J a k + j J b k, k = 0, 1,, M 1 The complete gradient is thus given by 0 (J) 1 (J) w (J) =. = M 1 (J) J a 0 + j J b 0 J a 1 + j J b 1. J a M 1 + j J b M 1 Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

12 Wiener Filtering Example Let c and w be M 1 complex vectors. For g = c H w, find w (g) Note Thus g = c H w = M 1 k=0 c k w k = M 1 k=0 c k (a k + jb k ) k (g) = g + j g a k b k = ck + j(jc k ) = 0, k = 0, 1,, M 1 Result: For g = c H w w (g) = 0 (g) 1 (g). M-1 (g) = = 0 Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

13 Wiener Filtering Example Now suppose g = w H c. Find w (g) In this case, and g = w H c = M 1 k=0 w k c k = M 1 k=0 c k (a k jb k ) k (g) = g a k + j g b k = c k + j( jc k ) = 2c k, k = 0, 1,, M 1 Result: For g = w H c w (g) = 0 (g) 1 (g). M-1 (g) = 2c 0 2c 1. 2c M 1 = 2c Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

14 Wiener Filtering Example Lastly, suppose g = w H Qw. Find w (g) In this case, g = = M 1 i=0 M 1 i=0 M 1 j=0 w i w j q i,j M 1 (a i jb i )(a j + jb j )q i,j j=0 k (g) = g + j g a k b k M 1 = 2 (a j + jb j )q k,j + 0 j=0 M 1 = 2 w j q k,j j=0 Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

15 Wiener Filtering Result: For g = w H Qw w (g) = 0 (g) 1 (g). M-1 (g) = 2 M 1 i=0 M 1 i=0 M-1 i=0 q 0,i w i q 1,i w i. q M 1,i w i = 2Qw Observation: Differentiation result depends on matrix ordering Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

16 Wiener Filtering Returning to the MSE performance criteria J(w) = σ 2 d ph w w H p+w H Rw Approach: Minimize error by differentiating with respect to w and set result to 0 w (J) = 0 0 2p+2Rw = 0 Rw 0 = p [normal equation] Result: The Wiener filter coefficients are defined by w 0 = R 1 p Question: Does R 1 always exist? Recall R is positive semi-definite, and usually positive definite Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

17 Orthogonality Principle Orthogonality Principle Consider again the normal equation that defines the optimal solution Rw 0 E{x(n)x H (n)}w 0 = p = E{x(n)d (n)} Rearranging E{x(n)d (n)} E{x(n)x H (n)}w 0 = 0 E{x(n)[d (n) x H (n)w 0 ]} = 0 E{x(n)e 0 (n)} = 0 Note: e0 (n) is the error when the optimal weights are used, i.e., e 0 (n) = d (n) x H (n)w 0 Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

18 Orthogonality Principle Thus E{x(n)e0 (n)} = E x(n)e 0 (n) x(n 1)e 0 (n). x(n M + 1)e 0 (n) = Orthogonality Principle A necessary and sufficient condition for a filter to be optimal is that the estimate error, e (n), be orthogonal to each input sample in x(n) Interpretation: The observations samples and error are orthogonal and contain no mutual information Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

19 Minimum MSE Objective: Determine the minimum MSE Approach: Use the optimal weights w 0 = R 1 p in the MSE expression J(w) = σd 2 ph w w H p+w H Rw J min = σd 2 ph w 0 w H 0 p+wh 0 R(R 1 p) = σd 2 ph w 0 w H 0 p+wh 0 p = σd 2 ph w 0 Result: J min = σ 2 d ph R 1 p where the substitution w 0 = R 1 p has been employed Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

20 Excess MSE Objective: Consider the excess MSE introduced by using a weighted vector that is not optimal. J(w) J min = (σ 2 d ph w w H p+w H Rw) (σ 2 d ph w 0 w H 0 p+wh 0 Rw 0) Using the fact that yields p = Rw 0 and p H = w H 0 R J(w) J min = p H w w H p+w H Rw+p H w 0 + w H 0 p wh 0 Rw 0 = w H 0 Rw wh Rw 0 + w H Rw+w H 0 Rw 0 + w H 0 Rw 0 w H 0 Rw 0 = w H 0 Rw wh Rw 0 + w H Rw+w H 0 Rw 0 = (w w 0 ) H R(w w 0 ) J(w) = J min +(w w 0 ) H R(w w 0 ) Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

21 Excess MSE Finally, using the eigenvalue and vector representation R = QΩQ H J(w) = J min +(w w 0 ) H QΩQ H (w w 0 ) or defining the eigenvector transformed difference Result: v = Q H (w w 0 ) ( ) J(w) = J min + v H Ωv M = J min + λ k v k vk J(w) = J min + k=1 M λ k v k 2 k=1 Note: ( ) shows that v k is the difference (w w 0 ) projected onto eigenvector q k Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

22 Communications Example Example Consider the following system & C ( : - * +, - > 0 9 9? , - & ' A 6 7 : B ) ( D : : ; 2 1 E 7 6 B / : ; 0 < 2 : 5 = ) ( G F Objective: Determine the optimal filter for a given system and channel Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

23 Communications Example Specific Objective Determine the optimal order two filter weights, w 0, for H 1 (z) = H 2 (z) = z z 1 [AR process] [communication channel] where v 1 (n) and v 2 (n) zero mean white noise processes with σ 2 1 = 0.27 and σ2 2 = 0.1 Note: To determine w 0, we need: R u auto correlation of the received signal p the cross correlation between received signal u(n) and the desired signal d(n) Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

24 Communications Example J H e M Y R S T \ O L M N O ` R [ [ a Y S b X V S R Y P M N O H I J b c X Y \ d K J f \ T S W \ ] T S g Y X d Q R S T U V W X Y T Z R W [ \ ] R ^ T \ W _ X V S R Y K J i h Procedure: Consider R u first Since u(n) = x(n)+v 2 (n), where v 2 (n) is white with σ2 2 = 0.1 and is uncorrelated with x(n) [ ] R u = R x + R v2 = R x Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

25 j k š Wiener (MSE) Filtering Theory Communications Example q t x } ~ z w x y z } Œ ~ ƒ ~ } { x y z q r t v t l m l n o p Ž ƒ ~ ˆ ~ ƒ k œ v t n o } ~ ƒ } ˆ } Š ƒ ~ } Next, consider the relation between x(n) and v 1 (n) where for the give systems H 1 (z)h 2 (z) = X(z) = H 1 (z)h 2 (z)v 1 (z) 1 ( z 1 )( z 1 ) Converting to the time domain, we see x(n) is an order 2 AR process x(n) 0.1x(n 1) 0.8x(n 2) = v 1 (n) x(n)+a 1 x(n 1)+a 2 x(n 2) = v 1 (n) Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

26 Communications Example Since x(n) is a real valued order two AR process, the Yule-Walker equations are given by [ r(0) r(1) r (1) r(0) [ r(0) r(1) r(1) r(0) Solving for the coefficients: a 1 ][ a1 a 2 ][ a1 a 2 ] = ] = [ r (1) r (2) [ r(1) r(2) = r(1)[r(0) r(2)] r 2 (0) r 2 (1) a 2 = r(0)r(2) r 2 (1) r 2 (0) r 2 (1) Question: What are the known and unknown terms in this system? ] ] Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

27 Communications Example Note: We must solve the system to obtain the unknown r( ) values Noting r(0) = σ 2 x and rearranging to solve for r(1) and r(2) r(1) = a 1 σx 2 ( ) 1+a ( 2 ) r(2) = a 2 + a2 1 1+a 2 The Yule-Walker equations also stipulate σ 2 x ( ) σ 2 v 1 = r(0)+a 1 r(1)+a 2 r(2) ( ) Next, utilize the determined r(1), r(2) and given a 1, a 2,σ 2 v 1 values to determine r(0) = σ 2 x Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

28 Communications Example Substituting ( ) and ( ) into ( ), utilize r(0) = σ 2 x, and rearranging σ 2 v 1 = σx 2 = = r(0)+a 1 r(1)+a 2 r(2) ( ( ) = σx 2 a1 + a 1 )σ 2x + a 2 a 2 + a2 1 1+a 2 1+a 2 ( ) 1+ a2 1 a2 2 1+a + a2 1 a a 2 ( ) 1+a2 σ 2 v 1 1 a 2 (1+a 2 ) 2 a1 2 σ 2 x σ 2 x Note: All correlation terms have been determined since r(0) = σ 2 x Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

29 Communications Example Using a 1 = 0.1, a 2 = 0.8, and σv 2 1 = 0.27, ( ) σx 2 1+a2 σv 2 = 1 1 a 2 (1+a 2 ) 2 a1 2 = 1 = Thus r(0) = 1. Similarly r(1) = a 1 1+a 2 σ 2 x = 0.5 and finally, R x is R x = [ ] Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

30 ž Ÿ Communications Example Ñ Ï Î Ð Recall the overall system Ä ± ² ³» «± º º À ² Á µ ² ± ª Á » à Ż ³ ²» ¼ ³ ² Æ Ã Ç È Ÿ ª É Ê Ë Í Ì ± ² ³ µ ³ ¹ ± º» ¼ ± ½ ³» ¾ µ ² ± Putting the pieces of R u together R u = R x + R v2 [ ] [ = [ ] = ] Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

31 Communications Example Recall: The Wiener solution is given by w 0 = R 1 p Note: R is known, but p is still to be determined {[ ]} d(n)x(n) p = E d(n)u(n 1) Recall or in the time domain X(z) = H 2 (z)d(z) = D(z) z 1 x(n) x(n 1) = d(n) Lastly, the observation is corrupted by additive noise u(n) = x(n)+v 2 (n) Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

32 Communications Example Thus E{u(n)d(n)} = E{[x(n)+v 2 (n)][x(n) x(n 1)]} Similarly, = E{x 2 (n)}+e{x(n)v 2 (n)} E{x(n)x(n 1)} E{v 2 (n)x(n 1)} = σx r(1) 0 ( ) 1 = = E{u(n 1)d(n)} = E{[x(n 1)+v 2 (n 1)][x(n) x(n 1)]} = r(1) r(0) = Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

33 Communications Example Thus Final Solution: p = [ ] w 0 = R 1 p [ ] 1 [ = [ ] = ] The optimal filter weights are a function of the source signal statistics and the communications channel Question: How do we optimized a filter of when the statistics are not known in closed form or a priori? Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical Signal Processing Spring / 34

ELEG-636: Statistical Signal Processing

ELEG-636: Statistical Signal Processing ELEG-636: Statistical Signal Processing Gonzalo R. Arce Department of Electrical and Computer Engineering University of Delaware Spring 2010 Gonzalo R. Arce (ECE, Univ. of Delaware) ELEG-636: Statistical

More information

Chapter 2 Wiener Filtering

Chapter 2 Wiener Filtering Chapter 2 Wiener Filtering Abstract Before moving to the actual adaptive filtering problem, we need to solve the optimum linear filtering problem (particularly, in the mean-square-error sense). We start

More information

EE482: Digital Signal Processing Applications

EE482: Digital Signal Processing Applications Professor Brendan Morris, SEB 3216, brendan.morris@unlv.edu EE482: Digital Signal Processing Applications Spring 2014 TTh 14:30-15:45 CBC C222 Lecture 11 Adaptive Filtering 14/03/04 http://www.ee.unlv.edu/~b1morris/ee482/

More information

26. Filtering. ECE 830, Spring 2014

26. Filtering. ECE 830, Spring 2014 26. Filtering ECE 830, Spring 2014 1 / 26 Wiener Filtering Wiener filtering is the application of LMMSE estimation to recovery of a signal in additive noise under wide sense sationarity assumptions. Problem

More information

Framework for functional tree simulation applied to 'golden delicious' apple trees

Framework for functional tree simulation applied to 'golden delicious' apple trees Purdue University Purdue e-pubs Open Access Theses Theses and Dissertations Spring 2015 Framework for functional tree simulation applied to 'golden delicious' apple trees Marek Fiser Purdue University

More information

Least Mean Square Filtering

Least Mean Square Filtering Least Mean Square Filtering U. B. Desai Slides tex-ed by Bhushan Least Mean Square(LMS) Algorithm Proposed by Widrow (1963) Advantage: Very Robust Only Disadvantage: It takes longer to converge where X(n)

More information

Machine Learning. A Bayesian and Optimization Perspective. Academic Press, Sergios Theodoridis 1. of Athens, Athens, Greece.

Machine Learning. A Bayesian and Optimization Perspective. Academic Press, Sergios Theodoridis 1. of Athens, Athens, Greece. Machine Learning A Bayesian and Optimization Perspective Academic Press, 2015 Sergios Theodoridis 1 1 Dept. of Informatics and Telecommunications, National and Kapodistrian University of Athens, Athens,

More information

Adaptive Systems Homework Assignment 1

Adaptive Systems Homework Assignment 1 Signal Processing and Speech Communication Lab. Graz University of Technology Adaptive Systems Homework Assignment 1 Name(s) Matr.No(s). The analytical part of your homework (your calculation sheets) as

More information

The goal of the Wiener filter is to filter out noise that has corrupted a signal. It is based on a statistical approach.

The goal of the Wiener filter is to filter out noise that has corrupted a signal. It is based on a statistical approach. Wiener filter From Wikipedia, the free encyclopedia In signal processing, the Wiener filter is a filter proposed by Norbert Wiener during the 1940s and published in 1949. [1] Its purpose is to reduce the

More information

Adaptive Filtering Part II

Adaptive Filtering Part II Adaptive Filtering Part II In previous Lecture we saw that: Setting the gradient of cost function equal to zero, we obtain the optimum values of filter coefficients: (Wiener-Hopf equation) Adaptive Filtering,

More information

2.6 The optimum filtering solution is defined by the Wiener-Hopf equation

2.6 The optimum filtering solution is defined by the Wiener-Hopf equation .6 The optimum filtering solution is defined by the Wiener-opf equation w o p for which the minimum mean-square error equals J min σ d p w o () Combine Eqs. and () into a single relation: σ d p p 1 w o

More information

III.C - Linear Transformations: Optimal Filtering

III.C - Linear Transformations: Optimal Filtering 1 III.C - Linear Transformations: Optimal Filtering FIR Wiener Filter [p. 3] Mean square signal estimation principles [p. 4] Orthogonality principle [p. 7] FIR Wiener filtering concepts [p. 8] Filter coefficients

More information

INTRODUCTION Noise is present in many situations of daily life for ex: Microphones will record noise and speech. Goal: Reconstruct original signal Wie

INTRODUCTION Noise is present in many situations of daily life for ex: Microphones will record noise and speech. Goal: Reconstruct original signal Wie WIENER FILTERING Presented by N.Srikanth(Y8104060), M.Manikanta PhaniKumar(Y8104031). INDIAN INSTITUTE OF TECHNOLOGY KANPUR Electrical Engineering dept. INTRODUCTION Noise is present in many situations

More information

EE482: Digital Signal Processing Applications

EE482: Digital Signal Processing Applications Professor Brendan Morris, SEB 3216, brendan.morris@unlv.edu EE482: Digital Signal Processing Applications Spring 2014 TTh 14:30-15:45 CBC C222 Lecture 11 Adaptive Filtering 14/03/04 http://www.ee.unlv.edu/~b1morris/ee482/

More information

Adaptive Systems. Winter Term 2017/18. Instructor: Pejman Mowlaee Beikzadehmahaleh. Assistants: Christian Stetco

Adaptive Systems. Winter Term 2017/18. Instructor: Pejman Mowlaee Beikzadehmahaleh. Assistants: Christian Stetco Adaptive Systems Winter Term 2017/18 Instructor: Pejman Mowlaee Beikzadehmahaleh Assistants: Christian Stetco Signal Processing and Speech Communication Laboratory, Inffeldgasse 16c/EG written by Bernhard

More information

LMS and eigenvalue spread 2. Lecture 3 1. LMS and eigenvalue spread 3. LMS and eigenvalue spread 4. χ(r) = λ max λ min. » 1 a. » b0 +b. b 0 a+b 1.

LMS and eigenvalue spread 2. Lecture 3 1. LMS and eigenvalue spread 3. LMS and eigenvalue spread 4. χ(r) = λ max λ min. » 1 a. » b0 +b. b 0 a+b 1. Lecture Lecture includes the following: Eigenvalue spread of R and its influence on the convergence speed for the LMS. Variants of the LMS: The Normalized LMS The Leaky LMS The Sign LMS The Echo Canceller

More information

Linear Optimum Filtering: Statement

Linear Optimum Filtering: Statement Ch2: Wiener Filters Optimal filters for stationary stochastic models are reviewed and derived in this presentation. Contents: Linear optimal filtering Principle of orthogonality Minimum mean squared error

More information

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d)

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d) Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d) Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides

More information

AN IDENTIFICATION ALGORITHM FOR ARMAX SYSTEMS

AN IDENTIFICATION ALGORITHM FOR ARMAX SYSTEMS AN IDENTIFICATION ALGORITHM FOR ARMAX SYSTEMS First the X, then the AR, finally the MA Jan C. Willems, K.U. Leuven Workshop on Observation and Estimation Ben Gurion University, July 3, 2004 p./2 Joint

More information

Lecture 3: Linear FIR Adaptive Filtering Gradient based adaptation: Steepest Descent Method

Lecture 3: Linear FIR Adaptive Filtering Gradient based adaptation: Steepest Descent Method 1 Lecture 3: Linear FIR Adaptive Filtering Gradient based adaptation: Steepest Descent Method Adaptive filtering: Problem statement Consider the family of variable parameter FIR filters, computing their

More information

OC330C. Wiring Diagram. Recommended PKH- P35 / P50 GALH PKA- RP35 / RP50. Remarks (Drawing No.) No. Parts No. Parts Name Specifications

OC330C. Wiring Diagram. Recommended PKH- P35 / P50 GALH PKA- RP35 / RP50. Remarks (Drawing No.) No. Parts No. Parts Name Specifications G G " # $ % & " ' ( ) $ * " # $ % & " ( + ) $ * " # C % " ' ( ) $ * C " # C % " ( + ) $ * C D ; E @ F @ 9 = H I J ; @ = : @ A > B ; : K 9 L 9 M N O D K P D N O Q P D R S > T ; U V > = : W X Y J > E ; Z

More information

Lecture 7: Linear Prediction

Lecture 7: Linear Prediction 1 Lecture 7: Linear Prediction Overview Dealing with three notions: PREDICTION, PREDICTOR, PREDICTION ERROR; FORWARD versus BACKWARD: Predicting the future versus (improper terminology) predicting the

More information

ELEG 833. Nonlinear Signal Processing

ELEG 833. Nonlinear Signal Processing Nonlinear Signal Processing ELEG 833 Gonzalo R. Arce Department of Electrical and Computer Engineering University of Delaware arce@ee.udel.edu February 15, 2005 1 INTRODUCTION 1 Introduction Signal processing

More information

LA PRISE DE CALAIS. çoys, çoys, har - dis. çoys, dis. tons, mantz, tons, Gas. c est. à ce. C est à ce. coup, c est à ce

LA PRISE DE CALAIS. çoys, çoys, har - dis. çoys, dis. tons, mantz, tons, Gas. c est. à ce. C est à ce. coup, c est à ce > ƒ? @ Z [ \ _ ' µ `. l 1 2 3 z Æ Ñ 6 = Ð l sl (~131 1606) rn % & +, l r s s, r 7 nr ss r r s s s, r s, r! " # $ s s ( ) r * s, / 0 s, r 4 r r 9;: < 10 r mnz, rz, r ns, 1 s ; j;k ns, q r s { } ~ l r mnz,

More information

ECE534, Spring 2018: Solutions for Problem Set #5

ECE534, Spring 2018: Solutions for Problem Set #5 ECE534, Spring 08: s for Problem Set #5 Mean Value and Autocorrelation Functions Consider a random process X(t) such that (i) X(t) ± (ii) The number of zero crossings, N(t), in the interval (0, t) is described

More information

MMSE System Identification, Gradient Descent, and the Least Mean Squares Algorithm

MMSE System Identification, Gradient Descent, and the Least Mean Squares Algorithm MMSE System Identification, Gradient Descent, and the Least Mean Squares Algorithm D.R. Brown III WPI WPI D.R. Brown III 1 / 19 Problem Statement and Assumptions known input x[n] unknown system (assumed

More information

Wiener Filtering. EE264: Lecture 12

Wiener Filtering. EE264: Lecture 12 EE264: Lecture 2 Wiener Filtering In this lecture we will take a different view of filtering. Previously, we have depended on frequency-domain specifications to make some sort of LP/ BP/ HP/ BS filter,

More information

Lecture Notes in Adaptive Filters

Lecture Notes in Adaptive Filters Lecture Notes in Adaptive Filters Second Edition Jesper Kjær Nielsen jkn@es.aau.dk Aalborg University Søren Holdt Jensen shj@es.aau.dk Aalborg University Last revised: September 19, 2012 Nielsen, Jesper

More information

Machine Learning and Adaptive Systems. Lectures 3 & 4

Machine Learning and Adaptive Systems. Lectures 3 & 4 ECE656- Lectures 3 & 4, Professor Department of Electrical and Computer Engineering Colorado State University Fall 2015 What is Learning? General Definition of Learning: Any change in the behavior or performance

More information

V. Adaptive filtering Widrow-Hopf Learning Rule LMS and Adaline

V. Adaptive filtering Widrow-Hopf Learning Rule LMS and Adaline V. Adaptive filtering Widrow-Hopf Learning Rule LMS and Adaline Goals Introduce Wiener-Hopf (WH) equations Introduce application of the steepest descent method to the WH problem Approximation to the Least

More information

1. Determine if each of the following are valid autocorrelation matrices of WSS processes. (Correlation Matrix),R c =

1. Determine if each of the following are valid autocorrelation matrices of WSS processes. (Correlation Matrix),R c = ENEE630 ADSP Part II w/ solution. Determine if each of the following are valid autocorrelation matrices of WSS processes. (Correlation Matrix) R a = 4 4 4,R b = 0 0,R c = j 0 j 0 j 0 j 0 j,r d = 0 0 0

More information

4.3 Laplace Transform in Linear System Analysis

4.3 Laplace Transform in Linear System Analysis 4.3 Laplace Transform in Linear System Analysis The main goal in analysis of any dynamic system is to find its response to a given input. The system response in general has two components: zero-state response

More information

A Beamforming Method for Blind Calibration of Time-Interleaved A/D Converters

A Beamforming Method for Blind Calibration of Time-Interleaved A/D Converters A Beamforming Method for Blind Calibration of Time-nterleaved A/D Converters Bernard C. Levy University of California, Davis Joint wor with Steve Huang September 30, 200 Outline 1 Motivation Problem Formulation

More information

Adaptive Filter Theory

Adaptive Filter Theory 0 Adaptive Filter heory Sung Ho Cho Hanyang University Seoul, Korea (Office) +8--0-0390 (Mobile) +8-10-541-5178 dragon@hanyang.ac.kr able of Contents 1 Wiener Filters Gradient Search by Steepest Descent

More information

1. Allstate Group Critical Illness Claim Form: When filing Critical Illness claim, please be sure to include the following:

1. Allstate Group Critical Illness Claim Form: When filing Critical Illness claim, please be sure to include the following: Dear Policyholder, Per your recent request, enclosed you will find the following forms: 1. Allstate Group Critical Illness Claim Form: When filing Critical Illness claim, please be sure to include the

More information

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes 1 Discrete-time Stochastic Processes Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes Electrical & Computer Engineering University of Maryland, College Park

More information

3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE

3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE 3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE 3.0 INTRODUCTION The purpose of this chapter is to introduce estimators shortly. More elaborated courses on System Identification, which are given

More information

FSAN815/ELEG815: Foundations of Statistical Learning

FSAN815/ELEG815: Foundations of Statistical Learning FSAN815/ELEG815: Foundations of Statistical Learning Gonzalo R. Arce Chapter 14: Logistic Regression Fall 2014 Course Objectives & Structure Course Objectives & Structure The course provides an introduction

More information

ESE 531: Digital Signal Processing

ESE 531: Digital Signal Processing ESE 531: Digital Signal Processing Lec 22: April 10, 2018 Adaptive Filters Penn ESE 531 Spring 2018 Khanna Lecture Outline! Circular convolution as linear convolution with aliasing! Adaptive Filters Penn

More information

Manifold Regularization

Manifold Regularization Manifold Regularization Vikas Sindhwani Department of Computer Science University of Chicago Joint Work with Mikhail Belkin and Partha Niyogi TTI-C Talk September 14, 24 p.1 The Problem of Learning is

More information

Reduced-Rank Multi-Antenna Cyclic Wiener Filtering for Interference Cancellation

Reduced-Rank Multi-Antenna Cyclic Wiener Filtering for Interference Cancellation Reduced-Rank Multi-Antenna Cyclic Wiener Filtering for Interference Cancellation Hong Zhang, Ali Abdi and Alexander Haimovich Center for Wireless Communications and Signal Processing Research Department

More information

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver Stochastic Signals Overview Definitions Second order statistics Stationarity and ergodicity Random signal variability Power spectral density Linear systems with stationary inputs Random signal memory Correlation

More information

Practical Spectral Estimation

Practical Spectral Estimation Digital Signal Processing/F.G. Meyer Lecture 4 Copyright 2015 François G. Meyer. All Rights Reserved. Practical Spectral Estimation 1 Introduction The goal of spectral estimation is to estimate how the

More information

An Example file... log.txt

An Example file... log.txt # ' ' Start of fie & %$ " 1 - : 5? ;., B - ( * * B - ( * * F I / 0. )- +, * ( ) 8 8 7 /. 6 )- +, 5 5 3 2( 7 7 +, 6 6 9( 3 5( ) 7-0 +, => - +< ( ) )- +, 7 / +, 5 9 (. 6 )- 0 * D>. C )- +, (A :, C 0 )- +,

More information

Connection equations with stream variables are generated in a model when using the # $ % () operator or the & ' %

Connection equations with stream variables are generated in a model when using the # $ % () operator or the & ' % 7 9 9 7 The two basic variable types in a connector potential (or across) variable and flow (or through) variable are not sufficient to describe in a numerically sound way the bi-directional flow of matter

More information

An Introduction to Optimal Control Applied to Disease Models

An Introduction to Optimal Control Applied to Disease Models An Introduction to Optimal Control Applied to Disease Models Suzanne Lenhart University of Tennessee, Knoxville Departments of Mathematics Lecture1 p.1/37 Example Number of cancer cells at time (exponential

More information

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides were adapted

More information

Laboratory Project 2: Spectral Analysis and Optimal Filtering

Laboratory Project 2: Spectral Analysis and Optimal Filtering Laboratory Project 2: Spectral Analysis and Optimal Filtering Random signals analysis (MVE136) Mats Viberg and Lennart Svensson Department of Signals and Systems Chalmers University of Technology 412 96

More information

! " # $! % & '! , ) ( + - (. ) ( ) * + / 0 1 2 3 0 / 4 5 / 6 0 ; 8 7 < = 7 > 8 7 8 9 : Œ Š ž P P h ˆ Š ˆ Œ ˆ Š ˆ Ž Ž Ý Ü Ý Ü Ý Ž Ý ê ç è ± ¹ ¼ ¹ ä ± ¹ w ç ¹ è ¼ è Œ ¹ ± ¹ è ¹ è ä ç w ¹ ã ¼ ¹ ä ¹ ¼ ¹ ±

More information

SECTION FOR DIGITAL SIGNAL PROCESSING DEPARTMENT OF MATHEMATICAL MODELLING TECHNICAL UNIVERSITY OF DENMARK Course 04362 Digital Signal Processing: Solutions to Problems in Proakis and Manolakis, Digital

More information

Q3. Derive the Wiener filter (non-causal) for a stationary process with given spectral characteristics

Q3. Derive the Wiener filter (non-causal) for a stationary process with given spectral characteristics Q3. Derive the Wiener filter (non-causal) for a stationary process with given spectral characteristics McMaster University 1 Background x(t) z(t) WF h(t) x(t) n(t) Figure 1: Signal Flow Diagram Goal: filter

More information

Sparse Least Mean Square Algorithm for Estimation of Truncated Volterra Kernels

Sparse Least Mean Square Algorithm for Estimation of Truncated Volterra Kernels Sparse Least Mean Square Algorithm for Estimation of Truncated Volterra Kernels Bijit Kumar Das 1, Mrityunjoy Chakraborty 2 Department of Electronics and Electrical Communication Engineering Indian Institute

More information

Optimal Control of PDEs

Optimal Control of PDEs Optimal Control of PDEs Suzanne Lenhart University of Tennessee, Knoville Department of Mathematics Lecture1 p.1/36 Outline 1. Idea of diffusion PDE 2. Motivating Eample 3. Big picture of optimal control

More information

SGN Advanced Signal Processing: Lecture 4 Gradient based adaptation: Steepest Descent Method

SGN Advanced Signal Processing: Lecture 4 Gradient based adaptation: Steepest Descent Method SGN 21006 Advanced Signal Processing: Lecture 4 Gradient based adaptation: Steepest Descent Method Ioan Tabus Department of Signal Processing Tampere University of Technology Finland 1 / 20 Adaptive filtering:

More information

$%! & (, -3 / 0 4, 5 6/ 6 +7, 6 8 9/ 5 :/ 5 A BDC EF G H I EJ KL N G H I. ] ^ _ ` _ ^ a b=c o e f p a q i h f i a j k e i l _ ^ m=c n ^

$%! & (, -3 / 0 4, 5 6/ 6 +7, 6 8 9/ 5 :/ 5 A BDC EF G H I EJ KL N G H I. ] ^ _ ` _ ^ a b=c o e f p a q i h f i a j k e i l _ ^ m=c n ^ ! #" $%! & ' ( ) ) (, -. / ( 0 1#2 ' ( ) ) (, -3 / 0 4, 5 6/ 6 7, 6 8 9/ 5 :/ 5 ;=? @ A BDC EF G H I EJ KL M @C N G H I OPQ ;=R F L EI E G H A S T U S V@C N G H IDW G Q G XYU Z A [ H R C \ G ] ^ _ `

More information

Lesson 1. Optimal signalbehandling LTH. September Statistical Digital Signal Processing and Modeling, Hayes, M:

Lesson 1. Optimal signalbehandling LTH. September Statistical Digital Signal Processing and Modeling, Hayes, M: Lesson 1 Optimal Signal Processing Optimal signalbehandling LTH September 2013 Statistical Digital Signal Processing and Modeling, Hayes, M: John Wiley & Sons, 1996. ISBN 0471594318 Nedelko Grbic Mtrl

More information

Complement on Digital Spectral Analysis and Optimal Filtering: Theory and Exercises

Complement on Digital Spectral Analysis and Optimal Filtering: Theory and Exercises Complement on Digital Spectral Analysis and Optimal Filtering: Theory and Exercises Random Signals Analysis (MVE136) Mats Viberg Department of Signals and Systems Chalmers University of Technology 412

More information

CHAPTER 2 RANDOM PROCESSES IN DISCRETE TIME

CHAPTER 2 RANDOM PROCESSES IN DISCRETE TIME CHAPTER 2 RANDOM PROCESSES IN DISCRETE TIME Shri Mata Vaishno Devi University, (SMVDU), 2013 Page 13 CHAPTER 2 RANDOM PROCESSES IN DISCRETE TIME When characterizing or modeling a random variable, estimates

More information

EEG- Signal Processing

EEG- Signal Processing Fatemeh Hadaeghi EEG- Signal Processing Lecture Notes for BSP, Chapter 5 Master Program Data Engineering 1 5 Introduction The complex patterns of neural activity, both in presence and absence of external

More information

X n = c n + c n,k Y k, (4.2)

X n = c n + c n,k Y k, (4.2) 4. Linear filtering. Wiener filter Assume (X n, Y n is a pair of random sequences in which the first component X n is referred a signal and the second an observation. The paths of the signal are unobservable

More information

ETIKA V PROFESII PSYCHOLÓGA

ETIKA V PROFESII PSYCHOLÓGA P r a ž s k á v y s o k á š k o l a p s y c h o s o c i á l n í c h s t u d i í ETIKA V PROFESII PSYCHOLÓGA N a t á l i a S l o b o d n í k o v á v e d ú c i p r á c e : P h D r. M a r t i n S t r o u

More information

Optimal and Adaptive Filtering

Optimal and Adaptive Filtering Optimal and Adaptive Filtering Murat Üney M.Uney@ed.ac.uk Institute for Digital Communications (IDCOM) 26/06/2017 Murat Üney (IDCOM) Optimal and Adaptive Filtering 26/06/2017 1 / 69 Table of Contents 1

More information

ECE 636: Systems identification

ECE 636: Systems identification ECE 636: Systems identification Lectures 3 4 Random variables/signals (continued) Random/stochastic vectors Random signals and linear systems Random signals in the frequency domain υ ε x S z + y Experimental

More information

EFFECTS OF ILL-CONDITIONED DATA ON LEAST SQUARES ADAPTIVE FILTERS. Gary A. Ybarra and S.T. Alexander

EFFECTS OF ILL-CONDITIONED DATA ON LEAST SQUARES ADAPTIVE FILTERS. Gary A. Ybarra and S.T. Alexander EFFECTS OF ILL-CONDITIONED DATA ON LEAST SQUARES ADAPTIVE FILTERS Gary A. Ybarra and S.T. Alexander Center for Communications and Signal Processing Electrical and Computer Engineering Department North

More information

Designing Information Devices and Systems II Fall 2015 Note 5

Designing Information Devices and Systems II Fall 2015 Note 5 EE 16B Designing Information Devices and Systems II Fall 01 Note Lecture given by Babak Ayazifar (9/10) Notes by: Ankit Mathur Spectral Leakage Example Compute the length-8 and length-6 DFT for the following

More information

Revision of Lecture 4

Revision of Lecture 4 Revision of Lecture 4 We have discussed all basic components of MODEM Pulse shaping Tx/Rx filter pair Modulator/demodulator Bits map symbols Discussions assume ideal channel, and for dispersive channel

More information

QUESTIONS ON QUARKONIUM PRODUCTION IN NUCLEAR COLLISIONS

QUESTIONS ON QUARKONIUM PRODUCTION IN NUCLEAR COLLISIONS International Workshop Quarkonium Working Group QUESTIONS ON QUARKONIUM PRODUCTION IN NUCLEAR COLLISIONS ALBERTO POLLERI TU München and ECT* Trento CERN - November 2002 Outline What do we know for sure?

More information

Lecture: Adaptive Filtering

Lecture: Adaptive Filtering ECE 830 Spring 2013 Statistical Signal Processing instructors: K. Jamieson and R. Nowak Lecture: Adaptive Filtering Adaptive filters are commonly used for online filtering of signals. The goal is to estimate

More information

Principal Secretary to Government Haryana, Town & Country Planning Department, Haryana, Chandigarh.

Principal Secretary to Government Haryana, Town & Country Planning Department, Haryana, Chandigarh. 1 From To Principal Secretary to Government Haryana, Town & Country Planning Department, Haryana, Chandigarh. The Director General, Town & Country Planning Department, Haryana, Chandigarh. Memo No. Misc-2339

More information

Neural Network Training

Neural Network Training Neural Network Training Sargur Srihari Topics in Network Training 0. Neural network parameters Probabilistic problem formulation Specifying the activation and error functions for Regression Binary classification

More information

TSKS01 Digital Communication Lecture 1

TSKS01 Digital Communication Lecture 1 TSKS01 Digital Communication Lecture 1 Introduction, Repetition, and Noise Modeling Emil Björnson Department of Electrical Engineering (ISY) Division of Communication Systems Emil Björnson Course Director

More information

Covariance and Correlation Matrix

Covariance and Correlation Matrix Covariance and Correlation Matrix Given sample {x n } N 1, where x Rd, x n = x 1n x 2n. x dn sample mean x = 1 N N n=1 x n, and entries of sample mean are x i = 1 N N n=1 x in sample covariance matrix

More information

CCNY. BME I5100: Biomedical Signal Processing. Stochastic Processes. Lucas C. Parra Biomedical Engineering Department City College of New York

CCNY. BME I5100: Biomedical Signal Processing. Stochastic Processes. Lucas C. Parra Biomedical Engineering Department City College of New York BME I5100: Biomedical Signal Processing Stochastic Processes Lucas C. Parra Biomedical Engineering Department CCNY 1 Schedule Week 1: Introduction Linear, stationary, normal - the stuff biology is not

More information

Linear Stochastic Models. Special Types of Random Processes: AR, MA, and ARMA. Digital Signal Processing

Linear Stochastic Models. Special Types of Random Processes: AR, MA, and ARMA. Digital Signal Processing Linear Stochastic Models Special Types of Random Processes: AR, MA, and ARMA Digital Signal Processing Department of Electrical and Electronic Engineering, Imperial College d.mandic@imperial.ac.uk c Danilo

More information

Notes on Linear Minimum Mean Square Error Estimators

Notes on Linear Minimum Mean Square Error Estimators Notes on Linear Minimum Mean Square Error Estimators Ça gatay Candan January, 0 Abstract Some connections between linear minimum mean square error estimators, maximum output SNR filters and the least square

More information

5 Kalman filters. 5.1 Scalar Kalman filter. Unit delay Signal model. System model

5 Kalman filters. 5.1 Scalar Kalman filter. Unit delay Signal model. System model 5 Kalman filters 5.1 Scalar Kalman filter 5.1.1 Signal model System model {Y (n)} is an unobservable sequence which is described by the following state or system equation: Y (n) = h(n)y (n 1) + Z(n), n

More information

Complement on Digital Spectral Analysis and Optimal Filtering: Theory and Exercises

Complement on Digital Spectral Analysis and Optimal Filtering: Theory and Exercises Complement on Digital Spectral Analysis and Optimal Filtering: Theory and Exercises Random Processes With Applications (MVE 135) Mats Viberg Department of Signals and Systems Chalmers University of Technology

More information

Parametric Signal Modeling and Linear Prediction Theory 4. The Levinson-Durbin Recursion

Parametric Signal Modeling and Linear Prediction Theory 4. The Levinson-Durbin Recursion Parametric Signal Modeling and Linear Prediction Theory 4. The Levinson-Durbin Recursion Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides were adapted

More information

Edge Detection. Computer Vision P. Schrater Spring 2003

Edge Detection. Computer Vision P. Schrater Spring 2003 Edge Detection Computer Vision P. Schrater Spring 2003 Simplest Model: (Canny) Edge(x) = a U(x) + n(x) U(x)? x=0 Convolve image with U and find points with high magnitude. Choose value by comparing with

More information

Recursive Generalized Eigendecomposition for Independent Component Analysis

Recursive Generalized Eigendecomposition for Independent Component Analysis Recursive Generalized Eigendecomposition for Independent Component Analysis Umut Ozertem 1, Deniz Erdogmus 1,, ian Lan 1 CSEE Department, OGI, Oregon Health & Science University, Portland, OR, USA. {ozertemu,deniz}@csee.ogi.edu

More information

General Neoclassical Closure Theory: Diagonalizing the Drift Kinetic Operator

General Neoclassical Closure Theory: Diagonalizing the Drift Kinetic Operator General Neoclassical Closure Theory: Diagonalizing the Drift Kinetic Operator E. D. Held eheld@cc.usu.edu Utah State University General Neoclassical Closure Theory:Diagonalizing the Drift Kinetic Operator

More information

3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 REDCLIFF MUNICIPAL PLANNING COMMISSION FOR COMMENT/DISCUSSION DATE: TOPIC: April 27 th, 2018 Bylaw 1860/2018, proposed amendments to the Land Use Bylaw regarding cannabis

More information

EE538 Final Exam Fall 2007 Mon, Dec 10, 8-10 am RHPH 127 Dec. 10, Cover Sheet

EE538 Final Exam Fall 2007 Mon, Dec 10, 8-10 am RHPH 127 Dec. 10, Cover Sheet EE538 Final Exam Fall 2007 Mon, Dec 10, 8-10 am RHPH 127 Dec. 10, 2007 Cover Sheet Test Duration: 120 minutes. Open Book but Closed Notes. Calculators allowed!! This test contains five problems. Each of

More information

SOLAR MORTEC INDUSTRIES.

SOLAR MORTEC INDUSTRIES. SOLAR MORTEC INDUSTRIES www.mortecindustries.com.au Wind Regions Region A Callytharra Springs Gascoyne Junction Green Head Kununurra Lord Howe Island Morawa Toowoomba Wittanoom Bourke Region B Adelaide

More information

Planning for Reactive Behaviors in Hide and Seek

Planning for Reactive Behaviors in Hide and Seek University of Pennsylvania ScholarlyCommons Center for Human Modeling and Simulation Department of Computer & Information Science May 1995 Planning for Reactive Behaviors in Hide and Seek Michael B. Moore

More information

Properties of the Autocorrelation Function

Properties of the Autocorrelation Function Properties of the Autocorrelation Function I The autocorrelation function of a (real-valued) random process satisfies the following properties: 1. R X (t, t) 0 2. R X (t, u) =R X (u, t) (symmetry) 3. R

More information

An Adaptive Sensor Array Using an Affine Combination of Two Filters

An Adaptive Sensor Array Using an Affine Combination of Two Filters An Adaptive Sensor Array Using an Affine Combination of Two Filters Tõnu Trump Tallinn University of Technology Department of Radio and Telecommunication Engineering Ehitajate tee 5, 19086 Tallinn Estonia

More information

ADAPTIVE FILTER THEORY

ADAPTIVE FILTER THEORY ADAPTIVE FILTER THEORY Fourth Edition Simon Haykin Communications Research Laboratory McMaster University Hamilton, Ontario, Canada Front ice Hall PRENTICE HALL Upper Saddle River, New Jersey 07458 Preface

More information

Adaptive Filters. un [ ] yn [ ] w. yn n wun k. - Adaptive filter (FIR): yn n n w nun k. (1) Identification. Unknown System + (2) Inverse modeling

Adaptive Filters. un [ ] yn [ ] w. yn n wun k. - Adaptive filter (FIR): yn n n w nun k. (1) Identification. Unknown System + (2) Inverse modeling Adaptive Filters - Statistical digital signal processing: in many problems of interest, the signals exhibit some inherent variability plus additive noise we use probabilistic laws to model the statistical

More information

Advanced Signal Processing Adaptive Estimation and Filtering

Advanced Signal Processing Adaptive Estimation and Filtering Advanced Signal Processing Adaptive Estimation and Filtering Danilo Mandic room 813, ext: 46271 Department of Electrical and Electronic Engineering Imperial College London, UK d.mandic@imperial.ac.uk,

More information

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if jt X ( ) = xte ( ) dt, (3-) then X ( ) represents its energy spectrum. his follows from Parseval

More information

Max. Input Power (W) Input Current (Arms) Dimming. Enclosure

Max. Input Power (W) Input Current (Arms) Dimming. Enclosure Product Overview XI025100V036NM1M Input Voltage (Vac) Output Power (W) Output Voltage Range (V) Output urrent (A) Efficiency@ Max Load and 70 ase Max ase Temp. ( ) Input urrent (Arms) Max. Input Power

More information

Cairns Hospital: Suspected Acute Coronary Syndrome Pathways. DO NOT USE if a non cardiac cause for the chest pain can be diagnosed

Cairns Hospital: Suspected Acute Coronary Syndrome Pathways. DO NOT USE if a non cardiac cause for the chest pain can be diagnosed Cairns Hospital: Suspected Acute Coronary Syndrome Pathways DO NOT USE if a non cardiac cause for the chest pain can be diagnosed Clinical pathways never replace clinical judgement. Care outlined on this

More information

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science : Discrete-Time Signal Processing

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science : Discrete-Time Signal Processing Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.34: Discrete-Time Signal Processing OpenCourseWare 006 ecture 8 Periodogram Reading: Sections 0.6 and 0.7

More information

Discrete-time signals and systems

Discrete-time signals and systems Discrete-time signals and systems 1 DISCRETE-TIME DYNAMICAL SYSTEMS x(t) G y(t) Linear system: Output y(n) is a linear function of the inputs sequence: y(n) = k= h(k)x(n k) h(k): impulse response of the

More information

AdaptiveFilters. GJRE-F Classification : FOR Code:

AdaptiveFilters. GJRE-F Classification : FOR Code: Global Journal of Researches in Engineering: F Electrical and Electronics Engineering Volume 14 Issue 7 Version 1.0 Type: Double Blind Peer Reviewed International Research Journal Publisher: Global Journals

More information

Adaptive SP & Machine Intelligence Linear Adaptive Filters and Applications

Adaptive SP & Machine Intelligence Linear Adaptive Filters and Applications Adaptive SP & Machine Intelligence Linear Adaptive Filters and Applications Danilo Mandic room 813, ext: 46271 Department of Electrical and Electronic Engineering Imperial College London, UK d.mandic@imperial.ac.uk,

More information

New BaBar Results on Rare Leptonic B Decays

New BaBar Results on Rare Leptonic B Decays New BaBar Results on Rare Leptonic B Decays Valerie Halyo Stanford Linear Accelerator Center (SLAC) FPCP 22 valerieh@slac.stanford.edu 1 Table of Contents Motivation to look for and Analysis Strategy Measurement

More information

Lecture 1: From Data to Graphs, Weighted Graphs and Graph Laplacian

Lecture 1: From Data to Graphs, Weighted Graphs and Graph Laplacian Lecture 1: From Data to Graphs, Weighted Graphs and Graph Laplacian Radu Balan February 5, 2018 Datasets diversity: Social Networks: Set of individuals ( agents, actors ) interacting with each other (e.g.,

More information

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011 UCSD ECE53 Handout #40 Prof. Young-Han Kim Thursday, May 9, 04 Homework Set #8 Due: Thursday, June 5, 0. Discrete-time Wiener process. Let Z n, n 0 be a discrete time white Gaussian noise (WGN) process,

More information