Fourier Analysis Linear transformations and lters. 3. Fourier Analysis. Alex Sheremet. April 11, 2007

Similar documents
X(t)e 2πi nt t dt + 1 T

The Discrete Fourier Transform (DFT) Properties of the DFT DFT-Specic Properties Power spectrum estimate. Alex Sheremet.

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

System Identification & Parameter Estimation

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

ENSC327 Communications Systems 2: Fourier Representations. Jie Liang School of Engineering Science Simon Fraser University

Fourier Analysis and Power Spectral Density

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

2.1 Basic Concepts Basic operations on signals Classication of signals

ENSC327 Communications Systems 2: Fourier Representations. School of Engineering Science Simon Fraser University

Stochastic Processes

Signals and Spectra (1A) Young Won Lim 11/26/12

Summary of Fourier Transform Properties

LINEAR SYSTEMS. J. Elder PSYC 6256 Principles of Neural Coding

Problem Sheet 1 Examples of Random Processes

ELEMENTS OF PROBABILITY THEORY

Stochastic Processes. A stochastic process is a function of two variables:

16.584: Random (Stochastic) Processes

GATE EE Topic wise Questions SIGNALS & SYSTEMS

Fourier Series. Spectral Analysis of Periodic Signals

Continuous Time Signal Analysis: the Fourier Transform. Lathi Chapter 4

Statistical signal processing

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.

A523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011

1 Signals and systems

Problems on Discrete & Continuous R.Vs

LECTURE 12 Sections Introduction to the Fourier series of periodic signals

Communication Systems Lecture 21, 22. Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University

Name of the Student: Problems on Discrete & Continuous R.Vs

A6523 Linear, Shift-invariant Systems and Fourier Transforms

Empirical Mean and Variance!

7 The Waveform Channel

2.161 Signal Processing: Continuous and Discrete Fall 2008

Spectral Analysis. Jesús Fernández-Villaverde University of Pennsylvania

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

TSKS01 Digital Communication Lecture 1

where r n = dn+1 x(t)

Fundamentals of the Discrete Fourier Transform

13.42 READING 6: SPECTRUM OF A RANDOM PROCESS 1. STATIONARY AND ERGODIC RANDOM PROCESSES

3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

Signal Processing Signal and System Classifications. Chapter 13

ECE6604 PERSONAL & MOBILE COMMUNICATIONS. Week 3. Flat Fading Channels Envelope Distribution Autocorrelation of a Random Process

LINEAR RESPONSE THEORY

Fourier Series and Transforms. Revision Lecture

2 Frequency-Domain Analysis

Chapter 5 Random Variables and Processes

EE4601 Communication Systems

This examination consists of 11 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS

This examination consists of 10 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS

Fourier Analysis and Sampling

Mathematical Foundations of Signal Processing

8: Correlation. E1.10 Fourier Series and Transforms ( ) Fourier Transform - Correlation: 8 1 / 11. 8: Correlation

Time Series Analysis: 4. Digital Linear Filters. P. F. Góra

IV. Covariance Analysis

Fourier Methods in Digital Signal Processing Final Exam ME 579, Spring 2015 NAME

Chapter 6. Random Processes

Name of the Student: Problems on Discrete & Continuous R.Vs

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

This is a Gaussian probability centered around m = 0 (the most probable and mean position is the origin) and the mean square displacement m 2 = n,or

2A1H Time-Frequency Analysis II

Random signals II. ÚPGM FIT VUT Brno,

Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering. Stochastic Processes and Linear Algebra Recap Slides

Signals and Spectra - Review

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

PROBABILITY AND RANDOM PROCESSESS

Gaussian, Markov and stationary processes

Stochastic Process II Dr.-Ing. Sudchai Boonto

2A1H Time-Frequency Analysis II Bugs/queries to HT 2011 For hints and answers visit dwm/courses/2tf

Lecture 1: Pragmatic Introduction to Stochastic Differential Equations

LTI Systems (Continuous & Discrete) - Basics

ECE 636: Systems identification

The Hilbert Transform

Solutions to Problems in Chapter 4

Functions not limited in time

Figure 3.1 Effect on frequency spectrum of increasing period T 0. Consider the amplitude spectrum of a periodic waveform as shown in Figure 3.2.

CONTINUOUS IMAGE MATHEMATICAL CHARACTERIZATION

Representing a Signal

Chapter 2 Random Processes

3F1 Random Processes Examples Paper (for all 6 lectures)

E2.5 Signals & Linear Systems. Tutorial Sheet 1 Introduction to Signals & Systems (Lectures 1 & 2)

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2013

Ver 3808 E1.10 Fourier Series and Transforms (2014) E1.10 Fourier Series and Transforms. Problem Sheet 1 (Lecture 1)

Lecture 4 - Spectral Estimation

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring

ECG782: Multidimensional Digital Signal Processing

Correlation, discrete Fourier transforms and the power spectral density

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

4 Classical Coherence Theory

Math 127: Course Summary

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Time Series Analysis: 4. Linear filters. P. F. Góra

Ordinary Differential Equation Theory

Timbral, Scale, Pitch modifications

5 Operations on Multiple Random Variables

2. (a) What is gaussian random variable? Develop an equation for guassian distribution

PART 1. Review of DSP. f (t)e iωt dt. F(ω) = f (t) = 1 2π. F(ω)e iωt dω. f (t) F (ω) The Fourier Transform. Fourier Transform.

State Space Representation of Gaussian Processes

Introduction to Probability and Stochastic Processes I

Module 4. Signal Representation and Baseband Processing. Version 2 ECE IIT, Kharagpur

Transcription:

Stochastic processes review 3. Data Analysis Techniques in Oceanography OCP668 April, 27 Stochastic processes review Denition Fixed ζ = ζ : Function X (t) = X (t, ζ). Fixed t = t: Random Variable X (ζ) = X (t, ζ).

Stochastic processes review Summary Names and denitions given for 2RVs and 2 processes X (t), Y (t). To obtain the denitions for auto- moments, set Y = X in column 2. 2 RV X, Y Correlation (Inner product) R = E{XY } Covariance C = E {(X µ X ) (Y µ Y ) } Variance σ 2 = E X µ x 2 = C xx Correlation coecient ρ = C xy /σ x σ y 2 Processes X (t), Y (t), (RV X (t),y (t2)) Cross (auto)- correlation R XY (t, t2) = E {X (t)y (t2)} Cross (auto)- n covariance o C XY (t, t2)=e (X µ X ) (Y µ t Y ) t2 Variance σ 2 (t) = E X (t) µ(t) 2 Correlation coecient ρ XY (t, t2) = C XY (t, t2)/σ X (t)σ Y (t2) Measurement data Stochastic processes review The auto-correlation R(t2, t) has a very specic structure. (also auto-covariance, or correlation coecient) t = t2 t = time lag.

Stochastic processes review Harmonic series Denitions Harmonic process: H(t) = A cos (ωt + T ); A, T mutually uncorrelated, T uniformly distributed in 2 Harmonic series: H(t) = P N n= A n cos (ω n t + T n ); Auto-correlation: (auto-covariance) Mean: µ H =. Variance: σ 2 H = P N n= 2 E A 2 P n = N n= σ2 n Average Power: R HH (t, t2) = P N n= 2 E A 2 n cos ωn (t t2), R HH ( t) = P N n= 2 E A 2 n cos(ωn t). Depend only on the lag t!! R HH (t) = P N n= 2 E A 2 P n = N n= σ2 n = σh, 2 2-order stationary processes Stochastic processes review Denition Process X (t) is 2-order stationary if: Mean: µ = E {x(t)} = constant, independent of t, Auto-correlation: R(t, t2) = R(t, t) = R( t) independent of t. Average Power: R() = E X (t) 2 = constant, independent of t, Auto-covariance: C( t) = R( t) µ 2 depends only on time lag. Correlation coecient: ρ( t) = C( t) C() depends only on time lag.

Stochastic processes review Harmonic analysis Harmonic series processes are 2-order stationary. Can any 2-order stationary process be represented as a harmonic series process? If not, then maybe: Can some 2-order stationary process be represented as a harmonic series process? What kind? Can the representation be modied to an integral instead of a sum? Would such a representation be unique? Represent = approximate arbitrary close (e.g. MS distance) a given process Series = sum with an innite number of terms... integral = a limiting process applied to a sum... What would that mean in stochastic processes? Basic idea Functions here are deterministic, and periodic. Real representation: X (t) = X 2 a + (a n cos nt + b n sin nt), n= with a n, b n R Complex representation: X (t) = X n= c n exp int, What is so special about (sin nt, cos nt) or exp int? Any 2π-periodic function = series of sinusoidals? What are the conditions X (t) has to satisfy?

Orthogonality of sin, cos, exp It might be possible: 8 >< >: R π π R π π R π π cos nt cos mt = for sin nt sin mt = for cos nt sin mt = m n m n for all m, n and in complex: Z π j n m exp (int) exp ( imt) dt = δ nm = 2π π n = m. Scalar product! Maybe there is a vector space? Then sin, cos, exp might be a basis. Fourier-Euler Formulae Real Fourier series If X (t) is a deterministic, 2π-periodic function, seek a representation X (t) = X 2 a + (a n cos nt + b n sin nt), with a n, b n R n= a n = π Z π π X (t) cos nt dt, b n = π Z π π X (t) sin nt dt. Complex Fourier Series X X (t) = c n exp int, with c n C, and c n = c n. n= c n = 2π Z π π X (t) exp ( int) dt,

The vector space: Class L 2 ( π, π) To admit a Fourier series representation, R π π X (t) 2 dt <. The scape is called the L 2 ( π, π) class. Denitions. L 2 ( π, π) = -Dim. vector space + inner product = Hilbert space. 2. If X (t) and Y (t) are two functions of class L 2 ( π, π), the inner product 3. The norm in L 2 ( π, π): X, Y = 2π Z π π X 2 = X, X = 2π X (t) Y (t) dt. Z π π X (t) 2 dt. Class L 2 ( π, π) as a vector space L 2 ( π, π) class functions X (t) are vectors; Set of vectors {U n } = {exp int} is a basis. The unique (Fourier) representation = decomposition on the basis X X = c n U n ; n= c n are the components (projections) of X in this basis; Length of a vector (norm): X = P x n U n and Y = P y n U n, X 2 X = c n U n 2 X and X, Y = x n y n U n 2. n= n= If basis is orthonormal ( U n = ), then X X, Y = x n y n and X 2 X = x n 2. n= n=

General Periodicity To generalize from 2π-periodic functions to T -periodic, T R. Change variables s = t T 2π, and t = 2π T s, 2 Dene frequency f n = n T, radian frequency ω n = n 2π T : General-periodicity Fourier-Euler relations X X (t) = c n exp 2πif n t, n= c n = T Z T /2 X (t) exp ( 2πif n t) dt, T /2 Inner product X, Y = T Z T /2 X (t)y (t) dt. T /2 Properties of Fourier Coecients Let X = X x n U n ; Y = X y n U n ; Z = X z n U n with U n = exp(iω n t), with ω n = 2πn/T ; Linearity If Z(t) = ax (t) + by (t), then z n = ax n + by n. If X and Y have Fourier representation so does Z. Even function X is even c n are all real, c n = c n. Odd function X is odd c n are all imaginary.

Parseval Relation Parseval Relation If X = P n x nu n (X is L 2 class, U n = exp 2πif n t) X 2 X = x n 2. n= Remember stochastic processes? Total power, 2-order stationary process X(t): σ 2 X = R() = E X (t) 2. Total power, harmonic process X(t): σ 2 H = R() = E H(t) 2 = P σn. 2 Calculate the mean as time average: R() = Z b a X (t) 2 dt. Spectral Analysis of Periodic Functions total energy = T /2 T /2 total power = total energy / period = T X (t) 2 dt = T n= T /2 c n 2 X (t) 2 dt = T /2 n= c n 2 Single-frequency oscillation X (t) = c n exp(2πf n t), with f = n/t, Total power = c n 2 Spectral representation Fourier-Euler relations for (periodic) function X. Power spectrum (Parseval) Total power = sum of individual harmonic power.

Example: ramp function..5 X(t).5.5.5.5 2 t.2.5 c n..5 8 6 4 2 2 4 6 8 n arg(c n ), (π rad).5.5 8 6 4 2 2 4 6 8 n X (t) = t, with t [ T /2, T /2]. X ( t) = X (t), X is odd: c n = T Z T /2 t exp ( iω n t) dt c n = ( )n i ; c n = ; i = exp iπ T /2 2πf n 2πf n 2 ; Fourier Integrals: Basic idea X (t) is not periodic. Dene a T -periodic X T (t) X (t) for t Use previous theory for X T» T 2, T ; 2 X X T (t) = c n exp (2πif n t) ; n= c n = T Z T /2 X (t) exp( 2πif n t) dt T /2 Substitute c n, dene f = f n f n = T, X (t) = X n= Let T, f = T and P R X (t) = Z " Z # T /2 X (t) exp( 2πif n t) dt exp(2πif n t) f T /2»Z X (t) exp( 2πif n t) dt exp(2πif n t) df

Fourier Pair Fourier Transform Pair ˆX (f ) = X (t) = X (t) exp( 2πift) dt ˆX (f ) exp(2πift) df direct FT inverse FT ˆX (f ) is the Fourier transform of X (t); ˆX (f ) and X (t) are called a Fourier pair; Fourier Transform operator: F[ ] = [ ] exp( 2πift) dt Inverse Fourier Transform operator: F [ ] = [ ] exp(2πift) dt [ ] ˆX (f ) = F [X (t)], X (t) = F ˆX (f ) ; Dual nature of FT Existence of Fourier Transform Pair Existence of Fourier pair X (t), ˆX (f ) more problematic than Fourier series If X (t) is such that X (t) dt <, it has a FT ˆX (f ). 2 Inverse ˆX (f ) exists if additional conditions of good behavior for fast enough X (t) are imposed (e.g. X (t) as t ± ) 2 8 X(t) 6 4 2 8 6 4 2 2 4 6 8 t

Existence of Fourier Transform Pair However: exp(2πif t) = δ(f f ) exp(2πift) df, δ(f ) and exp(2πift) are like a Fourier pair. Generalization of the Fourier integral concept? generalized functions (also called distributions). Properties of FT () Let X (t) and ˆX (f ) = F [X (t)] be a FT pair. Linearity: F [ax + by ] = af[x ] + bf[y ]. Shift : F[X (t a)] = ˆX (f ) exp ( 2πifa). Shift 2: F[X (t) exp (2πift)] = ˆX (f f ). (Duality: Shift in argument oscillation of the transform) Scale: F[X (at)] = a ˆX ( f a ) Consequence: F [X ( t)] = ˆX ( f ). (Duality: stretching t compression in f )

Properties of FT (2) Derivative : Derivative 2: Double F: d n ˆX (f ) = ( 2πi) n F[t n X (t)]. df n» d n F dt X (t) = ( 2πif ) n ˆX (f ). n F [F [X (t)]] = X ( t). Duality: F [X (t)] = ˆX (f ), and F h ˆX (f ) i = X ( t). Properties of FT (3) Symmetry of the function and its FT X (t) real, even real, odd imaginary, even imaginary, odd ˆX (t) real, even imaginary, odd imaginary, even real, odd

Examples: Rectangular pulse j t a X (t) = otherwise ; 2a sin(2πfa) ˆX (f ) = = 2a sinc(2πfa). 2πfa (a).5 2.5.5.5.5 2 t 2 (b) 2.5.5.5.5 2 f Examples: Gaussian The Gaussian is invariant to the FT F [ exp ( πt 2)] = exp ( πf 2). Symmetry examples based on the Gaussian: (real - imaginary).8 (a) (b).8.8 (a) (b).8.6.6.6.6.4.4.4.4.2.2.2.2.2.2.2.2.5 (c) (d).5.5 (c) (d).5.5.5.5.5 4 2 2 4 4 2 2 4 t f real, even real, even real, odd imaginary, odd 4 2 2 4 4 2 2 4 t f imaginary, even imaginary, even imaginary, odd real, odd

Examples: shifted Gaussian Think about a wave group. IFT of a Gaussian at f, width σ: X (t) = A exp ( 2π 2 σ 2 t) exp (2πif t). 8 6 4 2 8 6 4 2 (a) (c)..2.3.4 f (b).5.5 (d).5.5 5 5 t 8 6 4 2 8 6 4 2 (a) (c)..2.3.4 f (b).5.5 (d).5.5 5 5 t Length σ t of group inversely proportional to spectral width σ f σ t σ f Convolution and correlation Denitions Convolution product: Z(t) = (X Y )(t) = R X (ξ)y (t ξ) dξ. 2 Correlation product: Z(t) = (X Y )(t) = R X (ξ)y (t + ξ) dξ Theorem. Convolution: 2. Correlation: a) F [X Y ] = F [X ] F [Y ], a) F [X Y ] = F [X ] (F [Y ]), b) F [X Y ] = F [X ] F [Y ], b) F [X X ] = F [X ] 2. 2b) is a signicant result!

Convolution properties Another way to write it: (X Y )(t) = 2 Product-like properties X (ξ ) Y (ξ 2 ) δ (t ξ ξ 2 ) dξ dξ 2. comutativity: X Y = Y X, 2 associativity: X (Y Z) = (X Y ) Z, 3 disstributivity (addition): X (Y + Z) = (X Y ) + (X Z); 3 Multiplication with a constant a:a (X Y ) = ax Y = X ay, 4 Derivative: d dt dx (X Y ) = dt Y = X dy dt, 5 Integral: (X Y ) dt = ( X (t)dt ) ( Y (t)dt ). Parseval relation, power spectrum Parseval relation: Total energy: Energy spectrum: X (t) 2 dt = ˆX (f ) 2, Total energy in interval [f, f + df ]: ˆX (f ) 2 df, ˆX (f ) 2 df. Fourier transform superposition of harmonics f with energy ˆX (f ) 2 df.

Correlation and power spectrum (deterministic functions) Stochastic process: auto-correlation is dened as R( t) = E {X (t)x (t + t)} = X (t) X (t + t) dt Deterministic function: Replace E{ } [ ]dt autocorrelation R( t) = X (t) X (t + t) dt. The Fourier transform of the autocorrelation is the energy spectrum: F [R(τ)] = R(τ) e 2πif τ dτ = ˆX (f ) 2. Uncertainty Principle Question: well localized in time and well localized in frequency? Transform of a sinusoidal: F [exp (2πif t)] = δ(f f ). Transform of a Gaussian: [ ) A F exp ( ] t2 = 2πσt 2 2σt 2 A exp ( f 2 2σ 2 f ) ; with σ t = 2πσ f One cannot reduce simultaneously the σ f and σ t (σ f σ t = ).

Heisenberg Theorem Dene mean spread σ t for X (similar construction for σ f using ˆX ): Assume that X 2 = X (t) 2 dt < ; 2 F T (t) = X (t) 2 X 2 3 Following the denitions µ T = ; F T (t) dt = PDF of time RV T ; t F T (t) dt, σ 2 T = (t µ T ) 2 F T (t) dt. Heisenberg Uncertainty Principle: σ 2 t σ 2 f. (= for Gaussian) 2 (2π) No pair X ˆX is such that both cancel outside an interval. (review) Fourier: Series Transform Conditions T -Periodic R T /2 T /2 X (t) 2 dt < R X (t) dt < Decays t ± Synthesis X (t) = P n= c ne 2πi n T t X (t) = R ˆX (f )e 2πift df Analysis c n = R X n (t)e 2πi T t dt R ˆX (f ) = X (t)e 2πift dt Energy spectrum c n 2 ˆX (f ) 2 Total energy P n= c n 2 R ˆX (f ) 2 df

(review) Fourier Transform Properties Convolution Correlation Autocorrelation- Energy spectrum Heisemberg uncertainty (X Y ) (t) = R dxy = ˆX Ŷ X (ξ)y (ξ t)dξ (X Y ) (t) = R X (ξ)y (ξ + t)dξ X Y = ˆX Ŷ X X = ˆX 2 σ 2 t σ 2 f (2π) 2 µ T = R t F T (t) dt, σ 2 T = R (t µ T ) 2 F T (t) dt. Spectral analysis of stochastic processes Fixed ζ = ζ : Function X (t) = X (t, ζ). Fixed t = t: Random Variable X (ζ) = X (t, ζ).

Two approaches to derive the spectrum of a process Goal: estimate the spectrum of a stochastic process X (t, ζ). Means: Fourier theory developed for deterministic functions. Approach (simple) Approach 2 (rigorous) Process = family of realizations Realization = function X (t); Apply Fourier theory to realization; Average over all realizations. Physical meaning of spectrum? Represent process X (t) as X (t) = X Hα = X α α Hα = harmonic process; Ane 2πif αt Spectrum of X (t): σ 2 α = P α σ2 α X (t) should be 2-order stationary (harmonic processes): Constant mean and variance Auto-correlation R(t, t2) = E {X (t)x (t2)} = R(t2 t2) Approach, simple () Treat one realization as a deterministic function X (t); 2 Calculate its spectrum using Fourier theory above. However, if process X (t) is 2-order stationary, one realization does not have a Fourier series representation (X is not periodic) 2 does not have a Fourier integral representation since X (t) dt =. Try to circumvent this diculty by using a limit process.

Approach, simple (2) Limit process: a) Window the function X (t); b) Dene the spectrum of the windowed function c) Let the window width go to. a) Windowed version of X (t): X T X T (t) = j X (t), T /2 t T /2, otherwise has a Fourier transform Z Z T /2 ˆX T (f ) = X T (t) exp ( 2πift) dt = X (t) exp ( 2πift) dt T /2 Windowed signal

Approach, simple (3) b) Apply Parseval relation for X T energy spectrum: total energy: energy in [f, f + f ]: T /2 T /2 X T (t) 2 dt = ˆX T (f ) 2 df ˆX T (f ) 2 df But energy of X is innite: if T, both total energy and spectrum. Approach, simple (4) However, X is 2-order stationary (variance is constant in time) energy (or spectrum) window length T. Normalizing by T should keep the values bounded; Energy/time=power, so total power: power in [f, f + f ]: If lim T T ˆX T (f ) R T /2 T T /2 X T (t) 2 dt = T ˆX T (f ) 2 df T 2 < then for X (t) R ˆX T (f ) 2 df power in [f, f + f ]: lim T T ˆX T (f ) 2 df Note: this is a time average!

Approach, simple (5) c) Derive power spectrum for the process, by averaging over ensemble: Denition The power spectrum (power spectral density) of process X (t) is j h(f ) = average of power in [f, f + df ] = lim E ˆX T (f ) 2ff. T T Power spectrum FT Auto-correlation Theorem If X (t) 2-order stationary process with auto-covariance R(τ), then h(f ) exists for all f, and h(f ) = Z R(τ)e 2πift dt, R(τ) = Auto-correlation (deterministic function) (X X )(τ) = Z Z X (t)x (t + τ) dt h(f ) e 2πift df. FT Replaced by the auto-correlation for a stochastic process: R(τ) = E {X (t) X (t + τ)} FT h(f ) = lim T T Equivalent to replacing ensemble by time average: Z T /2 E {F (t)} lim F (t) dt T T T /2 ˆX (f ) 2 ˆX T (f ) 2

Approach summary Steps to build the power spectrum of a stochastic process: Take one representation X (t) (no FS/FT representation); 2 Build truncated version X T (t) (has FT representation); 3 Dene for X T (t) power spectrum 2 ˆX T (f ) T, and let T ; ( 4 Average over the ensemble h(f ) = E lim T 2) ˆX T (f ). T Meaning: X T (t) sum of harmonic oscillations f of power 2 ˆX T (f ) T Stochastic process: X T (t) sum of harmonic processes of power h(f ). We built this as a limiting process. But is there such a decomposition for a given process X (t)? Approach 2, rigorous Theorem Let X (t) be a 2-order stationary process. There exists an orthogonal process Z(f ) such that X (t) = with the properties E [dz(f )] = Z e 2πift dz(f ), 2 E ˆ dz(f ) 2 = dh(f ) with dh(f ) = power contributed by [f, f + df ]. 3 For any f f2, E [dz(f)dz (f2)] =. (e.g. Priestley, MB, Spectral Analysis and Time Series) Integral is Stieltjes; convergence in the MS sense. continuous spectrum: dh(f ) = H (f ) df = h(f ) df, h(f ) = power spectrum. Any 2-order stationary process = sum of harmonic processes with uncorrelated amplitudes and phases.

Example. White noise For the white-noise process: C(τ) = q δ(τ), with q. Power spectrum h(f ) = F [C] = q All frequencies have equal power. δ(τ) exp ( 2πif τ) dτ = q. Example 2. Harmonic process Denition H(t) = P N n= A n cos (ω n t + T n ); A n and T n mutually uncorrelated, T n uniformly distributed in the interval [ π, π]. auto-covariance C(τ) = n 2 E { A 2 n } cos (2πfn τ), power spectral density h(f ) = 4 one-sided spectrum { E An} 2 [δ(f + fn ) + δ(f f n )]. n h(f ) = n 2 E { A 2 n } δ(f fn ), with f n.

One Realization vs. Process T ˆX (f ) 2 h(f ) 6 power density 5 4 3 2 (a) 25 2 (b) power density 5 5.5..5.2.25.3.35.4 frequency (Hz) a) ˆX (f ) 2; b) h(f ) Note: Covariance-Ergodic processes Previously, we replaced the ensemble average by the time average. M = E {G(x)} M = lim T T Z G(x) dt, Denition Processes which have the property that E {G(x)} = lim T T Z G(x) dt are called M ergodic. Theorem A zero-mean stochastic process is covariance-ergodic if lim τ C(τ) =

Systems and process transformation A procedure that transforms process Y (t) into process X (t) In the process, some characteristics of input Y are changed. Output X is dierent. Spectra of Y and X are dierent. Some frequencies have been ltered out. Linear system/transformation X (t) and Y (t) are two stochastic processes. Denition A system (or transformation) L maps input Y (t) into output X (t): X (t) = L [Y (t)] L is an operator acting only on functions of t. L is linear if a, b C L [ay (t) + by 2 (t)] = al [Y (t)] + bl [Y 2 (t)]. L is shift-invariant if for any s R X (t) = L [Y (t)] X (t s) = L [Y (t s)]. Transformations discussed here are assumed linear and shift-invariant.

Examples: General systems Let Y (t) and Y (t) be discrete-time processes: Y (t) = Y (n t) and X (t) = X (n t), Two (3) basic ways to combine them: Autoregressive model AR(n): a X (t) + a X (t t) + a 2 X (t 2 t) +... + a n X (t n t) = Y (t) 2 Moving average model MA(n): X (t) = b Y (t) + b Y (t t) + b 2 Y (t 2 t) +... b n Y (t n t) 3 Autoregressive Moving Average ARMA(m,n) n m a k X (t k t) = b k Y (t k t) k= k= Examples: Autoregressive model AR Second order discrete AR(2): a X (t) + a X (t t) + a 2 X (t 2 t) = Y (t), Output given through an implicit relationship. Physically realizable condition: only X (t n t). Second order continuous AR(2): use X (t) = X (t) X (t t), 2 X (t) = X (t) X (t t) = X (t) 2X (t t) + X (t 2 t) Let t d 2 X (t) dt 2 + α dx (t) dt (Bumpy road - forced oscillation model) + α 2 X (t) = Z(t).

Examples: AR(n) Autoregressive process of arbitrary order AR(n): a X (t) + a X (t t) + a 2 X (t 2 t) +... + a n X (t n t) = X (t), and α d n X (t) dt n n d X (t) dx (t) + α +... + α dt n n + α n X (t) = Z(t), dt Examples: Moving Average model (MA) Discrete Moving Average MA(n): X (t) = b Y (t) + b Y (t t) + b 2 Y (t 2 t) +... b n Y (t n t), Continuous Moving Average MA: Replace b j by function w(t) (window) { w(t) w(t) = s T otherwise, MA is the convolution of Y (t) with w: X (t) = Y (s) w(t s) ds = Y w. Note: convolution with window w has also the eect of smearing the function Y.

Examples: General ARMA (continuous) For continuous-time processes, the general form of a linear system: d n X (t) d α +α dt n n X (t) dt n dx (t) +...+α n +α n X (t) = w(t s)y (s)ds, dt where α j C (j =, 2,..., n) and w(t)is a window function, w(t)dt =. Fundamental Theorem Theorem If L is a shift-invariant linear transformation X (t) = Y (t) G(t) = Y (s)g(t s) ds, G(t) = L [δ(t)] = impulse response of L A linear system is completely characterized by its impulse response. Steps: Obtain the function G(t) = response of L to a delta impulse, 2 Calculate output X as convolution of input Y and G.

Linear transformations of moments Mean L and E {.} are linear operators and commute (acting on independent variables), therefore µ X (t) = E [X (t)] = E [L [Y (t)]] = L [E [Y (t)]] = L [µ Y (t)]. Autocorrelation Let L t2 = linear system operates on the variable t 2. Regard t = parameter. Then R YX (t, t 2 ) = L t2 [R YY (t, t 2 )] = R YY (t, t 2 ) G(t 2 ); R XX (t, t 2 ) = L t [R YX (t, t 2 )] = R YX (t, t 2 ) G(t 2 ); where, for example, R YY (t, t 2 ) G(t 2 ) = R YY (t, t 2 s)g(s) ds. Similar relations for auto-/cross-covariances. Example: White noise Input Y (t) is a zero-mean, white-noise process C YY (t, t 2 ) = q(t ) δ(t 2 t ), q(t ). Power of output X (t) [ E X (t) 2] = q(t) G(t) 2 = q(t s) G(s) 2 ds. If Y (t) is stationary q(t) = q, [ E X (t) 2] = qe, where E = G(s) 2 ds.

Goal: power spectrum transformation Time domain Frequency domain Input R YY Output R YY G R YX R YX G R XX FT FT FT h YY? h YX? h XX Power spectrum transformation () Assume that X (t) and Y (t) are 2-order stationary: Constant mean and variance, Auto-correlation R(t, t2) = R(t2 t2) = R(τ). Theorem The following statements are true for a shift-invariant linear system: R YX (τ) = R YY (τ) G ( τ), R XX (τ) = R YX (τ) G(τ), h YX (f ) = h YY (f ) Ĝ (f ), h XX (f ) = h YX (f ) Ĝ(f ), Where Ĝ(f ) = F [G(t)]; h YX (f ) = F [R YX (t)] = cross-spectrum of X and Y ; h XX = F [R XX (t)] = (auto) spectrum of X.

Power spectrum transformation (2) Time domain Frequency domain Input R YY G Output R YY R YX R YX G R XX FT FT FT h YY h YY Ĝ h YX h YX Ĝ h XX Power Spectrum Theorem Combining equations: R XX (τ) = R YY (τ) G(τ) G ( τ) = R YY (τ) [G(τ) G(τ)], h XX (f ) = h YY (f ) Ĝ(f ) 2. Conclusions: Spectrum of output = spectrum of input factor function 2 Factor function dened by impulse response function of L. Ĝ(f ) 2.

Power spectrum H(f ) = Ĝ(f ). Examples: white noise input Input is zero-mean white noise with δ auto-correlation and constant spectral distribution: R YY (τ) = qδ(τ), h YY (f ) = q, Output spectrum square of window spectrum h XX (f ) = q Ĝ(f ) 2, RXX (τ) = q [G(τ) G(τ)].

Examples: Continuous AR() A continuous AR() process is described by equation dx (t) + ax (t) = Y (t), dt Impulse response function G(t) dg(t) dt + ag(t) = δ(t), h XX (f ) = h YY (f ) Ĝ(f ) 2 h XX (f ) = a 2 + ω, ω = 2 2πf. If input is a white-noise, h YY (f ) = q, the output spectrum is h XX (f ) = q a 2 + ω 2. Examples: Continuous MA System is a moving average with window w X (t) = Y (s) w(t s) ds, with Impulse response: G(t) = L [δ(t)] = w(t)dt =. δ(s) w(t s) ds = w(t). h XX (f ) = h YY (f ) ŵ(f ) 2 If w is a rectangular window, its FT is a cardinal sine: w(t) = { /T t T /2 otherwise If input is white noise: ; ŵ(f ) = 2a sin(πft ) πft = sinc(πft ). h XX (f ) = q sinc(πft ) 2.

AR() & MA spectral window AR(): h XX (f ) = h YY (f ) Ĝ(f ) 2 ; Ĝ(f ) 2 = a 2 + ω 2, MA: h XX (f ) = h yy (f ) Ĝ(f ) 2 ; Ĝ(f ) 2 = h yy (f ) sinc(πft ) 2 MA and AR behave like low-pass lters. Filters Relations used as lters X (t) = Motivation: Design G(t) so that Y (s)g(t s) ds, h XX (f ) = h YY (f ) Ĝ(f ) 2 ; Ĝ(f ) 2 for some frequencies. Impulse response: G(t), completely characterizes the lter. Transfer function: H(f ) = Ĝ(f ) complex, H(f ) = ρ(f ) e iφ(f ). Gain: Modulus ρ(f ) = amplication @ f. Phase: Angle φ(f ) = phase shift @ f.