Fourier Analysis and Power Spectral Density

Similar documents
ENSC327 Communications Systems 2: Fourier Representations. Jie Liang School of Engineering Science Simon Fraser University

Signals and Spectra (1A) Young Won Lim 11/26/12

Lecture 8 ELE 301: Signals and Systems

8: Correlation. E1.10 Fourier Series and Transforms ( ) Fourier Transform - Correlation: 8 1 / 11. 8: Correlation

Review of Fourier Transform

Spectral Analysis of Random Processes

2.1 Basic Concepts Basic operations on signals Classication of signals

Experimental Fourier Transforms

23.6. The Complex Form. Introduction. Prerequisites. Learning Outcomes

Amplitude and Phase A(0) 2. We start with the Fourier series representation of X(t) in real notation: n=1

Ver 3808 E1.10 Fourier Series and Transforms (2014) E1.10 Fourier Series and Transforms. Problem Sheet 1 (Lecture 1)

Fourier Analysis Linear transformations and lters. 3. Fourier Analysis. Alex Sheremet. April 11, 2007

EE303: Communication Systems

LECTURE 12 Sections Introduction to the Fourier series of periodic signals

X(t)e 2πi nt t dt + 1 T

Summary of Fourier Transform Properties

ENSC327 Communications Systems 2: Fourier Representations. School of Engineering Science Simon Fraser University

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

Signal Processing Signal and System Classifications. Chapter 13

Chapter 4 The Fourier Series and Fourier Transform

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ω 0 = 2π/T 0 is called the fundamental angular frequency and ω 2 = 2ω 0 is called the

Fourier Series. Spectral Analysis of Periodic Signals

Module 4. Signal Representation and Baseband Processing. Version 2 ECE IIT, Kharagpur

Signals and Spectra - Review

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

7 The Waveform Channel

Notes on Fourier Analysis

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

This is a Gaussian probability centered around m = 0 (the most probable and mean position is the origin) and the mean square displacement m 2 = n,or

Fundamentals of Noise

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.

Solutions to Problems in Chapter 4

System Identification & Parameter Estimation

Stochastic Processes. A stochastic process is a function of two variables:

IV. Covariance Analysis

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring

Properties of Fourier Series - GATE Study Material in PDF

2.161 Signal Processing: Continuous and Discrete

Fourier Series. Fourier Transform

LOPE3202: Communication Systems 10/18/2017 2

Introduction to Fourier Transforms. Lecture 7 ELE 301: Signals and Systems. Fourier Series. Rect Example

EE401: Advanced Communication Theory

1 Signals and systems

Analysis and Design of Analog Integrated Circuits Lecture 14. Noise Spectral Analysis for Circuit Elements

3. Frequency-Domain Analysis of Continuous- Time Signals and Systems

Aspects of Continuous- and Discrete-Time Signals and Systems

2A1H Time-Frequency Analysis II

Chapter 4 The Fourier Series and Fourier Transform

Problem Sheet 1 Examples of Random Processes

Fig 1: Stationary and Non Stationary Time Series

Figure 3.1 Effect on frequency spectrum of increasing period T 0. Consider the amplitude spectrum of a periodic waveform as shown in Figure 3.2.

2 Fourier Transforms and Sampling

Continuous Time Signal Analysis: the Fourier Transform. Lathi Chapter 4

3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE

A=randn(500,100); mu=mean(a); sigma_a=std(a); std_a=sigma_a/sqrt(500); [std(mu) mean(std_a)] % compare standard deviation of means % vs standard error

2 Frequency-Domain Analysis

The Continuous-time Fourier

Fourier Series and Transforms. Revision Lecture

2A1H Time-Frequency Analysis II Bugs/queries to HT 2011 For hints and answers visit dwm/courses/2tf

Spectra (2A) Young Won Lim 11/8/12

E2.5 Signals & Linear Systems. Tutorial Sheet 1 Introduction to Signals & Systems (Lectures 1 & 2)

ECE6604 PERSONAL & MOBILE COMMUNICATIONS. Week 3. Flat Fading Channels Envelope Distribution Autocorrelation of a Random Process

Fourier Methods in Digital Signal Processing Final Exam ME 579, Spring 2015 NAME

6: Fourier Transform

CHAPTER 4 FOURIER SERIES S A B A R I N A I S M A I L

The Discrete Fourier Transform (DFT) Properties of the DFT DFT-Specic Properties Power spectrum estimate. Alex Sheremet.

e st f (t) dt = e st tf(t) dt = L {t f(t)} s

Digital Band-pass Modulation PROF. MICHAEL TSAI 2011/11/10

Each of these functions represents a signal in terms of its spectral components in the frequency domain.

Mathematical Foundations of Signal Processing

13.42 READING 6: SPECTRUM OF A RANDOM PROCESS 1. STATIONARY AND ERGODIC RANDOM PROCESSES

Wave Phenomena Physics 15c

Representing a Signal

ECE-700 Review. Phil Schniter. January 5, x c (t)e jωt dt, x[n]z n, Denoting a transform pair by x[n] X(z), some useful properties are

ENGIN 211, Engineering Math. Laplace Transforms

7: Fourier Transforms: Convolution and. Parseval s Theorem

5 Analog carrier modulation with noise

A6523 Linear, Shift-invariant Systems and Fourier Transforms

A1 Time-Frequency Analysis

OSE801 Engineering System Identification. Lecture 05: Fourier Analysis

Fourier Series and Transform KEEE343 Communication Theory Lecture #7, March 24, Prof. Young-Chai Ko

DESIGN OF CMOS ANALOG INTEGRATED CIRCUITS

Computational Methods for Astrophysics: Fourier Transforms

Chapter 2. Signals. Static and Dynamic Characteristics of Signals. Signals classified as

Review of Linear Time-Invariant Network Analysis

Communication Signals (Haykin Sec. 2.4 and Ziemer Sec Sec. 2.4) KECE321 Communication Systems I

How many initial conditions are required to fully determine the general solution to a 2nd order linear differential equation?

4.1. If the input of the system consists of the superposition of M functions, M

ECE Digital Image Processing and Introduction to Computer Vision. Outline

Tutorial Sheet #2 discrete vs. continuous functions, periodicity, sampling

3 Fourier Series Representation of Periodic Signals

4 The Continuous Time Fourier Transform

Lecture 7 ELE 301: Signals and Systems

Communications and Signal Processing Spring 2017 MSE Exam

Notes 07 largely plagiarized by %khc

Fundamentals of the Discrete Fourier Transform

Data Processing and Analysis

Line Spectra and their Applications

ECE 541 Stochastic Signals and Systems Problem Set 11 Solution

Transcription:

Chapter 4 Fourier Analysis and Power Spectral Density 4. Fourier Series and ransforms Recall Fourier series for periodic functions for x(t + ) = x(t), where x(t) = 2 a + a = 2 a n = 2 b n = 2 n= a n cos 2πnt + b n sin 2πnt ] x(t) dt x(t) cos nωt dt ( a 2 = x ) x(t) sin nωt dt. ( ω = 2π ) (4.) (4.2) Dirichlet heorem: For x(t) periodic on t <, if x(t) is bounded, has a finite number of maxima, minima, and discontinuities, then the Fourier Series Eq. (4.) converges t to 2 x(t+ ) + x(t )]. Complex form of Eq. (4.) is better for experimental applications. Using Euler s (or de Moivre s) formulas we get: cos ωt = 2 sin ωt = 2i Using above Eq. (4.) can be rewritten as: x(t) = ( e iωt + e iωt) ( e iωt e iωt). (4.3) X n e inωt, (4.4) 3

32 CHAPER 4. FOURIER ANALYSIS AND POWER SPECRAL DENSIY where X = a 2 X ±n = 2 (a n ib n ). (4.5) Please also note that X n = X n. herefore: X n = x(t) e inωt dt = /2 x(t) e inωt dt. (4.6) If signal x(t) is not periodic, we let f n = n (i.e., in Eq. (4.6) nω = 2πf n). Now, we define a function X(f) by X(f n ) = X n (i.e., X n = X(f n )/ ) to get the following: x(t) = X n e inωt = n= X(f n) e i2πfnt = n= X(f n ) e i2πfnt f n, (4.7) where we used the fact that f n = f n+ f n = n+ n =. herefore, in the limit as and f n in Eq. (4.9), we get our signal in time domain as x(t) = X(f) e i2πft df. (4.8) Now, assuming that everything converges and using Eq. (4.6) we get the corresponding frequency domain expression X(f) = x(t) e i2πft dt. (4.9) herefore, x(t) X(f) are Fourier ransform pair, where x(t) is in time domain and X(f) is in frequency domain. 4.. Several Important Properties of Fourier ransforms We denote a Fourier transform (F) as X(f) = F(x(t)) and x(t) = F (X(f)). Now, we can write several of the properties of F:. Linearity: Fαx(t) + βy(t)] = αx(f) + βy (f) 2. Duality: x(t) X(f) X(t) x( f) 3. Conjugation: x(t) X(f) x (t) X ( f). herefore, for real signal x(t), X(f) = X ( f). his instead gives: X(f) 2 = X(f)X (f) = X ( f)x( f) = X( f) 2, (4.) i.e., for real x(t), X(f) is symmetric. 4. Convolution: ] F x(τ)y(t τ) dτ = X(f)Y (f) Fx y],

4.. FOURIER SERIES AND RANSFORMS 33 where x y indicates time convolution between x(t) and y(t). In addition, Fxy] = X(φ)Y (f φ) dφ X Y, where X Y indicates frequency convolution between X(f) and Y (f) (also, X Y = Y X). 5. Differentiation: 6. ime Scaling and Shifting: F d k ] x dt k = (i2πf) k X(f), x(at + b) e2πif b a a X ( ) f. a heorem: Provided x(t) L (i.e., x(t) dt < and x(t) has a finite number of maxima, minima, and discontinuities) X(f) exists, and F X(f)] for x continuous at t, x(t) = 2 x(t+ ) + x(t )] for x discontinuous at t.. here is a problem with the above theorem if we consider the following: sin t dt =, which can be fixed using theory of generalized functions (or distributions), duality and other basic properties. 4..2 Basic Fourier ransform Pairs. Delta (δ) function : his actually is a generalized function or distribution defined as: δ(t)dt =. (4.) Now, by definition of δ(t), (f) = e 2πift δ(t t )dt = e 2πift, also called sifting property. Note that is a complex constant with =. herefore: δ(t t ) e 2πift, (4.2) and in particular, δ(t). herefore, by duality property e 2πift δ(f f ), (4.3) and in particular, δ(f).

34 CHAPER 4. FOURIER ANALYSIS AND POWER SPECRAL DENSIY Figure 4.: Signal modulation in the frequency domain 2. rigonometric functions: cos(2πf t) = 2 ( e 2πif t + e 2πift), (4.4) so Similarly, F cos(2πf t)] = δ(f f ) + δ(f + f ) 2 F sin(2πf t)] = δ(f f ) δ(f + f ) 2i. (4.5). (4.6) 3. Modulated trigonometric functions: As an example consider x(t) cos(2πf c t) X(f) C(f), where f c is called carrier frequency and C(f) is given by Eq. (4.5). hen, x(t) cos(2πf c t) X(s) δ(f f c s) + δ(f + f c s) ds, (4.7) 2 where on the right hand side we have a convolution integral, which gives: x(t) cos(2πf c t) X(f f c) + X(f + f c ) 2. (4.8) herefore, if we already know X(f), modulation scales and shifts it to ±f c as shown in Fig. 4.. 4.2 Power Spectral Density he autocorrelation of a real, stationary signal x(t) is defined to by R x (τ) = Ex(t)x(t + τ)]. he Fourier transform of R x (τ) is called the Power Spectral Density (PSD) S x (f). hus: S x (f) = R x (τ) e i2πft dτ. (4.9) he question is: what is the PSD? What does it mean? What is a spectral density, and why is S x called a power spectral density? o answer this question, recall that X(f) = x(t) e i2πft dt. (4.2)

4.2. POWER SPECRAL DENSIY 35 o avoid convergence problems, we consider only a version of the signal observed over a finite-time, x = x(t)w (t), where hen x has the Fourier transform for t /2, w = for t > /2. X (f) = = /2 (4.2) x (t) e i2πft dt, (4.22) x(t) e i2πft dt, (4.23) and so ] /2 ] /2 X X = x(t) e i2πft dt x (s) e i2πfs ds, (4.24) = /2 /2 x(t)x(s) e i2πf(t s) dtds, (4.25) where the star denotes complex conjugation and for compactness the frequency argument of X has been suppressed. aking the expectation of both sides of Eq. (4.26) 2 E X X ] = /2 /2 E x(t)x(s)] e i2πf(t s) dtds. (4.26) Letting s = t + τ, one sees that Ex(t)x(s)] Ex(t)x(t + τ)] = R x (τ), and thusb E X X ] = /2 /2 R x (τ) e i2πf(t s) dtds. (4.27) o actually evaluate the above integral, the both variables of integration must be changed. Let τ = f(t, s) = s t (as already defined for Eq. (4.3)) (4.28) η = g(t, s) = s + t. (4.29) hen, the integral of Eq. (4.3) is transformed (except for the limits of integration) using the change of variables formula: 3 /2 /2 R x (τ) e i2πf(t s) dtds = R x (τ) e i2πfτ J dτdη, (4.3) his restriction is necessary because not all of our signals will be square integrable. However, they will be mean square integrable, which is what we will take advantage of here. 2 o understand what this means, remember that Eq. (5) holds for any x(t). So imagine computing Eq. (6) for different x(t) obtained from different experiments on the same system (each one of these is called a sample function). he expectation is over all possible sample functions. Since the exponential kernel inside the integral of Eq. (6) is the same for each sample function, it can be pulled outside of the expectation. 3 his is a basic result from multivariable calculus. See, for example, I.S. Sokolnikoff and R.M. Redheffer, Mathematics of Physics and Modern Engineering, 2 nd edition, McGraw- Hill, New York, 966.

36 CHAPER 4. FOURIER ANALYSIS AND POWER SPECRAL DENSIY Figure 4.2: he domain of integration (gray regions) for the Fourier transform of the autocorrelation Eq. (7): (left) for the original variables, t and s; (right) for the transformed variables, η and τ, obtained by the change of variables Eq. (4.28). Notice that the square region on the left is not only rotated (and flipped about the t axis), but its area is increased by a factor of J = 2. he circled numbers show where the sides of the square on the left are mapped by the change of variables. he lines into which the t and s axes are mapped are also shown. where J is the absolute value of the Jacobian for the change of variables Eq. (4.28) given by df df J = dt ds = = 2. (4.3) dg dt dg ds o determine the limits of integration needed for the right hand side of Eq. (4.3), we need to refer to Fig. 4.2, in which the domain of integration is plotted in both the original (t, s) variables and the transformed (τ, η) variables. Since we wish to integrate on η first, we hold τ fixed. For τ >, a vertical cut through the diamond-shaped region in Fig. 4.2 (right) shows that + τ η τ, whereas for τ < one finds that τ η + τ. Putting this all together yields: E X X ] = τ ( τ ) R x (τ) e i2πfτ dηdτ = τ ] R x (τ) e i2πfτ dτ. (4.32) Finally, dividing both sides of Eq. (4.32) by and taking the limit as gives lim E X X ] = lim = lim = = S x (f). τ ] R x (τ) e i2πfτ dτ R x (τ) e i2πfτ dτ R x (τ) e i2πfτ dτ (4.33)

4.2. POWER SPECRAL DENSIY 37 hus, in summary, the above demonstrates that S x (f) = lim E X (f) 2]. (4.34) Recalling that X (f) has units SU/Hz (where SU stands for signal units, i.e., whatever units the signal x (t) has), it is clear that E X (f) 2] has units (SU/Hz) 2. However, / has units of Hz, so that Eq. (4.33) shows that the PSD has units of (SU 2 )/Hz. 4 Although it is not always literally true, in many cases the mean square of the signal is proportional to the amount of power in the signal. 5 he fact that S x is therefore interpreted as having units of power per unit frequency explains the name Power Spectral Density. Notice that power at a frequency f that does not repeatedly reappear in x (t) as will result in S x (f ), because of the division by in Eq. (4.34). In fact, based on this idealized mathematical definition, any signal of finite duration (or, more generally, any mean square integrable signal), will have power spectrum identical to zero! In practice, however, we do not let extend much past the support min, max ] of x (t) ( min / max is the minimum (respectively, maximum) for which x (t) = ). Since all signals that we measure in the laboratory have the form y(t) = x(t) + n(t), where n(t) is broadband noise, extending to infinity for any signal with finite support will end up giving S x S n. We conclude by mentioning some important properties of S x. First, since S x is an average of the magnitude squared of the Fourier transform, S x (f) R and S x (f) for all f. A simple change of variables in the definition Eq. (4.9) shows that S x (f) = S x (f). Given the definition Eq. (4.9), we also have the dual relationship Setting τ = in the above gives which, for a mean zero signal gives R x (τ) = R x () = E x(t) 2] = σ 2 x = S x (f) e i2πfτ df. (4.35) S x (f) df, (4.36) S x (f) df, (4.37) Finally, if we assume that x(t) is ergodic in the autocorrelation, that is, that /2 R x (τ) = Ex(t)x(t + τ)] = lim x(t)x(t + τ)dt, 4 Of course, the units can also be determined by examining the definition of Eq. (4.9). 5 his comes primarily from the fact that, in electrical circuits, the power can be written in terms of the voltage as V 2 /Z, or in terms of the current as I 2 Z, where Z is the circuit impedance. hus, for electrical signals, it is precisely true that the mean square of the signal will be proportional to the power. Be forewarned, however, that the mean square of the scaled signal, expressed in terms of the actual measured variable (such as displacement or acceleration), will not in general be equal to the average mechanical power in the structure being measured.

38 CHAPER 4. FOURIER ANALYSIS AND POWER SPECRAL DENSIY where the last equality holds for any sample function x(t), then Eq. (4.37) can be rewritten as he above relationship is known as Parsevals Identity. /2 lim x(t) 2 dt = S x (f) df. (4.38) his last identity makes it clear that, given any two frequencies f and f 2, the quantity f2 f S x (f) df represents the portion of the average signal power contained in signal frequencies between f and f 2, and hence S x is indeed a spectral density. 4.3 Sample Power Spectra. White noise: S xx (f) =, where we have power at all frequencies. he corresponding autocorrelation is R xx (τ) = δ(τ), see Fig. 4.3. Figure 4.3: White noise signal has power at all frequencies and is uncorrelated for τ. 2. Band limited noise: S xx (f) = W (f) =, f f BW, otherwise R xx (τ) = = fbw W (f)e i2πfτ df e i2πfτ df f BW e i2πf BW τ e i2πf BW τ ] = 2iπτ = sin (2πf BW τ) πτ. (4.39) For the corresponding graphs refer to Fig. 4.4

4.3. SAMPLE POWER SPECRA 39 Figure 4.4: Band limited noise has a correlation time of 2f BW. Problems Problem 4. Create a sample {x n } 24+4 n= of uncorrelated Gaussian random variables (command randn in Matlab). Now apply the moving average filter s n = /5 7 i= 7 x n+i to obtain 24 correlated Gaussian variates. Estimate the power spectrum (type > help pwelch in Matlab) for both data sequences and observe the differences. Problem 4.2 Create two time series: () {x n } 496 n= of uncorrelated Gaussian random variables (command randn in Matlab), and (2) deterministic evolution of the Ulam map {y n } 496 n=, which follows the rule y =. and y n+ = 2y 2 n. he values of y n are measured through a nonlinear observation function s n = arccos( y n )/π. Compare the mean, variance and the power spectra of the two time series.