Signals and Spectra - Review

Similar documents
7 The Waveform Channel

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Review of Fourier Transform

Signals and Spectra (1A) Young Won Lim 11/26/12

Stochastic Processes. A stochastic process is a function of two variables:

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

Problem Sheet 1 Examples of Random Processes

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES

Chapter Review of of Random Processes

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

Fig 1: Stationary and Non Stationary Time Series

Module 4. Signal Representation and Baseband Processing. Version 2 ECE IIT, Kharagpur

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

Signals & Linear Systems Analysis Chapter 2&3, Part II

Chapter 6. Random Processes

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

Math 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential.

Statistical signal processing

Review of Probability

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship

Lesson 1. Optimal signalbehandling LTH. September Statistical Digital Signal Processing and Modeling, Hayes, M:

Spectral Analysis of Random Processes

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

System Identification

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

Stochastic Processes

Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering. Stochastic Processes and Linear Algebra Recap Slides

Appendix A PROBABILITY AND RANDOM SIGNALS. A.1 Probability

Fundamentals of Noise

2A1H Time-Frequency Analysis II

2A1H Time-Frequency Analysis II Bugs/queries to HT 2011 For hints and answers visit dwm/courses/2tf

Chapter 5 Random Variables and Processes

Random Processes Why we Care

Introduction to Probability and Stochastic Processes I

ω 0 = 2π/T 0 is called the fundamental angular frequency and ω 2 = 2ω 0 is called the

System Identification & Parameter Estimation

Problems on Discrete & Continuous R.Vs

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.

Introduction...2 Chapter Review on probability and random variables Random experiment, sample space and events

ENGR352 Problem Set 02

Mixed Signal IC Design Notes set 6: Mathematics of Electrical Noise

Probability and Statistics for Final Year Engineering Students

Mechanical Vibrations Chapter 13

ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1

Communication Theory II

Chapter 2 Random Processes

Basic Descriptions and Properties

Fourier Analysis and Power Spectral Density

Signal Processing Signal and System Classifications. Chapter 13

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

Lecture 15. Theory of random processes Part III: Poisson random processes. Harrison H. Barrett University of Arizona

Stochastic Process II Dr.-Ing. Sudchai Boonto

MATHEMATICAL TOOLS FOR DIGITAL TRANSMISSION ANALYSIS

Optimum Ordering and Pole-Zero Pairing. Optimum Ordering and Pole-Zero Pairing Consider the scaled cascade structure shown below

Analog Communication (10EC53)

Figure 3.1 Effect on frequency spectrum of increasing period T 0. Consider the amplitude spectrum of a periodic waveform as shown in Figure 3.2.

Name of the Student: Problems on Discrete & Continuous R.Vs

Detection and Estimation Theory

2.161 Signal Processing: Continuous and Discrete Fall 2008

EEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2:

Random signals II. ÚPGM FIT VUT Brno,

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

ECE Homework Set 3

EE 574 Detection and Estimation Theory Lecture Presentation 8

Detection & Estimation Lecture 1

5.9 Power Spectral Density Gaussian Process 5.10 Noise 5.11 Narrowband Noise

EE303: Communication Systems

TSKS01 Digital Communication Lecture 1

13.42 READING 6: SPECTRUM OF A RANDOM PROCESS 1. STATIONARY AND ERGODIC RANDOM PROCESSES

Random Processes Handout IV

3F1 Random Processes Examples Paper (for all 6 lectures)

5 Analog carrier modulation with noise

be a deterministic function that satisfies x( t) dt. Then its Fourier

PROBABILITY AND RANDOM PROCESSESS

ECE6604 PERSONAL & MOBILE COMMUNICATIONS. Week 3. Flat Fading Channels Envelope Distribution Autocorrelation of a Random Process

Massachusetts Institute of Technology

Transform Techniques - CF

Detection & Estimation Lecture 1

DETECTION theory deals primarily with techniques for

This examination consists of 11 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS

Communication Systems Lecture 21, 22. Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University

Sample Problems for the 9th Quiz

ENSC327 Communications Systems 2: Fourier Representations. Jie Liang School of Engineering Science Simon Fraser University

Transform Techniques - CF

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

Expectation Maximization Deconvolution Algorithm

Eli Barkai. Wrocklaw (2015)

PDF Estimation via Characteristic Function and an Orthonormal Basis Set

Name of the Student: Problems on Discrete & Continuous R.Vs

TLT-5200/5206 COMMUNICATION THEORY, Exercise 3, Fall TLT-5200/5206 COMMUNICATION THEORY, Exercise 3, Fall Problem 1.

Transform Techniques - CF

16.584: Random (Stochastic) Processes

The Cooper Union Department of Electrical Engineering ECE111 Signal Processing & Systems Analysis Final May 4, 2012

This is a Gaussian probability centered around m = 0 (the most probable and mean position is the origin) and the mean square displacement m 2 = n,or

EE5356 Digital Image Processing

Transcription:

Signals and Spectra - Review SIGNALS DETERMINISTIC No uncertainty w.r.t. the value of a signal at any time Modeled by mathematical epressions RANDOM some degree of uncertainty before the signal occurs Described using the theory of random processes

Periodic and Nonperiodic Signals A signal (t) is periodic in time if there eists a constant T 0 > 0, that t = t+ T < t <+ ( ) ( ) 0, The smallest value of T 0 is called the period of (t).

Analog and Discrete Signals R R t R t D An analog signal is a continuous function of time (i.e. speech) A discrete signal eists only at discrete times (i.e. sampled continuous signal)

Energy and Power Signals (t) real or comple-valued deterministic continuous-time signal The signal is called an energy signal if its energy E is finite 0 < E < E () = t dt The signal is called a power signal if its energy is infinite, but the mean (average) power P is finite 0 < P < 1 P T / lim () = t dt T T T /

Spectral Density PARSEVAL S THEOREM FOR ENERGY SIGNALS () 1 ( ) ( ) E = t dt = ω dω = f df π ψ ( ) ( ) f = f Energy spectral density (ESD) distribution of energy w.r.t. frequency f PARSEVAL S THEOREM FOR POWER SIGNALS = n n= 1 T ( ) δ ( ) G f c f nf / P t dt c T T0 / = 0 () = 0 0 n= Power spectral density (PSD) distribution of power w.r.t. frequency f n

Spectral Density - continued E = ψ f df = ψ f df ( ) ( ) 0 P = G f df = G f df ( ) ( ) 0 Non-periodic power signals power spectral density defined in the limiting sense T (t) truncated version of the signal (i.e. an energy signal) with the Fourier spectrum T (f) 1 G f f ( ) = lim ( ) T T T

Eample. a) Find the average normalized power of the waveform (t) = A cos( π f 0 t) using time averaging b) Repeat part (a) using the summation of spectral coefficients

Autocorrelation of an Energy Signal ( ) ( ) ( ), R τ = t t+ τ dt < τ < R (τ) is the function of time difference betwen the waveform and its shifted copy. R R R ( τ) = R( τ) ( τ ) R ( ) F ( τ) ψ ( f ) ( 0) ( ) 0 for all R = t dt τ an even function maimum value at the origin Autocorrealtion and ESD form a Fourier pair Value at the origin is equal to the energy of the signal

Autocorrelation of a Power Signal R R R 1 T / ( τ) = lim () t ( t τ) dt, τ T T + < < T / For periodic signals ( τ) = R( τ) ( τ ) R ( ) F ( τ ) ( ) R G f 0 0 for all τ 1 T0 / R ( 0) = () t dt T T0 / 1 T 0 / ( τ) () ( τ) R = t t+ dt T T0 / 0 an even function maimum value at the origin Autocorrealtion and PSD form a Fourier pair Value at the origin is equal to the average power of the signal

Random Signals Random signals are encountered in all areas of signal processing. They appear as disturbances in the transmission of signals. Even the transmitted and consequently also the received signals in telecommunications are of random nature, because only random signals carry information.

Random Variables A random variable (A) represent the functional relationship between a random event A and a real number. The distribution function F () is equal to the probability that the value taken by the random variable is less than or equal to a real number. F = P ( ) ( ) The probability density function (pdf) ( ) P ( ) 1 = p d 1 p ( ) = df d ( )

Ensemble Averages mean value n th moment { } ( ) m = E = p d { } n n ( ) E = p d mean square value variance var σ ( ) ( ) = var ( ) { } ( ) E = p d { } ( ) ( ) = E m = m p d standard deviation σ { } σ = E m

Random Processes A random process (A,t) can be viewed as a function of an event A and time t. k sample functions of time 1 (t) A fied A variable (t) 3 (t) t fied (A,t) number (A,t) random variable k (t) t variable (A,t) sample function (A,t) random process

Random Processes - continued The autocorrelation function of the random process (t) is a function of two variables, t 1 and t, given by { } R t, t = E t t ( ) ( ) ( ) 1 1 (t 1 ) and (t ) are random variables obtained by observing (t) at times t 1 and t respectively. The autocorrelation function is a mesure of the degree to which two time samples of the same random process are related.

Stationarity of a random process A random process is said to be wide-sense stationary (WSS) if its mean and autocorrelation function do not vary with a shift in the time origin. m = const (, ) = ( ) = ( ) R t t R t t R τ 1 1 Most of useful results in communication theory are predicated on random information signals and noise being wide-sense stationary. From a practical point of view, it is not necessary for a random process to be stationary for all time but only for some observation interval of interest.

Autocorrelation of a Wide-Sense Stationary Random Process R τ = E{ t t+ τ }, < τ < ( ) ( ) ( ) For a zero mean WSS process, R (τ) indicates the etent to which the random values of the process separated by τ seconds in time are statistically correlated. If R (τ) changes slowly as τ increases from 0 to some value, then sample values of (t) taken at t=t 1 and t=t 1 + τ are nearly the same ((t) contain mostly low frequencies). If R (τ) decreases rapidly as τ increases, then sample values of (t) taken at t=t 1 and t=t 1 +τ are completely different ((t) contain mostly high frequencies).

Properties of R (τ) of a real-valued WSS process R R ( ) ( ) ( ) ( ) F ( ) ( ) { } R 0 = E t τ τ = R τ R R τ G f ( ) ( ) 0 for all τ an even function maimum value at the origin Autocorrealtion and PSD form a Fourier pair Value at the origin is equal to the average energy of the signal

Time Averaging and Ergodicity The process is ergodic if it can be determined by time averaging over a single sample function. The process is ergodic in the mean if 1 = lim The process is ergodic in the autocorrelation function if m T / T T T / R 1 T / ( τ) = lim () t ( t τ) dt T T + T / () t dt In most communication systems the waveforms are assumed to be ergodic in mean and autocorrelation function.

Time Averaging and Ergodicity - continued Engineering parameters related to the moments of an ergodic random process: 1.m dc level of the signal.m - normalized power in the dc component 3.E{ (t)} total average normalized power () { } 4. E t - rms value of the voltage or current signal 5.σ - average normalized power in the ac component of the signal

PSD and the Wiener-Khintchine theorem A random process can be generally classified as a power signal. Its PSD describes the distribution of a signal s power in the frequency domain.. It enables to evaluate the signal power that will pass thorough a network having known frequency characteristics. G f = G f G ( ) ( ) f ( ) F 0 R τ G f ( ) ( ) P ( ) = G f df Wiener-Khintchine theorem

Eample 3. Consider a random process given by ( ) = cos( π + φ ) t A f t where A i f 0 are constants and φ is a random variable that is uniformly distributed over (0, π). If (t) is an ergodic process, the time averages of (t) in the limit as t are equal to the corresponding ensemble averages of (t). a)use time averaging over an integer number of periods to calculate the approimations to the first and second moments of (t) b)calculate the ensemble-average approimations to the first and second moments of (t) 0

Noise in Communication Systems The term noise refers to unwanted electrical signals present in electrical systems. The presence of the noise superimposed on a signal tends to obscure or mask the signal. It limits the rate of information transmission. Good engineering design can eliminate much of the noise thorough filtering, the choice of modulation and the selection of an optimum receiver site. However, there is one natural source of noise, called thermal noise that cannot be eliminated. It is caused by the thermal motion of electronsin resistors, wires etc.

Thermal noise The thermal noise n(t) can be described as a zero-mean Gaussian random process. It is a random function whose value n at an arbitrary time t is statistically characterized by the Gaussian pdf ( ) p n 1 1 n = ep σ π σ Where σ is the variance of n. The normalized Gaussian pdf is obtained with σ = 1. random signal z = a+ n dc component noise ( ) p n 1 1 = ep σ π z a σ

Gaussian distribution as the system noise model Central limit theorem: the probability density function of j statistically independent random variables approaches the Gaussian distribution as j, no matter what the individual distribution functions may be. Even though individual noise mechanisms might have other than Gaussian distributions, the aggregate of many such mechanisms will tend toward the Gaussian distribution.

White noise When the noise power has an uniform spectral density (i.e. its PSD is the same for all frequencies of interest in most communication systems) we refer to it as white noise. N0 Gn ( f ) = [ W / Hz] 1 N0 R τ = F G f = δ τ n ( ) ( ) { n } ( ) Any two different samples of white noise, no matter how close together in time they are taken, are uncorrelated. P n N 0 = df = We assume that the system is corrupted by additive zeromean white Gaussian noise.