Signals and Spectra - Review SIGNALS DETERMINISTIC No uncertainty w.r.t. the value of a signal at any time Modeled by mathematical epressions RANDOM some degree of uncertainty before the signal occurs Described using the theory of random processes
Periodic and Nonperiodic Signals A signal (t) is periodic in time if there eists a constant T 0 > 0, that t = t+ T < t <+ ( ) ( ) 0, The smallest value of T 0 is called the period of (t).
Analog and Discrete Signals R R t R t D An analog signal is a continuous function of time (i.e. speech) A discrete signal eists only at discrete times (i.e. sampled continuous signal)
Energy and Power Signals (t) real or comple-valued deterministic continuous-time signal The signal is called an energy signal if its energy E is finite 0 < E < E () = t dt The signal is called a power signal if its energy is infinite, but the mean (average) power P is finite 0 < P < 1 P T / lim () = t dt T T T /
Spectral Density PARSEVAL S THEOREM FOR ENERGY SIGNALS () 1 ( ) ( ) E = t dt = ω dω = f df π ψ ( ) ( ) f = f Energy spectral density (ESD) distribution of energy w.r.t. frequency f PARSEVAL S THEOREM FOR POWER SIGNALS = n n= 1 T ( ) δ ( ) G f c f nf / P t dt c T T0 / = 0 () = 0 0 n= Power spectral density (PSD) distribution of power w.r.t. frequency f n
Spectral Density - continued E = ψ f df = ψ f df ( ) ( ) 0 P = G f df = G f df ( ) ( ) 0 Non-periodic power signals power spectral density defined in the limiting sense T (t) truncated version of the signal (i.e. an energy signal) with the Fourier spectrum T (f) 1 G f f ( ) = lim ( ) T T T
Eample. a) Find the average normalized power of the waveform (t) = A cos( π f 0 t) using time averaging b) Repeat part (a) using the summation of spectral coefficients
Autocorrelation of an Energy Signal ( ) ( ) ( ), R τ = t t+ τ dt < τ < R (τ) is the function of time difference betwen the waveform and its shifted copy. R R R ( τ) = R( τ) ( τ ) R ( ) F ( τ) ψ ( f ) ( 0) ( ) 0 for all R = t dt τ an even function maimum value at the origin Autocorrealtion and ESD form a Fourier pair Value at the origin is equal to the energy of the signal
Autocorrelation of a Power Signal R R R 1 T / ( τ) = lim () t ( t τ) dt, τ T T + < < T / For periodic signals ( τ) = R( τ) ( τ ) R ( ) F ( τ ) ( ) R G f 0 0 for all τ 1 T0 / R ( 0) = () t dt T T0 / 1 T 0 / ( τ) () ( τ) R = t t+ dt T T0 / 0 an even function maimum value at the origin Autocorrealtion and PSD form a Fourier pair Value at the origin is equal to the average power of the signal
Random Signals Random signals are encountered in all areas of signal processing. They appear as disturbances in the transmission of signals. Even the transmitted and consequently also the received signals in telecommunications are of random nature, because only random signals carry information.
Random Variables A random variable (A) represent the functional relationship between a random event A and a real number. The distribution function F () is equal to the probability that the value taken by the random variable is less than or equal to a real number. F = P ( ) ( ) The probability density function (pdf) ( ) P ( ) 1 = p d 1 p ( ) = df d ( )
Ensemble Averages mean value n th moment { } ( ) m = E = p d { } n n ( ) E = p d mean square value variance var σ ( ) ( ) = var ( ) { } ( ) E = p d { } ( ) ( ) = E m = m p d standard deviation σ { } σ = E m
Random Processes A random process (A,t) can be viewed as a function of an event A and time t. k sample functions of time 1 (t) A fied A variable (t) 3 (t) t fied (A,t) number (A,t) random variable k (t) t variable (A,t) sample function (A,t) random process
Random Processes - continued The autocorrelation function of the random process (t) is a function of two variables, t 1 and t, given by { } R t, t = E t t ( ) ( ) ( ) 1 1 (t 1 ) and (t ) are random variables obtained by observing (t) at times t 1 and t respectively. The autocorrelation function is a mesure of the degree to which two time samples of the same random process are related.
Stationarity of a random process A random process is said to be wide-sense stationary (WSS) if its mean and autocorrelation function do not vary with a shift in the time origin. m = const (, ) = ( ) = ( ) R t t R t t R τ 1 1 Most of useful results in communication theory are predicated on random information signals and noise being wide-sense stationary. From a practical point of view, it is not necessary for a random process to be stationary for all time but only for some observation interval of interest.
Autocorrelation of a Wide-Sense Stationary Random Process R τ = E{ t t+ τ }, < τ < ( ) ( ) ( ) For a zero mean WSS process, R (τ) indicates the etent to which the random values of the process separated by τ seconds in time are statistically correlated. If R (τ) changes slowly as τ increases from 0 to some value, then sample values of (t) taken at t=t 1 and t=t 1 + τ are nearly the same ((t) contain mostly low frequencies). If R (τ) decreases rapidly as τ increases, then sample values of (t) taken at t=t 1 and t=t 1 +τ are completely different ((t) contain mostly high frequencies).
Properties of R (τ) of a real-valued WSS process R R ( ) ( ) ( ) ( ) F ( ) ( ) { } R 0 = E t τ τ = R τ R R τ G f ( ) ( ) 0 for all τ an even function maimum value at the origin Autocorrealtion and PSD form a Fourier pair Value at the origin is equal to the average energy of the signal
Time Averaging and Ergodicity The process is ergodic if it can be determined by time averaging over a single sample function. The process is ergodic in the mean if 1 = lim The process is ergodic in the autocorrelation function if m T / T T T / R 1 T / ( τ) = lim () t ( t τ) dt T T + T / () t dt In most communication systems the waveforms are assumed to be ergodic in mean and autocorrelation function.
Time Averaging and Ergodicity - continued Engineering parameters related to the moments of an ergodic random process: 1.m dc level of the signal.m - normalized power in the dc component 3.E{ (t)} total average normalized power () { } 4. E t - rms value of the voltage or current signal 5.σ - average normalized power in the ac component of the signal
PSD and the Wiener-Khintchine theorem A random process can be generally classified as a power signal. Its PSD describes the distribution of a signal s power in the frequency domain.. It enables to evaluate the signal power that will pass thorough a network having known frequency characteristics. G f = G f G ( ) ( ) f ( ) F 0 R τ G f ( ) ( ) P ( ) = G f df Wiener-Khintchine theorem
Eample 3. Consider a random process given by ( ) = cos( π + φ ) t A f t where A i f 0 are constants and φ is a random variable that is uniformly distributed over (0, π). If (t) is an ergodic process, the time averages of (t) in the limit as t are equal to the corresponding ensemble averages of (t). a)use time averaging over an integer number of periods to calculate the approimations to the first and second moments of (t) b)calculate the ensemble-average approimations to the first and second moments of (t) 0
Noise in Communication Systems The term noise refers to unwanted electrical signals present in electrical systems. The presence of the noise superimposed on a signal tends to obscure or mask the signal. It limits the rate of information transmission. Good engineering design can eliminate much of the noise thorough filtering, the choice of modulation and the selection of an optimum receiver site. However, there is one natural source of noise, called thermal noise that cannot be eliminated. It is caused by the thermal motion of electronsin resistors, wires etc.
Thermal noise The thermal noise n(t) can be described as a zero-mean Gaussian random process. It is a random function whose value n at an arbitrary time t is statistically characterized by the Gaussian pdf ( ) p n 1 1 n = ep σ π σ Where σ is the variance of n. The normalized Gaussian pdf is obtained with σ = 1. random signal z = a+ n dc component noise ( ) p n 1 1 = ep σ π z a σ
Gaussian distribution as the system noise model Central limit theorem: the probability density function of j statistically independent random variables approaches the Gaussian distribution as j, no matter what the individual distribution functions may be. Even though individual noise mechanisms might have other than Gaussian distributions, the aggregate of many such mechanisms will tend toward the Gaussian distribution.
White noise When the noise power has an uniform spectral density (i.e. its PSD is the same for all frequencies of interest in most communication systems) we refer to it as white noise. N0 Gn ( f ) = [ W / Hz] 1 N0 R τ = F G f = δ τ n ( ) ( ) { n } ( ) Any two different samples of white noise, no matter how close together in time they are taken, are uncorrelated. P n N 0 = df = We assume that the system is corrupted by additive zeromean white Gaussian noise.