Lecture 7 Random Signal Analysis

Similar documents
Basics on Probability. Jingrui He 09/11/2007

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Preliminary statistics

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

1 Probability and Random Variables

Origins of Probability Theory

Recitation 2: Probability

L2: Review of probability and statistics

Chapter 2. Probability

1 Random variables and distributions

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Single Maths B: Introduction to Probability

1 Random Variable: Topics

CS 361: Probability & Statistics

Discrete Random Variables

Probability theory for Networks (Part 1) CS 249B: Science of Networks Week 02: Monday, 02/04/08 Daniel Bilar Wellesley College Spring 2008

G.PULLAIAH COLLEGE OF ENGINEERING & TECHNOLOGY DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING PROBABILITY THEORY & STOCHASTIC PROCESSES

Dept. of Linguistics, Indiana University Fall 2015

where r n = dn+1 x(t)

Centre for Mathematical Sciences HT 2017 Mathematical Statistics

Algorithms for Uncertainty Quantification

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

Data Analysis and Monte Carlo Methods

Review of Probability Theory

Lecture 2: Repetition of probability theory and statistics

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events

Probability theory. References:

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory

PROBABILITY THEORY. Prof. S. J. Soni. Assistant Professor Computer Engg. Department SPCE, Visnagar

Bivariate distributions

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

Probability Review. Chao Lan

Probability Theory for Machine Learning. Chris Cremer September 2015

Randomized Algorithms

02 Background Minimum background on probability. Random process

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

Deep Learning for Computer Vision

Exercises with solutions (Set D)

ECE Homework Set 3

Chapter 3: Random Variables 1

PHYS 114 Exam 1 Answer Key NAME:

Review of probability. Nuno Vasconcelos UCSD

Introduction to Stochastic Processes

ECE 636: Systems identification

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

Brief Review of Probability

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

Sample Spaces, Random Variables

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

Introduction to Probability and Statistics (Continued)

Communication Theory II

CME 106: Review Probability theory

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Lecture - 30 Stationary Processes

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

From Determinism to Stochasticity

ECE531: Principles of Detection and Estimation Course Introduction

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27

Review of Probabilities and Basic Statistics

Given a experiment with outcomes in sample space: Ω Probability measure applied to subsets of Ω: P[A] 0 P[A B] = P[A] + P[B] P[AB] = P(AB)

Appendix A : Introduction to Probability and stochastic processes

Lecture 16. Lectures 1-15 Review

Northwestern University Department of Electrical Engineering and Computer Science

Introduction to Probability and Stocastic Processes - Part I

Class of waveform coders can be represented in this manner

Introduction to Information Entropy Adapted from Papoulis (1991)

Review: mostly probability and some statistics

Some Concepts of Probability (Review) Volker Tresp Summer 2018

Lab 4: Quantization, Oversampling, and Noise Shaping

PROBABILITY THEORY REVIEW

BIOE 198MI Biomedical Data Analysis. Spring Semester Lab 4: Introduction to Probability and Random Data

PROBABILITY AND RANDOM PROCESSESS

ME 597: AUTONOMOUS MOBILE ROBOTICS SECTION 2 PROBABILITY. Prof. Steven Waslander

Properties of Probability

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Distortion Analysis T

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Probability theory basics

Relationship between probability set function and random variable - 2 -

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Chapter 2: Random Variables

Lecture 1: Probability Fundamentals

Random Variables and Their Distributions

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES

Supratim Ray

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science : Discrete-Time Signal Processing

Probability and distributions. Francesco Corona

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Random Processes Why we Care

EE4601 Communication Systems

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3

Definition of a Stochastic Process

Probability Models in Electrical and Computer Engineering Mathematical models as tools in analysis and design Deterministic models Probability models

Transcription:

Lecture 7 Random Signal Analysis 7. Introduction to Probability 7. Amplitude Distributions 7.3 Uniform, Gaussian, and Other Distributions 7.4 Power and Power Density Spectra 7.5 Properties of the Power Spectrum 7.6 Power Spectral Estimation 7.7 Data Windows in Spectral Estimation 7.8 The Cross-Power Spectrum 7.9 Algorithms 7. Exercises Note: For this chapter, the material in the text book is not enough. Please refer to the chapter 3 of the reference [] or the attached chapter, from Kay, 6.

Random Signals In the first part of the course we discussed the deterministic signals, which are precisely determined by the signal models (functions). The future values of a deterministic signal can be predicted exactly from its past values. In practice, most measurements are random, evolving in an unpredictable manner. A signal is called random signal if its values cannot be predicted exactly from its past values. The study of random signals needs background in probability and statistics. X S ζ ζ ζ ζ ζ t X(t,ζ ) X(t,ζ ) X(t,ζ 3 ) X(t,ζ 4 ) t X(t,ζ 5 )

(a) Waveform of F-6 noise (b) Histogram and theoretical probability density function.

. Basics of Probability Probability: describe the uncertainty outcome of an event. Intuitively, probabilities are numbers assigned to events that indicate how likely it is that the event will occur when a random experiment is performed. For example: P (fever is brought by cold) =.6. A performance of the experiment is called an experiment (or a trial). The sample space S is the set corresponds to all possible outcomes of a trial. Particular subsets of S are called events. The probability law for a random experiment is the rule that assigns probabilities to the events in the trial.

Examples (from reference []) Two dice are cast and the total number of spots on the sides that are up are counted. The sample space is S={, 3, 4, 5, 6, 7, 8, 9,,, } and contains a finite number of outcomes. A fair coin is flipped successively at random until the first head is observed. If we let ζ denote the number of flips of the coin required, then the sample space is S = {,, 3,... }, which consists of an infinite, but countable, number of outcomes. In general, if an event A occurs N(A) times in N trials of an experiment, the relative frequency of the event is given by f N( A) N( A) Number of occurrence of event N = Total number of trivials A

. Probability axioms Probability axioms Given a finite sample space S and an event A in S. We define P( A) is the probability of A, then, a. P( A) for each event A in S. b. P( S ) =. c. P( A B) P( A) P( B) + = + if A and Bare mutually exclusive events in S.

. More properties on probability Joint probability : If A and B are random variables, then the joint probability function, ( ) P( ab, ) = P A= ab, = b Conditional probability : the probability of A conditioned onb is defined as, PAB ( ) = P( AB, ) P( B) Product rule: from the definition of conditional probability, the product rule is (, ) = ( ) ( ) = ( ) ( ) P A B P A B P B P B A P A Chain rule: the chain rule is an extension of the product rule which we can write down in more generic form as: (,,, ) = (,, ) (,, ) ( ) ( ) P a a a P a a a P a a a P a a P a n n 3 n n n n

.3 Bayes Rule Bayes rule is an alternative method to calculate the conditional probability if the joint probability P( AB, ) is unknown. Bayes rule is, ( ) P A B ( ) ( ) P BA P A = P B ( )

.4 Cumulative Distribution Function In probability and statistics, a random variable is, a variable whose value results from a measurement. A random variable can be defined by its cumulative distribution function F ( ) is the probability that a realization of x be smaller than or equal to x: FX ( x) = P( X x) a or, FX ( a) = P( x a) = px ( x) dx As a consequence, if x is a real variable, it has, F X (+ ) =, F X (- ) =. X x, which

.5 Probability Density Function A random variable is fully defined by its probability density or probability distribution function (pdf). A random variable x has a probability density p (x) if the probability that a realization y of the random variable x lies in an interval [x, x+dx] is equal to p(x) dx. The probability density is the derivative of the cumulative distribution function, a ( ) ( ) ( ) FX a = P x a = px x dx p X + ( x) = F X ( x) dx ( ) = p x dx

Probability Function p(x) and Distribution Function F(x) Discrete Random Variable.35.8.3.5 F X (x).6.4 P X (x)..5...5 4 6 8 x 3 4 5 6 7 8 9 x Continuous Random Variable 8 x -4 6 p(x) 4 5 5 5 3 35 4 45 x

. Statistical Properties of Signals Two kinds of properties describe: How a signal s amplitude is distributed. How a signal s spectral content is distributed. All signals can be described in terms of statistical properties. Random signals can be described only in terms of statistical propertied. Any signal is stationary if its statistical properties (amplitude and frequency distribution) do not change with time.

Continuous and Discrete Amplitude Distributions x k and x(t) 7 6 5 4 3 T=; Integer samples - - -3-4 -5-6 -7 4 6 8 k Continuous: p( x); p( x) dx= P = Pr x= n ; P = Discrete: { } n n= n

fn #samples equal to n = P N Frequency Function n Frequency distribution of [ x n ] on previous slide: x=[-3,,,,3,,-,-,3,5,4,3,3,,-3,-,,-,-4,-,,,-3,-3,-,-, -,-,-4,-,,,-,,,-,-3,-,-3,-5,-4,-,,,-,-5,-3,,,, -,-,-3,,,,-,-3,-,,,3,,,,-,-5,-3,,,-5,-,5,5,-, -,,,,,-,,4,4,,,,,,,4,3,-,-,,,-,-3,,3,]. Amplitude distribution of x; N= f(x). -5 5 x

Functions Used to Display f n = #samples equal to N n stem(-5:5,f) was used to make the plot in the previous slide. hist(y,x) plots a histogram of y with bins centered over x. bar(x,f) plots a bar chart of f with bars centered over x. hist example. bar example # occurrences of y 5 5 f(x).5..5-5 -3-3 5 x -5-3 - 3 5 x

Mean, Variance, and Standard Deviation Mean value of any function of x, including x itself: Continuous: E[ y] = y( x) p( x) dx; [ ] ( ) = = Discrete: E[ y] = y( x ) P ; μ E[ x] n= Variance of x: = E ( x ) = E x σ μ μ x x x n n μ x E x xp x dx = = x P x n n n= The standard deviation, σ x, is the standard measure of the average deviation of x from its mean value, μ. Properties of μ and σ : μ ax + b = aμ x + b σ a σ ax+ b = x x

Properties of a Uniform Random Function p(x) Continuous P n Discrete /(b-a) /N a b x a b n b+ a μx = b a σ x = ( )

Properties of a Normal Random Function p(x) Continuous. P n Discrete.4/σ./σ. μ-σ μμ+σ x n px ( ) = N( μσ, ) = e σ π ( x μ ) σ Matlab function erf computes x y erf ( x) = e dy π

Gaussian Probability Computation It is usually easier to use function erfx: y=erfx(mu,sigma,x) erfx is the probability that a normally distributed random variate with mean mu and standard deviation sigma lies in the range {mu,mu+x}. Amplitude distribution -bit A-D converter: Pr=erfx(,6,5)=.977 8 x -4 6 p(x) 4 5 5 5 3 35 4 45 x

Matlab Vectors with Random Elements Vector length = N; Mean = ; Standard deviation = s. Uniform: x=s*sqrt()*(rand(,n)-.5); Normal: x=s*randn(,n) Examples; N =,; μ =.8 Uniform; σ=4.33. Gaussian; σ=..6.5 f n f n.4. -8-6 -4-4 6 8 n..5-8 -6-4 - 4 6 8 n

More Examples of Amplitude Distributions 5 x=7.5sin4πt Grey-scale image x(t) -5 t..5 f n. f n..5-8 -6-4 - 4 6 8 n 64 8 9 56 n

3. Power and Power Density Suppose x is the sample vector of a long stationary waveform. The average power in x is where X is the DFT of x. E x = x = X N N n n m N n= N m= The relationship the periodogram, N N xn = Xm is known as Parseval s theorem. It proves that n= N m= Pxx( m) X N m =, is a measure of the power density of x.

Periodogram of a Periodic Signal x = Pxx( m) =.93 N N N n n= N m= in this example. x(t), x n - 3 4 5 Time index (n) P m.5-5 5 Frequency index (m)

Power Density Spectrum P(ν).8.6.4. -.5.5 Frequency ν (Hz-s).5 P( ν). Average power = dν =.93.5 If power density is plotted versus Hz instead of Hz-s, then the frequency axis is scaled by /T, and so P must be scaled by T in order to preserve the correct average power (integral). Similar changes with other frequency measures.

Scales change depending on units. Power Density Spectrum (Cont.) Power per Hz-s x -6 Power per Hz P(ν).8.6.4. -.5.5 Frequency ν (Hz-s) P(f).8.6.4. -5 Frequency f (Hz) 5 x 5 Total power in both plots:.5 f / s P( ν) dν = P( f) df =.93..5 f / If power density is plotted versus Hz instead of Hz-s, then the frequency axis is scaled by /T, and so P must be scaled by T in order to preserve the correct average power (integral). Similar changes are necessary with other frequency measures. s

Properties of the Power Spectrum The periodogram is the DFT of the autocorrelation function. Given a sample vector, x, { } DFT ϕ ( k) = P ( m); m =,,..., N xx xx Thus, the autocorrelation function and the power spectrum convey the same information about a signal in different forms. The amplitude distribution conveys a different kind of information about the signal.

Examples of Autocorrelation Functions and Power Spectra Sinusoidal waveform White Gaussian waveform x(t) x(t) - 5 5 x -3 t - 5. t φ xx (k) φ xx (k) P(ν) -5-5 5.. P(ν) -5 5. k -.5.5 -.5.5 ν

Estimating the Power Spectrum Suppose xt () is a stationary random signal, x is a sample vector of length N, and P xx is the periodogram of x. Since xx N independent components, increasing N increases the detail in the estimated power spectrum but does not decrease the variance of the estimate. P has / The next slide illustrates how the variance of the estimated power spectrum may be reduced by averaging periodograms of different segments of xt ()

White Gaussian Signal Avg. Power = v /Hz-s Single periodograms are useless, but the avg. of 8 disjoint periodograms (blue plot) is quite accurate. P(ν); N= 6 P(ν); N= 9 Avg. P(ν) 5 5 Power density (v /Hz-s)...3.4.5 ν (Hz-s)

(red) Complete sampled waveform; N = 4. (blue) Partitioned into two disjoint segments; N = 5. (green) Partitioned into five overlapping segments; N = 5. x k x k x k x k x k x 3k x 4k x 5k 8 56 384 5 64 768 896 4 k

Rules for Estimating Power Spectra. Decide on the largest acceptable frequency resolution, Δ f Hz.. The corresponding segment length, N = / Δ ft = / Δ ν, should be << the overall length of the signal sample vector. 3. Use overlapping segments in the periodogram average. The more overlap, the better the estimate. You may use function [P,nsgmts,v]=pds(x,N,windo,overlap) to implement periodogram averaging with overlapping segments. Enter help pds to see how to use the function. An example illustrating the use of pds is shown next

Examples of the Use of Overlapping Segments White Gaussian noise x = N(,) Bandpass filter y 4 samples of signal y are generated by passing a white Gaussian random signal through a bandpass filter. Thus, theoretically, the power spectrum of y is P = H j, ( ) ( ) y ν πν where H( j πν ) is the amplitude gain of the bandpass filter. On the next slide, the theoretical spectrum is the blue plot. The segment size is constant at n = 64.

Examples of the Use of Overlapping Segments Overlap=%; Ns=6.6 Overlap=5%; Ns=3 Overlap=95%; Ns=3.6.6.4.4.4... Power density, P y (ν).8.6.4.8.6.4.8.6.4....5 ν (Hz-s).5 ν (Hz-s).5 ν (Hz-s)

Increasing the Accuracy of the Estimate by Decreasing the Segment Size (N) (Signal length K = 4; Overlap = 95%) N=64 N=3 N=6 Power density, P y (ν).5.5 Power density, P y (ν).5.5 Power density, P y (ν).5.5.5 ν (Hz-s).5 ν (Hz-s).5 ν (Hz-s)

Using Data Windows to Improve Spectral Estimates These data windows are available when in pds and pds:.5 Boxcar 3.5 Tapered 3 w k.5 Hanning 3 w k.5 Hamming 3.5 Blackman 3.5 Tent 3

Examples of the Use of Data Windows ( K = 496 ; N = 64, 95% overlap) Rectangular window Hanning window Blackman window Power density, P y (ν).5.5.5.5.5.5.5 ν (Hz-s).5 ν (Hz-s).5 ν (Hz-s)

The Cross-Power Spectrum The cross-power spectrum of two signals is similar to the power spectrum. It is given in terms of the cross-periodogram, Pxy ( m) = X ;,,..., mym m N N = As before, P xy is the DFT of the correlation vector, that is, DFT { xy} ϕ = P xy Furthermore, just as ϕ () = E x is the avg. power in x, xx k is the average cross-power in x and y. N ϕxy () = Pxy ( m) N m=

The Cross-Power Spectrum (Cont.) Sometimes the real part of P xy provides the distribution of physical power, as when x is voltage and y is current, or when x is force and y is speed. For example, when x and y are sinusoidal and in phase, like Asinω t and Bsinω t, P xy is real and represents real physical power. When x and y are sinusoidal and out of phase, like Asinω t and Bcosω t, P xy is imaginary. The cross-power spectrum is also useful in the coherence function, which we examine next.

Magnitude-Squared Coherence (Msc) The MSC function is a normalized version of xy P : P ( ) xy ν Γ xy ( ν ) = P ( ν ) P ( ν ) Sometimes it is uses to decide whether detectors are receiving the same signal ( ) s, as in the following situation: xx yy noise (n ) source (s) H (v) + x H (v) + y noise (n )

Magnitude-Squared Coherence (Cont.) If n and n are independent, an MSC value, Γ xy ( ν ), close to indicates signal power from the same source arriving at detectors x and y at frequency ν. Furthermore, if the MSC is close to at all frequencies, then the cross-power spectrum magnitude, P xy, may be used as a scaled measure of source power density. An example is given on the next slide (text p. 93) in which H ( z ) = and ( ) d H z = z, that is, the source is simply delayed on one channel relative to the other. The signal-to-noise ratio (SNR) in the example is souce power SNR = =.3 noise power

Power Spectra Based On 5 Samples 35 P ss (ν) 35 P xx (ν) 35 P yy (ν) 35 P xy (ν) 3 3 3 3 5 5 5 5 Power density 5 5 5 5 5 5 5 5.5 ν (Hz-s).5 ν (Hz-s).5 ν (Hz-s).5 ν (Hz-s)

Matlab Functions [f, xmin, xmax] = freq(x) computes the amplitude distribution of an integer vector or array (x). [P,nsgmts,v]=pds(x,N,windo,overlap) computes the power density spectrum of x in v = [,.5]. N is the segment size, windo designates the data window, and overlap is in the range (,). [P, nsgmnts,v] = pds(x, y, N, windo, overlap) computes the cross-pds of x and y in v = [,.5]. N, windo, and overlap are as above. hist(x,b) plots a histogram of x with bins centered at elements of b. stem(x,y) is like plot(x,y), with vertical stems. bar(v, P, color) same as Matlab function bar, except the bars touch each other, as on the previous slide.

Exercises