3F1 Random Processes Examples Paper (for all 6 lectures)

Similar documents
Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Stochastic Processes

ECE 636: Systems identification

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

16.584: Random (Stochastic) Processes

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

Problems on Discrete & Continuous R.Vs

Signals and Spectra (1A) Young Won Lim 11/26/12

PROBABILITY AND RANDOM PROCESSESS

Spectral Analysis of Random Processes

ECE-340, Spring 2015 Review Questions

Problem Sheet 1 Examples of Random Processes

2A1H Time-Frequency Analysis II Bugs/queries to HT 2011 For hints and answers visit dwm/courses/2tf

Chapter 6. Random Processes

ECE Homework Set 3

Random Processes Why we Care

Statistical signal processing

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Introduction to Probability and Stochastic Processes I

EE 3054: Signals, Systems, and Transforms Summer It is observed of some continuous-time LTI system that the input signal.

Massachusetts Institute of Technology

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

Signal Processing Signal and System Classifications. Chapter 13

MTH739U/P: Topics in Scientific Computing Autumn 2016 Week 6

2. (a) What is gaussian random variable? Develop an equation for guassian distribution

2A1H Time-Frequency Analysis II

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

TSKS01 Digital Communication Lecture 1

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

Recitation 2: Probability

3F1 Random Processes Course

Homework 3 (Stochastic Processes)

Introduction to Probability and Stocastic Processes - Part I

where r n = dn+1 x(t)

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y)

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

ECE6604 PERSONAL & MOBILE COMMUNICATIONS. Week 3. Flat Fading Channels Envelope Distribution Autocorrelation of a Random Process

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

Name of the Student: Problems on Discrete & Continuous R.Vs

Random Process. Random Process. Random Process. Introduction to Random Processes

conditional cdf, conditional pdf, total probability theorem?

Chapter 5 Random Variables and Processes

Name of the Student: Problems on Discrete & Continuous R.Vs

ECE Homework Set 2

Lecture 11: Probability, Order Statistics and Sampling

Definition of a Stochastic Process

FINAL EXAM: 3:30-5:30pm

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011

Chapter 3: Random Variables 1

Properties of the Autocorrelation Function

Random Processes Handout IV

Chapter 2 Random Processes

EEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2:

Chapter 6: Random Processes 1

MATHS & STATISTICS OF MEASUREMENT

7 The Waveform Channel

Chapter 2: Random Variables

This examination consists of 10 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS

Communication Systems Lecture 21, 22. Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University

This examination consists of 11 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS

Fig 1: Stationary and Non Stationary Time Series

Probability and Statistics for Final Year Engineering Students

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

REVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B)

E2.5 Signals & Linear Systems. Tutorial Sheet 1 Introduction to Signals & Systems (Lectures 1 & 2)

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

Probability review. September 11, Stoch. Systems Analysis Introduction 1

EE4601 Communication Systems

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Chapter 3: Random Variables 1

Geometry and Motion Selected answers to Sections A and C Dwight Barkley 2016

This is a Gaussian probability centered around m = 0 (the most probable and mean position is the origin) and the mean square displacement m 2 = n,or

UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7

Multiple Random Variables

Fundamentals of Noise

ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1

Today s lecture. Local neighbourhood processing. The convolution. Removing uncorrelated noise from an image The Fourier transform

Each problem is worth 25 points, and you may solve the problems in any order.

Chapter Review of of Random Processes

Stochastic Process II Dr.-Ing. Sudchai Boonto

Continuous Random Variables

13.42 READING 6: SPECTRUM OF A RANDOM PROCESS 1. STATIONARY AND ERGODIC RANDOM PROCESSES

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Signals and Spectra - Review

Appendix A PROBABILITY AND RANDOM SIGNALS. A.1 Probability

Probability, CLT, CLT counterexamples, Bayes. The PDF file of this lecture contains a full reference document on probability and random variables.

LTI Systems (Continuous & Discrete) - Basics

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

Estimation theory. Parametric estimation. Properties of estimators. Minimum variance estimator. Cramer-Rao bound. Maximum likelihood estimators

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Random Variables. Will Perkins. January 11, 2013

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.

Chap 2.1 : Random Variables

Fourier Series. Spectral Analysis of Periodic Signals

Transcription:

3F Random Processes Examples Paper (for all 6 lectures). Three factories make the same electrical component. Factory A supplies half of the total number of components to the central depot, while factories B and C each supply a quarter of the total number. It is known that 5% of components from factory A are defective, while 4% and % are defective from factories B and C, respectively. Define a suitable Sample Space (set of possible outcomes) for this problem and mark these outcomes as regions on a Venn Diagram. Identify the regions Made at Factory C and Component is faulty on the Venn Diagram. A randomly selected component at the depot is found to be faulty. What is the probability that the component was made by factory C?. In a digital communication system, the messages are encoded into binary symbols and. Because of noise in the system, the incorrect symbol is sometimes received. Suppose the probability of a being transmitted is.4 and the probability of a being transmitted is.6. Further suppose that the probability of a transmitted being received as a is.8 and the probability of a transmitted being received as a is.5. Find: (a) The probability that a received was transmitted as a. (b) The probability that a received was transmitted as a. (c) The probability that any symbol is received in error. 3. A random variable has a Cumulative Distribution Function (CDF) given by: Find: F X (x) (a) The probability that X >.5 (b) The probability that X.5 (c) The probability that.3 < X.7 { < x e x x <

4. Sketch the Probability Density Function (PDF) and Cumulative Distribution Function (CDF) for each of the following distributions (in parts (b) and (c) you will first need to calculate the normalizing constants c, and you may use a computer for these plots if you wish). Determine the mean µ and standard deviation σ of each, and mark µ and µ ± σ on the PDF sketches. Determine also the modes (locations of maximum probability) for the continuous pdfs (b) and (c). (a) Discrete Random Variable with three possible values: Pr{X.5} /5, Pr{X } /5 and Pr{X +} /5 (b) Laplace Distribution: f X (x) c exp( x ), < x < + (c) Rayleigh Distribution: f X (x) c x exp( x /), x < + 5. A real-valued random variable U is distributed as a Gaussian with mean zero and variance σ. A new random variable X g(u) is defined as a function of U. Determine and sketch the probability density function of X when g(.) takes the following forms: (a) g(u) U (b) g(u) U U + a for U < a (c) g(u) for a U a (a is a positive constant) U a for U > a 6. A darts player, aiming at the bullseye, measures the accuracy of shots in terms of x and y coordinates relative to the centre of the dartboard. It is found from many measurements that the x- and y-values of the shots are independent and normally distributed with mean zero and standard deviation σ. Write down the joint probability density function for the x and y measurements and write down an integral expression which gives the probability that x lies between a and b and y lies between c and d. Show that the Cumulative Distribution Function (CDF) for R, the radial distance of shots from the centre of the dartboard, is given by: F R (r) exp( r /(σ )), r < Determine the Probability Density Function for R. What type of distribution is this? If σ 3cm, what is the probability that the player s next shot hits the bullseye, which has a diameter of cm?

3 7. Two random variables X and Y have a joint Probability Density Function given by: f XY (x, y) Determine: { kxy x, y otherwise (a) The value of k which makes this a valid Probability Density Function. (b) The probability of the event X / AND Y > /. (c) The marginal densities f X (x) and f Y (y). (d) The conditional density f Y X (y x). (e) Whether X and Y are independent. 8. A particular Random Process {X(t)} has an ensemble which consists of four sample functions: X(t, α ), X(t, α ), X(t, α 3 ) sin(πt), X(t, α 4 ) cos(πt) where {α, α, α 3, α 4 } is the set of possible outcomes in some underlying Random Experiment, each of which occurs with equal probability. Determine the first order Probability Density Function (PDF) f X(t) (x) and the expected value of the process E{X(t)}. Is the process Strict Sense Stationary? 9. For each of the following random processes, sketch a few members of the ensemble, determine the expected values and autocorrelation functions of the processes and state which of them are wide sense stationary (WSS): (a) X(t) A (b) A is uniformly distributed between and. (c) X(t) cos(πf t + Φ) f is fixed; Φ is uniformly distributed between and ϕmax. This process is in general non-stationary. Are there any values of ϕmax for which the process is WSS? X(t) A cos(πf t + Φ) f is fixed; Φ is uniformly distributed between and π; A is uniformly distributed between and ; Φ and A are statistically independent. Is the process in part (a) Mean Ergodic?

4. Show the following results for a wide-sense stationary (WSS), real-valued random process {X(t)} with autocorrelation function r XX (τ) and power spectrum S X (ω): (a) r XX ( τ) r XX (+τ) ; (b) If {X(t)} represents a random voltage across a Ω resistance, the average power dissipated is P av r XX () ; (c) S X (ω) S X ( ω) ; (d) S X (ω) is real-valued; (e) P av S X (ω) dω. π. A white noise process {X(t)} is a Wide Sense Stationary, zero mean process with autocorrelation function: r XX (τ) σ δ(τ) where δ(τ) is the delta-function centred on τ whose area is unity and width is zero. Sketch the Power Spectrum for this process. A sample function X(t) from such a white noise process is applied as the input to a linear system whose impulse response is h(t). The output is Y (t). Derive expressions for the output autocorrelation function r Y Y (τ) E[Y (t) Y (t+τ)] and the cross-correlation function between the input and output r XY (τ) E[X(t) Y (t + τ)]. Hence obtain an expression for the output power spectrum S Y (ω) in terms of σ and the frequency response H(ω) of the linear system. If the process is correlation ergodic, suggest in block diagram form a scheme for measurement of r XY (τ). What is a possible application for such a scheme?. It is desired to predict a future value X(t + T ) of a WSS random process {X(t)} from the current value X(t) using the formula: where c is a constant to be determined. ˆX(t + T ) c X(t) If the process has autocorrelation function r XX (τ), show that the value of c which leads to minimum mean squared error between X(t + T ) and ˆX(t + T ) is c r XX(T ) r XX () Hence obtain an expression for the expected mean squared error in this case. If {X(t)} has non-zero mean, suggest an improved formula for ˆX(t + T ), giving reasons. Nick Kingsbury; October 9,

5 Tripos questions: This 3F course has been given since 3 and is similar in content to the Random Signals Course from the old third-year paper E4, and questions on random processes and probability from E4 Tripos papers up to will be also relevant. Answers:. /8. (a).95 (b).947 (c).6 3. (a).67 (b). (c).44 4. (a) µ.3; σ.6. (b) c.5; µ ; σ.44; mode. (c) c ; µ π.533; σ π 5. (a) πσ exp ( x σ ) for x < + ; elsewhere..655; mode. (b) exp ( ) x πxσ σ for x < + ; elsewhere. exp ( ) (x a) πσ for < x < σ (c) (Φ(a/σ) )δ(x) for x where Φ(x) x exp ( ) π exp( u /) du. (x+a) πσ for < x < σ 6. f R (r) r exp ( ) r σ σ for r < ; elsewhere. Pr(bullseye).54. 7. (a) 4 (b) 3/6 { x (c) f X (x) x elsewhere, f Y (y) { y y (d) f Y X (y x) elsewhere { y y elsewhere 8. f X(t) (x) [δ(x ) + δ(x + ) + δ(x sin(πt)) + δ(x cos(πt))] 4 E[X(t)] [sin(πt) + cos(πt) ] 4

6 9. (a) E[X(t)] / ; E[X(t )X(t )] /3.. (b) E[X(t)] [sin(ω t + ϕ max ) sin(ω t)]/ϕ max ; E[X(t )X(t )] [sin(ω (t + t ) + ϕ max ) sin(ω (t + t )) + ϕ max cos(ω (t t ))] / 4ϕ max Not WSS unless ϕ max nπ. (c) E[X(t)] ; E[X(t )X(t )] 6 cos(ω (t t )). S X (ω) σ r Y Y (τ) σ h(τ) h( τ) r XY (τ) σ h(τ) S Y (ω) σ H(ω) ( is convolution). Minimum MSE r XX() r XX(T ) r XX ()

3F Random Processes Examples Paper Solutions 3F Random Processes Examples Paper Solutions. Sample space: {AF, AF, BF, BF, CF, CF } where {A, B, C} are the factories and {F, F } are the faulty / not-faulty states. AF BF CF Venn diagram: AF BF CF (regions may be drawn with areas proportional to probabilities) Top row Made at C ; right column Component is faulty. Prob. that faulty component was made at C: P (C F ) P (F C) P (C) P (F ) P (F C) P (C) P (F A) P (A) + P (F B) P (B) + P (F C) P (C)./4.5/ +.4/4 +./4.5.5 +. +.5 8. Let {t, t} and {r, r} be the pairs of transmit and receive states.... P (t).4, P (t).6 and P (r t).8, P (r t).5 (a) (b) P (t r) P (t r) P (r t) P (t) P (r) ( P (r t)) P (t) ( P (r t)) P (t) + P (r t)) P (t) (.8).4 (.8).4 +.5.6.368.368 +.3.95 P (r t) P (t) P (r) ( P (r t)) P (t) ( P (r t)) P (t) + P (r t)) P (t) (.5).6 (.5).6 +.8.4.57.57 +.3.947

3F Random Processes Examples Paper Solutions (c) P (error) P (r t) P (t) + P (r t) P (t).8.4 +.5.6.6 3. (a) Pr{X >.5} F X (.5) e.5.665 (b) Pr{X.5} F X (.5) e.5. (c) Pr{.3 < X.7} F X (.7) F X (.3) e.3 e.7.44 4. (a) E[X] i x i p i.5 5 + 5 + 5.3 E[X ] x i p i.5 + +.45 5 5 5 i σ E[X ] (E[X]).45.9.6 (a) pdf of discrete RV (b) cdf of discrete RV.4 µ σ µ µ+σ.3...8.6.4...5.5.5 X..5.5.5 X (b) Normalising constant c must make the area under the pdf unity. f X (x) dx c e x dx c e x dx c [ e x] c... c.5

3F Random Processes Examples Paper Solutions 3 (c) pdf of Laplace distribution (d) cdf of Laplace distribution.5 µ σ µ µ+σ.4.8.3.6..4... 6 4 4 6 x E[X] σ E[X ] x f X (x) dx Use integration by parts: x f X (x) dx. 6 4 4 6 x x c e x dx from symmetry of f X x c e x dx c x e x dx [ x e x] + x e x dx + [ xe x] + e x dx + + [ e x]... σ c. Mode is where f X is max; i.e. at x (by inspection). x e x dx (c) Normalising constant c must make the area under the pdf unity. f X (x) dx c x e x / dx c e u du (if u x /) c [ e u] c... c (e) pdf of Rayleigh distribution (f) cdf of Rayleigh distribution.6 µ σ µ µ+σ.5.4.3...8.6.4.. 3 4 5 x. 3 4 5 x

4 3F Random Processes Examples Paper Solutions E[X] x f X (x) dx x (x e x / ) dx u / e u du Γ( 3) π π.533 E[X ]... σ x (x e x / ) dx E[X ] (E[X]) u e u du Γ() π.655 Mode is where gradient of pdf : Hence mode is at x d f X dx e x / x(x e x / ) when x 5. pdf of U: f U (u) πσ e u /σ cdf of U: F U (u) where Q(x) x u (a) X g(u) U : f U (α) dα Q(u/σ) π e α / dα is the Gaussian Error Integral function. F X (x) P ({u : u x}) P ({u : x u x}) F U (x) F U ( x) for < x < F U (x) [ F U (x)] F U (x)... f X (x) d dx F X(x) d dx F U(x) for < x < f U (x) πσ e x /σ for < x < (a) pdf of X g(u) U (b) pdf of X g(u) U.8.5.6.4..5 3 4 5 6 x / sigma 3 4 5 6 x / sigma

3F Random Processes Examples Paper Solutions 5 (b) X g(u) U : F X (x) P ({u : u x}) P ({u : x u x}) F U ( x) F U ( x) for < x < F U ( x)... f X (x) d F dx X(x) d F dx U( x) for < x < x f U( x) πσ x e x/σ for < x < U + a for U < a (c) X g(u) for a U a U a for U > a Consider the 3 regions separately: i. U < a, X < : F X (x) P ({u : u + a x}) P ({u : u x a}) F U (x a)... f X (x) f U (x a) πσ e (x a) /σ for x < ii. a U a, X : Pr{X } P ({u : a u a}) F U (a) F U ( a) F U (a) This is a probability mass at x, so iii. U > a, X > : f X (x) [F U (a) ]δ(x) [ Q(a/σ)]δ(x) for x F X (x) P ({u : u a x}) P ({u : u x + a}) F U (x + a)... f X (x) f U (x + a) πσ e (x+a) /σ for x > (c) pdf of X g(u) U+a,, U a; if a sigma.8.6.4. 3 3 x / sigma

6 3F Random Processes Examples Paper Solutions 6. X and Y each have a zero-mean Gaussian pdf given by f X (x) πσ e x /σ and f Y (y) Since X and Y are independent, the joint pdf is given by πσ e y /σ f X,Y (x, y) f X (x) f Y (y) +y )/σ e (x πσ Hence Pr{a < X b, c < Y d} b xa d πσ yc b f X,Y (x, y) dy dx xa d yc e (x +y )/σ dy dx In polar coordinates so the above expression converts to x r cos θ and y r sin θ Pr{r < R r, θ < θ θ } r πσ θ θ πσ θ r /σ e r θ r r /σ r r e r dθ dr To get the cdf for R, we integrate over all angles so that θ θ π, and we integrate from r to r r to get F R (r) Pr{ < R r, π < θ π} π r r e r /σ dr πσ [ e r /σ ] r /σ e r for r < This is a Rayleigh distribution. For a cm diameter bullseye, max radius cm, and if σ 3cm: Pr{bullseye} Pr{R cm} F R () e /( 9).544 dr

3F Random Processes Examples Paper Solutions 7 7. (a) Valid pdf: [ ] x [ ] y kxy dx dy k... k 4 k 4 (b) (c) (d) (e) Pr{X.5, Y >.5} f X (x).5 x.5 x f XY (x, y) dy y.5 4x 4xy dy [ xy ] 4xy dy dx 3 4 dx 3 y for x < and x >.5 [ ] x.5 x x [ 4xy ] 3 6 x for x f Y (y) y for y (by symmetry) for y < and y > f Y X (y x) f XY (x, y) f X (x) y f X (x) f Y (y) x y 4xy f XY (x, y) Therefore X and Y are independent, by definition. y.5 dx 8. For any particular t, X(t) can take one of the 4 values, X(t, α i ), each with equal probability. Hence: f X(t) (x) [δ(x ) + δ(x + ) + δ(x sin(πt)) + δ(x cos(πt))] 4... E[X(t)] x f X(t) (x) dx [ + sin(πt) + cos(πt)] 4 4 [sin(πt) + cos(πt) ] Since f X(t) (x) and E[X(t)] depend on t, the process is not SSS.

8 3F Random Processes Examples Paper Solutions 9. (a) X(t) A: [ ] a E[X(t)] X(t, a) f A (a) da a da r XX (t, t ) E[X(t ) X(t )] X(t, a) X(t, a) f A (a) da a da 3 These are independent of t, hence the process is WSS. (b) X(t) cos(πf t + Φ): E[X(t)] X(t, ϕ) f Φ (ϕ) dϕ [ sin(ω t + ϕ) ϕ max ] ϕmax ϕmax X(t, ϕ) dϕ ϕ max sin(ω t + ϕ max ) sin(ω t) ϕ max ϕmax cos(ω t + ϕ) ϕ max dϕ r XX (t, t ) E[X(t ) X(t )] X(t, ϕ) X(t, ϕ) f Φ (ϕ) dϕ ϕmax ϕmax ϕ max cos(ω t + ϕ) cos(ω t + ϕ) f Φ (ϕ) dϕ [cos(ω (t + t ) + ϕ) + cos(ω (t t ))] [ ϕmax sin(ω (t + t ) + ϕ)] ϕ max dϕ + ϕ max ϕ max cos(ω (t t )) 4ϕ max [sin(ω (t + t ) + ϕ max ) sin(ω (t + t )) + ϕ max cos(ω (t t ))] Mean and r XX depend on t, so the process is not in general WSS. However if ϕ max nπ (integer n), mean is zero and hence is independent of t. If ϕ max nπ, r XX cos(ω (t t )) which depends only on τ (t t ). If both of the above are satisfied then the process is WSS. Hence the process is not WSS unless ϕ max nπ (integer n). (c) X(t) A cos(πf t + Φ): Ensemble is now any sample function from (a) times any from (b). Think now of a multivariate sample space which represents pairs {A, Φ}. Since A and Φ are independent: f A,Φ (a, ϕ) f A (a) f Φ (ϕ)

3F Random Processes Examples Paper Solutions 9 Hence Similarly E[X(t)] X(t, a, ϕ) f A,Φ (a, ϕ) da dϕ a cos(ω t + ϕ) f A (a) f Φ (ϕ) da dϕ π a da cos(ω t + ϕ) dϕ π from (a) and (b): 4π [sin(ω t + π) sin(ω t)] E[X(t ) X(t )] a cos(ω t + ϕ) cos(ω t + ϕ) f A (a) f Φ (ϕ) da dϕ a da π π cos(ω t + ϕ) cos(ω t + ϕ) dϕ from (a) and (b): 3 cos(ω (t t )) 6 cos(ω τ), τ t t Hence this process is WSS. Is the process in (a) Mean Ergodic? The time average is the mean over t for a single realisation a: E t [X(t, a)] lim T T T T X(t, a) dt lim T T T T T a a dt lim T T The ensemble average is the mean over A for a single time instant t: E A [X(t, a)] X(t, a) f A (a) da from (a) a These two averages are not the same, so the process in (a) is (even though it is stationary). not ergodic

3F Random Processes Examples Paper Solutions. (a) r XX (τ) E[X(t) X(t + τ)] E[X(t + τ) X(t)] Let t t + τ: E[X(t ) X(t τ)] r XX ( τ) (b) Instantaneous power V /R X (t)/ Average power, P av E[X (t)] r XX () (c) S X (ω) Let τ τ: from (a): S X ( ω) r XX (τ) e jωτ dτ r XX ( τ ) e jωτ dτ r XX (τ ) e jωτ dτ (d) S X (ω) τ τ in st integral: from (a): r XX (τ) e jωτ dτ r XX (τ) e jωτ dτ + r XX ( τ) e jωτ dτ + r XX (τ) [e jωτ + e jωτ ] dτ r XX (τ) cos(ωτ) dτ All terms in the integral are real, so the result is real. r XX (τ) e jωτ dτ r XX (τ) e jωτ dτ Real (e) From (b): P av r XX () S X (ω) e jω dω S X (ω) dω π π

3F Random Processes Examples Paper Solutions. Power Spectrum: S X (ω) r XX (τ) e jωτ dτ σ δ(τ) e jωτ dτ σ e jω σ Hence the power spectrum is a constant, σ, over all frequencies ω. For the linear system: Y (t) h(t) X(t) h(β) X(t β) dβ (convolution) Autocorrelation function: r Y Y (τ) E[Y (t)y (t + τ)] [( ) ( E h(β )X(t β ) dβ )] h(β )X(t + τ β ) dβ [ E ] h(β ) h(β ) X(t β ) X(t + τ β ) dβ dβ h(β ) h(β ) E [X(t β ) X(t + τ β )] dβ dβ h(β ) h(β ) r XX (τ + β β ) dβ dβ h(β ) h(β ) σ δ(τ + β β ) dβ dβ σ h(β τ) h(β ) dβ σ h( τ) h(τ) Cross-correlation function: r XY (τ) E[X(t)Y (t + τ)] [ ( )] E X(t) h(β)x(t + τ β) dβ [ ] E h(β) X(t) X(t + τ β) dβ h(β) E [X(t) X(t + τ β)] dβ h(β) r XX (τ β) dβ h(β) σ δ(τ β) dβ σ h(τ) Taking Fourier Transforms of the above expression for r Y Y (τ): since H (ω) is the transform of h( τ). S Y (ω) σ H (ω) H(ω) σ H(ω)

3F Random Processes Examples Paper Solutions If the process is Correlation Ergodic, the correlations may be estimated from expectations over time:... r XY (τ) lim T T T T for some suitably large choice of T. X(t) Y (t + τ) dt T T T X(t τ) Y (t) dt The integration over T can be approximated by a lowpass filter with an impulse response whose main lobe is of duration roughly T between its half-amplitude points. The block diagram should therefore show X(t) being passed through a delay of τ, then multiplied by Y (t), and the result being passed through the lowpass filter to produce ˆr XY (τ), which is the estimate of r XY (τ). This is known as a Correlometer. If the delay τ is varied slowly compared with the T response time of the filter, then the full function ˆr XY (τ) can be obtained. The main application of this scheme is System Identification, in which the impulse response of the system is estimated from ĥ(τ) ˆr XY (τ) σ. Instantaneous error: Mean square error (MSE): ε(t) X(t + T ) ˆX(t + T ) X(t + T ) c X(t) P ε E[ε (t)] E[X (t + T ) c X(t + T ) X(t) + c X (t)] since X is WSS: ( + c ) E[X (t)] c E[X(t + T ) X(t)] Differentiating to find min: So the min MSE is: d P ε dc... c min ( + c ) r XX () c r XX (T ) c r XX () r XX (T ) at min r XX (T ) r XX () P ε,min ( + c min) r XX () c min r XX (T ) ( ) + r XX(T ) r rxx() XX () r XX(T ) r XX () r XX() + rxx(t ) rxx(t ) r XX () r XX() r XX(T ) r XX ()

3F Random Processes Examples Paper Solutions 3 To see the effect of a non-zero mean, assume that X(t) is zero-mean, and that the non-zero mean process is Y (t) X(t) + µ. Hence r Y Y (τ) E[Y (t + T ) Y (t)] E[(X(t + T ) + µ) (X(t) + µ)] r XX (τ) + µ Using the given formula to predict Y (t + T ) from Y (t) results in an MSE of P ε,min r Y Y () r Y Y (T ) r Y Y () (r Y Y () + r Y Y (T ))(r Y Y () r Y Y (T )) r Y Y () (r XX() + µ + r XX (T ) + µ )(r XX () r XX (T )) r XX () + µ ( + r XX(T ) + µ ) (r r XX () + µ XX () r XX (T )) Because r XX (T ) < r XX (), we see that the addition of µ to r XX will always increase P ε,min if the original predictor is used. Hence we get the minimum MSE if our predictor is applied to a zero-mean process. To optimally predict the non-zero mean process Y (t), we should therefore subtract the mean µ and then apply the predictor to the zero-mean remainder X(t), to give: Ŷ (t + T ) µ c min (Y (t) µ) or Ŷ (t + T ) c min Y (t) + ( c min )µ where c min r XX(T ) r XX () r Y Y (T ) µ r Y Y () µ This will then leave P ε,min unaffected by changes in µ, the mean component of Y.