Homework 3 (Stochastic Processes)

Similar documents
16.584: Random (Stochastic) Processes

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

2A1H Time-Frequency Analysis II

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

2A1H Time-Frequency Analysis II Bugs/queries to HT 2011 For hints and answers visit dwm/courses/2tf

Stochastic Processes

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

3F1 Random Processes Examples Paper (for all 6 lectures)

Problems on Discrete & Continuous R.Vs

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

EEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2:

Final. Fall 2016 (Dec 16, 2016) Please copy and write the following statement:

Random Processes Handout IV

6.003 Homework #10 Solutions

Signals and Spectra (1A) Young Won Lim 11/26/12

ECE Homework Set 3

Stochastic Process II Dr.-Ing. Sudchai Boonto

ECE-340, Spring 2015 Review Questions

ECE 636: Systems identification

Spectral Analysis of Random Processes

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs

Chapter 4 Random process. 4.1 Random process

FINAL EXAM: 3:30-5:30pm

13.42 READING 6: SPECTRUM OF A RANDOM PROCESS 1. STATIONARY AND ERGODIC RANDOM PROCESSES

Properties of Linear Translation Invariant (LTI) Systems

Probability and Statistics for Final Year Engineering Students

Lesson 4: Stationary stochastic processes

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

EE4601 Communication Systems

Statistical signal processing

Properties of the Autocorrelation Function

COURSE OUTLINE. Introduction Signals and Noise Filtering Noise Sensors and associated electronics. Sensors, Signals and Noise 1

5 Operations on Multiple Random Variables

EE 3054: Signals, Systems, and Transforms Summer It is observed of some continuous-time LTI system that the input signal.

Random Process. Random Process. Random Process. Introduction to Random Processes

PROBABILITY AND RANDOM PROCESSESS

2. (a) What is gaussian random variable? Develop an equation for guassian distribution

06/12/ rws/jMc- modif SuFY10 (MPF) - Textbook Section IX 1

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs

Homework 7 Solution EE235, Spring Find the Fourier transform of the following signals using tables: te t u(t) h(t) = sin(2πt)e t u(t) (2)

IV. Covariance Analysis

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

TSKS01 Digital Communication Lecture 1

= 4. e t/a dt (2) = 4ae t/a. = 4a a = 1 4. (4) + a 2 e +j2πft 2

NAME: ht () 1 2π. Hj0 ( ) dω Find the value of BW for the system having the following impulse response.

LTI Systems (Continuous & Discrete) - Basics

The Johns Hopkins University Department of Electrical and Computer Engineering Introduction to Linear Systems Fall 2002.

STA 4321/5325 Solution to Homework 5 March 3, 2017

Fourier Series. Spectral Analysis of Periodic Signals

ECE 301 Division 1, Fall 2006 Instructor: Mimi Boutin Final Examination

Stochastic Processes. Monday, November 14, 11

Problem Sheet 1 Examples of Random Processes

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science : Discrete-Time Signal Processing

ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)

Final Exam January 31, Solutions

Atmospheric Flight Dynamics Example Exam 1 Solutions

ECE 301 Division 1, Fall 2008 Instructor: Mimi Boutin Final Examination Instructions:

Each problem is worth 25 points, and you may solve the problems in any order.

EE538 Final Exam Fall :20 pm -5:20 pm PHYS 223 Dec. 17, Cover Sheet

Modern Navigation. Thomas Herring

Signal Processing Signal and System Classifications. Chapter 13

ECE534, Spring 2018: Solutions for Problem Set #5

Expectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

ECE302 Spring 2006 Practice Final Exam Solution May 4, Name: Score: /100

Massachusetts Institute of Technology

Multiple Random Variables

Frequency-Domain C/S of LTI Systems

ECE 301. Division 2, Fall 2006 Instructor: Mimi Boutin Midterm Examination 3

Expectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda

MA6451 PROBABILITY AND RANDOM PROCESSES

Properties of LTI Systems

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

Homework 4. May An LTI system has an input, x(t) and output y(t) related through the equation y(t) = t e (t t ) x(t 2)dt

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

ECE 3084 QUIZ 2 SCHOOL OF ELECTRICAL AND COMPUTER ENGINEERING GEORGIA INSTITUTE OF TECHNOLOGY APRIL 2, Name:

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Chapter 6: Random Processes 1

Fourier transform representation of CT aperiodic signals Section 4.1

EE531 (Semester II, 2010) 6. Spectral analysis. power spectral density. periodogram analysis. window functions 6-1

Q3. Derive the Wiener filter (non-causal) for a stationary process with given spectral characteristics

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

1 Signals and systems

conditional cdf, conditional pdf, total probability theorem?

D.S.G. POLLOCK: BRIEF NOTES

Communication Theory Summary of Important Definitions and Results

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.

Lecture 1: Pragmatic Introduction to Stochastic Differential Equations

Non-parametric identification

14 - Gaussian Stochastic Processes

Communication Systems Lecture 21, 22. Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University

LOPE3202: Communication Systems 10/18/2017 2

Transcription:

In the name of GOD. Sharif University of Technology Stochastic Processes CE 695 Dr. H.R. Rabiee Homework 3 (Stochastic Processes). Explain why each of the following is NOT a valid autocorrrelation function: { e τ τ (a) R X (τ) e 2τ τ < Real autocorrelation function must be odd. (b) R X (τ) e τ I τ τ : R() < R(τ). (c) R X (τ) +τ 2 +τ 4 τ : R() < R(τ). All of them is considered for a continous WSS process. 2. The discrete-time Linear system shown in below figure consists of one unit delay and a constant multiplier (a < ). The input to this system is a white noise with average power σ 2. Find the spectral density and average power of the output. H(e jω ) e jω R XX (n) σ 2 δ(n) S Y Y (e jω ) σ 2 ae jω 2 σ 2 +a 2 2a cos(ω) R Y Y () E[Y 2 (n)] δ h(t) 2. 3. Let ϕ, ϕ 2,..., ϕ n be idd sample from the uniform distribution U( π, π). Now consider an stochastic process X(t) n i a i sin ( 2πi n t + ϕ i), Where a i are constant coefficients. (a) find R X (t, t 2 )? R X (t, t 2 ) n i 2 a2 2πi i cos( n (t t 2 )) (b) is X(t) a WSS process? E[X(t)] E [ n i a i sin( 2πi n t + ϕ i) ]

(c) Consider an LTI system with impulse response h(t) e 2t u(t). If we apply X(t) as input to this system, Find R XY and R Y Y. H(ω) jω+2 S XX (ω) πδ(ω 2πi 2πi ) + πδ(ω + S XY (ω) S XX (ω)h(ω) S Y Y (ω) S XY (ω)h( ω) R XY (ω) F (S XY ) R Y Y (ω) F (S Y Y ) n n ) π + π j 2πi n +2 j 2πi n +2 4. Consider an LTI system with system function: H(s) s 2 + 4s + 3 The input to this system is a WSS process X(t) with E{X 2 (t)}. Find S X (ω) such that the average power of output is maximum. R Y Y () E[y 2 ] 2π SXX (ω) H(ω) 2 dω 2π max H(ω) 2 S X (ω)dω max H(ω) 2 R XX () max H(ω) 2 44 R Y Y () E[y 2 ] < 44 S XX (ω) πδ(ω 5) + πδ(ω + 5) Solution is not unique! 5. Let {X(t)}, t R, be a continuous wide-sense-stationary process with unknown mean m and covariance function Cov(s) ae b s, t R, where a > and b >. For fixed T > set X T T X(s)ds. Show that (a) E[ X] m. That is, X is an unbiased estimator of m. (b) var( X) 2a[(bT ) (bt ) 2 ( e bt )] (a) E[X] E[ T T X(s)ds] T mds m E[X(s)]ds 2

(b) V ar(x) E[X 2 ] E[X] 2 E[ a a r ( R X (s r)dsdr m 2 C X (s r)dsdr a e bx dx + r X(s)X(r)dsds] m 2 e bx dx)dr ( b (e br + e b(t r) ))dr 2a b 2 (e bt ) + 2a bt C X (s r) + m 2 dsdr e b s r dsdr 6. Let X n (X+na mod ), n, ±, ±2,..., where X is an RV uniformly distributed on [, ] and a is a fixed irrational number. Show that X n is strictly stationary. Define I(p) where p is a composition as I(p) if p is true and I(p) if p is false. We have: F X (x,..., x n ; t,..., t n ) P (X(t ) x,..., X(t n ) x n ) P ((X + t a) mod x,..., (X + t n a) mod x n ) s s + I((X + t a) mod x,..., (X + t n a) mod x n )dx I((x + t a) mod x,..., (x + t n a) mod x n )dx I((x + t a) mod x,..., (x + t n a) mod x n )dx For any s. Setting s (ra) mod we have: F X (x,..., x n ; t,..., t n ) s + s I((x + ra + t a) mod x,..., (x + ra + t n a) mod x n )dx I((x + ra + t a) mod x,..., (x + ra + t n a) mod x n )dx I((x + (t + r)a) mod x,..., (x + (t n + r)a) mod x n )dx F X (x,..., x n ; t + r,..., t n + r) 7. Consider the random process X(t) as a stock index value at time t. Then S(t) X(t) X(t 5) is representative of the return if someone buys the stock at time t 5 and sells it at time t. Suppose S(t) is a normal WSS process with mean and autocorrelation function R(τ) 5exp( τ ). We 3

have observed that S(6), S(7) are below 3. That is we know there is negative return if some buys at time 6 5, 7 5 2 and sells at time 6 and 7, respectively. We want to estimate what is the probability that S() is positive. In other terms we want to see whether we have any chance to get profit if we buy the stock at time 5 5 and sell it at time. we have to compute P (S() > S(6) < 3, S(7) < 3). e e 4 S(6), S(7), S() N([,, ], Σ) where Σ 5 e e 3 e 4 e 3 P (S() > S(6) < 3, S(7) < 3) 3 3 x x 2 x 3 f x x 2 x 3 (x,x 2,x 3 )dx dx 2 dx 3 3 3 x 2 x 3 f x 2,x 3 (x 2,x 3 )dx 2 dx 3 8. Suppose V is a uniform random variable with U(, ) as its PDF. Define two stochastic processes X(t) u(t V ) and Y (t) δ(t V ). (Assume t ) a. Explain intuitively what X(t) and Y (t) describe. X(t) represents whether V t or V > t, i.e. if V t, then X(t) and otherwise X(t). Y(t) represents wether V y or not, i.e. if v t then X(t) and otherwise X(t). b. Evaluate the mean of X(t) and Y (t). (Assume t ) For a constant t, X(t) is a discrete random variable and its pmf is as following: t α f X(t) (α) t α otherwise E[X] ( t) + timest t For a constant t, Y (t) is a random variable which is a function of V, thus let s represent it as Y t (V ) E[Y t ] Y t(v )f V (α)dα δ(t α)dα Set β t α, then E[Y t ] t t δ(β)dβ c. Evaluate R XX (t, t 2 ), R XY (t, t 2 ) and R Y Y (t, t 2 ). (Assume t and t 2 ) R XX (t, t 2 ) P {X(t ), X(t 2 ) } +... p{x(max{t, t2}) } f X(max{t,t 2 })() max{t, t2} R XY (t, t2) E[X(t )Y (t 2 )] E[u(t V )δ(t 2 V )] u(t V )δ(t 2 V )f V (α)dα 4

Set β t α, then R XY (t, t 2 ) t t u(β)δ(β (t t 2 ))dβ t { δ(β t t (t t 2 ))dβ 2 t 2 otherwise if t t 2 then R Y Y (t, t 2 ) E[δ(t V )δ(t 2 V )] E[]. if t t 2 then R Y Y (t, t 2 ) E[δ 2 (t V )] δ2 (t α)f V (α)dα δ2 (t α)dα t t δ2 (β)dβ 9. Show that if the processes x(t) and y(t) are WSS and E{ x() y() 2 }, then R xx (τ) R xy (τ) R xz (τ). Hint: Set z x(t + τ), w x (t) y (t) and use Schwartz s inequality. E[zw] E[ z e iθ z w e iθ w ] z w ei(θ +θ 2 ) f(z, w)dzdw z w e i(θ z +θ w )f(z,w) dzdw z w f(z, w)dzdw E[ z w ] E[zw] E[ z w ] E[zw] 2 E 2 [ z w ] E[ z 2 ]E[ w 2 ] (Cauchy-Schwarz inequality) E[zw] 2 E[x(t + τ)x (t)] E[x(t + τ)y (t)] R XX (τ) R Y Y (τ) E[ z 2 ]E[ w 2 ] E[ z 2 ]E[ x (t) y (t) 2 ] E[ z 2 ]E[ x(t) y(t) 2 ] if we set t, then E[ x(t) y(t) 2 ], therefor: R XX (τ) R Y Y (τ) R XX (τ) R Y Y (τ) We can do the same to demonstrate R Y Y (τ) R XY (τ) 5