Communication Theory II

Similar documents
Communication Theory II

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

IV. Covariance Analysis

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

Stochastic Processes

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

Chapter 6. Random Processes

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes

Communication Theory II

Communication Theory II

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

Statistical signal processing

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi)

Module 9: Stationary Processes

PROBABILITY THEORY. Prof. S. J. Soni. Assistant Professor Computer Engg. Department SPCE, Visnagar

Stochastic Processes. A stochastic process is a function of two variables:

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

Fig 1: Stationary and Non Stationary Time Series

Communication Theory II

Lecture - 30 Stationary Processes

Stochastic Process II Dr.-Ing. Sudchai Boonto

Question Paper Code : AEC11T03

Econometría 2: Análisis de series de Tiempo

LQR, Kalman Filter, and LQG. Postgraduate Course, M.Sc. Electrical Engineering Department College of Engineering University of Salahaddin

Definition of a Stochastic Process

Chapter 2 Random Processes

Problems on Discrete & Continuous R.Vs

Probability and Statistics for Final Year Engineering Students

16.584: Random (Stochastic) Processes

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

Name of the Student: Problems on Discrete & Continuous R.Vs

ECE Homework Set 3

Some Time-Series Models

Signals and Spectra - Review

Random Process. Random Process. Random Process. Introduction to Random Processes

Random Processes Why we Care

Introduction to Probability and Stochastic Processes I

1. Fundamental concepts

7 The Waveform Channel

If we want to analyze experimental or simulated data we might encounter the following tasks:

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

Stochastic Processes

Chapter 6: Random Processes 1

Communications and Signal Processing Spring 2017 MSE Exam

TSKS01 Digital Communication Lecture 1

Massachusetts Institute of Technology

Chapter 5 Random Variables and Processes

STOCHASTIC PROBABILITY THEORY PROCESSES. Universities Press. Y Mallikarjuna Reddy EDITION

1. Stochastic Processes and Stationarity

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

Probability and Statistics

Lecture 1: Brief Review on Stochastic Processes

Stochastic Processes. Monday, November 14, 11

STAT 248: EDA & Stationarity Handout 3

Reliability Theory of Dynamically Loaded Structures (cont.)

X. Cross Spectral Analysis

Reliability Theory of Dynamic Loaded Structures (cont.) Calculation of Out-Crossing Frequencies Approximations to the Failure Probability.

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

System Modeling and Identification CHBE 702 Korea University Prof. Dae Ryook Yang

Basics on 2-D 2 D Random Signal

Analysis and Design of Analog Integrated Circuits Lecture 14. Noise Spectral Analysis for Circuit Elements

Stochastic process for macro

Lecture 7 Random Signal Analysis

Name of the Student: Problems on Discrete & Continuous R.Vs

Stochastic Structural Dynamics Prof. Dr. C. S. Manohar Department of Civil Engineering Indian Institute of Science, Bangalore

1 Class Organization. 2 Introduction

Econ 424 Time Series Concepts

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

MA6451 PROBABILITY AND RANDOM PROCESSES

On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y).

Name of the Student: Problems on Discrete & Continuous R.Vs

ECE302 Spring 2006 Practice Final Exam Solution May 4, Name: Score: /100

1 Linear Difference Equations

3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE

Utility of Correlation Functions

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends

2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES

Random Processes. DS GA 1002 Probability and Statistics for Data Science.

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms For Inference Fall 2014

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

G.PULLAIAH COLLEGE OF ENGINEERING & TECHNOLOGY DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING PROBABILITY THEORY & STOCHASTIC PROCESSES

What s for today. Random Fields Autocovariance Stationarity, Isotropy. c Mikyoung Jun (Texas A&M) stat647 Lecture 2 August 30, / 13

Introduction to Stochastic processes

Stochastic Dynamics of SDOF Systems (cont.).

ECE Lecture #10 Overview

Chapter 6 - Random Processes

STOCHASTIC PROCESSES Basic notions

Probability Models in Electrical and Computer Engineering Mathematical models as tools in analysis and design Deterministic models Probability models

EEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2:

Communication Theory II

Discrete time processes

4 Classical Coherence Theory

2. (a) What is gaussian random variable? Develop an equation for guassian distribution

Transcription:

Communication Theory II Lecture 8: Stochastic Processes Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 5 th, 2015 1

o Stochastic processes What is a stochastic process? Types: Lecture Outlines o Stationary vs. nonstationary processes o Strictly stationary processes vs. weakly stationary processes (e.g., Ergodic processes) Parameters: o Mean, correlation, Covariance o Power spectral density Transmission of a weakly stationary process in LTI system Poisson and Gaussian Processes 2

Introduction to Stochastic Processes o Received signal at the wireless channel output varies randomly with time Processes of this kind are said to be random or stochastic Signal with random changes Source Encoder Channel Decoder o It is not possible to predict the exact value of a signal drawn from a stochastic process However, it is possible to characterize the process in terms of statistical parameters such as average power, correlation, and power spectra 3

Stochastic Processes o A stochastic process may be presented as the sample space or ensemble composed of functions of time o Each sample point of the sample space pertaining to a stochastic process is a function of time o The totality of sample points corresponding to the aggregate of all possible realizations of the stochastic process is called the sample space o A single realization of a process is a random waveform that evolves across time Each realization of a process is associated with a sample point Set (ensemble) of sample functions 4

Stochastic Processes (cont d) o Consider a stochastic process specified by: outcomes s observed from some sample space S events defined on the sample space S probabilities of these events o Suppose, we assign each sample point s a function in time in accordance with the rule: X(t,s j ), -T t T 2T: total observation interval o For a fixed sample point s j the graph of the function X(t,s j ) is called a realization or sample function of the stochastic process To simplify we denote this sample function X j (t)= X(t,s j ), -T t T Set (ensemble) of sample functions 5

Stochastic Processes and R.Vs o A realization or sample function of the stochastic process: X j (t)= X(t,s j ), -T t T o At particular instant of time, we deal with a R.V sampled (observed) at that instant of time o A random variable is constituted as the set of numbers observed at a fixed time t k inside the observation interval A stochastic process X(t,s) or X(t) is represented by the time indexed ensemble (family) of random variables {X(t k,s)} or {X(t k )} Set (ensemble) of sample functions 6

Stochastic Processes and R.Vs (cont d) o Stochastic process the outcomes of stochastic experiment is mapped into a waveform (function of time) o R.V. the outcomes of a stochastic process is mapped to a number A stochastic process X(t) is an ensemble of time functions, which, together with a probability rule, assigns a probability to any meaningful event associated with an observation of one of the sample functions of the stochastic process Set (ensemble) of sample functions 7

Important Types of Stochastic Processes o Stationary and nonstationary o Strictly stationary and weakly stationary (wide-sense stationary) Ergodic processes (subsets of weakly stationary processes) 8

Stationary Processes o In dealing with stochastic processes encountered in the real world: We often find that the statistical characterization of a process is independent of the time at which observation of the process is initiate That is, if such a process is divided into a number of time intervals, the various sections of the process exhibit essentially the same statistical properties This process is said to be stationary. Otherwise, it is said to be nonstationary o A stationary process arises from a stable phenomenon that has evolved into a steady-state mode of behavior, whereas a nonstationary process arises from an unstable phenomenon 9

Strictly stationary 10

Strictly stationary (cont d) oa stochastic process X(t), initiated at time t = -, is strictly stationary if the joint distribution of any set of random variables obtained by observing the process X(t) is invariant with respect to the location of the origin t = 0 ohow such a process is random? The finite-dimensional distributions depend on the relative time separation between random variables, not on their absolute time However, the stochastic process has the same probabilistic behavior throughout the global time t 11

Jointly Strictly Stationary Processes 12

Properties of Strictly Stationary Processes 1. 2. 13

Example (Multiple Spatial Windows) 14

Weakly (Wide-sense) stationary Processes o A stochastic process X(t) is said to be weakly stationary if its secondorder moments satisfy the following two conditions: 1. The mean of the process X(t) is constant for all time t 2. The autocorrelation function of the process X(t) depends solely (alone) on the difference between any two times at which the process is sampled auto in autocorrelation refers to the correlation of the process with itself 15

Statistical Parameters of Stochastic Processes o Mean o Correlation o Cross-correlation o Covariance o Power Spectral Density o Cross Spectral Density 16

Mean of Stochastic Processes othe mean of a real-valued stochastic process X(t) is the expectation of the random variable obtained by sampling the process at some time t Where is the first-order probability density function of the process X(t), observed at time t Note also that the use of single X as subscript in is intended to emphasize the fact that is a first-order moment 17

Mean of Weakly stationary Processes o The mean of a process is given by: o For a process X(t), to satisfy the first condition of weak stationary, The mean is a constant for all time (independent of time t) 18

Autocorrelation of A Stochastic Process oautocorrelation function of the stochastic process X(t) is the expectation of the product of two random variables, X(t 1 ) and X(t 2 ), obtained by sampling the process X(t) at times t 1 and t 2 : Where is the second-order probability density function of the process X(t), observed at times t 1 and t 2 19

Autocorrelation of A Weakly Stationary Process The autocorrelation function of the process X(t) depends solely (alone) on the difference between any two times at which the process is sampled Equivalently, if t 1 = t and t 2 = t 1 + τ : 20

Autocovariance of A Weakly Stationary Process 21

Mean and Autocorrelation of Weakly stationary Processes for the mean and the autocorrelation 22

Properties of the Autocorrelation Function o Mean Square value o Symmetry Thus the auto correlation may be defined also as: o Maximum magnitude at zero shift: or o Normalized autocorrelation function: 0 ρ XX 1 23

Autocorrelation Function and Decorrelation Time o The autocorrelation function R XX τ provides a means of describing the interdependence of two random variables obtained by sampling the stochastic process X(t) at times τ seconds apart o The more rapidly the stochastic process X(t) changes with time, the more rapidly will the autocorrelation function R XX τ decrease from its maximum R XX 0 as increases o At a decorrelation time τ dec, such that, for τ > τ dec : The magnitude of the autocorrelation function R XX τ remains below some prescribed value The one percent decorrelation time τ dec of a weakly stationary process of X(t) zero mean is the time taken for the magnitude of the autocorrelation function R XX τ decrease, for example, to 1% of its maximum value R XX 0 24

Example 1: Sinusoidal Wave with Random Phase o Consider a sinusoidal signal with random phase, defined by o The random variable is equally likely to have any value in the interval [ ] o Each value of corresponds to a point in the sample space S of the stochastic process X(t) o The process X(t) represents a locally generated carrier in the receiver of a communication system, which is used in the demodulation of a received signal o Find and plot the autocorrelation function 25

Homework o Write a Matlab program showing that by increasing the number of samples from a uniformly distributed R.V., the normalized histogram of these samples is closer to a uniformly distributed mass function frequency of X=x normalized histogram at (X=x i )= i summation of all frequencies = frequency of X=x i Number of samples o Group is allowed (3-5) using the oral grouping o Post your solution by Email to eng.nakib@gmail.com o You can be asked individually about your solution 26

Solution 27

Questions 28