Deterministic. Deterministic data are those can be described by an explicit mathematical relationship

Similar documents
ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Fig 1: Stationary and Non Stationary Time Series

Basic Descriptions and Properties

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Stochastic Processes

IV. Covariance Analysis

Stochastic Processes. A stochastic process is a function of two variables:

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

CHAPTERl BASIC DESCRIPTIONS AND PROPERTIES

Continuous Random Variables

Chapter 6 - Random Processes

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Statistical signal processing

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

Northwestern University Department of Electrical Engineering and Computer Science

Signals and Spectra - Review

Signals and Spectra (1A) Young Won Lim 11/26/12

Module 9: Stationary Processes

Fundamentals of Noise

Random Processes Why we Care

CS145: Probability & Computing

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Expectation of Random Variables

Notes on Random Processes

Limiting Distributions

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

MULTIVARIATE PROBABILITY DISTRIBUTIONS

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Proving the central limit theorem

Stat 5101 Notes: Algorithms (thru 2nd midterm)

Joint Distribution of Two or More Random Variables

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

Chapter 6: Random Processes 1

2 Continuous Random Variables and their Distributions

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Mathematics 426 Robert Gross Homework 9 Answers

Spectral representations and ergodic theorems for stationary stochastic processes

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.

If we want to analyze experimental or simulated data we might encounter the following tasks:

Properties of the Autocorrelation Function

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

Probability and Statistics for Final Year Engineering Students

2 (Statistics) Random variables

Chapter 5 Random Variables and Processes

Econometría 2: Análisis de series de Tiempo

Limiting Distributions

4 Classical Coherence Theory

ECE 636: Systems identification

Chapter 4. Chapter 4 sections

Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process.

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

Exercises and Answers to Chapter 1

LIST OF FORMULAS FOR STK1100 AND STK1110

Lecture 3: Signal and Noise

7.7 The Schottky Formula for Shot Noise

Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages

Probability and Distributions

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

Gaussian, Markov and stationary processes

Stat 5101 Notes: Algorithms

Introduction to Probability and Stochastic Processes I

16.584: Random (Stochastic) Processes

Discrete Probability Refresher

Chp 4. Expectation and Variance

Chapter 6. Random Processes

STOCHASTIC PROBABILITY THEORY PROCESSES. Universities Press. Y Mallikarjuna Reddy EDITION

STAT 430/510: Lecture 16

1 Random Variable: Topics

Random Variables. P(x) = P[X(e)] = P(e). (1)

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

STAT 248: EDA & Stationarity Handout 3

conditional cdf, conditional pdf, total probability theorem?

Lecture 11. Probability Theory: an Overveiw

Interest Rate Models:

2A1H Time-Frequency Analysis II Bugs/queries to HT 2011 For hints and answers visit dwm/courses/2tf

Communication Theory II

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

BASICS OF PROBABILITY

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

3. Probability and Statistics

Probability Models in Electrical and Computer Engineering Mathematical models as tools in analysis and design Deterministic models Probability models

7 The Waveform Channel

Formulas for probability theory and linear models SF2941

Stochastic Processes. Chapter Definitions

Notes for Math 324, Part 19

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

Spectral Analysis of Random Processes

Chapter 4. Continuous Random Variables

4. Distributions of Functions of Random Variables

5 Operations on Multiple Random Variables

This is a Gaussian probability centered around m = 0 (the most probable and mean position is the origin) and the mean square displacement m 2 = n,or

Review of Probability Theory

Math 180C, Spring Supplement on the Renewal Equation

Lecture 4 : Random variable and expectation

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

Transcription:

Random data

Deterministic Deterministic data are those can be described by an explicit mathematical relationship

Deterministic x(t) =X cos r! k m t

Non deterministic There is no way to predict an exact value at a future instant of time These data are random in character and must be described in terms of probability statements and statistical averages

In practical terms The decision of whether physical data are deterministic or random is usually based on the ability of reproduce the data by controlled experiments

In practical terms If the experiment can be repeated producing identical data, within the limits of experimental error -> deterministic If an experiment cannot be designed that will produce identical results when repeated -> non deterministic (random)

Terminology A single time history representing a random phenomena is called a sample function or a sample record The collection of all sample function that a random phenomenon might have produced is called random process or stochastic process A sample record of data may be thought of as one physical realization of a random process

Classification of random data

Random data classification Random Stationary Nonstationary Ergodic Nonergodic

Statistical properties Considering a collection of sample functions:

Statistical properties Considering a collection of sample functions: mean value of a random process at some time ti can be computed averaging all instantaneous values of each sample correlation between values at two different times is the average of the product of instantaneous values at time ti and ti+tau

Statistical properties µ x (t i )= lim N!+1 1 N NX k=1 x k (t i ) R xx (t i,t i + ) = lim N!+1 1 N NX x k (t i )x k (t i + ) k=1

Stationary vs Nonstationary When mean value and autocorrelation vary as time ti varies, the random process is said to be nonstationary When mean value and autocorrelation do not vary as time ti varies, the random process is said to be weakly stationary or stationary in a wide sense

Weakly stationary For weakly stationary random process, the mean value is constant and the autocorrelation function is dependent only on the time displacement tau. µ x (t i )=µ x R xx (t i,t i + ) =R xx ( )

Weakly stationary In most case it is possible to describe the properties of a stationary random process by computing time averages over specific sample function µ x (k) = lim T!1 R xx (,k)= lim T!1 1 T 1 T Z T 0 Z T 0 x k (t) dt x k (t)x k (t + ) dt

Ergodic random data If mean value and autocorrelation function do not differ over different sample functions the random process is said to be ergodic!! µ x (k) =µ x R xx (,k)=r xx ( ) Only stationary random process can be ergodic

Ergodic random data Ergodic random process are an important class of random processes All properties of ergodic random process can be determined by a single sample function Fortunately, in practice, random data representing stationary physical phenomena are generally ergodic

Nonstationary random data The properties of nonstationary random process are generally time-varying function In practice it is often not feasible to obtain a sufficient number of sample records to permit an accurate measurement of properties of the ensemble This has tend to impede the development of practical techniques for measuring and analyzing nonstationary random data

Stationary sample records Data in the form of sample records are referred to be stationary or nonstationary Z ti +T µ x (t i,k)= 1 T x k (t) dt t i Z ti +T R xx (t i,t i +,k)= 1 T x k (t)x k (t + ) dt t i

Stationary sample records A single time series is referred to be stationary if properties computed over short time intervals do not vary significantly from one interval to the next If the sample properties vary significantly as the starting time ti varies the individual sample record is said to be nonstationary

Stationary sample records A sample record obtained from an ergodic random process will be stationary Sample records from nonstationary random process will be nonstationary Hence if an ergodic assumption is justified verification of stationarity of a single sample records will justify an assumption of stationarity and ergodicity for the random process

Analysis of random data

Analysis of random data Since no explicit mathematical equation can be written, statistical procedures must be used to define the descriptive properties of the data

Basic descriptive properties Mean and mean square values Probability density functions Autocorrelation functions Power spectral density functions

Joint statistical properties Joint probability density functions Cross-correlation functions Cross-spectral density functions Frequency response functions Coherence functions

Probability density function Is a function that describes the relative likelihood for this random variable to take on a given value The probability of the random variable falling within a particular range of values is given by the integral of this variable s density over that range

Probability density function Z b Pr[a apple X apple b] = a f X (x) dx Z x CDF X (x) =P (X apple x) = 1 f X (u) du

Probability density function 0.5 0.4 0.3 0.2 0.1-5 -2.5 0 2.5 5

Time domain analysis

Expected value E[X] = 1X x i p i,! i=1! E[X] = Z 1 xf(x) dx! 1 Arithmetic mean is an estimator of the expected value of a random process nx 2 x = 1 n i=1 x i Var( x) = n

Expected value E[X + c] =E[X]+c E[X + Y ]=E[X]+E[Y] E[aX] =a E[X] E[XY ]= Z Z xy j(x, y) dx dy Cov(X, Y )=E[XY ] E[X]E[Y ] E[XY ]= = Z Z Z Z xy j(x, y) dx dy = xyf(x)g(y) dy dx applez applez xf(x) dx yg(y) dy =E[X]E[Y ]

Variance Var(X) =E (X µ) 2 =E X 2 2X E[X]+(E[X]) 2 =E X 2 2E[X]E[X]+(E[X]) 2 =E X 2 (E[X]) 2 S 2 n(x) = 1 n nx (x i x) 2 s 2 n(x) = 1 i=1 n 1 nx (x i x) 2 i=1

Variance E[Sn]= 2 n 1 n Var(Sn)= 2 n 1 n 2 2 4 n E[s 2 n]= 2 Var(s 2 n)= 2 4 n 1 biased unbiased smaller variance greater variance

Variance of biased and unbiased estimator σ = 2 28 24 20 16 12 8 4 4 8 12 16 20 24 28 32 36 40 44 48

Markov's inequality 8a >0 P(X a) apple 1 a E[X] more in general: P(g(X) a) apple 1 a E[g(X)] g(x) g : R! R 0, 8x 2 R

Chebyshev's inequality 8a >0 P( X µ a) apple 2 a 2 P( X µ a ) apple 1 a 2 no more than 1/a 2 of the distribution's values can be more than a standard deviations away from the mean

Autocorrelation C xx ( ) =E[(X t µ)(x t+ µ)] R xx ( ) = E[(X t µ)(x t+ µ)] 2 Often the autocovariance is called autocorrelation even if this normalization has not been performed and vice-versa

Frequency domain analysis

Fourier transform The Fourier transform is given by! and the inverse transform is given by! The Fourier transform is also a random variable X(f) = lim T!1 x(t) = lim F!1 Z T/2 Z F/2 F/2 T/2 e i2 ft x(t) dt e i2 ft X(f) df

Power spectral density Average value of the squared magnitude of the Fourier transform S(f) =h X(f) 2 i = hx(f)x (f)i = lim T!1 1 T Z T/2 T/2 e i2 ft x(t) dt Z T/2 T/2 e i2 ft x(t 0 ) dt 0

= Z 1 Z 1 1 1 = lim T!1 = lim T!1 = lim T!1 Wiener-Khinchin S(f)e i2 f df hx(f)x (f)ie i2 f df 1 T 1 T 1 T 1 Z 1 1 Z 1 1 Z T/2 T/2 Z T/2 = lim T!1 T T/2 = hx(t)x(t )i Z T/2 T/2 Z T/2 Z T/2 T/2 Z T/2 T/2 x(t)x(t theorem e i2 ft x(t) dt T/2 Z T/2 T/2 e i2 ft0 x(t 0 ) dt 0 e i2 f df e i2 f(t t0 ) df x(t)x(t 0 ) dt dt 0 (t t 0 )x(t)x(t 0 ) dt dt 0 ) dt

Examples

Parseval s theorem Z 1 hx(t)x(t )i = Z 1 1 S(f)e i2 f df )hx 2 (t)i = 1 S(f) df The average value of the square of the signal (variance if the signal has zero mean) is equal to the integral of the power spectral density

Examples

Shot noise Generated by discrete arrival electrons in a wire rain on a roof Interactions can be ignored Arrival independent Poisson process

Shot noise <I>= qn/t I(t) =q NX n=1 (t t n ) Z T/2 NX NX I(f) = lim T!1 T/2 e i2 ft q n=1 (t t n ) dt = q n=1 e i2 ft n S I (f) =<I(f)I (f) >= lim T!1 = lim T!1 q 2 N T = q<i> q 2 T NX n=1 e i2 ft n NX m=1 e i2 ft m! <I 2 noise >= 2q<I> f

Johnson noise Relaxation of thermal fluctuation in a resistor Small voltage fluctuation associated with thermal motion of electrons <V 2 noise >= 4kTR f