Random Process Review

Similar documents
ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Random Process Examples 1/23

1 Brownian motion and the Langevin equation

The Distribution of the Covariance Matrix for a Subset of Elliptical Distributions with Extension to Two Kurtosis Parameters

Chapter 6 1-D Continuous Groups

Probability and Statistics

Automated Frequency Domain Decomposition for Operational Modal Analysis

Poisson processes and their properties

Probability and Statistics for Final Year Engineering Students

Estimating Parameters for a Gaussian pdf

TEST OF HOMOGENEITY OF PARALLEL SAMPLES FROM LOGNORMAL POPULATIONS WITH UNEQUAL VARIANCES

The proofs of Theorem 1-3 are along the lines of Wied and Galeano (2013).

Chapter 6. Random Processes

Physics 215 Winter The Density Matrix

COS 424: Interacting with Data. Written Exercises

Block designs and statistics

Supplementary to Learning Discriminative Bayesian Networks from High-dimensional Continuous Neuroimaging Data

Fourier Series Summary (From Salivahanan et al, 2002)

16.584: Random (Stochastic) Processes

Multi-Scale/Multi-Resolution: Wavelet Transform

Pseudo-marginal Metropolis-Hastings: a simple explanation and (partial) review of theory

2 Q 10. Likewise, in case of multiple particles, the corresponding density in 2 must be averaged over all

1 Bounding the Margin

Tail Estimation of the Spectral Density under Fixed-Domain Asymptotics

PREPRINT 2006:17. Inequalities of the Brunn-Minkowski Type for Gaussian Measures CHRISTER BORELL

13.2 Fully Polynomial Randomized Approximation Scheme for Permanent of Random 0-1 Matrices

Symmetrization and Rademacher Averages

Physics 2107 Oscillations using Springs Experiment 2

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

Moments of the product and ratio of two correlated chi-square variables

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Multi-dimensional Central Limit Theorem

Multi-dimensional Central Limit Argument

SOLUTIONS. PROBLEM 1. The Hamiltonian of the particle in the gravitational field can be written as, x 0, + U(x), U(x) =

Quantum algorithms (CO 781, Winter 2008) Prof. Andrew Childs, University of Waterloo LECTURE 15: Unstructured search and spatial search

SEISMIC FRAGILITY ANALYSIS

A Simplified Analytical Approach for Efficiency Evaluation of the Weaving Machines with Automatic Filling Repair

3.3 Variational Characterization of Singular Values

Approximation in Stochastic Scheduling: The Power of LP-Based Priority Policies

Poornima University, For any query, contact us at: , 18

Solutions of some selected problems of Homework 4

Slide10. Haykin Chapter 8: Principal Components Analysis. Motivation. Principal Component Analysis: Variance Probe

Stochastic Processes

Causality and the Kramers Kronig relations

AVOIDING PITFALLS IN MEASUREMENT UNCERTAINTY ANALYSIS

Vulnerability of MRD-Code-Based Universal Secure Error-Correcting Network Codes under Time-Varying Jamming Links

Stochastic Process II Dr.-Ing. Sudchai Boonto

PHYSICS 110A : CLASSICAL MECHANICS MIDTERM EXAM #2

BEE604 Digital Signal Processing

Non-Parametric Non-Line-of-Sight Identification 1

Extension of CSRSM for the Parametric Study of the Face Stability of Pressurized Tunnels

Machine Learning Basics: Estimators, Bias and Variance

III.H Zeroth Order Hydrodynamics

RAFIA(MBA) TUTOR S UPLOADED FILE Course STA301: Statistics and Probability Lecture No 1 to 5

CSE525: Randomized Algorithms and Probabilistic Analysis May 16, Lecture 13

4 = (0.02) 3 13, = 0.25 because = 25. Simi-

Ref. Gallager, Stochastic Processes. Notation a vector. All vectors are row vectors. k k. jωx. Φ joint chacteristic function of.

Lecture 21 Nov 18, 2015

An Improved Particle Filter with Applications in Ballistic Target Tracking

Problems on Discrete & Continuous R.Vs

Probability Distributions

Feature Extraction Techniques

1. Calculate the DFT of order m of the following sequences. m 1. Furthermore, the only nonzero entry in the DFT of x k occurs when j = 0 thus, m 1

Optimal Jamming Over Additive Noise: Vector Source-Channel Case

Basic concept of dynamics 3 (Dynamics of a rigid body)

Asymptotic mean values of Gaussian polytopes

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Lectures 8 & 9: The Z-transform.

Kinetic Theory of Gases: Elementary Ideas

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

Supplementary Information for Design of Bending Multi-Layer Electroactive Polymer Actuators

Bayesian inference for stochastic differential mixed effects models - initial steps

Using EM To Estimate A Probablity Density With A Mixture Of Gaussians

Detection and Estimation Theory

Kinetic Theory of Gases: Elementary Ideas

Statistical signal processing

1 Proof of learning bounds

Statistics and Probability Letters

The Chebyshev Matching Transformer

Lean Walsh Transform

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 11 10/15/2008 ABSTRACT INTEGRATION I

On Conditions for Linearity of Optimal Estimation

12 Towards hydrodynamic equations J Nonlinear Dynamics II: Continuum Systems Lecture 12 Spring 2015

Department of Physics, Sri Venkateswara University, Tirupati Range Operations, Satish Dhawan Space Centre SHAR, ISRO, Sriharikota

Four-vector, Dirac spinor representation and Lorentz Transformations

Problem Set 2 Due Sept, 21

Supporting Information for Supression of Auger Processes in Confined Structures

2nd Workshop on Joints Modelling Dartington April 2009 Identification of Nonlinear Bolted Lap Joint Parameters using Force State Mapping

Basics on 2-D 2 D Random Signal

Multi-Dimensional Hegselmann-Krause Dynamics

arxiv: v2 [math.st] 11 Dec 2018

On the Sliding-Window Representation in Digital Signal Processing

Understanding Machine Learning Solution Manual

Stochastic Processes. A stochastic process is a function of two variables:

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

PROBABILITY AND RANDOM PROCESSESS

Computational and Statistical Learning Theory

An Extension to the Tactical Planning Model for a Job Shop: Continuous-Time Control

IN modern society that various systems have become more

Keywords: Estimator, Bias, Mean-squared error, normality, generalized Pareto distribution

Transcription:

Rando Process Review Consider a rando process t, and take k saples. For siplicity, we will set k. However it should ean any nuber of saples. t () t x t, t, t We have a rando vector t, t, t. If we find the joint pdf of, f f ( x, x, x ) t t t then we know everything about the rando process t. Stationary rando processes o show t is stationary, we ust show the tie-shift invariance, that is, for any, ft, t, t ( x, x, x) ft, t, t ( x, x, x). ypically we will show that by expanding the joint pdf, ft, t, t ( x, x, x) ft ( x) ft, t t ( x, x x) f t ( x) f ( x x) f, ( x x, x) t t t t t and show all three ters are tie-shift invariant f t ( x ) does not depend on t. f ( x x) depends on t t t t f, ( x x, x) depends on tt and/or t t t t t Recall that the rando telegraph signal satisfies these conditions.

Wide-sense stationary rando processes t is wide-sense stationary if Mean: t ( ) for all t Auto-correlation: R ( t, t ) t t R ( ), t t for any t and t If t herefore and is wide-sense stationary, the auto-covarince is C ( t, t ) t t t t R ( ). C ( t, t ) C ( ) C ( t, t ) C (0) R (0) j j Mean Vector and Covariance Matrix he ean vector is = ( t ), ( t ), ( t ). and the covariance atrix of the rando vector is Λ,,,,,,,,, C t t C t t C t t C t t C t t C t t C t t C t t C t t If t and is wide-sense stationary, Λ =,, C t t C t t C t t C t t C t t C t t

Gaussian Rando Vector is a Gaussian rando vector if and only if its joint characteristic function is ω exp ωλω jω where is the ean and Λ is the covariance atrix. he pdf f f ( x) can be found by the inverse Fourier transfor: ( x) exp x k Λ x Λ,, are said to be jointly Gaussian rando variables if and only if,, us a Gaussian rando vector. Weighted Su of Gaussian Rando Variables Let be a Gaussian rando vector and define Y as a transforation of Y A b where di k, A is a kk atrix, and b is a k-diesional constant vector. hen Y is also a Gaussian rando vector, and Y Hoework and Y A b Λ AΛ A Suppose,, is a -di Gausssian rando vector. Show that, is a -di Gaussian rando vector. Proof Since,, is a -di Gaussian rando vector, its joint characteristic function is of the for where ω,,, ω exp ωλ ω jω is the -diensional ean vector and Λ is the covariance atrix. Let Λ = and.

4 On the other hand, the joint CF of the -di rando vector (, ) is,, j x j x e e f, x x dx dx j x j x, e e f,, x x x dx dx dx,, 0,,,, In eq., ωλ ω 0 0 0 0 ωλ ω where ω (, ) and (, ) Likewise ω 0 0 ω We have shown exp ω ωλ ω jω and thus we have proved is a -di Gaussian rando vector.

5 Hoework Suppose,, is a -di Gausssian rando vector with covariance atrix and ean vector. Find the covariance atrix and the ean vector of the -di rando vector,. Solution., are jointly Gausssian with covariance atrix and ean vector Furtherore, is Gausssian with variance and ean..

6 Gaussian Rando Process ( t) is referred to as a Gaussian rando process if t, t, t is a Gaussian rando vector for any sapling instants t, t, t. Wide-sense Stationary Gaussian Rando Process t is a wide-sense stationary Gaussian rando process =,, if t, t, t is a Gaussian rando vector with and C t t C t t Λ C t t C t t C t t C t t for any sapling instants t, t, t. A wide-sense staionary Gaussian rando process is a staionary Gaussian rando process. Recall the MD centra liit theore. When the donor process is WSS, the su process is a stationary Gaussian rando process. he two have the sae covariance atrix. Gaussian Input/Output he output of an LI filter is a Gaussian rando process when the input is Gaussian. Wide-Sense Stationary Input/Output he output of a LI filter is a WSS rando process when the input is WSS.

7 Gaussian Rando Vector Wj N if are indenpendent 0,. k Ref. Gallager, Stochastic Processes. Definition. W W, W,, W is referred to as an noralized IID Gaussian k-rv Definition... each j rando variables,,, is a set of jointly Gaussian zero-ean rando variables if k can be expressed as a linear cobination of soe finte set of noralized IID Gaussian j W, W,, W a W, j,,, k. j he above definition can be written in a vector for below Definition. is a zero-ean Gaussian k-rv if for soe noralized IID Gaussian -rv W, can be expressed as A W where A aj, j k, is a given atrix of real nubers. More generally, Definition.. U, U,, U are jointly Gaussian or equivalently, U is a Gaussian k-rv if U μ where is a zero-ean Gaussian k-rv and μ is a real k vector. k h..5. Let be a zero-ean Gaussian n-rv with covariance atrix Λ. hen the joint characteristic function of is copletely deterined by Λ ω e ωλ ω h..5.ext Let U be a Gaussian n-rv with an arbitrary ean. Let U μ for soe zero-ean Gaussian n-rv. μ is the ean value vector. hen the joint characteristic function of U is copletely deterined by μ and Λ. U ωλω jωμ ωe U and have the sae covariance atrix Λ.