VECTOR RANDOM PROCESSES

Size: px
Start display at page:

Download "VECTOR RANDOM PROCESSES"

Transcription

1 ECE5530: Multivariable Control Systems II. 1 VECTOR RANDOM PROCESSES.1: Scalar random variables We will be working with systems whose state is affected by noise and whose output measurements are also corrupted by noise. Disturbance w.t/ System x.t/ Sensor noise v.t/ By definition, noise is not deterministic it is random in some sense. So, to discuss the impact of noise on the system dynamics, we must y.t/ review the concept of a random variable, (RV) X. Cannot predict exactly what we will get each time we measure or sample the random variable, but We can characterize the probability of each sample value by the probability density function (pdf). Probability density functions (pdf) We denote probability density function (pdf) of RV X as fx.x/. f X.x/ dx fx.x 0 / dx is the probability that x 0 x random variable X is between Œx 0 ;x 0 C dx.

2 ECE5530, VECTOR RANDOM PROCESSES Properties that are true of all pdfs: 1. f X.x/ 0 8 x.. Z 1 f X.x/ dx D Pr.X x 0 / D Z x0 distribution function (cdf). 1 f X.x/ dx 4 D F X.x 0 /,whichistherv scumulative Problem: Apart from simple examples it is often difficult to determine f X.x/ accurately. Use approximations to capture the key behavior. Need to define key characteristics of fx.x/. EXPECTATION: Describes the expected outcome of a random trial. Nx D EŒX D Z 1 xf X.x/ dx. Expectation is a linear operator (very important for working withit). So, for example, the first moment about the mean: EŒX Nx D 0. STATISTICAL AVERAGE: Different from expectation. Consider a (discrete) RV X that can assume n values x1, x,...x n. Define the average by making many measurements N!1.Then, m i is the number of times the value of the measurement is i. Nx D 1 N.m 1x 1 C m x CCm n x n / D m 1 N x 1 C m N x CC m n N x n. In the limit, m1 =N! Pr.X D x 1 / and so forth (assuming ergodicity), NX Nx D x i Pr.X D x i /. id1 1 Some call this a probability distribution function, with acronym PDF (uppercase), as different from a probability density function, which has acronym pdf (lowercase).

3 ECE5530, VECTOR RANDOM PROCESSES 3 So, statistical means can converge to expectation in the limit. (Can show similar property for continuous RVs... but harder to do.) VARIANCE: Second moment about the mean. var.x/ D EŒ.X Nx/ D D EŒX.EŒX /, Z 1.x Nx/ f X.x/ dx or is equal to the mean-square minus the square-mean. STANDARD DEVIATION: Measure of dispersion about the mean of the samples of X: X D p var.x/. The expectation and variance capture key features of the actual pdf. Higher-order moments are available, but we won t need them! KEY POINT FOR UNDERSTANDING VARIANCE: Chebychev s inequality Chebychev s inequality states (for positive ") Pr.jX Nxj "/ X ", which implies that probability is concentrated around the mean. It may be proven as follows: Pr.jX Nxj "/ D Z Nx " f X.x/ dx C Z 1 NxC" f X.x/ dx. For the two regions of integration jx Nxj=" 1 or.x Nx/ =" 1. So, Pr.jX Nxj "/ Z Nx ".x Nx/ " f X.x/ dx C Since fx.x/ is positive, then we also have Pr.jX Nxj "/ Z 1.x Nx/ Z 1 NxC".x Nx/ " f X.x/ dx D X ". " f X.x/ dx.

4 ECE5530, VECTOR RANDOM PROCESSES 4 This inequality shows that probability is clustered around the mean, and that the variance is an indication of the dispersion of the pdf. That is, variance (later on, covariance too) informs us of how uncertain we are about the value of a random variable. Low variance means that we are very certain of its value; High variance means that we are very uncertain of its value. The mean and variance give us an estimate of the value of a random variable, and how certain we are of that estimate. The most important distribution for this course The Gaussian (normal) distribution is of key importance to us. (We will explain why this is true later see main point #7 on pg. 14.) Its pdf is defined as: 1.x Nx/ f X.x/ D p exp. X Symmetric about Nx. X Peak proportional to 1 X at Nx. Notation: X N. Nx; X / Probability that X within X of Nx is 68%; probabilitythatx within X of Nx is 96%; probabilitythatx within 3 X of Nx is 99:7%. A 3 X range almost certainly covers observed samples. Narrow distribution Sharp peak. High confidence in predicting X. Wide distribution Poor knowledge in what to expect for X.

5 ECE5530, VECTOR RANDOM PROCESSES 5.: Vector random variables With very little change in the preceding, we can also handle vectors of random variables. X D 6 4 X 1 X : X n I let x 0 D 6 4 x 1 x : x n X described by (scalar function) joint pdf fx.x/ of vector X. fx.x 0 / means f X.X 1 D x 1 ;X D x X n D x n /. That is, fx.x 0 / dx 1 dx dx n is the probability that X is between x 0 and x 0 C dx. Properties of joint pdf fx.x/: 1. f X.x/ 0 8 x. Sameasbefore.. Z 1 Z 1 3. Nx D EŒX D Z 1 f X.x/ dx 1 dx dx n D 1. Basicallythesame. Z 1 Z 1 Z 1 4. Correlation matrix: Different. X D EŒXX T (outer product) D Z 1 Z 1 ::: xf X.x/ dx 1 dx dx n.basicallysame. Z 1 xx T f X.x/ dx 1 dx dx n. 5. Covariance matrix: Different. Define e X D X Nx. Then, X D e X D EŒ.X Nx/.X Nx/ T D Z 1 Z 1 Z 1.x Nx/.x Nx/ T f X.x/ dx 1 dx dx n.

6 ECE5530, VECTOR RANDOM PROCESSES 6 X is symmetric and positive-semi-definite (psd). This means y T X y 0 8 y. PROOF: For all y 0, 0 EŒ.y T.X Nx// D y T EŒ.X Nx/.X Nx/ T y D y T X y. Notice that correlation and covariance are the same for zero-mean random vectors. The covariance entries have specific meaning:. X / ii D X i. X / ij D ij Xi Xj D. X / ji. The diagonal entries are the variances of each vector component; The correlation coefficient ij is a measure of linear dependence between X i and X j. j ij j1. The most important multivariable distribution for this course The multivariable Gaussian is of key importance for Kalman filtering. Nx 1 Nx

7 ECE5530, VECTOR RANDOM PROCESSES 7 f X.x/ D 1./ n= j X j 1= exp 1.x Nx/T X.x Nx/ : j X jddet. X /; X requires positive-definite X. Notation: X N. Nx; X /. Contours of constant fx.x/ are hyper-ellipsoids, centered at Nx, directions governed by X.Principleaxesdecouple X (eigenvectors). Two-dimensional zero-mean case: (Let 1 D X1 and D X ) " # X D 1 1 j X jd1.1 1 /: 0 1 x1 x 1 1 x B 1 f X1 ;X.x 1 ;x / D 1 q1 1 1 C x C / A.

8 ECE5530, VECTOR RANDOM PROCESSES 8.3: Uncorrelated versus independent INDEPENDENCE: Iff jointly-distributed RVs are independent, then f X.x 1 ;x ;:::;x n / D f X1.x 1 /f X.x / f Xn.x n /. Joint distribution can be split up into the product of individual distributions for each RV Equivalent condition: For all functions f./ and g./, EŒf.X 1 /g.x / D EŒf.X 1 / EŒg.X /. The particular value of the random variable X1 has no impact on what value we would obtain for the random variable X. UNCORRELATED: Two jointly-distributed RVs X 1 and X are uncorrelated if their second moments are finite and which implies 1 D 0. cov.x 1 ;X / D EŒ.X 1 Nx 1 /.X Nx / D 0 Uncorrelated means that there is no linear relationship between X1 and X. MAIN POINT #1: If jointly-distributed RV X 1 and X are independent then they are uncorrelated. Independence implies uncorrelation. To see this, notice that independence means EŒX1 X D EŒX 1 EŒX. Therefore, therefore uncorrelated. cov.x 1 ;X / D EŒX 1 X EŒX 1 EŒX D 0,

9 ECE5530, VECTOR RANDOM PROCESSES 9 Does uncorrelation imply independence? Consider the example Y 1 D sin.x/ and Y D cos.x/ with X uniformly distributed on Œ0; 1. We can show that EŒY1 D EŒY D EŒY 1 Y D 0. So, cov.y1 ;Y / D 0 and Y 1 and Y are uncorrelated. But, Y 1 C Y D 1 and therefore Y 1 and Y are clearly not independent. Note: cov.x1 ;X / D 0 (uncorrelated) implies that but independence requires for all f./ and g./. EŒX 1 X D EŒX 1 EŒX EŒf.X 1 /g.x / D EŒf.X 1 / EŒg.X / Therefore, independence is much stronger than uncorrelated. COROLLARY: Consider a RV X with uncorrelated components. The covariance matrix X D EŒ.X Nx/.X Nx/ T is diagonal. PROOF: Notice that: The diagonal elements are. X / ii D EŒ.X i Nx i / D i : The off-diagonal elements are. X / ij D EŒ.X i Nx i /.X j Nx j / D EŒX i X j X i Nx j Nx i X j CNx i Nx j D EŒX i EŒX j EŒX i EŒX j C EŒX i EŒX j D 0. MAIN POINT #: If jointly normally distributed RVs are uncorrelated, then they are independent. (This is a special case.)

10 ECE5530, VECTOR RANDOM PROCESSES 10.4: Functions of random variables MAIN POINT #3: Suppose we are given two random variables Y and X with Y D g.x/. Alsoassumethatg exists, g and g are continuously differentiable, then f Y.y/ D f ˇ, where kjjk means to take the absolute value of the determinant. So what? Say we know the pdf of X1 and X and ) Y 1 D X 1 C X Y D AX Y D X we can now form the pdf of Y handy! EXAMPLE: Y D kx and X N.0; X /. 1 X D k Y and so g.y/ D 1.y/ k y.then,@g D k. y ˇˇˇˇ 1 f Y.y/ D f X k kˇ D 1 1.y=k/ p exp jkj X X 1 y D p.x k/ exp.. X k/ Therefore, multiplication by gain k results in a normally-distributed RV with scale change in standard deviation. Y N.0; k X /. EXAMPLE: Y D AX C B where A is a constant (non-singular) matrix, B is a constant vector, and X N. Nx; X /. X D A Y A B so g.y/ D A y D A.

11 ECE5530, VECTOR RANDOM PROCESSES 11 fx.x/ D Therefore, f Y.y/ D D D 1 exp./ n= j X j1= ja j exp./ n= j X j1= 1./ n=.jajj X jja T j/ 1 exp./ n= j Y j1= 1.x Nx/T X.x Nx/. 1.A.y B/ Nx/ T 1= exp X.A.y B/ Nx/ 1.y Ny/T.A / T X A.y Ny/ 1.y Ny/ Y.y Ny/, if Y D A X A T and Ny D A Nx C B. Thatis,Y N.A Nx C B;A X A T /. CONCLUSION: Sum of Gaussians is Gaussian very special case. RANDOM NOTE: How to use randn.m to simulate non-zero mean Gaussian noise with covariance Y? Y N. Ny; Y / but randn.m returns X N.0; I /. Try y DNy C A T x where A is square with the same dimension as Y ; A T A D Y.(A is the Cholesky decomposition of positive-definite symmetric matrix Y ). ybar = [1; ]; covar = [1, 0.5; 0.5, 1]; A = chol(covar); x = randn([, 1]); y = ybar + A'*x; When Y is non-positive definite (but also non-negative definite) y coordinate samples with mean [1;] and covar [1,0.5; 0.5,1] [L,D] = ldl(covar); x = randn([,5000]); y = ybar(:,ones([1 5000])) +(L*sqrt(D))*x; x coordinate

12 ECE5530, VECTOR RANDOM PROCESSES 1 MAIN POINT #4: Normality is preserved in a linear transformation (extension of above example). Let X and W be independent vector random variables, a is a constant (vector): X N. Nx; X /, W N. Nw; W /. Let Z D AX C BW C a EŒZ DŃDANx C B Nw C a Z D A X A T C B W B T. Z is also a normal RV and Z N.Ń; Z /.

13 ECE5530, VECTOR RANDOM PROCESSES 13.5: Conditioning MAIN POINT #5: Conditional probabilities. Given jointly-distributed RVs, it is often of extreme interest to find the pdf of some of the RVs given known values for the rest. For example, given the joint pdf fx;y.x; y/ for RVs X and Y,wewant the conditional pdf of f XjY.x j y/ which is the pdf of X for a known value Y D y. Can also think of it as fxjy.x D x j Y D y/. DEFINE: Conditional pdf f XjY.x j y/ D f X;Y.x; y/ f Y.y/ is the probability that X D x given that Y D y has happened. NOTE I: The marginal probability f Y.y/ may be calculated as f Y.y/ D Z 1 For each y, integrateouttheeffectofx. f X;Y.x; y/ dx. NOTE II: If X; Y independent, f X;Y.x; y/ D f X.x/f Y.y/. Therefore f XjY.x j y/ D f X;Y.x; y/ f Y.y/ D f X.x/f Y.y/ f Y.y/ D f X.x/. Knowing that Y D y has occurred provides no information about X. (Is this what you would expect?) DIRECT EXTENSION: f X;Y.x; y/ D f XjY.x j y/f Y.y/ D f Y jx.y j x/f X.x/,

14 ECE5530, VECTOR RANDOM PROCESSES 14 Therefore, f XjY.x j y/ D f Y jx.y j x/f X.x/. f Y.y/ This is known as Bayes rule. It relates the aposterioriprobability to the aprioriprobability. It forms a key step in the Kalman filter derivation. MAIN POINT #6: Conditional expectation. Now that we have a way of expressing a conditional probability, we can also express a conditional mean. What do we expect the value of X to be given that Y D y has happened? EŒX D x j Y D y D EŒX j Y D Z 1 xf XjY.x j Y/dx. Conditional expectation is not a constant (as expectation is) but a random variable. It is a function of the conditioning random variable (i.e., Y ). Note: Conditional expectation is critical. TheKalmanfilterisan algorithm to compute EŒx k j Z k,wherewedefinez k later. MAIN POINT #7: Central limit theorem. X If Y D X i and the X i are independent and identically distributed i IID), and the X i have finite mean and variance, then Y will be approximately normally distributed. The approximation improves as the number of summed RVs gets large.

15 ECE5530, VECTOR RANDOM PROCESSES 15 Since the state of our dynamic system adds up the effects of lots of independent random inputs, it is reasonable to assume that the distribution of the state tends to the normal distribution. This leads to the key assumptions for the derivation of the Kalman filter, as we will see: We will assume that state x k is a normally-distributed RV; We will assume that process noise w k is a normally-distributed RV; We will assume that sensor noise v k is a normally-distributed RV; We will assume that w k and v k are uncorrelated with each other. Even when these assumptions are broken in practice, the Kalman filter often works quite well. Exceptions to this rule tend to be with very highly nonlinear systems, for which particle filters must sometimes be employed to get good estimates.

16 ECE5530, VECTOR RANDOM PROCESSES 16.6: Vector random (stochastic) processes Astochasticrandomprocessisafamilyofrandomvectorsindexed by aparameterset( time inourcase). For example, the random process X.t/. The value of the random process at any time t 0 is RV X.t 0 /. Usually assume stationarity. The pdf of the random variables are time-shift invariant. Therefore, EŒX.t/ DNx for all t and EŒX.t 1 /X T.t / D R X.t 1 t /. Properties 1. Autocorrelation: R X.t 1 ;t / D EŒX.t 1 /X T.t /. Ifstationary, R X./ D EŒX.t/X T.t C /. Provides a measure of correlation between elements of the process having time displacement. RX.0/ D X for zero-mean X. RX.0/ is always the maximum value of R X./.. Autocovariance: C X.t 1 ;t / D EŒ.X.t 1 / EŒX.t 1 / /.X.t / EŒX.t / / T.If stationary, C X./ D EŒ.X.t/ Nx/.X.t C / Nx/ T. White noise Correlation from one time instant to the next is a key property ofa random process. R X./ and C X./ very important. Some processes have a unique autocorrelation:

17 ECE5530, VECTOR RANDOM PROCESSES Zero mean,. R X./ D EŒX.t/X.t C / T D S X ı./ where ı./ is the Dirac delta. Therefore, the process is uncorrelated in time. Clearly an abstraction, butprovestobeavery useful one. 4 White noise 0. Correlated noise Value 0 Value Time Power spectral density (PSD) Time Consider stationary random processes with autocorrelation defined as R X./ D EŒX.t/X.t C / T. If a process varies slowly, time-adjacent samples are very correlated, and R X./ will drop off slowly Mostly low-frequency content. If a process varies quickly, time-adjacent samples aren t very correlated, and R X./ drops off quickly More high-freq. content. Therefore, the autocorrelation function tells us about the frequency content of the random signal. Consider scalar case. We define the power spectral density (PSD) as the Fourier transform of the autocorrelation function:

18 ECE5530, VECTOR RANDOM PROCESSES 18 Z 1 S X.!/ D R X./ D 1 Z 1 R X./e j! d S X.!/e j! d!. SX.!/ gives a frequency-domain interpretation of the power (energy) in the random process X.t/, andisreal,symmetricand positive definite for real random processes. EXAMPLE: R X./ D e jj for >0.Then, S X.!/ D Z 1 R X./ Peak of e jj e j! d D! C. S X.!/Peak of Autocorrelation is a bandwidth measure. PSD! INTERPRETATION: 1. Area under PSD S X.!/ for! 1!! provides a measure of energy in the signal in that frequency range.. White noise can be thought of as a limiting process as!1. From our prior example, let!1. As!1, RX./! ı./; thereforewhite. Bandwidth of spectral content is approximately, sogoestoinfinity. Abstraction since implies that signal has infinite energy!

19 ECE5530, VECTOR RANDOM PROCESSES 19.7: Discrete-time dynamic systems with random inputs What are the statistics of the state of a system driven by a random process? That is, how do the system dynamics influence the time-propagation of x.t/? Do not know x.t/ exactly, but can develop a pdf for x.t/. Primary example: A linear system driven by white noise w.t/, possibly in addition to deterministic input u.t/. u.t/ w.t/ A; B; C; D x.t/ y.t/ x.t/ Contrast to deterministic simulation, which can be easily simulated Px.t/ D Ax.t/ C Bu.t/; x.0/ D x 0 ; x.0/; u.t/known. System response is completely specified. Therefore, no uncertainty. The stochastic propagation is different since there are inputs that are not known exactly. Best we can do is to say how uncertainty in the state changes with time. x Pr.x/ x.t f / x x.t f / x.t o / Deterministic x.t x o / 1 x 1 Stochastic Deterministic: x.t0 / and inputs known State trajectory deterministic.

20 ECE5530, VECTOR RANDOM PROCESSES 0 Stochastic: pdf for x.t0 / and inputs known Can find only pdf for x.t/. We will work with Gaussian noises to a large extent, which are uniquely defined by the first- and second central moments of the statistics Gaussian assumption not essential. We will only ever track the first two moments. NOTATION: Until now, we have always used capital letters for random variables. The state of a system driven by a random process is a random vector, so we could now call it X.t/ or X k.however,itismore common to retain the standard notation x.t/ or x k and understand from the context that we are now discussing an RV Discrete-time systems We will start with a discrete system and then look at how to get an equivalent answer for continuous systems. Model: x k D A d;kx k C B d;ku k C w k, where x k W State vector; arandomprocess: u k W Deterministic control inputs: w k W Noise that drives the process.process noise/; arandomprocess. Ad and B d assumed known. Can be time-varying. For now, assume fixed. Generalize later if necessary. Key assumptions about driving noise:

21 ECE5530, VECTOR RANDOM PROCESSES 1 1. Zero mean: EŒw k D 0 8 k.. White: EŒw k1 wk T D w.k 1 k /. w is called the spectral density of the noise signal w k. Uncertainty at startup: Statistics of initial condition. EŒx 0 DNx 0 I EŒ.x 0 Nx 0 /wk T D 0 8 k x;0 D EŒ.x 0 Nx 0 /.x 0 Nx 0 / T. Key question: How do we propagate the statistics of this system? Mean value: The mean value is EŒx k DNx k D EŒA d x k C B d u k C w k D A d Nx k C B d u k. Therefore, the mean propagation is Nx 0 W Given Nx k D A d Nx k C B d u k. Deterministic simulation and mean values treated the same way. Variations about mean To study the random variations about the mean, we need to form the second central moment of the statistics. x;k D EŒ.x k Nx k /.x k Nx k / T.

22 ECE5530, VECTOR RANDOM PROCESSES Easiest to study if we note that x k Nx k D A d x k C B d u k C w k Thus, A d Nx k B d u k D A d.x k Nx k / C w k. x;k D EŒ.A d.x k Nx k / C w k /.A d.x k Nx k / C w k / T. Three terms: 1. EŒA d.x k Nx k /.x k Nx k / T A T d D A d x;k A T d.. EŒw k w T k D w. 3. EŒA d.x k Nx k /wk T D? The third term is a cross-correlation term. But, x k depends only on x 0 and inputs w m for m D 0:::k. w k is white noise uncorrelated with x 0. Therefore, third term is zero. Therefore, the covariance propagation is x;0 W Given x;k D A d x;k A T d C w. In this equation, A d x;k A T d is the homogeneous part; w is the driving term. Note: If Ad and w are constant and A d stable, there is a steady-state solution. As k!1, x;k D x;kc1 D x.

23 ECE5530, VECTOR RANDOM PROCESSES 3 Then, x D A d x A T d C w. This form is called a discrete Lyapunov equation. In MATLAB, dlyap.m EXAMPLE: Can solve by hand in the scalar case. Consider: xk D x k C w k. EŒw k D 0; w D 1; Nx 0 D 0. Then, x.1 / D 1 x D x T C 1 x D 1 1. Valid for j j <1;otherwiseunstable. In the example below, we plot 100 random trajectories for D 0:75 and x;0 D 50. Compare propagation of x;k with steady-state dlyap.m solution. 50 x D :9 30 Propagation of state, with error bounds 40 0 x;k 30 0 Value Time Time (100 simulations plotted) The error bounds plotted are 3 bounds (i.e., 3 p x;k ).

24 ECE5530, VECTOR RANDOM PROCESSES 4.8: Continuous-time dynamic systems with random inputs For continuous-time systems we have where Px.t/ D A.t/x.t/ C B u.t/u.t/ C B w.t/w.t/, x.t/ W State vector; arandomprocess: u.t/ W Deterministic control inputs: w.t/ W Noise driving the process.process noise/; arandomprocess. Further: EŒw.t/ D 0I EŒw.t/w./ T D S w ı.t / EŒx.0/ DNx.0/I EŒ.x.0/ Nx.0//.x.0/ Nx.0// T D x.0/. A, Bu and B w assumed known. Can be time-varying. For now, assume fixed. Generalize later if necessary. Sw is the spectral density of w.t/. Easiest analysis is to discretize model; use results obtained earlier; let t! 0. Drop deterministic inputs u.t/ for simplicity. (They affect only the mean, and in known ways.) Continuous-time propagation of statistics Mean value Starting with the discrete case, we have Nx k D A d Nx k C B d u k A d D e A t I C A t C O. t /.

25 ECE5530, VECTOR RANDOM PROCESSES 5 Let t! 0 and consider case when uk D 0 for all k. Nx k D.A t C I/Nx k As t! 0 Nx k Nx k t D A Nx k. PNx.t/ D A Nx.t/ as expected. Therefore, the mean propagation is Nx.0/ W Given PNx.t/ D A Nx.t/ C Bu.t/. Deterministic simulation and mean values treated the same way. Variations about mean The result for discrete-time systems was: x;k D A d x;k A T d C w. But, we need a way to relate discrete w to a continuous spectral density S w before we can proceed. Recall the discrete system response in terms of continuous system matrices: x k D e A t x k C Z k t.k/ t e A.k t / B w w./ d D e A t x k C w k. Integral explicitly accounts for variations in the noise during t. RECALL: Discrete white noise of the form 8 < EŒw k wl T D w ; k D li : 0; k l:

26 ECE5530, VECTOR RANDOM PROCESSES 6 But noise input term to our converted state dynamics is defined asan integral w k D Z k t.k/ t e A.k t / B w w./ d. Form outer product to get equivalent w Z! k t Z k t w D E 4 e A.k t / B w w./ d D E " Z k t.k/ t Z k t.k/ t.k/ t.k/ t! 3 T e A.k t / B w w./ d 5 # e A.k t / B w w./w./ T Bw T eat.k t / d d. Big ugly mess but has two saving graces 1. Expectation can go inside integrals.. EŒw./w./ T D S w ı. / One of the integrals drops out! So, we have w D Z k t.k/ t e A.k t / B w S w B T w eat.k t / d. KEY POINT: While S w may have simple form, w will be full matrix in general. To solve integral, we can approximate: As t! 0, thenk t! 0 and e A.k t / I C A.k t / C That is, e A.k t / I.Then, w B w S w Bw T t. We will see a better method to evaluate w when t 0, butfornow we continue with this result to determine the continuous-time system covariance propagation.

27 ECE5530, VECTOR RANDOM PROCESSES 7 Start with x;k D A d x;k A T d C w,andsubstitute w B w S w B T w t. Also, use A d.i C A t/ x;k x;k t As t! 0 x;k.i C A t/ x;k.i C A t/ T C B w S w B T w t initialized with x.0/. D x;k C t.a x;k C x;k A T C B w S w B T w / C O. t / D A x;k C x;k A T C B w S w B T w C O. t/. P x.t/ D A x.t/ C x.t/a T C B w S w B T w, This is a matrix differential equation. Symmetric, so don t need to solve for every element. Two effects A x.t/ C x.t/a T :Homogeneouspart.ContractiveforstableA. Reduces covariance. B w S w B T w :Impactofprocessnoise.Tendstoincreasecovariance. Steady-state solution: Effects balance for systems with constant dynamics (A, B w, S w )andstablea. A x;ss C x;ss A T C B w S w B T w D 0. This is a continuous-time Lyapunov equation. In MATLAB, lyap.m The covariance propagation is x.0/ W Given P x.t/ D A x.t/ C x.t/a T C B w S w B T w. In this equation, A x.t/ C x.t/a T is the homogeneous part; B w S w B T w is the driving term.

28 ECE5530, VECTOR RANDOM PROCESSES 8 EXAMPLE: Px.t/ D Ax.t/ C B w w.t/, wherea and B w are scalars. Then, P x D A x C B w S w.thiscanbesolvedinclosedformtofind x.t/ D B w S w A.eAt 1/ C x.0/e At. If A<0(stable) then the initial condition contribution goes to zero. independent of x.0/. Example: Let A D, Bw D 1, S w D, and x.0/ D 5. Find x.t/. x;ss D B w S w A 5 4 x;ss D 1 Solution: Increased x;ss as more noise added: Bw S w term. Decreased x;ss as A becomes more stable. x.t/ Time

29 ECE5530, VECTOR RANDOM PROCESSES 9.9: Relating w to S w precisely: A little trick Want to find w precisely, regardless of t. AssumeweknowS w and plant dynamics. Define Z D w D Z t 0 4 A B ws w B T w 0 A T e A. t / B w S w B T w eat. t / d. 3 5 ; C D e Z t D 4 c 11 c 1 0 c We can compute C D L Œ.sI Z/ ˇˇtD t and find that c 11 D L Œ.sI 11 / ˇˇtD t D L Œ.sI C A/ ˇˇtD t D e A t 3 5 c 1 D L Œ.sI 11 / 1.sI / ˇˇtD t D L Œ.sI C A/ B w S w Bw T.sI AT / ˇˇtD t c D L Œ.sI / D L ˇˇtD t Œ.sI A T / D e ˇˇtD t AT t D A T d So, c T D A d.alsoc T c 1 D w.withoneexpm.m command, can quickly switch from continuous-time to discrete-time. To show the second identity, recognize that c1 is a convolution of two impulse responses: one due to the system.si C A/ B w and the other due to the system S w Bw T.sI AT /. The first of these has impulse response Ie At B w and the second has impulse response S w B T w eat t.convolvingthemweget Z t 0 Ie A B w S w B T w eat. t / d.

30 ECE5530, VECTOR RANDOM PROCESSES 30 Finally, consider c T c 1 D EXAMPLE: Let Z t 0 with S w D 0:06 and t D =16. e A. t / B w S w B T w eat. t / d D w. " # " # Px.t/ D x.t/ C w.t/ 1 Z D :06 I C D 3 0:91 0:47 0:006 0:0046 0:47 1:38 0:0046 0:03 : :93 0: :3 0:6 So, A d D " 0:93 0:3 0:3 0:6 # I w D " 0:0009 0:0030 0:0030 0:0156 # I B w S w B T w D " :06 #. Exact w and approximate w very different due to large t. Compare predictions of steady-state performance. Continuous: " # A x;ss C x;ss A T C B w S w Bw T D 0I 0:03 0 x;ss D : 0 0:03 " # Discrete: x;ss D A d x;ss A T d C 0:03 0 wi x;ss D : 0 0:03 Because we used discrete white noise scaled to be equivalent to the continuous noise, the steady-state predictions are the same. Conversion now complete:

31 ECE5530, VECTOR RANDOM PROCESSES 31 DISCRETE: A d, w, B d Given x;0 W x;k D A d x;k A T d C w Given Nx 0 W Nx k D A d Nx k C B d u k. CONTINUOUS: A, S w, B u, B w Given x.0/ W P x.t/ D A x.t/ C x.t/a T C B w S w B T w Given Nx.0/ W PNx.t/ D A Nx.t/ C B u u.t/. All matrices may be time varying. Equivalent w for given S w available.

32 ECE5530, VECTOR RANDOM PROCESSES 3.10: Shaping filters We have assumed that the inputs to the linear dynamic systems are white (Gaussian) noises. Therefore, the PSD is flat The input has content at all frequencies. Pretty limiting assumption, but one that can be easily fixed Can use second linear system to shape the noise and modify the PSD as desired. White noise w.t/ Previous Picture G.s/ y.t/ Shaped noise w 1.t/ New Picture G.s/ y.t/ White noise w.t/ H.s/ w 1.t/ Shaped G.s/ y.t/ Therefore, we can drive our linear system with noise that has a desired PSD by introducing a shaping filter H.s/ that itself is driven by white noise. The combined system GH.s/ looks exactly the same as before, but the system G.s/ is not driven by pure white noise any more. Analysis quite simple: Augment original model with filter states. Original system has Px.t/ D Ax.t/ C B w w 1.t/ y.t/ D Cx.t/ New shaping filter with white input and desired PSD output has

33 ECE5530, VECTOR RANDOM PROCESSES 33 Px s.t/ D A s x s.t/ C B s w.t/ w 1.t/ D C s x s.t/. Combine into one system Px.t/ 5 D 4 A B wc s 5 4 x.t/ 5 C w.t/ Px s.t/ 0 A s x s.t/ B s h i 3 y.t/ D C 0 4 x.t/ 5 x s.t/ Augmented system just a larger-order linear system driven by white noise. KEY QUESTION: Given a PSD for w 1,howdowefindH.s/ (or, A s, B s, C s ). Fact: If w is unit-variance white noise, then PSD for w 1 is S w1.!/ D H. j!/h T.j!/. Find H.j!/ by spectral factorization of Sw1.!/. Takethe minimum-phase part (keep the poles and zeros that have negative real part). EXAMPLE: Let Then S w1.!/ D D! C p p C j! j!.

34 ECE5530, VECTOR RANDOM PROCESSES 34 EXAMPLE: Wind model. H.s/ D p s C ; Px s.t/ D x s.t/ C p w.t/ w 1.t/ D x s.t/. Wind gusts have some high-frequency content, but typically are dominated by low-frequency disturbances White noise a good model? We would typically think of a PSD for the wind that puts more emphasis on the low-frequency component of the signal. Think of this as the output of a shaping filter H.s/ D 1 w s C 1. NOTE: If the bandwidth of the wind filter.1= w />bandwidth of plant dynamics, then white noise is a good approximation to system input. Otherwise, must use shaping filters in plant model and drive with white noise. The size of the A matrix becomes a limitation. Use the simplest noise model possible.

14 - Gaussian Stochastic Processes

14 - Gaussian Stochastic Processes 14-1 Gaussian Stochastic Processes S. Lall, Stanford 211.2.24.1 14 - Gaussian Stochastic Processes Linear systems driven by IID noise Evolution of mean and covariance Example: mass-spring system Steady-state

More information

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random

More information

ECE 636: Systems identification

ECE 636: Systems identification ECE 636: Systems identification Lectures 3 4 Random variables/signals (continued) Random/stochastic vectors Random signals and linear systems Random signals in the frequency domain υ ε x S z + y Experimental

More information

LINEAR QUADRATIC GAUSSIAN

LINEAR QUADRATIC GAUSSIAN ECE553: Multivariable Control Systems II. LINEAR QUADRATIC GAUSSIAN.: Deriving LQG via separation principle We will now start to look at the design of controllers for systems Px.t/ D A.t/x.t/ C B u.t/u.t/

More information

Gaussian random variables inr n

Gaussian random variables inr n Gaussian vectors Lecture 5 Gaussian random variables inr n One-dimensional case One-dimensional Gaussian density with mean and standard deviation (called N, ): fx x exp. Proposition If X N,, then ax b

More information

Statistical signal processing

Statistical signal processing Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable

More information

where r n = dn+1 x(t)

where r n = dn+1 x(t) Random Variables Overview Probability Random variables Transforms of pdfs Moments and cumulants Useful distributions Random vectors Linear transformations of random vectors The multivariate normal distribution

More information

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.

More information

3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE

3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE 3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE 3.0 INTRODUCTION The purpose of this chapter is to introduce estimators shortly. More elaborated courses on System Identification, which are given

More information

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are

More information

ECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1

ECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1 EE 650 Lecture 4 Intro to Estimation Theory Random Vectors EE 650 D. Van Alphen 1 Lecture Overview: Random Variables & Estimation Theory Functions of RV s (5.9) Introduction to Estimation Theory MMSE Estimation

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else ECE 450 Homework #3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3 4 5

More information

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I Code: 15A04304 R15 B.Tech II Year I Semester (R15) Regular Examinations November/December 016 PROBABILITY THEY & STOCHASTIC PROCESSES (Electronics and Communication Engineering) Time: 3 hours Max. Marks:

More information

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver Stochastic Signals Overview Definitions Second order statistics Stationarity and ergodicity Random signal variability Power spectral density Linear systems with stationary inputs Random signal memory Correlation

More information

ECE Homework Set 3

ECE Homework Set 3 ECE 450 1 Homework Set 3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3

More information

Gaussian, Markov and stationary processes

Gaussian, Markov and stationary processes Gaussian, Markov and stationary processes Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ November

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

ECE-340, Spring 2015 Review Questions

ECE-340, Spring 2015 Review Questions ECE-340, Spring 2015 Review Questions 1. Suppose that there are two categories of eggs: large eggs and small eggs, occurring with probabilities 0.7 and 0.3, respectively. For a large egg, the probabilities

More information

Gaussian Basics Random Processes Filtering of Random Processes Signal Space Concepts

Gaussian Basics Random Processes Filtering of Random Processes Signal Space Concepts White Gaussian Noise I Definition: A (real-valued) random process X t is called white Gaussian Noise if I X t is Gaussian for each time instance t I Mean: m X (t) =0 for all t I Autocorrelation function:

More information

Stochastic Processes

Stochastic Processes Elements of Lecture II Hamid R. Rabiee with thanks to Ali Jalali Overview Reading Assignment Chapter 9 of textbook Further Resources MIT Open Course Ware S. Karlin and H. M. Taylor, A First Course in Stochastic

More information

A Probability Review

A Probability Review A Probability Review Outline: A probability review Shorthand notation: RV stands for random variable EE 527, Detection and Estimation Theory, # 0b 1 A Probability Review Reading: Go over handouts 2 5 in

More information

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides were adapted

More information

16.584: Random (Stochastic) Processes

16.584: Random (Stochastic) Processes 1 16.584: Random (Stochastic) Processes X(t): X : RV : Continuous function of the independent variable t (time, space etc.) Random process : Collection of X(t, ζ) : Indexed on another independent variable

More information

Introduction to Unscented Kalman Filter

Introduction to Unscented Kalman Filter Introduction to Unscented Kalman Filter 1 Introdution In many scientific fields, we use certain models to describe the dynamics of system, such as mobile robot, vision tracking and so on. The word dynamics

More information

Chapter 6. Random Processes

Chapter 6. Random Processes Chapter 6 Random Processes Random Process A random process is a time-varying function that assigns the outcome of a random experiment to each time instant: X(t). For a fixed (sample path): a random process

More information

Here represents the impulse (or delta) function. is an diagonal matrix of intensities, and is an diagonal matrix of intensities.

Here represents the impulse (or delta) function. is an diagonal matrix of intensities, and is an diagonal matrix of intensities. 19 KALMAN FILTER 19.1 Introduction In the previous section, we derived the linear quadratic regulator as an optimal solution for the fullstate feedback control problem. The inherent assumption was that

More information

CS 532: 3D Computer Vision 6 th Set of Notes

CS 532: 3D Computer Vision 6 th Set of Notes 1 CS 532: 3D Computer Vision 6 th Set of Notes Instructor: Philippos Mordohai Webpage: www.cs.stevens.edu/~mordohai E-mail: Philippos.Mordohai@stevens.edu Office: Lieb 215 Lecture Outline Intro to Covariance

More information

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for DHANALAKSHMI COLLEGE OF ENEINEERING, CHENNAI DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING MA645 PROBABILITY AND RANDOM PROCESS UNIT I : RANDOM VARIABLES PART B (6 MARKS). A random variable X

More information

Probability Background

Probability Background Probability Background Namrata Vaswani, Iowa State University August 24, 2015 Probability recap 1: EE 322 notes Quick test of concepts: Given random variables X 1, X 2,... X n. Compute the PDF of the second

More information

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei) UCSD ECE 53 Handout #46 Prof. Young-Han Kim Thursday, June 5, 04 Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei). Discrete-time Wiener process. Let Z n, n 0 be a discrete time white

More information

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl. E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,

More information

Review (Probability & Linear Algebra)

Review (Probability & Linear Algebra) Review (Probability & Linear Algebra) CE-725 : Statistical Pattern Recognition Sharif University of Technology Spring 2013 M. Soleymani Outline Axioms of probability theory Conditional probability, Joint

More information

Parameter Estimation in a Moving Horizon Perspective

Parameter Estimation in a Moving Horizon Perspective Parameter Estimation in a Moving Horizon Perspective State and Parameter Estimation in Dynamical Systems Reglerteknik, ISY, Linköpings Universitet State and Parameter Estimation in Dynamical Systems OUTLINE

More information

Definition of a Stochastic Process

Definition of a Stochastic Process Definition of a Stochastic Process Balu Santhanam Dept. of E.C.E., University of New Mexico Fax: 505 277 8298 bsanthan@unm.edu August 26, 2018 Balu Santhanam (UNM) August 26, 2018 1 / 20 Overview 1 Stochastic

More information

ECE 673-Random signal analysis I Final

ECE 673-Random signal analysis I Final ECE 673-Random signal analysis I Final Q ( point) Comment on the following statement "If cov(x ; X 2 ) 0; then the best predictor (without limitation on the type of predictor) of the random variable X

More information

LQR, Kalman Filter, and LQG. Postgraduate Course, M.Sc. Electrical Engineering Department College of Engineering University of Salahaddin

LQR, Kalman Filter, and LQG. Postgraduate Course, M.Sc. Electrical Engineering Department College of Engineering University of Salahaddin LQR, Kalman Filter, and LQG Postgraduate Course, M.Sc. Electrical Engineering Department College of Engineering University of Salahaddin May 2015 Linear Quadratic Regulator (LQR) Consider a linear system

More information

Lecture - 30 Stationary Processes

Lecture - 30 Stationary Processes Probability and Random Variables Prof. M. Chakraborty Department of Electronics and Electrical Communication Engineering Indian Institute of Technology, Kharagpur Lecture - 30 Stationary Processes So,

More information

Solution of Linear State-space Systems

Solution of Linear State-space Systems Solution of Linear State-space Systems Homogeneous (u=0) LTV systems first Theorem (Peano-Baker series) The unique solution to x(t) = (t, )x 0 where The matrix function is given by is called the state

More information

5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors

5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors EE401 (Semester 1) 5. Random Vectors Jitkomut Songsiri probabilities characteristic function cross correlation, cross covariance Gaussian random vectors functions of random vectors 5-1 Random vectors we

More information

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017) UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable

More information

Probability and Statistics for Final Year Engineering Students

Probability and Statistics for Final Year Engineering Students Probability and Statistics for Final Year Engineering Students By Yoni Nazarathy, Last Updated: May 24, 2011. Lecture 6p: Spectral Density, Passing Random Processes through LTI Systems, Filtering Terms

More information

ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1

ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1 ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen D. van Alphen 1 Lecture 10 Overview Part 1 Review of Lecture 9 Continuing: Systems with Random Inputs More about Poisson RV s Intro. to Poisson Processes

More information

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation

More information

BASICS OF PROBABILITY

BASICS OF PROBABILITY October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Klaus Witrisal witrisal@tugraz.at Signal Processing and Speech Communication Laboratory www.spsc.tugraz.at Graz University of

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan

More information

This examination consists of 10 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS

This examination consists of 10 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS THE UNIVERSITY OF BRITISH COLUMBIA Department of Electrical and Computer Engineering EECE 564 Detection and Estimation of Signals in Noise Final Examination 08 December 2009 This examination consists of

More information

Multivariate Distributions

Multivariate Distributions IEOR E4602: Quantitative Risk Management Spring 2016 c 2016 by Martin Haugh Multivariate Distributions We will study multivariate distributions in these notes, focusing 1 in particular on multivariate

More information

4 Derivations of the Discrete-Time Kalman Filter

4 Derivations of the Discrete-Time Kalman Filter Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof N Shimkin 4 Derivations of the Discrete-Time

More information

Lecture 2: Review of Basic Probability Theory

Lecture 2: Review of Basic Probability Theory ECE 830 Fall 2010 Statistical Signal Processing instructor: R. Nowak, scribe: R. Nowak Lecture 2: Review of Basic Probability Theory Probabilistic models will be used throughout the course to represent

More information

Section 8.1. Vector Notation

Section 8.1. Vector Notation Section 8.1 Vector Notation Definition 8.1 Random Vector A random vector is a column vector X = [ X 1 ]. X n Each Xi is a random variable. Definition 8.2 Vector Sample Value A sample value of a random

More information

Lesson 4: Stationary stochastic processes

Lesson 4: Stationary stochastic processes Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@univaq.it Stationary stochastic processes Stationarity is a rather intuitive concept, it means

More information

Introduction to Probability and Stochastic Processes I

Introduction to Probability and Stochastic Processes I Introduction to Probability and Stochastic Processes I Lecture 3 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark Slides

More information

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment: Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 08 SUBJECT NAME : Probability & Random Processes SUBJECT CODE : MA645 MATERIAL NAME : University Questions REGULATION : R03 UPDATED ON : November 07 (Upto N/D 07 Q.P) (Scan the

More information

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University ENSC327 Communications Systems 19: Random Processes Jie Liang School of Engineering Science Simon Fraser University 1 Outline Random processes Stationary random processes Autocorrelation of random processes

More information

Uncorrelatedness and Independence

Uncorrelatedness and Independence Uncorrelatedness and Independence Uncorrelatedness:Two r.v. x and y are uncorrelated if C xy = E[(x m x )(y m y ) T ] = 0 or equivalently R xy = E[xy T ] = E[x]E[y T ] = m x m T y White random vector:this

More information

Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering. Stochastic Processes and Linear Algebra Recap Slides

Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering. Stochastic Processes and Linear Algebra Recap Slides Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering Stochastic Processes and Linear Algebra Recap Slides Stochastic processes and variables XX tt 0 = XX xx nn (tt) xx 2 (tt) XX tt XX

More information

MA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2

MA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2 MA 575 Linear Models: Cedric E Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2 1 Revision: Probability Theory 11 Random Variables A real-valued random variable is

More information

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process Lecture Notes 7 Stationary Random Processes Strict-Sense and Wide-Sense Stationarity Autocorrelation Function of a Stationary Process Power Spectral Density Continuity and Integration of Random Processes

More information

Notes on Random Processes

Notes on Random Processes otes on Random Processes Brian Borchers and Rick Aster October 27, 2008 A Brief Review of Probability In this section of the course, we will work with random variables which are denoted by capital letters,

More information

STA205 Probability: Week 8 R. Wolpert

STA205 Probability: Week 8 R. Wolpert INFINITE COIN-TOSS AND THE LAWS OF LARGE NUMBERS The traditional interpretation of the probability of an event E is its asymptotic frequency: the limit as n of the fraction of n repeated, similar, and

More information

ECE 541 Stochastic Signals and Systems Problem Set 9 Solutions

ECE 541 Stochastic Signals and Systems Problem Set 9 Solutions ECE 541 Stochastic Signals and Systems Problem Set 9 Solutions Problem Solutions : Yates and Goodman, 9.5.3 9.1.4 9.2.2 9.2.6 9.3.2 9.4.2 9.4.6 9.4.7 and Problem 9.1.4 Solution The joint PDF of X and Y

More information

white noise Time moving average

white noise Time moving average 1.3 Time Series Statistical Models 13 white noise w 3 1 0 1 0 100 00 300 400 500 Time moving average v 1.5 0.5 0.5 1.5 0 100 00 300 400 500 Fig. 1.8. Gaussian white noise series (top) and three-point moving

More information

Stochastic process for macro

Stochastic process for macro Stochastic process for macro Tianxiao Zheng SAIF 1. Stochastic process The state of a system {X t } evolves probabilistically in time. The joint probability distribution is given by Pr(X t1, t 1 ; X t2,

More information

ECE Lecture #10 Overview

ECE Lecture #10 Overview ECE 450 - Lecture #0 Overview Introduction to Random Vectors CDF, PDF Mean Vector, Covariance Matrix Jointly Gaussian RV s: vector form of pdf Introduction to Random (or Stochastic) Processes Definitions

More information

ENGR352 Problem Set 02

ENGR352 Problem Set 02 engr352/engr352p02 September 13, 2018) ENGR352 Problem Set 02 Transfer function of an estimator 1. Using Eq. (1.1.4-27) from the text, find the correct value of r ss (the result given in the text is incorrect).

More information

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace

More information

3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES

3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES 3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES 3.1 Introduction In this chapter we will review the concepts of probabilit, rom variables rom processes. We begin b reviewing some of the definitions

More information

x. Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ 2 ).

x. Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ 2 ). .8.6 µ =, σ = 1 µ = 1, σ = 1 / µ =, σ =.. 3 1 1 3 x Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ ). The Gaussian distribution Probably the most-important distribution in all of statistics

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

Notes on Random Vectors and Multivariate Normal

Notes on Random Vectors and Multivariate Normal MATH 590 Spring 06 Notes on Random Vectors and Multivariate Normal Properties of Random Vectors If X,, X n are random variables, then X = X,, X n ) is a random vector, with the cumulative distribution

More information

4. Distributions of Functions of Random Variables

4. Distributions of Functions of Random Variables 4. Distributions of Functions of Random Variables Setup: Consider as given the joint distribution of X 1,..., X n (i.e. consider as given f X1,...,X n and F X1,...,X n ) Consider k functions g 1 : R n

More information

ECE534, Spring 2018: Solutions for Problem Set #5

ECE534, Spring 2018: Solutions for Problem Set #5 ECE534, Spring 08: s for Problem Set #5 Mean Value and Autocorrelation Functions Consider a random process X(t) such that (i) X(t) ± (ii) The number of zero crossings, N(t), in the interval (0, t) is described

More information

STATE-SPACE DYNAMIC SYSTEMS (DISCRETE-TIME)

STATE-SPACE DYNAMIC SYSTEMS (DISCRETE-TIME) ECE452/552: Multivariable Control Systems I. 3 1 STATE-SPACE DYNAMIC SYSTEMS (DISCRETE-TIME) 3.1: The transform Computer control requires analog-to-digital (A2D) and digital-toanalog (D2A) conversion.

More information

System Identification

System Identification System Identification Arun K. Tangirala Department of Chemical Engineering IIT Madras July 27, 2013 Module 3 Lecture 1 Arun K. Tangirala System Identification July 27, 2013 1 Objectives of this Module

More information

OPTIMAL CONTROL AND ESTIMATION

OPTIMAL CONTROL AND ESTIMATION OPTIMAL CONTROL AND ESTIMATION Robert F. Stengel Department of Mechanical and Aerospace Engineering Princeton University, Princeton, New Jersey DOVER PUBLICATIONS, INC. New York CONTENTS 1. INTRODUCTION

More information

Lecture 9. Introduction to Kalman Filtering. Linear Quadratic Gaussian Control (LQG) G. Hovland 2004

Lecture 9. Introduction to Kalman Filtering. Linear Quadratic Gaussian Control (LQG) G. Hovland 2004 MER42 Advanced Control Lecture 9 Introduction to Kalman Filtering Linear Quadratic Gaussian Control (LQG) G. Hovland 24 Announcement No tutorials on hursday mornings 8-9am I will be present in all practical

More information

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship Random data Deterministic Deterministic data are those can be described by an explicit mathematical relationship Deterministic x(t) =X cos r! k m t Non deterministic There is no way to predict an exact

More information

COMS 4721: Machine Learning for Data Science Lecture 19, 4/6/2017

COMS 4721: Machine Learning for Data Science Lecture 19, 4/6/2017 COMS 4721: Machine Learning for Data Science Lecture 19, 4/6/2017 Prof. John Paisley Department of Electrical Engineering & Data Science Institute Columbia University PRINCIPAL COMPONENT ANALYSIS DIMENSIONALITY

More information

EEE582 Homework Problems

EEE582 Homework Problems EEE582 Homework Problems HW. Write a state-space realization of the linearized model for the cruise control system around speeds v = 4 (Section.3, http://tsakalis.faculty.asu.edu/notes/models.pdf). Use

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

EEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2:

EEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2: EEM 409 Random Signals Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Consider a random process of the form = + Problem 2: X(t) = b cos(2π t + ), where b is a constant,

More information

The Multivariate Normal Distribution. In this case according to our theorem

The Multivariate Normal Distribution. In this case according to our theorem The Multivariate Normal Distribution Defn: Z R 1 N(0, 1) iff f Z (z) = 1 2π e z2 /2. Defn: Z R p MV N p (0, I) if and only if Z = (Z 1,..., Z p ) T with the Z i independent and each Z i N(0, 1). In this

More information

component risk analysis

component risk analysis 273: Urban Systems Modeling Lec. 3 component risk analysis instructor: Matteo Pozzi 273: Urban Systems Modeling Lec. 3 component reliability outline risk analysis for components uncertain demand and uncertain

More information

The Multivariate Normal Distribution 1

The Multivariate Normal Distribution 1 The Multivariate Normal Distribution 1 STA 302 Fall 2017 1 See last slide for copyright information. 1 / 40 Overview 1 Moment-generating Functions 2 Definition 3 Properties 4 χ 2 and t distributions 2

More information

Properties of the Autocorrelation Function

Properties of the Autocorrelation Function Properties of the Autocorrelation Function I The autocorrelation function of a (real-valued) random process satisfies the following properties: 1. R X (t, t) 0 2. R X (t, u) =R X (u, t) (symmetry) 3. R

More information

Computer Vision Group Prof. Daniel Cremers. 2. Regression (cont.)

Computer Vision Group Prof. Daniel Cremers. 2. Regression (cont.) Prof. Daniel Cremers 2. Regression (cont.) Regression with MLE (Rep.) Assume that y is affected by Gaussian noise : t = f(x, w)+ where Thus, we have p(t x, w, )=N (t; f(x, w), 2 ) 2 Maximum A-Posteriori

More information

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t, CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance

More information

Kalman Filter. Predict: Update: x k k 1 = F k x k 1 k 1 + B k u k P k k 1 = F k P k 1 k 1 F T k + Q

Kalman Filter. Predict: Update: x k k 1 = F k x k 1 k 1 + B k u k P k k 1 = F k P k 1 k 1 F T k + Q Kalman Filter Kalman Filter Predict: x k k 1 = F k x k 1 k 1 + B k u k P k k 1 = F k P k 1 k 1 F T k + Q Update: K = P k k 1 Hk T (H k P k k 1 Hk T + R) 1 x k k = x k k 1 + K(z k H k x k k 1 ) P k k =(I

More information

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes EAS 305 Random Processes Viewgraph 1 of 10 Definitions: Random Processes A random process is a family of random variables indexed by a parameter t T, where T is called the index set λ i Experiment outcome

More information

16.584: Random Vectors

16.584: Random Vectors 1 16.584: Random Vectors Define X : (X 1, X 2,..X n ) T : n-dimensional Random Vector X 1 : X(t 1 ): May correspond to samples/measurements Generalize definition of PDF: F X (x) = P[X 1 x 1, X 2 x 2,...X

More information

BIOS 2083 Linear Models Abdus S. Wahed. Chapter 2 84

BIOS 2083 Linear Models Abdus S. Wahed. Chapter 2 84 Chapter 2 84 Chapter 3 Random Vectors and Multivariate Normal Distributions 3.1 Random vectors Definition 3.1.1. Random vector. Random vectors are vectors of random variables. For instance, X = X 1 X 2.

More information

2t t dt.. So the distance is (t2 +6) 3/2

2t t dt.. So the distance is (t2 +6) 3/2 Math 8, Solutions to Review for the Final Exam Question : The distance is 5 t t + dt To work that out, integrate by parts with u t +, so that t dt du The integral is t t + dt u du u 3/ (t +) 3/ So the

More information

Chapter 5 Random Variables and Processes

Chapter 5 Random Variables and Processes Chapter 5 Random Variables and Processes Wireless Information Transmission System Lab. Institute of Communications Engineering National Sun Yat-sen University Table of Contents 5.1 Introduction 5. Probability

More information

This does not cover everything on the final. Look at the posted practice problems for other topics.

This does not cover everything on the final. Look at the posted practice problems for other topics. Class 7: Review Problems for Final Exam 8.5 Spring 7 This does not cover everything on the final. Look at the posted practice problems for other topics. To save time in class: set up, but do not carry

More information

Multivariate Distributions

Multivariate Distributions Copyright Cosma Rohilla Shalizi; do not distribute without permission updates at http://www.stat.cmu.edu/~cshalizi/adafaepov/ Appendix E Multivariate Distributions E.1 Review of Definitions Let s review

More information

The Multivariate Normal Distribution 1

The Multivariate Normal Distribution 1 The Multivariate Normal Distribution 1 STA 302 Fall 2014 1 See last slide for copyright information. 1 / 37 Overview 1 Moment-generating Functions 2 Definition 3 Properties 4 χ 2 and t distributions 2

More information