Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver
|
|
- Oswald Welch
- 5 years ago
- Views:
Transcription
1 Stochastic Signals Overview Definitions Second order statistics Stationarity and ergodicity Random signal variability Power spectral density Linear systems with stationary inputs Random signal memory Correlation matrices Introduction Discrete-time stochastic processes provides a mathematical framework for working with non-deterministic signals Signals that have an exact functional relationship are often called predictable or deterministic, though some stochastic processes are predictable I m going to use the term deterministic to refer to signals that are not affected by the outcome of a random experiment I will use the terms stochastic process and random process interchangeably J McNames Portland State University ECE 538/638 Stochastic Signals Ver J McNames Portland State University ECE 538/638 Stochastic Signals Ver Probability Space Ω ζ x(n, ζ) Conceptually we should imagine a sample space with some number (possibly infinite) of outcomes: Ω={ζ 1,ζ 2,} Each has a probability Pr {ζ k } By some rule, each outcome generates a sequence x(n, ζ k ) We can think of x(n, ζ k ) as a vector of (possibly) infinite duration Note that the entire sequence is generated from a single outcome of the underlying experiment x(n, ζ) is called a discrete-time stochastic process or a random sequence Definitions and Interpretations Interpretations Random variable: x(n, ζ) with n = n o fixed and ζ treated as a variable Sample Sequence: x(n, ζ) with ζ = ζ k fixed and n treated as an independent (non-random) variable Number: x(n, ζ) with both ζ = ζ k and n = n o fixed Stochastic Process: x(n, ζ) with both ζ and n treated as variables Realization: a sample sequence Ensemble: The set of all possible sequences, {x(n, ζ)} J McNames Portland State University ECE 538/638 Stochastic Signals Ver J McNames Portland State University ECE 538/638 Stochastic Signals Ver 110 4
2 Probability Functions In order to fully characterize a stochastic process, we must consider the cdf or pdf F x (x 1,,x k ; n 1,,n k ) = Pr{x(n 1 ) x 1,,x(n k ) x k } f x (x 1,,x k ; n 1,,n k ) = k F x (x 1,,x k ; n 1,,n k ) x 1 x k for every k 1 and any set of sample times {n 1,n 2,,n k } Without additional sweeping assumptions, estimation of f x ( ) from a realization is impossible Many stochastic processes can be characterized accurately or, at least, usefully by much less information To simplify notation, from here on will mostly use x(n) to denote both random processes and single realizations In most cases will assume x(n) is complex valued Second Order Statistics At any time n, we can specify the mean and variance of x(n) μ x (n) E[x(n)] σ 2 x(n) E[ x(n) μ x (n) 2 ] μ x (n) and σx(n) 2 are both deterministic sequences The expectation is taken over the ensemble In general, the second-order statistics at two different times are given by the autocorrelation or autocovariance sequences Autocorrelation Sequence Autocovariance Sequence r xx (n 1,n 2 )=E[x(n 1 )x (n 2 )] γ xx (n 1,n 2 ) = E[(x(n 1 ) μ x (n 1 )) (x(n 2 ) μ x (n 2 )) ] = r xx (n 1,n 2 ) μ x (n 1 )μ x(n 2 ) J McNames Portland State University ECE 538/638 Stochastic Signals Ver J McNames Portland State University ECE 538/638 Stochastic Signals Ver Cross-Correlation Cross-Correlation and Cross-Covariance Independent: iff More Definitions Cross-Covariance r xy (n 1,n 2 )=E[x(n 1 )y (n 2 )] γ xy (n 1,n 2 ) = E [ (x(n 1 ) μ x (n 1 )) (y(n 2 ) μ y (n 2 )) ] = r xy (n 1,n 2 ) μ x (n 1 )μ y(n 2 ) Normalized Cross-Correlation ρ xy (n 1,n 2 )= γ xy(n 1,n 2 ) σ x (n 1 )σ y (n 2 ) Uncorrelated: if Orthogonal: if f x (x 1,,x k ; n 1,,n k )= γ x (n 1,n 2 )= r x (n 1,n 2 )= k f l (x l, ; n l ) l=1 { σ 2 x(n 1 ) n 1 = n 2 0 n 1 n 2 { σ 2 x(n 1 )+ μ x (n 1 ) 2 n 1 = n 2 0 n 1 n 2 k J McNames Portland State University ECE 538/638 Stochastic Signals Ver J McNames Portland State University ECE 538/638 Stochastic Signals Ver 110 8
3 Still More Definitions Wide-sense Periodic: if μ x (n) = μ x (n + N) n r x (n 1,n 2 ) = r x (n 1 + N,n 2 )=r x (n 1,n 2 + N) = r x (n 1 + N,n 2 + N) Statistically Independent: iff for every n 1 and n 2 f xy (x, y; n 1,n 2 ) = f x (x; n 1 )f y (y; n 2 ) Uncorrelated: if for every n 1 and n 2, Stationarity Stationarity of Order N: A stochastic process x(n) such that f x (x 1,,x N ; n 1,,n N )=f x (x 1,,x N ; n 1 + k,,n N + k) for any value for any k Any stochastic process of Order N, is also a stochastic process of order M for all M N Strict-Sense Stationary (SSS): A stochastic process that is stationary of all orders N γ xy (n 1,n 2 )=0 Orthogonal: if for every n 1 and n 2 r xy (n 1,n 2 )=0 J McNames Portland State University ECE 538/638 Stochastic Signals Ver J McNames Portland State University ECE 538/638 Stochastic Signals Ver Wide Sense Stationary f x (x 1,x 2 ; n 1,n 2 )=f x (x 1,x 2 ; n 1 + k, n 2 + k) Wide-Sense Stationary (WSS): A stochastic process with a constant mean and autocorrelation that only depends on the delay between the two sample times WSS Properties Example 1: Stationarity Describe a random process that is stationary Describe a second random process that is not stationary E[x(n)] = μ x r x (n 1,n 2 )=r x (l) =r x (n 1 n 2 )=E[x(n + l)x (n)] γ x (l) =r x (l) μ x 2 This implies the variance is also constant, var[x(n)] = σx 2 All processes that are stationary of order 2 are WSS Not all WSS processes are stationary of order 2 Note this is slightly different from the text J McNames Portland State University ECE 538/638 Stochastic Signals Ver J McNames Portland State University ECE 538/638 Stochastic Signals Ver
4 Stationarity Notes SSS implies WSS If the marginal pdf of a signal is Guassian for all n, thenwss implies SSS The book states that most WSS processes are SSS True? Jointly Wide-Sense Stationary: two random signals x(n) and are jointly WSS if they are both WSS and r xy (l) =r xy (n1 n 2 )=r xy (l) =E[x(n)y (n l)] γ xy (l) =γ xy (n 1 n 2 )=γ xy (l) =r xy (l) μ x μ y WSS is a very useful property because it enables us to consider a spectral description In practice, we only need the signal to be WSS long enough to estimate the autocorrelation or cross-correlation k=1 m=1 Autocorrelation Sequence Properties r x (0) = σ 2 x + μ x 2 r x (0) r x (l) r x (l) = rx( l) α k r x (k m)αm 0 α sequences Average DC Power: μ x 2 Average AC Power: σ 2 x Nonnegative Definite: A sequence is said to be nonnegative definite if it satisfies this last property Positive Definite: Any sequence that satisfies the last inequality strictly for any α J McNames Portland State University ECE 538/638 Stochastic Signals Ver J McNames Portland State University ECE 538/638 Stochastic Signals Ver Comments on Stationarity Many real processes are nonstationary Best case: can determine from domain knowledge of the process Else: must rely on statistical methods Many nonstationary processes are approximately locally stationary (stationary over short periods of time) Much of time-frequency analysis is dedicated to this type of signal There is no general mathematical framework for analyzing nonstationary signals However, many nonstationary stochastic processes can be understood through linear estimation (ie, Kalman filters) Note that nonstationary is a negative definition: not stationary Introduction to Ergodicity In most practical situations we can only observe one or a few realizations If the process is ergodic, we can know all statistical information from a single realization Ensemble Averages: Repeat the experiment many times Time Averages: 1 ( ) lim N 2N +1 N n= N ( ) J McNames Portland State University ECE 538/638 Stochastic Signals Ver J McNames Portland State University ECE 538/638 Stochastic Signals Ver
5 Time Averages of Interest Mean value = x(n) Mean square = x(n) 2 Variance = x(n) x(n) 2 Autocorrelation = x(n)x (n l) Autocovariance = [x(n) x(n) ][x(n l) x(n) ] Cross-correlation = x(n)y (n l) Cross-covariance = [x(n) x(n) ][y(n l) ] Similar to correlation sequences for deterministic power signals Both quantitites have the same properties Difference Time averages are random variables (functions of the experiment outcome) In the deterministic case the quantities are fixed numbers Ergodic Random Processes Ergodic Random Process: a random signal for which the ensemble averages equal the corresponding time averages Like stationarity, there are various degrees Ergodic in the Mean: a random process such that x(n) =E[x(n)] = μ x Ergodic in Correlation: a random process such that x(n)x (n l) =E[x(n)x (n l)] = r x (l) If a process is ergodic in both mean and correlation, it is also WSS Only stationary signals can be ergodic WSS does not imply any type of ergodicity Text: Almost all stationary processes are also ergodic True? Our usage: ergodic = ergodic in both the mean and correlation J McNames Portland State University ECE 538/638 Stochastic Signals Ver J McNames Portland State University ECE 538/638 Stochastic Signals Ver More on Ergodicity Joint Ergodicity: two random signals are jointly ergodic iff they are individually ergodic and x(n)y (n l) =E[x(n)y (n l)] Stationarity ensures time invariance of the statistics Ergodicity implies the statistics can be obtained from a single realization with time averaging In words: one realization (a single ζ k ) is sufficient to estimate any statistic of the underlying random process Problems with Ergodicity Problem we never know x(n) for n = to + In all real sitations, we only have finite records The most common estimator is then 1 N ( ) N ( ) 2N +1 n= N Note that it is a random variable How good is it? Bias Variance Consistent Confidence intervals Distribution This is one of the key topics of this class J McNames Portland State University ECE 538/638 Stochastic Signals Ver J McNames Portland State University ECE 538/638 Stochastic Signals Ver
6 Ergodic Processes vs Deterministic Signals 1 N r x (l) = lim x(n)x (n l) N 2N +1 n= N The autocorrelation of a deterministic power signal and a ergodic process can be calculated with the same infinite summation What s the difference then? With deterministic signals there is only one signal With stochastic signals, we assume it was generated from an underlying random experiment ζ k This enables us to consider the ensemble of possible signals: r x (l) =E[x(n)x (n l)] We can therefore draw inferences and make predictions about the population of possible outcomes, not merely this one signal Whether you define a given signal as deterministic or as a single realization of a random process depends largely on the application Random Processes in the Frequency Domain Power Spectral Density (PSD) R x (e jω ) F{r x (l)} = l= r x (l) = F 1 { R x (e jω ) } = 1 2π r x (l)e jωl π π R x (e jω )e jω dω Stationary random processes have deterministic correlation sequences They have a single index (independent variable) Note again that the power spectral density can be calculated with the same equation for deterministic and ergodic signals J McNames Portland State University ECE 538/638 Stochastic Signals Ver J McNames Portland State University ECE 538/638 Stochastic Signals Ver Periodic and Non-Periodic Processes R x (e jω ) F{r x (l)} = r x (l)e jωl l= If r x (l) is periodic, the DTFS is most appropriate Line Spectrum If we allow impulses in the PSD, then the PSD of a periodic r x (l) consists of an impulse train If the process x(n) is non-zero mean (ie, nonzero average DC power), the PSD will contain an impulse at ω =0 More generally, a random process can be composed of both deterministic components and non-periodic components Power Spectral Density Properties R x (e jω ) is real-valued R x (e jω ) is periodic with period 2π R x (e jω ) 0 is nonnegative definite R x (e jω ) has nonnegative area and 1 2π π π If x(n) is real-valued r x (l) is real and even R x (e jω ) is an even function of ω What if x(n) is complex-valued? R x (e jω )dω = r x (0) = E[ x(n) 2 ] J McNames Portland State University ECE 538/638 Stochastic Signals Ver J McNames Portland State University ECE 538/638 Stochastic Signals Ver
7 White Noise White Noise Process: A random WSS sequence w(n) such that E[w(n)] = μ w r w (l) = ( σ 2 w + μ 2 w) δ(l) Specifically, this is a second-order white process Notation: w(n) WN(μ w,σw) 2 Not a complete characterization of w(n): the marginal pdf could be anything If w(n) is Gaussian, then a white Guassian process is denoted by w(n) WGN(μ w,σw) 2 The term white comes from properties of white light Harmonic Process Harmonic Process any process defined by x(n) = a k cos(ω k n + φ k ) k=1 ω k 0 where M, {a k } M 1, and {ω k } M 1 are constant The random variables {φ k } M 1 are pairwise independent and uniformly distributed in the interval [ π, π] x(n) is stationary and ergodic with zero mean and autocorrelation r x (l) = 1 2 a 2 k cos(ω k l) k=1 Note the cosines in the autocorrelation are in-phase J McNames Portland State University ECE 538/638 Stochastic Signals Ver J McNames Portland State University ECE 538/638 Stochastic Signals Ver Harmonic Process PSD Harmonic Process Comments x(n) = a k cos(ω k n + φ k ) k=1 ω k 0 x(n) = a k cos(ω k n + φ k ) k=1 ω k 0 The PSD consists of pairs of impulses (line spectrum) of area πa2 k 2 located a frequencies ±ω k R x (e jω )= π 2 a 2 k [δ(ω ω k )+δ(ω + ω k )] k=1 π ω π If all ω k /(2π) are rational numbers, x(n) is periodic and the impulses are equally spaced apart This never happens, unless there is a single periodic (perhaps non-sinusoidal) component Otherwise they are almost periodic (always happens) It is only stationary if all of the random phases are equally likely (uniformly distributed over all possible angles) This is an unusual circumstance where the signal is stationary but is parameterized by one or more random variables that are constant over all n In general, is non-guassian Is a predictable random sequence! (also highly unusual) J McNames Portland State University ECE 538/638 Stochastic Signals Ver J McNames Portland State University ECE 538/638 Stochastic Signals Ver
8 Cross-Power Spectral Density Cross-power Spectral Density: if x(n) and are jointly stationary stochastic processes, R xy (e jω ) F{r xy (l)} = l= r xy (l) = 1 π R xy (e jω )e jωl dω 2π π R xy (e jω ) = Ryx(e jω ) r xy (l)e jωl Also known as the cross-spectrum Note that unlike the PSD, it is not real-valued, in general Normalized Cross-Spectrum G xy (e jω ) Coherence R xy (e jω ) Rx (e jω )R y (e jω ) Also known as the coherency spectrum or simply coherency Similar to the correlation coefficient in frequency Coherence Function Gxy(e 2 jω ) R xy(e jω ) 2 R x (e jω )R y (e jω ) Alsoknownasthecoherence and magnitude square coherence If =h(n) x(n), thengxy(e 2 jω )=1 ω If r xy (l) =0,thenGxy(e 2 jω )=0 ω 0 Gxy 2 1 J McNames Portland State University ECE 538/638 Stochastic Signals Ver J McNames Portland State University ECE 538/638 Stochastic Signals Ver Linear Transforms and Coherence x(n) H(z) Linear transforms have no effect on coherence Similar to the case of random variables: y = mx + b x and y are perfectly correlated: ρ = ±1 w(n) x(n) G 2 xy(e jω )= Linear Transforms and Coherence G(z) H(z) P F(z) R x (e jω ) H(e jω ) 2 R x (e jω ) H(e jω ) 2 + R w (e jω ) G(e jω ) 2 Noise w(n) decreases coherence The final linear transform F (z) has no effect! J McNames Portland State University ECE 538/638 Stochastic Signals Ver J McNames Portland State University ECE 538/638 Stochastic Signals Ver
9 Complex Spectral Density Functions Complex Spectral Density R x (z) = r x (l)z l R y (z) = r y (l)z l Complex Cross-Spectral Density R xy (z) = r xy (l)z l Random Processes and Linear Systems If the input to an LTI system is a random process, so is the output y(n, ζ) = h(k)x(n k, ζ) If the system is BIBO stable and the input process is stationary with E[ x(n, ζ) ] <, then the output converges absolutely with probability one In English, the output is stationary If E[ x(n, ζ) 2 ] <, thene[ y(n, ζ) 2 ] < If the h(n) has finite energy, the output converges in the mean square sense J McNames Portland State University ECE 538/638 Stochastic Signals Ver J McNames Portland State University ECE 538/638 Stochastic Signals Ver Linear System Statistics Output Power x(n) h(n) x(n) H(z) x(n) h(n) x(n) H(z) Let x(n) be a random process that is the input to an LTI system with an output μ y = h(k)e[x(n k)] = μ x h(k) =μ x H(e j0 ) r xy (l) = h (k)r x (l + k) = m= h ( m)r x (l m) r xy (l) = h ( l) r x (l) r yx (l) = h(l) r x (l) r y (l) = h(l) r xy (l) =h(l) h ( l) r x (l) =r h (l) r x (l) Let x(n) be a random process that is the input to an LTI system with an output P y = r y (0) = [r h (l) r x (l)] l=0 = r h (k)r x ( k) = If the system is FIR, then r h (k)r x(k) P y = h H R x h If μ x =0,thenμ y =0and σ 2 y = P y J McNames Portland State University ECE 538/638 Stochastic Signals Ver J McNames Portland State University ECE 538/638 Stochastic Signals Ver
10 Output Distribution z Domain Analysis x(n) h(n) x(n) H(z) x(n) h(n) x(n) H(z) In general, it is very difficult to solve for the output PDF (even when is WSS) If x(n) is a Gaussian process, the output is a Gaussian process If x(n) is IID, The output is a weighted sum of IID random variables If the distribution of x(n) is stable, then has the same distribution (even if the mean and variance differ) If many of the largest weights are approximately equal so that many elements of the input signal have an equal effect on the output, then the CLT applies (approximately) and the output will be approximately Gaussian Z{h ( n)} = H (z ) R xy (z) = Z{h ( l) r x (l)} = H (z )R x (z) R yx (z) = Z{h(l) r x (l)} = H(z)R x (z) R y (z) = H(z)H (z )R x (z) Note that if h(n) is real, then h ( n) =h( n) h( n) Z H(z 1 ) J McNames Portland State University ECE 538/638 Stochastic Signals Ver J McNames Portland State University ECE 538/638 Stochastic Signals Ver Frequency Domain Analysis Random Signal & System Memory x(n) h(n) x(n) H(z) x(n) h(n) x(n) H(z) If the system is stable, z =e jω lies in the ROC and the following relations hold R xy (e jω ) = H (e jω )R x (e jω ) R yx (e jω ) = H(e jω )R x (e jω ) R y (e jω ) = H(e jω ) 2 R x (e jω ) Zero-memory: a process for which r x (l) =σxδ(l) 2 Examples: white noise, IID process We can create a signal with memory (dependence) by passing a zero-memory process through an LTI system Extent and degree of imposed dependence depend on h(n) Knowing r y (l) and r x (l) or the input and output PSD s are sufficient to determine H(e jω ) We can t estimate H(e jω ) from this information (the second order statistics) Only r xy (l) or R xy (e jω ) can provide phase information J McNames Portland State University ECE 538/638 Stochastic Signals Ver J McNames Portland State University ECE 538/638 Stochastic Signals Ver
11 Correlation Length Correlation Length: given a WSS process, L c = 1 r x (0) r x (l) = ρ x (l) l=0 l=0 Equal to the area of the normalized autocorrelation curve Undesirable properties Why is it one sided? Lengths should not be negative, in general Could this be negative? Short Memory Processes Short Memory: a WSS process x(n) such that ρ x (l) < l= For example, autocorrelation decays exponentially ρ x (l) a l for large l r(l) =[10000, 03214, 07538] for l =1, 2, 3 zero-memory processes have a non-zero correlation length (L c =1) J McNames Portland State University ECE 538/638 Stochastic Signals Ver J McNames Portland State University ECE 538/638 Stochastic Signals Ver Long Memory Processes Long memory: for WSS signal x(n) with finite variance, if there exists 0 <α<1 and C r > 0 1 lim l C r σx 2 r x (l)l α =1 Equivalently, there exists 0 β<1 and C r > 0 such that lim ω 0 1 C r σx 2 R x (e jω ) ω β =1 Implies The autocorrelation has heavy tails Autocorrelation decays as a power law ρ x (l) = ρ x (l) C r l α as l l= Has infinite autocorrelation length Correlation Matrices Let the random vector x(n) be related to the (possibly nonstationary) random process x(n) as follows x(n) [ x(n) x(n 1) x(n M +1) ] H E[x(n)] = [ μ x (n) μ x (n 1) μ x (n M +1) ] H R x (n) = E[x(n)x(n) H ] r x (n, n) r x (n, n M +1) = r x (n M +1,n) r x (n M +1,n M +1) Note that R x (n) is nonnegative definite and Hermitian since r x (n i, n j) =r x(n j, n i) J McNames Portland State University ECE 538/638 Stochastic Signals Ver J McNames Portland State University ECE 538/638 Stochastic Signals Ver
12 Correlation Matrices If x(n) is a stationary process, the correlation matrix becomes r x (0) r x (1) r x (2) r x (M 1) rx(1) r x (0) r x (1) r x (M 2) R x (n) = rx(2) rx(1) r x (0) r x (M 3) rx(m 1) rx(m 2) rx(m 3) r x (0) In this case R x is Hermitian, R x = R H x, Toeplitz (the elements along each diagonal are equal), and nonnegative definite Conditioning of Correlation Matrix Condition number: of a positive definite matrix R x is χ(r x ) λ max λ min where λ max and λ min are the largest and smallest eigenvalues of the autocorrelation matrix, respectively If x(n) is a WSS random process, then the eigenvalues of the autocorrelation matrix are bounded by the dynamic range of the PSD min R(e jω ) λ i max ω ω R(ejω ) λ i See text for proof Interpretation: a large spread in eigenvalues implies PSD is more variable (less flat) Process is less like white nose (more predictable) J McNames Portland State University ECE 538/638 Stochastic Signals Ver J McNames Portland State University ECE 538/638 Stochastic Signals Ver
Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes
Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides were adapted
More informationELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random
More informationStochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno
Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.
More informationNonparametric and Parametric Defined This text distinguishes between systems and the sequences (processes) that result when a WN input is applied
Linear Signal Models Overview Introduction Linear nonparametric vs. parametric models Equivalent representations Spectral flatness measure PZ vs. ARMA models Wold decomposition Introduction Many researchers
More informationStatistical signal processing
Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable
More informationwhere r n = dn+1 x(t)
Random Variables Overview Probability Random variables Transforms of pdfs Moments and cumulants Useful distributions Random vectors Linear transformations of random vectors The multivariate normal distribution
More informationEAS 305 Random Processes Viewgraph 1 of 10. Random Processes
EAS 305 Random Processes Viewgraph 1 of 10 Definitions: Random Processes A random process is a family of random variables indexed by a parameter t T, where T is called the index set λ i Experiment outcome
More informationDefinition of a Stochastic Process
Definition of a Stochastic Process Balu Santhanam Dept. of E.C.E., University of New Mexico Fax: 505 277 8298 bsanthan@unm.edu August 26, 2018 Balu Santhanam (UNM) August 26, 2018 1 / 20 Overview 1 Stochastic
More informationStochastic Processes
Elements of Lecture II Hamid R. Rabiee with thanks to Ali Jalali Overview Reading Assignment Chapter 9 of textbook Further Resources MIT Open Course Ware S. Karlin and H. M. Taylor, A First Course in Stochastic
More informationECE 636: Systems identification
ECE 636: Systems identification Lectures 3 4 Random variables/signals (continued) Random/stochastic vectors Random signals and linear systems Random signals in the frequency domain υ ε x S z + y Experimental
More informationParametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d)
Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d) Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides
More informationTerminology Suppose we have N observations {x(n)} N 1. Estimators as Random Variables. {x(n)} N 1
Estimation Theory Overview Properties Bias, Variance, and Mean Square Error Cramér-Rao lower bound Maximum likelihood Consistency Confidence intervals Properties of the mean estimator Properties of the
More informationLecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process
Lecture Notes 7 Stationary Random Processes Strict-Sense and Wide-Sense Stationarity Autocorrelation Function of a Stationary Process Power Spectral Density Continuity and Integration of Random Processes
More informationLecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi)
Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi) Our immediate goal is to formulate an LLN and a CLT which can be applied to establish sufficient conditions for the consistency
More informationfor valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I
Code: 15A04304 R15 B.Tech II Year I Semester (R15) Regular Examinations November/December 016 PROBABILITY THEY & STOCHASTIC PROCESSES (Electronics and Communication Engineering) Time: 3 hours Max. Marks:
More informationStochastic Processes. A stochastic process is a function of two variables:
Stochastic Processes Stochastic: from Greek stochastikos, proceeding by guesswork, literally, skillful in aiming. A stochastic process is simply a collection of random variables labelled by some parameter:
More informationEstimators as Random Variables
Estimation Theory Overview Properties Bias, Variance, and Mean Square Error Cramér-Rao lower bound Maimum likelihood Consistency Confidence intervals Properties of the mean estimator Introduction Up until
More informationLecture 4: FT Pairs, Random Signals and z-transform
EE518 Digital Signal Processing University of Washington Autumn 2001 Dept. of Electrical Engineering Lecture 4: T Pairs, Rom Signals z-transform Wed., Oct. 10, 2001 Prof: J. Bilmes
More informationBasics on 2-D 2 D Random Signal
Basics on -D D Random Signal Spring 06 Instructor: K. J. Ray Liu ECE Department, Univ. of Maryland, College Park Overview Last Time: Fourier Analysis for -D signals Image enhancement via spatial filtering
More informationSRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS
UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS 1. Define random process? The sample space composed of functions of time is called a random process. 2. Define Stationary process? If a random process is divided
More informationChapter 6. Random Processes
Chapter 6 Random Processes Random Process A random process is a time-varying function that assigns the outcome of a random experiment to each time instant: X(t). For a fixed (sample path): a random process
More informationP 1.5 X 4.5 / X 2 and (iii) The smallest value of n for
DHANALAKSHMI COLLEGE OF ENEINEERING, CHENNAI DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING MA645 PROBABILITY AND RANDOM PROCESS UNIT I : RANDOM VARIABLES PART B (6 MARKS). A random variable X
More informationRandom Processes Why we Care
Random Processes Why we Care I Random processes describe signals that change randomly over time. I Compare: deterministic signals can be described by a mathematical expression that describes the signal
More informationENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University
ENSC327 Communications Systems 19: Random Processes Jie Liang School of Engineering Science Simon Fraser University 1 Outline Random processes Stationary random processes Autocorrelation of random processes
More informationEEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2:
EEM 409 Random Signals Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Consider a random process of the form = + Problem 2: X(t) = b cos(2π t + ), where b is a constant,
More informationIntroduction to Probability and Stochastic Processes I
Introduction to Probability and Stochastic Processes I Lecture 3 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark Slides
More informationProbability and Statistics
Probability and Statistics 1 Contents some stochastic processes Stationary Stochastic Processes 2 4. Some Stochastic Processes 4.1 Bernoulli process 4.2 Binomial process 4.3 Sine wave process 4.4 Random-telegraph
More informationProbability and Statistics for Final Year Engineering Students
Probability and Statistics for Final Year Engineering Students By Yoni Nazarathy, Last Updated: May 24, 2011. Lecture 6p: Spectral Density, Passing Random Processes through LTI Systems, Filtering Terms
More informationStochastic Process II Dr.-Ing. Sudchai Boonto
Dr-Ing Sudchai Boonto Department of Control System and Instrumentation Engineering King Mongkuts Unniversity of Technology Thonburi Thailand Random process Consider a random experiment specified by the
More informationProblems on Discrete & Continuous R.Vs
013 SUBJECT NAME SUBJECT CODE MATERIAL NAME MATERIAL CODE : Probability & Random Process : MA 61 : University Questions : SKMA1004 Name of the Student: Branch: Unit I (Random Variables) Problems on Discrete
More informationProf. Dr.-Ing. Armin Dekorsy Department of Communications Engineering. Stochastic Processes and Linear Algebra Recap Slides
Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering Stochastic Processes and Linear Algebra Recap Slides Stochastic processes and variables XX tt 0 = XX xx nn (tt) xx 2 (tt) XX tt XX
More informationSignals and Spectra - Review
Signals and Spectra - Review SIGNALS DETERMINISTIC No uncertainty w.r.t. the value of a signal at any time Modeled by mathematical epressions RANDOM some degree of uncertainty before the signal occurs
More informationDigital Signal Processing Lecture 3 - Discrete-Time Systems
Digital Signal Processing - Discrete-Time Systems Electrical Engineering and Computer Science University of Tennessee, Knoxville August 25, 2015 Overview 1 2 3 4 5 6 7 8 Introduction Three components of
More information16.584: Random (Stochastic) Processes
1 16.584: Random (Stochastic) Processes X(t): X : RV : Continuous function of the independent variable t (time, space etc.) Random process : Collection of X(t, ζ) : Indexed on another independent variable
More informationEconometría 2: Análisis de series de Tiempo
Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each
More informationECE534, Spring 2018: Solutions for Problem Set #5
ECE534, Spring 08: s for Problem Set #5 Mean Value and Autocorrelation Functions Consider a random process X(t) such that (i) X(t) ± (ii) The number of zero crossings, N(t), in the interval (0, t) is described
More informationEE538 Final Exam Fall :20 pm -5:20 pm PHYS 223 Dec. 17, Cover Sheet
EE538 Final Exam Fall 005 3:0 pm -5:0 pm PHYS 3 Dec. 17, 005 Cover Sheet Test Duration: 10 minutes. Open Book but Closed Notes. Calculators ARE allowed!! This test contains five problems. Each of the five
More informationPractical Spectral Estimation
Digital Signal Processing/F.G. Meyer Lecture 4 Copyright 2015 François G. Meyer. All Rights Reserved. Practical Spectral Estimation 1 Introduction The goal of spectral estimation is to estimate how the
More informationCore Concepts Review. Orthogonality of Complex Sinusoids Consider two (possibly non-harmonic) complex sinusoids
Overview of Continuous-Time Fourier Transform Topics Definition Compare & contrast with Laplace transform Conditions for existence Relationship to LTI systems Examples Ideal lowpass filters Relationship
More informationFinal Exam January 31, Solutions
Final Exam January 31, 014 Signals & Systems (151-0575-01) Prof. R. D Andrea & P. Reist Solutions Exam Duration: Number of Problems: Total Points: Permitted aids: Important: 150 minutes 7 problems 50 points
More informationCommunication Theory II
Communication Theory II Lecture 8: Stochastic Processes Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 5 th, 2015 1 o Stochastic processes What is a stochastic process? Types:
More informationOverview of Discrete-Time Fourier Transform Topics Handy Equations Handy Limits Orthogonality Defined orthogonal
Overview of Discrete-Time Fourier Transform Topics Handy equations and its Definition Low- and high- discrete-time frequencies Convergence issues DTFT of complex and real sinusoids Relationship to LTI
More informationChapter 2 Random Processes
Chapter 2 Random Processes 21 Introduction We saw in Section 111 on page 10 that many systems are best studied using the concept of random variables where the outcome of a random experiment was associated
More informationSignals & Systems Handout #4
Signals & Systems Handout #4 H-4. Elementary Discrete-Domain Functions (Sequences): Discrete-domain functions are defined for n Z. H-4.. Sequence Notation: We use the following notation to indicate the
More informationCHAPTER 2 RANDOM PROCESSES IN DISCRETE TIME
CHAPTER 2 RANDOM PROCESSES IN DISCRETE TIME Shri Mata Vaishno Devi University, (SMVDU), 2013 Page 13 CHAPTER 2 RANDOM PROCESSES IN DISCRETE TIME When characterizing or modeling a random variable, estimates
More informationLesson 1. Optimal signalbehandling LTH. September Statistical Digital Signal Processing and Modeling, Hayes, M:
Lesson 1 Optimal Signal Processing Optimal signalbehandling LTH September 2013 Statistical Digital Signal Processing and Modeling, Hayes, M: John Wiley & Sons, 1996. ISBN 0471594318 Nedelko Grbic Mtrl
More information14 - Gaussian Stochastic Processes
14-1 Gaussian Stochastic Processes S. Lall, Stanford 211.2.24.1 14 - Gaussian Stochastic Processes Linear systems driven by IID noise Evolution of mean and covariance Example: mass-spring system Steady-state
More informationAdvanced Digital Signal Processing -Introduction
Advanced Digital Signal Processing -Introduction LECTURE-2 1 AP9211- ADVANCED DIGITAL SIGNAL PROCESSING UNIT I DISCRETE RANDOM SIGNAL PROCESSING Discrete Random Processes- Ensemble Averages, Stationary
More informationLecture 19 IIR Filters
Lecture 19 IIR Filters Fundamentals of Digital Signal Processing Spring, 2012 Wei-Ta Chu 2012/5/10 1 General IIR Difference Equation IIR system: infinite-impulse response system The most general class
More informationMassachusetts Institute of Technology
Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.011: Introduction to Communication, Control and Signal Processing QUIZ, April 1, 010 QUESTION BOOKLET Your
More informationChap 2. Discrete-Time Signals and Systems
Digital Signal Processing Chap 2. Discrete-Time Signals and Systems Chang-Su Kim Discrete-Time Signals CT Signal DT Signal Representation 0 4 1 1 1 2 3 Functional representation 1, n 1,3 x[ n] 4, n 2 0,
More informationCCNY. BME I5100: Biomedical Signal Processing. Stochastic Processes. Lucas C. Parra Biomedical Engineering Department City College of New York
BME I5100: Biomedical Signal Processing Stochastic Processes Lucas C. Parra Biomedical Engineering Department CCNY 1 Schedule Week 1: Introduction Linear, stationary, normal - the stuff biology is not
More informationECE 438 Exam 2 Solutions, 11/08/2006.
NAME: ECE 438 Exam Solutions, /08/006. This is a closed-book exam, but you are allowed one standard (8.5-by-) sheet of notes. No calculators are allowed. Total number of points: 50. This exam counts for
More informationEE538 Final Exam Fall 2007 Mon, Dec 10, 8-10 am RHPH 127 Dec. 10, Cover Sheet
EE538 Final Exam Fall 2007 Mon, Dec 10, 8-10 am RHPH 127 Dec. 10, 2007 Cover Sheet Test Duration: 120 minutes. Open Book but Closed Notes. Calculators allowed!! This test contains five problems. Each of
More informationName of the Student: Problems on Discrete & Continuous R.Vs
Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan
More informationAnalog vs. discrete signals
Analog vs. discrete signals Continuous-time signals are also known as analog signals because their amplitude is analogous (i.e., proportional) to the physical quantity they represent. Discrete-time signals
More informationChapter 6: Random Processes 1
Chapter 6: Random Processes 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationChapter 7: The z-transform
Chapter 7: The -Transform ECE352 1 The -Transform - definition Continuous-time systems: e st H(s) y(t) = e st H(s) e st is an eigenfunction of the LTI system h(t), and H(s) is the corresponding eigenvalue.
More informationRandom Process. Random Process. Random Process. Introduction to Random Processes
Random Process A random variable is a function X(e) that maps the set of experiment outcomes to the set of numbers. A random process is a rule that maps every outcome e of an experiment to a function X(t,
More informationDigital Signal Processing Lecture 10 - Discrete Fourier Transform
Digital Signal Processing - Discrete Fourier Transform Electrical Engineering and Computer Science University of Tennessee, Knoxville November 12, 2015 Overview 1 2 3 4 Review - 1 Introduction Discrete-time
More informationMassachusetts Institute of Technology Department of Electrical Engineering and Computer Science : Discrete-Time Signal Processing
Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.34: Discrete-Time Signal Processing OpenCourseWare 006 ecture 8 Periodogram Reading: Sections 0.6 and 0.7
More informationSolutions. Number of Problems: 10
Final Exam February 2nd, 2013 Signals & Systems (151-0575-01) Prof. R. D Andrea Solutions Exam Duration: 150 minutes Number of Problems: 10 Permitted aids: One double-sided A4 sheet. Questions can be answered
More informationEach problem is worth 25 points, and you may solve the problems in any order.
EE 120: Signals & Systems Department of Electrical Engineering and Computer Sciences University of California, Berkeley Midterm Exam #2 April 11, 2016, 2:10-4:00pm Instructions: There are four questions
More informationELC 4351: Digital Signal Processing
ELC 4351: Digital Signal Processing Liang Dong Electrical and Computer Engineering Baylor University liang dong@baylor.edu October 18, 2016 Liang Dong (Baylor University) Frequency-domain Analysis of LTI
More informationLecture - 30 Stationary Processes
Probability and Random Variables Prof. M. Chakraborty Department of Electronics and Electrical Communication Engineering Indian Institute of Technology, Kharagpur Lecture - 30 Stationary Processes So,
More informationECE-S Introduction to Digital Signal Processing Lecture 3C Properties of Autocorrelation and Correlation
ECE-S352-701 Introduction to Digital Signal Processing Lecture 3C Properties of Autocorrelation and Correlation Assuming we have two sequences x(n) and y(n) from which we form a linear combination: c(n)
More informationSignal Processing Signal and System Classifications. Chapter 13
Chapter 3 Signal Processing 3.. Signal and System Classifications In general, electrical signals can represent either current or voltage, and may be classified into two main categories: energy signals
More informationE X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.
E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,
More informationECE Homework Set 3
ECE 450 1 Homework Set 3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3
More informationEE482: Digital Signal Processing Applications
Professor Brendan Morris, SEB 3216, brendan.morris@unlv.edu EE482: Digital Signal Processing Applications Spring 2014 TTh 14:30-15:45 CBC C222 Lecture 11 Adaptive Filtering 14/03/04 http://www.ee.unlv.edu/~b1morris/ee482/
More informationThe Cooper Union Department of Electrical Engineering ECE111 Signal Processing & Systems Analysis Final May 4, 2012
The Cooper Union Department of Electrical Engineering ECE111 Signal Processing & Systems Analysis Final May 4, 2012 Time: 3 hours. Close book, closed notes. No calculators. Part I: ANSWER ALL PARTS. WRITE
More information3F1 Random Processes Examples Paper (for all 6 lectures)
3F Random Processes Examples Paper (for all 6 lectures). Three factories make the same electrical component. Factory A supplies half of the total number of components to the central depot, while factories
More informationA6523 Modeling, Inference, and Mining Jim Cordes, Cornell University
A6523 Modeling, Inference, and Mining Jim Cordes, Cornell University Lecture 19 Modeling Topics plan: Modeling (linear/non- linear least squares) Bayesian inference Bayesian approaches to spectral esbmabon;
More informationLinear Stochastic Models. Special Types of Random Processes: AR, MA, and ARMA. Digital Signal Processing
Linear Stochastic Models Special Types of Random Processes: AR, MA, and ARMA Digital Signal Processing Department of Electrical and Electronic Engineering, Imperial College d.mandic@imperial.ac.uk c Danilo
More informationLECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.
LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation
More informationLaboratory Project 1: Introduction to Random Processes
Laboratory Project 1: Introduction to Random Processes Random Processes With Applications (MVE 135) Mats Viberg Department of Signals and Systems Chalmers University of Technology 412 96 Gteborg, Sweden
More informationDigital Signal Processing Lecture 4
Remote Sensing Laboratory Dept. of Information Engineering and Computer Science University of Trento Via Sommarive, 14, I-38123 Povo, Trento, Italy Digital Signal Processing Lecture 4 Begüm Demir E-mail:
More informationIV. Covariance Analysis
IV. Covariance Analysis Autocovariance Remember that when a stochastic process has time values that are interdependent, then we can characterize that interdependency by computing the autocovariance function.
More informationName of the Student: Problems on Discrete & Continuous R.Vs
Engineering Mathematics 08 SUBJECT NAME : Probability & Random Processes SUBJECT CODE : MA645 MATERIAL NAME : University Questions REGULATION : R03 UPDATED ON : November 07 (Upto N/D 07 Q.P) (Scan the
More informationSolutions. Number of Problems: 10
Final Exam February 4th, 01 Signals & Systems (151-0575-01) Prof. R. D Andrea Solutions Exam Duration: 150 minutes Number of Problems: 10 Permitted aids: One double-sided A4 sheet. Questions can be answered
More informationDiscrete Time Fourier Transform (DTFT)
Discrete Time Fourier Transform (DTFT) 1 Discrete Time Fourier Transform (DTFT) The DTFT is the Fourier transform of choice for analyzing infinite-length signals and systems Useful for conceptual, pencil-and-paper
More information1. Determine if each of the following are valid autocorrelation matrices of WSS processes. (Correlation Matrix),R c =
ENEE630 ADSP Part II w/ solution. Determine if each of the following are valid autocorrelation matrices of WSS processes. (Correlation Matrix) R a = 4 4 4,R b = 0 0,R c = j 0 j 0 j 0 j 0 j,r d = 0 0 0
More informationEcon 424 Time Series Concepts
Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length
More informationENGR352 Problem Set 02
engr352/engr352p02 September 13, 2018) ENGR352 Problem Set 02 Transfer function of an estimator 1. Using Eq. (1.1.4-27) from the text, find the correct value of r ss (the result given in the text is incorrect).
More informationChapter 2 Wiener Filtering
Chapter 2 Wiener Filtering Abstract Before moving to the actual adaptive filtering problem, we need to solve the optimum linear filtering problem (particularly, in the mean-square-error sense). We start
More informationDETECTION theory deals primarily with techniques for
ADVANCED SIGNAL PROCESSING SE Optimum Detection of Deterministic and Random Signals Stefan Tertinek Graz University of Technology turtle@sbox.tugraz.at Abstract This paper introduces various methods for
More informationReview of Probability
Review of robabilit robabilit Theor: Man techniques in speech processing require the manipulation of probabilities and statistics. The two principal application areas we will encounter are: Statistical
More informationFundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes
Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Klaus Witrisal witrisal@tugraz.at Signal Processing and Speech Communication Laboratory www.spsc.tugraz.at Graz University of
More informationDetection & Estimation Lecture 1
Detection & Estimation Lecture 1 Intro, MVUE, CRLB Xiliang Luo General Course Information Textbooks & References Fundamentals of Statistical Signal Processing: Estimation Theory/Detection Theory, Steven
More informationIntroduction to Stochastic processes
Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space
More informationStability Condition in Terms of the Pole Locations
Stability Condition in Terms of the Pole Locations A causal LTI digital filter is BIBO stable if and only if its impulse response h[n] is absolutely summable, i.e., 1 = S h [ n] < n= We now develop a stability
More informationEC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015
EC402: Serial Correlation Danny Quah Economics Department, LSE Lent 2015 OUTLINE 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers.
More information1. Stochastic Processes and Stationarity
Massachusetts Institute of Technology Department of Economics Time Series 14.384 Guido Kuersteiner Lecture Note 1 - Introduction This course provides the basic tools needed to analyze data that is observed
More informationECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else
ECE 450 Homework #3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3 4 5
More information5 Kalman filters. 5.1 Scalar Kalman filter. Unit delay Signal model. System model
5 Kalman filters 5.1 Scalar Kalman filter 5.1.1 Signal model System model {Y (n)} is an unobservable sequence which is described by the following state or system equation: Y (n) = h(n)y (n 1) + Z(n), n
More informationAdaptive Systems Homework Assignment 1
Signal Processing and Speech Communication Lab. Graz University of Technology Adaptive Systems Homework Assignment 1 Name(s) Matr.No(s). The analytical part of your homework (your calculation sheets) as
More informationSystem Identification
System Identification Arun K. Tangirala Department of Chemical Engineering IIT Madras July 27, 2013 Module 3 Lecture 1 Arun K. Tangirala System Identification July 27, 2013 1 Objectives of this Module
More informationDigital Image Processing
Digital Image Processing 2D SYSTEMS & PRELIMINARIES Hamid R. Rabiee Fall 2015 Outline 2 Two Dimensional Fourier & Z-transform Toeplitz & Circulant Matrices Orthogonal & Unitary Matrices Block Matrices
More informationELEG 305: Digital Signal Processing
ELEG 305: Digital Signal Processing Lecture 1: Course Overview; Discrete-Time Signals & Systems Kenneth E. Barner Department of Electrical and Computer Engineering University of Delaware Fall 2008 K. E.
More informationProbability Models in Electrical and Computer Engineering Mathematical models as tools in analysis and design Deterministic models Probability models
Probability Models in Electrical and Computer Engineering Mathematical models as tools in analysis and design Deterministic models Probability models Statistical regularity Properties of relative frequency
More information