A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2013

Size: px
Start display at page:

Download "A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2013"

Transcription

1 A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2013 Lecture 16 More on spectral analysis Reading: Chaper 13: Bayesian Revolution in Spectral Analysis (already assigned)

2 Upcoming Topics The remaining lectures will include these topics: Model fitting and statistical inference (frequentist and Bayesian) Model definition Linear and non-linear least squares, maximum likelihood Parameter errors (credible intervals, Fisher matrix) Parameter space exploration (grid, SA, GA, GHS) Markov processes and stochastic resonance MCMC (Hastings-Metropolis) Cholesky decomposition Principal component analysis (PCA) Localization/Matched filtering 1D, 2D problems (time, frequency/wavelength, image) Phase retrieval and Hilbert transforms Radon transform Extreme value statistics SA = simulated annealing, GA = Genetic Algorithm, GHS = Guided Hamiltonian Sampler

3 Chirped Signals A Bayesian Approach to Spectral Analysis Chirped signals are oscillating signals with time variable frequencies, usually with a linear variation of frequency with time. E.g. Examples: plasma wave diagnostic signals f(t) =A cos(ωt + αt 2 + θ). signals propagated through dispersive media (seismic cases, plasmas) gravitational waves from inspiraling binary stars doppler-shifted signals over fractions of an orbit (e.g. acceleration of pulsar in its orbit) Jaynes Approach to Spectral Analysis: cf. Jaynes Bayesian Spectrum and Chirp Analysis in Maximum Entropy and Bayesian Spectral Analysis and Estimation Problems Result: Optimal processing is a nonlinear operation on the data without recourse to smoothing. However, the DFT-based spectrum (the periodogram ) plays a key role in the estimation. 1

4 Start with Bayes theorem p(h/di) posterior prob. = p(h/i) prior prob. new data p(d/hi) p(d/i) As in Ch 1 of Gregory In this context, probabilities represent a simple mapping of degrees of belief onto real numbers. Recall p(d/hi) vs.d for fixed H = sampling distribution p(d/hi) vs.h for fixed D = likelihood function Read H as a statement that a parameter vector lies in a region of parameter space. Measured Quantity: y(t) = f(t)+e(t) Data Model 4 parameters: A, ω, α, θ f(t) = A cos(ωt + αt 2 + θ) e(t) = white gaussian noise,<e>=0,<e 2 >= σ 2 Data Set: D = {y(t), t T }, N =2T +1datapoints. 2

5 Data Probability: The probability of obtaining a data set of N samples is Should be product P (D HI)= P [y(t)] = Π T t= T (2πσ 2 ) 1/2 e X t 1 2σ2[y(t) f(t)]2, (1) which we can rewrite as a likelihood function once we acquire a data set and evaluate the probability for a specific H. Writing out the parameters explicitly, the likelihood function is L(A, ω, α, θ) e 1 2σ 2 T t= T [y(t) A cos(ωt + αt 2 + θ)] 2 For simplicity, assume that ωt 1 so that many cycles of oscillation are summed over. Then t cos 2 (ωt + αt 2 + θ) = 1 t 2 [1 + cos 2(ωt + αt2 + θ)] 2T +1 2 N 2 3

6 Expanding out the argument of the exponential in the likelihood function, we have [y(t) A cos(ωt + αt 2 + θ)] 2 = y 2 (t)+a 2 cos 2 () 2Ay(t)cos() We care only about terms that are functions of the parameters, so we drop the y 2 (t) term to get 1 2σ 2 T t= T The likelihood function becomes [y(t) A cos( )] 2 1 [A 2 cos 2 () 2Ay(t)cos()] 2σ 2 t L(A, ω, α, θ) e Integrating out the phase: A σ 2 t A σ 2 t y(t)cos() NA2 4σ 2 y(t)cos(ωt + αt 2 + θ) NA2 4σ 2 In calculating a power spectrum [in this case, a chirped power spectrum ( chirpogram )], we do not care about the phase of any sinusoid in the data. In Bayesian estimation, such a parameter is called a nuisance parameter. Since we do not know anything about θ, we integrate over its prior distribution, a pdf that is 4

7 uniform over [0, 2π]: f θ (θ) = 1 2π 0 θ 2π 0 otherwise. The marginalized likelihood function becomes L(A, ω, α) 1 2π dθ L(A, ω, α, θ) 2π 0 = 1 2π A dθ exp 2π 0 σ 2 = exp NA2 4σ 2 1 2π 2π 0 t y(t)cos(ωt + αt 2 + θ) NA2 4σ 2 dθ exp A σ 1 t y(t)cos(ωt + αt 2 + θ) Uniform pdf has maximum uncertainty Using the identity we have cos(ωt + αt 2 + θ) =cos(ωt + αt 2 )cosθ sin(ωt + αt 2 )sinθ t y(t)cos(ωt + αt 2 + θ) = cosθ t sin θ t y(t)cos(ωt + αt 2 ) P y(t)sin(ωt + αt 2 ) Q 5

8 P cos θ Q sin θ = P 2 + Q 2 cos[θ +tan 1 (Q/P )]. This result may be used to evaluate the integral over θ in the margninalized likelihood function: A P 1 = 2π σ Q 2 cos[θ + tan 1 (Q/P )] dθ e irrelevant phase shift 2π 0 To evaluate the integral we use the identity, I 0 (x) 1 2π dθ e x cos θ =modifiedbesselfunction 2π 0 This yields A = I0 P σ Q 2 We now simplify P 2 + Q 2 : P 2 + Q 2 = t y(t)cos(ωt + αt 2 ) 2 + t y t sin(ωt + αt 2 ) 2 = y(t)y(t )[cos(ωt + αt 2 ) cos(ωt + αt 2 ) t t + sin(ωt + αt 2 )sin(ωt + αt 2 ) cos[ω(t t )+α(t t ) 2 ] 6 Note issue with []; also (t-t ) 2 should be (t 2 t 2 ) here and in the following

9 Define P 2 + Q 2 = t t y(t)y(t )cos[ω(t t )+α(t t ) 2 ]. C(ω, α) N 1 (P 2 + Q 2 )=N 1 t t y(t)y(t )cos[ω(t t )+α(t t ) 2 ], Then the integral over θ gives I0 A NC(ω, α) and the marginalized likelihood is σ 2 L(A, ω, α) =e NA2 A 4σ 2 NC(ω, α) I 0 σ 2. 7

10 Notes: (1) The data appear only in C(ω, α). (2) C is a sufficient statistic, meaning that it contains all information from the data that are relevant to inference using the likelihood function. (3) How do we read L(A, ω, α)? As the probability distribution of the parameters A, ω, α in terms of the data dependent quantity C(ω, α). (Note that L is not normalized as a PDF). As such, L is a quite different quantity from the Fourier-based power spectrum. (4) What is C(ω, α) N 1 t t y(t)y(t )cos[ω(t t )+α(t t ) 2 ]? For a given data set, ω, α are variables. If we plot C(ω, α), we expect to get a large value when ω = ω signal, α = α signal. (5) For a non-chirped but oscillatory signal (ω = 0, α =0), the quantity C(ω, α) is nothing other than the periodogram (the squared magnitude of the Fourier transform of the data). We then see that, for this case, the likelihood function is a nonlinear function of the Fourier estimate for the power spectrum. 8

11 A Limiting Form: For argument x 1, the Bessel function I 0 (x) ex 2πx. In this case the marginalized likelihood is A NC(ω, α) L(A, ω, α) e NA2 4σ 2 I 0 σ 2 e 2πA NC(ω, α)σ 2 1/2. e NA2 4σ 2 A NC(ω, α) σ 2 Since C(ω, α) is large when ω and α match those of any true signal, we see that it is exponentiated as compared to appearing linearly in the periodogram. 9

12

13 Interpretation of the Bayesian and Fourier Approaches We found the marginalized likelihood for the frequency and chirp rate to be L(A, ω, α) =e NA2 4σ 2 I 0 A NC(ω, α) and the limiting form for the Bessel function s argument x 1 is I 0 (x) ex 2πx. σ 2. In this case the marginalized likelihood is A NC(ω, α) L(A, ω, α) e NA2 4σ 2 I 0 σ 2 e 2πA NC(ω, α)σ 2 1/2. e NA2 4σ 2 A NC(ω, α) σ 2 Since C(ω, α) is large when ω and α match those of any true signal, we see that it is exponentiated as compared to appearing linearly in the periodogram. 10

14 Now let s consider the case with no chirp rate, α =0. Examples in the literature show that the width of the Bayesian PDF is much narrower than the periodogram, C(ω, 0). Does this mean that the uncertainty principle has been avoided? The answer is no! Uncertainty Principle in the Periodogram: For a data set of length T, the frequency resolution implied by the spectral window function is Width of the Bayesian PDF: δω 2πδf 2π T. When the argument of the Bessel function is large the exponentiation causes the PDF to be much narrower than the spectral window for the periodogram. 11

15 Interpretation: The periodogram is the distribution of power (or variance) with frequency for the particular realization of data used to form the periodogram. The spectral window also depicts the distribution of variance for a pure sinusoid in the data (with infinite signal to noise ratio). The Bayesian posterior is the PDF for the frequency of a sinusoid and therefore represents a very different quantity than the periodogram and are thus not directly comparable. 12

16 1. The Bayesian method addresses the question, what is the PDF for the frequency of the sinusoid that is in the data.? 2. The periodogram is the distribution of variance in frequency. 3. If we use the periodogram to estimate the sinusoid s frequency, we get a result that is more comparable: (a) First note that the width of the posterior PDF involves the signal to noise ratio (in the square root of the periodogram) NA/σ while the width of the periodogram s spectral window is independent of the SNR. (b) General result: if a spectral line has width ω, its centroid can be determined to an accuracy δω ω SNR. This result follows from matched filtering, which we will discuss later on. (c) Quantitatively, the periodogram yields the same information about the location of the spectral line as does the posterior PDF. 4. Problem: derive an estimate for the width of the posterior PDF that can be compared with the estimate for the periodogram. 13

17 Figure 1: Left: Time series of sinusoid + white noise with A/σ =1sampled N =500times over an interval of length T =500. Right: Plot of the periodogram (red) and Bayesian PDF of the time series. 14

18 Figure 2: Left: Time series of sinusoid + white noise with A/σ =1/4 sampled N =500times over an interval of length T =500. Right: Plot of the periodogram (red) and Bayesian PDF of the time series. 15

19 Maximum Likelihood Spectra Estimation (MLSE) MLSE is a misnomer; a better name is High Resolution Method because the method is derived by explicitly maximizing the sensitivity to a given frequency while minimizing the effects (i.e. leakage, a.k.a. bias) from other frequencies. The MLSE was developed in the 1960s by Capon to analyze data from arrays of sensors to maximize the response to one particular direction and minimize the response to others. e.g. LASA = Large Aperture Seismic Array (test earthquakes vs. underground nuclear tests). There is a close relationship to beam forming in acoustic arrays and beam forming in radio interferometric arrays. In the original development of the method discussed by Capon 1 the spectral estimator is very closely related to a filter that gives the ML estimate of a signal when it is corrupted by Gaussian noise: S + N A Ŝ 1 see Nonlinear Methods of Spectral Analysis, Haykin, ed. pp

20 This system involves: a) a filter that gives the ML estimate of the signal when corrupted by Gaussian noise is also... b) the filter that generally gives the minimum variance and unbiased estimate of the signal for arbitrary noise and... c) has coefficients that yield an unbiased, high resolution spectral estimate for any signal. The way the filter coefficients are derived (i.e. the constraints applied to the maximization problem) imply that the spectral estimate minimizes leakage. The HRM is sometimes described as a positive constrained reconstruction method which minimizes leakage. Thus, the intent of the MLSE technique is much different from the MESE technique: MLSE minimizes variance and bias (recall how spectral bias was related to resolution) MESE in effect (via its relationship to prediction filters) tries to maximize resolution 2

21 We will derive the ML spectral estimate following the derivation of Lacoss. Method: Construct a linear filter that 1. yields an unbiased estimate of a sinusoidal signal and 2. minimizes the variance of the output with respect to corrupting white noise. Pass a signal y n through a linear filter: y n a k x n x n = n a k y n k+1 k=1 (causal) where the input is of the form of a deterministic sinusoid added to zero mean noise having an arbitrary spectrum: y n = Ae iωn + n n. We will determine the coefficients a k by invoking the above two criteria. 3

22 Goal: We want the filter to pass Ae iωn undistorted but to reject the noise as much as possible. Thus, we require N 1. no bias (in the mean): x n a k y n k+1 k=1 N = a k Ae iω(n k+1) + n n k+1 k=1 N = a k Ae iω(n k+1) k=1 Ae iωn (if no bias) N a k e iω(1 k) = 1 constraint equation k=1 4

23 This can be written in matrix form using to denote transpose conjugate: ε a =1 ε 1 e iω e i2ω. e i(n 1)Ω a = a 1 a 2. a N 5

24 2. Minimum variance of the filter output: σ 2 [x n x n ] 2 = = 2 a k y n k+1 Ae iωn k a k Ae iω(n k+1) = k k = a Ca, cancels last term a k n n k+1 2 where C is the covariance matrix of the noise, n. +n n k+1 Ae iωn 2 from 1. k a k n n k+1 n n k +1 a k k 6

25 3. Minimize σ 2 w.r.t. a and subject to the constraint ε a =1. By minimizing σ 2 subject to the constraint, we get the smallest error and no bias. Therefore we minimize L with respect to a: L = σ 2 + λ(ε a 1) = a Ca + λ(ε a 1) We can take L/ Re(a j ) and L/ Im(a j ) separately to derive equations for a, then recombine these equations to get This is the same as we get by taking a C + λε =0. a L L a = a a Ca + λ a (ε a) = a C + λε = 0 for a = a 0. 7

26 The solution for a 0 is a 0 C = λε C a 0 = λ ε a 0 = λ ( C ) 1 ε Now substitute back into the constraint equation ε a 0 =1(the no bias relation) to get ε a 0 = λ ε ( C ) 1 ε =1 or λ = 1 ε ( C ) 1 ε Note denominator is real (quadratic form) a 0 = ( C ) 1 ε ε ( C ) 1 ε ε ( C ) 1 ε = ε C 1 ε 8

27 4. Minimum variance: Substitute a 0 back into the expression for σ 2 to find the minimum variance: σ 2 min a 0 Ca 0 = = = σ 2 min = ( C ) 1 ε ε ( C ) 1 ε ε C 1 ε C (ε C 1 ε )(ε C 1 ε ) 1 ε C 1 ε 1 ε C 1 ε ( C ) 1 ε ε ( C ) 1 ε This is the power in the noise components with the same frequency as the signal Ω. (Note we have used the Hermitian relation C C.) 9

28 Interpretation: 1. σmin 2 = portion of noise that leaks through the filter, which is attempting to estimate a sinusoid corrupted by the noise. 2. Note that the filter coefficients and σmin 2 are functions of Ω and of the noise covariance matrix. But they do not depend on the amplitude of the sinusoid. 3. The trick: now take away the signal but keep the noise. We allow Ω to vary across a range of frequencies we are interested in. Then, σmin 2 (Ω) is a spectral estimate for the noise spectrum (which was left arbitrary) 4. maximum likelihood spectral estimator Ŝ ML (f) = 1 ε C 1 ε with Ω =2πf τ Further comments: 1. As used, the covariance matrix C is an ensemble average quantity. Applications to actual data require use of some estimate for the covariance matrix. 2. The derivation is for equally spaced data. 3. The spectral estimate should work well on processes with steep power-law spectra because the estimator is derived explicitly to minimize bias. 10

29 Data-adaptive aspect of the MLSE spectral estimator: Recall that the Fourier-transform based estimator has a fixed window. The MLSE has a data adaptive spectral window, as we will show. The filter coefficients are a function of the frequency of the sinusoid, Ω: a 0 (Ω) = ( C 1 ) (Ω) (Ω) ( C 1 ) (Ω) As Ω is varied, the coefficients a 0 vary but subject to the normalization constraint a 0 =1. For a given Ω, which labels the frequency component we are attempting to estimate, what is the response to other frequencies, ω? Define the window function W (ω, Ω) =a 0 (Ω) (ω) as the response to frequency ω of a filter designed to pass through the frequency Ω. The window function satisfies (normalization) W (Ω, Ω) 1. The equivalent quantity for a Fourier transform estimator might be sin(ω Ω)T/2 W (ω, Ω) = (ω Ω)T/2. 11

30

31 Simulating the HRM Generate a process with specified noise + signal spectrum or just noise with an arbitrary spectrum by passing white noise through a linear filter. white noise h(t) x(t) From one or more realizations of x(t) estimate the autocovariance and put it in the form of a covariance matrix, C. For each frequency of interest (Ω), calculate the MLM/HRM filter coefficients a 0 = Calculate the power-spectrum estimate as Ŝ(Ω) = The window function can be calculated as C 1 ε ε C 1 ε. 1 ε C 1 ε. W (ω, Ω) =a 0 (Ω)ε (ω). 12

32 Comparison of Spectral Estimators Bartlett MLM MEM N 2 ε Cε (ε C 1 ε ) 1 ε C 1 δ 2 100% error f = 1 N τ = 1 T large sidelobes same or better better resolution resolution (up to 2 of Bartlett) lower sidelobes Note all estimators are real because the quadratic form ε t Cε is real for C Toeplitz or, in the MEM case, the estimator is manifestly real. 13

33 Appendices

34 Derivation of the Maximum Entropy Spectrum This section follows Edward and Fitleson IEEE Trans on IT, 19, , 1973 and Mc- Donough in Nonlinear Methods of Spectral Analysis, ed. Haykin, pp Using the expression for the entropy rate in terms of the power spectrum (for a Gaussian process) h = 1 fn df ln S(f) 4f N f N we will derive a spectral estimate given that we know a finite number of values of the ACV. That is, suppose we know C(n) (X k x)(x k+n x), τ = sample interval. n = M, M +1,...,0,...M For now, assume that we actually know C(n) rather than some estimates for C(n), Ĉ(n). Letting S(f) carry the integration limits, we, therefore, have the constraint equations. C(n) = df e 2πifn τ S(f) which we incorporate into the maximization problem by using Lagrange multipliers λ n. Therefore, we maximize M L = h λ n C(n) (minus sign for convenience) n= M 21

35 which can be written as L = df Now we vary S(f) to find δl: δl = dt δs(f) 1 4f N ln S(f) M n= M 1 4f N 1 Ŝ(f) n λ n e 2πifn τ S(f) λ n e 2πifn τ =0 This holds for any δs(f) when S(f) equals the function that extremizes L. Thus, Ŝ(f) = 1 4f N 1 λ n e 2πifn τ Now substitute back into the constraint equations to get equations for the λ n : C(n) = df Ŝ(f) e2πifn τ = 1 e 2πifn τ df, 4f N λ n e 2πifn τ n = M,...,M This is a system of nonlinear equations for λ n. n n 22

36 Following Edward and Fitelson, note that the spectral estimate can be put into the form Ŝ(f) = 1 1 M where A(f) α 4f N A(f) 2 e 2πifl τ, which follows from the positive semi-definiteness of Ŝ(f) and is easy to see by analogy with the Wiener-Khinchin theorem: ACF S(f) FT 2. The coefficients α are related to the Lagrange mulipliers (λ is like a correlation function, α l a time series): M 2 A(f) 2 = α e 2πif τ =0 = α α e 2πif τ( ) Thus, both sides are equal if = M q= M M n= M λ q = M q =0 M q =0 l=0 α α q e 2πif τq λ n e 2πifn τ. α α q. 23

37 Now we can find a solution to the constraint equations. Start with: S(f) = 1 4f N 1 A(f) 2. Multiply Ŝ(f) by A (f) e 2πifn τ and integrate: fn f N df Ŝ(f) A (f) e 2πifn τ = 1 4f N The left-hand side becomes The right-hand size is LHS = = = fn fn f N df A (f) A(f) 2 e2πifn τ df Ŝ(f) M α e 2πif τ f N =0 n fn =0 α f N df Ŝ(f) e2πif τ(n ) C(n ) M α C(n ) =0 RHS = 1 4f N fn 24 f N df e2πifn τ A(f). e 2πifn τ

38 So we have M =0 α C(n ) = 1 4f N fn f N df e2πifn τ A(f). To further reduce the RHS we perform a contour integral in the complex plane for f. Let f = f r + if i and constrain [A(f)] 1 to be analytic 2 in the upper-half plane. Choose the contour ζ: By Cauchy s Integral Theorem 3 the integral around the closed contour vanishes: e 2πifn τ df =0= df [ ]+ df [ ]+ df [ ] + df [ ]. A(f) ζ (1) 2 Analytic functions (Sokolnikoff and Redheffer, p. 540): A function f(z) that has a derivative f (z) at a given point z = z 0 and at every point in the neighborhood of z 0 is analytic at the point z = z 0. The points where f(z) is not analytic are singular points. In order that f(z) =u(x, y)+iv(x, y) be analytic at z 0 = x 0 + iy 0 is that u and v and their partial derivatives be continuous and that the Cauchy-Riemann equations (2) u x = v y, v x = u y be satisfied throughout the neighborhood of x 0,y 0. 3 Cauchy s Integral Theorem If f(z) is continuous in a closed, simply connected region, R+C and analytic within the simple closed curve C, then dz f(z) =0. c (3) (4) 25

39 Consider first the (2) and (4) integrals: f r = ±f N,f i [0,f im ] df []+ df [] K 24 (2) = e 2πif N n τ (4) fim 0 df i e 2πf in τ A(f N + if i ) + e 2πif N n τ 0 f im df i e 2πifin τ A( f N + if i ) To proceed we specify that f N = Nyquistfrequency = 1 2 τ or 2 τf N =1. Then e ±2πif N n τ = e ±πin =cos(±πn) =cosnπ. Also A(f) is periodic so that A(+f N ) = A( f N ) A(+f N + if i ) = A( f N + if i ) This can be seen from A(f) = M α e 2πif τ 2 τf N =1 =0 M α e πi(f/f N ), =0 26

40 which implies that Therefore A(±f N + if i ) = fim K 24 =cosnπ df i = M α e π(f i/f N ) e ±πi =0 M α e π(f i/f N ) ( 1). =0 e 2πf in τ A(f N + if i ) 0 equal and opposite terms 0 e 2πfin τ + df i f im A(f N + if i ) K 24 =0 27

41 These results combined with the Cauchy Integral Theorem imply df [] = df [] (1) (3) or fn df e2πifn τ f N A(f) desired integral = = + fn f N fn f N df r df r e 2πin τ(fr+ifim) A(f r + if im ) e 2πifrn τ e 2πn τfim M α e 2πi τf r e 2π τf im Taking the limit as f im we get contributions only for n =0and =0: =0 Thus, lim f im fn e 2πifrn τ e 2πn τfim df r = δ n,0 f N α 0 α 0 fn f N df r. fn df e2πifn τ f N A(f) = 2f N δ n,0 α 0 28

42 The constraint equations are satisifed if: M l=0 α l C(n l) = 1 4f N fn df e2πifn τ f N A(f) = 1 δ n,0 2 α 0 Now put in matrix form by defining α 0 α α = 1. α M δ = ε = 1 e 2πif τ e 2πif2 τ. e 2πifM τ The constraint equations become C 0 C 1... C M α 0 C 1 C 0... C M 1 α 1.. C M C 1 M... C 0 α M = 1 2α or (since we ve assumed the process is real in which case α is also real) C α = 1 2α 0 δ 29

43 which has the solution α = 1 2α 0 C 1 δ (real data) 30

44 Now we can solve for the spectral estimate, Ŝ(f) = 1 4f N 1 A(f) 2 = 1 1 4f N M α e 2πif τ 2 =0 = 1 4f N 1 ε t C 1 δ 2 = 1 4f N 1 ε t α 2 = 1 4f N 4α 2 o ε t C 1 δ 2 (real data) Ŝ(f) = 1 f N α 2 0 ε t C 1 δ 2 31

45 Note that the solution for α = 1 2α 0 C 1 δ implies or α 0 = 1 2α 0 (C 1 ) 00 α2 0 = 1 2 (C 1 ) 00 32

46 Data Vectors Vector of random variables Mean X 1 X 2 X =. X N X = X 1 X 2. X N Dot product complex: Covariance matrix X X = X t X = zero mean case complex Hermitian N j=1 X X = X X = X 2 j N X j 2 j=1 X 1 2 X 1 X2 X 1 XN C = XX X 2 X1 X 2 2 X 2 XN =..... X N X1 X N X2 X N 2

47 Consider vectors A, B and matrix C with lengths N 1, N 1, and N N, respectively. We have (a) A A B = B (b) A A 2 =2A (c) A A t CA =(C t + C)A for real A. (d) A A CA = C t A + CA for complex A. (e) (CA) t = A t C t (f) If A is a zero mean stochastic process (e.g. a vector of N measurements of a noiselike signal), its covariance matrix can be written as C = AA. Here the notation is: = conjugate; t = transpose; = transpose conjugate.

A Bayesian Approach to Spectral Analysis

A Bayesian Approach to Spectral Analysis Chirped Signals A Bayesian Approach o Specral Analysis Chirped signals are oscillaing signals wih ime variable frequencies, usually wih a linear variaion of frequency wih ime. E.g. f() = A cos(ω + α 2

More information

Metropolis Algorithm

Metropolis Algorithm //7 A Modeling, Inference, and Mining Jim Cordes, Cornell University Lecture MCMC example Reading: Ch, and in Gregory (from before) Chapter 9 of Mackay (Monte Carlo Methods) hip://www.inference.phy.cam.ac.uk/itprnn/

More information

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2013

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2013 A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2013 Lecture 26 Localization/Matched Filtering (continued) Prewhitening Lectures next week: Reading Bases, principal

More information

Stochastic Processes. A stochastic process is a function of two variables:

Stochastic Processes. A stochastic process is a function of two variables: Stochastic Processes Stochastic: from Greek stochastikos, proceeding by guesswork, literally, skillful in aiming. A stochastic process is simply a collection of random variables labelled by some parameter:

More information

A6523 Linear, Shift-invariant Systems and Fourier Transforms

A6523 Linear, Shift-invariant Systems and Fourier Transforms A6523 Linear, Shift-invariant Systems and Fourier Transforms Linear systems underly much of what happens in nature and are used in instrumentation to make measurements of various kinds. We will define

More information

Problem Sheet 1 Examples of Random Processes

Problem Sheet 1 Examples of Random Processes RANDOM'PROCESSES'AND'TIME'SERIES'ANALYSIS.'PART'II:'RANDOM'PROCESSES' '''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''Problem'Sheets' Problem Sheet 1 Examples of Random Processes 1. Give

More information

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring Lecture 9 A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2015 http://www.astro.cornell.edu/~cordes/a6523 Applications: Comparison of Frequentist and Bayesian inference

More information

Signal Modeling, Statistical Inference and Data Mining in Astrophysics

Signal Modeling, Statistical Inference and Data Mining in Astrophysics ASTRONOMY 6523 Spring 2013 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Course Approach The philosophy of the course reflects that of the instructor, who takes a dualistic view

More information

Frequentist-Bayesian Model Comparisons: A Simple Example

Frequentist-Bayesian Model Comparisons: A Simple Example Frequentist-Bayesian Model Comparisons: A Simple Example Consider data that consist of a signal y with additive noise: Data vector (N elements): D = y + n The additive noise n has zero mean and diagonal

More information

A523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011

A523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011 A523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011 Lecture 6 PDFs for Lecture 1-5 are on the web page Problem set 2 is on the web page Article on web page A Guided

More information

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random

More information

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring Lecture 8 A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2015 http://www.astro.cornell.edu/~cordes/a6523 Applications: Bayesian inference: overview and examples Introduction

More information

A6523 Modeling, Inference, and Mining Jim Cordes, Cornell University

A6523 Modeling, Inference, and Mining Jim Cordes, Cornell University A6523 Modeling, Inference, and Mining Jim Cordes, Cornell University Lecture 19 Modeling Topics plan: Modeling (linear/non- linear least squares) Bayesian inference Bayesian approaches to spectral esbmabon;

More information

Stochastic Spectral Approaches to Bayesian Inference

Stochastic Spectral Approaches to Bayesian Inference Stochastic Spectral Approaches to Bayesian Inference Prof. Nathan L. Gibson Department of Mathematics Applied Mathematics and Computation Seminar March 4, 2011 Prof. Gibson (OSU) Spectral Approaches to

More information

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011 A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011 Reading: Chapter 10 = linear LSQ with Gaussian errors Chapter 11 = Nonlinear fitting Chapter 12 = Markov Chain Monte

More information

Overview of Beamforming

Overview of Beamforming Overview of Beamforming Arye Nehorai Preston M. Green Department of Electrical and Systems Engineering Washington University in St. Louis March 14, 2012 CSSIP Lab 1 Outline Introduction Spatial and temporal

More information

Spectral Analysis of Random Processes

Spectral Analysis of Random Processes Spectral Analysis of Random Processes Spectral Analysis of Random Processes Generally, all properties of a random process should be defined by averaging over the ensemble of realizations. Generally, all

More information

Linear models. Chapter Overview. Linear process: A process {X n } is a linear process if it has the representation.

Linear models. Chapter Overview. Linear process: A process {X n } is a linear process if it has the representation. Chapter 2 Linear models 2.1 Overview Linear process: A process {X n } is a linear process if it has the representation X n = b j ɛ n j j=0 for all n, where ɛ n N(0, σ 2 ) (Gaussian distributed with zero

More information

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.

More information

Linear and Nonlinear Oscillators (Lecture 2)

Linear and Nonlinear Oscillators (Lecture 2) Linear and Nonlinear Oscillators (Lecture 2) January 25, 2016 7/441 Lecture outline A simple model of a linear oscillator lies in the foundation of many physical phenomena in accelerator dynamics. A typical

More information

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2013 Lecture 4 For next week be sure to have read Chapter 5 of

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2013 Lecture 4 For next week be sure to have read Chapter 5 of A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2013 Lecture 4 For next week be sure to have read Chapter 5 of Gregory (Frequentist Statistical Inference) Today: DFT

More information

Damped harmonic motion

Damped harmonic motion Damped harmonic motion March 3, 016 Harmonic motion is studied in the presence of a damping force proportional to the velocity. The complex method is introduced, and the different cases of under-damping,

More information

Time Series Analysis: 4. Linear filters. P. F. Góra

Time Series Analysis: 4. Linear filters. P. F. Góra Time Series Analysis: 4. Linear filters P. F. Góra http://th-www.if.uj.edu.pl/zfs/gora/ 2012 Linear filters in the Fourier domain Filtering: Multiplying the transform by a transfer function. g n DFT G

More information

Detecting Parametric Signals in Noise Having Exactly Known Pdf/Pmf

Detecting Parametric Signals in Noise Having Exactly Known Pdf/Pmf Detecting Parametric Signals in Noise Having Exactly Known Pdf/Pmf Reading: Ch. 5 in Kay-II. (Part of) Ch. III.B in Poor. EE 527, Detection and Estimation Theory, # 5c Detecting Parametric Signals in Noise

More information

STA 4273H: Sta-s-cal Machine Learning

STA 4273H: Sta-s-cal Machine Learning STA 4273H: Sta-s-cal Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 2 In our

More information

Chapter 2: Complex numbers

Chapter 2: Complex numbers Chapter 2: Complex numbers Complex numbers are commonplace in physics and engineering. In particular, complex numbers enable us to simplify equations and/or more easily find solutions to equations. We

More information

The Hilbert Transform

The Hilbert Transform The Hilbert Transform David Hilbert 1 ABSTRACT: In this presentation, the basic theoretical background of the Hilbert Transform is introduced. Using this transform, normal real-valued time domain functions

More information

COS513 LECTURE 8 STATISTICAL CONCEPTS

COS513 LECTURE 8 STATISTICAL CONCEPTS COS513 LECTURE 8 STATISTICAL CONCEPTS NIKOLAI SLAVOV AND ANKUR PARIKH 1. MAKING MEANINGFUL STATEMENTS FROM JOINT PROBABILITY DISTRIBUTIONS. A graphical model (GM) represents a family of probability distributions

More information

Probing the covariance matrix

Probing the covariance matrix Probing the covariance matrix Kenneth M. Hanson Los Alamos National Laboratory (ret.) BIE Users Group Meeting, September 24, 2013 This presentation available at http://kmh-lanl.hansonhub.com/ LA-UR-06-5241

More information

Communication Systems Lecture 21, 22. Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University

Communication Systems Lecture 21, 22. Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University Communication Systems Lecture 1, Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University 1 Outline Linear Systems with WSS Inputs Noise White noise, Gaussian noise, White Gaussian noise

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 3 Linear

More information

ENSC327 Communications Systems 2: Fourier Representations. Jie Liang School of Engineering Science Simon Fraser University

ENSC327 Communications Systems 2: Fourier Representations. Jie Liang School of Engineering Science Simon Fraser University ENSC327 Communications Systems 2: Fourier Representations Jie Liang School of Engineering Science Simon Fraser University 1 Outline Chap 2.1 2.5: Signal Classifications Fourier Transform Dirac Delta Function

More information

Introduction to Bayesian Data Analysis

Introduction to Bayesian Data Analysis Introduction to Bayesian Data Analysis Phil Gregory University of British Columbia March 2010 Hardback (ISBN-10: 052184150X ISBN-13: 9780521841504) Resources and solutions This title has free Mathematica

More information

A6523 Modeling, Inference, and Mining Jim Cordes, Cornell University. False Positives in Fourier Spectra. For N = DFT length: Lecture 5 Reading

A6523 Modeling, Inference, and Mining Jim Cordes, Cornell University. False Positives in Fourier Spectra. For N = DFT length: Lecture 5 Reading A6523 Modeling, Inference, and Mining Jim Cordes, Cornell University Lecture 5 Reading Notes on web page Stochas

More information

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011 A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011 Reading Chapter 5 (continued) Lecture 8 Key points in probability CLT CLT examples Prior vs Likelihood Box & Tiao

More information

2A1H Time-Frequency Analysis II Bugs/queries to HT 2011 For hints and answers visit dwm/courses/2tf

2A1H Time-Frequency Analysis II Bugs/queries to HT 2011 For hints and answers visit   dwm/courses/2tf Time-Frequency Analysis II (HT 20) 2AH 2AH Time-Frequency Analysis II Bugs/queries to david.murray@eng.ox.ac.uk HT 20 For hints and answers visit www.robots.ox.ac.uk/ dwm/courses/2tf David Murray. A periodic

More information

Outline lecture 2 2(30)

Outline lecture 2 2(30) Outline lecture 2 2(3), Lecture 2 Linear Regression it is our firm belief that an understanding of linear models is essential for understanding nonlinear ones Thomas Schön Division of Automatic Control

More information

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2015 http://www.astro.cornell.edu/~cordes/a6523 Lecture 4 See web page later tomorrow Searching for Monochromatic Signals

More information

Lecture 3. G. Cowan. Lecture 3 page 1. Lectures on Statistical Data Analysis

Lecture 3. G. Cowan. Lecture 3 page 1. Lectures on Statistical Data Analysis Lecture 3 1 Probability (90 min.) Definition, Bayes theorem, probability densities and their properties, catalogue of pdfs, Monte Carlo 2 Statistical tests (90 min.) general concepts, test statistics,

More information

DETECTION theory deals primarily with techniques for

DETECTION theory deals primarily with techniques for ADVANCED SIGNAL PROCESSING SE Optimum Detection of Deterministic and Random Signals Stefan Tertinek Graz University of Technology turtle@sbox.tugraz.at Abstract This paper introduces various methods for

More information

Non-parametric identification

Non-parametric identification Non-parametric Non-parametric Transient Step-response using Spectral Transient Correlation Frequency function estimate Spectral System Identification, SSY230 Non-parametric 1 Non-parametric Transient Step-response

More information

A523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011

A523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011 A523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011 Lecture 1 Organization:» Syllabus (text, requirements, topics)» Course approach (goals, themes) Book: Gregory, Bayesian

More information

Time Series Analysis: 4. Digital Linear Filters. P. F. Góra

Time Series Analysis: 4. Digital Linear Filters. P. F. Góra Time Series Analysis: 4. Digital Linear Filters P. F. Góra http://th-www.if.uj.edu.pl/zfs/gora/ 2018 Linear filters Filtering in Fourier domain is very easy: multiply the DFT of the input by a transfer

More information

Aspects of Continuous- and Discrete-Time Signals and Systems

Aspects of Continuous- and Discrete-Time Signals and Systems Aspects of Continuous- and Discrete-Time Signals and Systems C.S. Ramalingam Department of Electrical Engineering IIT Madras C.S. Ramalingam (EE Dept., IIT Madras) Networks and Systems 1 / 45 Scaling the

More information

Fourier Analysis and Power Spectral Density

Fourier Analysis and Power Spectral Density Chapter 4 Fourier Analysis and Power Spectral Density 4. Fourier Series and ransforms Recall Fourier series for periodic functions for x(t + ) = x(t), where x(t) = 2 a + a = 2 a n = 2 b n = 2 n= a n cos

More information

F & B Approaches to a simple model

F & B Approaches to a simple model A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 215 http://www.astro.cornell.edu/~cordes/a6523 Lecture 11 Applications: Model comparison Challenges in large-scale surveys

More information

Statistical and Adaptive Signal Processing

Statistical and Adaptive Signal Processing r Statistical and Adaptive Signal Processing Spectral Estimation, Signal Modeling, Adaptive Filtering and Array Processing Dimitris G. Manolakis Massachusetts Institute of Technology Lincoln Laboratory

More information

X(t)e 2πi nt t dt + 1 T

X(t)e 2πi nt t dt + 1 T HOMEWORK 31 I) Use the Fourier-Euler formulae to show that, if X(t) is T -periodic function which admits a Fourier series decomposition X(t) = n= c n exp (πi n ) T t, then (1) if X(t) is even c n are all

More information

Machine Learning 4771

Machine Learning 4771 Machine Learning 4771 Instructor: Tony Jebara Topic 11 Maximum Likelihood as Bayesian Inference Maximum A Posteriori Bayesian Gaussian Estimation Why Maximum Likelihood? So far, assumed max (log) likelihood

More information

Approximate Inference Part 1 of 2

Approximate Inference Part 1 of 2 Approximate Inference Part 1 of 2 Tom Minka Microsoft Research, Cambridge, UK Machine Learning Summer School 2009 http://mlg.eng.cam.ac.uk/mlss09/ Bayesian paradigm Consistent use of probability theory

More information

Statistics: Learning models from data

Statistics: Learning models from data DS-GA 1002 Lecture notes 5 October 19, 2015 Statistics: Learning models from data Learning models from data that are assumed to be generated probabilistically from a certain unknown distribution is a crucial

More information

Chapter 2 Random Processes

Chapter 2 Random Processes Chapter 2 Random Processes 21 Introduction We saw in Section 111 on page 10 that many systems are best studied using the concept of random variables where the outcome of a random experiment was associated

More information

Continuous Wave Data Analysis: Fully Coherent Methods

Continuous Wave Data Analysis: Fully Coherent Methods Continuous Wave Data Analysis: Fully Coherent Methods John T. Whelan School of Gravitational Waves, Warsaw, 3 July 5 Contents Signal Model. GWs from rotating neutron star............ Exercise: JKS decomposition............

More information

Geophysical Data Analysis: Discrete Inverse Theory

Geophysical Data Analysis: Discrete Inverse Theory Geophysical Data Analysis: Discrete Inverse Theory MATLAB Edition William Menke Lamont-Doherty Earth Observatory and Department of Earth and Environmental Sciences Columbia University. ' - Palisades, New

More information

Approximate Inference Part 1 of 2

Approximate Inference Part 1 of 2 Approximate Inference Part 1 of 2 Tom Minka Microsoft Research, Cambridge, UK Machine Learning Summer School 2009 http://mlg.eng.cam.ac.uk/mlss09/ 1 Bayesian paradigm Consistent use of probability theory

More information

1 Math 241A-B Homework Problem List for F2015 and W2016

1 Math 241A-B Homework Problem List for F2015 and W2016 1 Math 241A-B Homework Problem List for F2015 W2016 1.1 Homework 1. Due Wednesday, October 7, 2015 Notation 1.1 Let U be any set, g be a positive function on U, Y be a normed space. For any f : U Y let

More information

QUALIFYING EXAMINATION, Part 1. 2:00 PM 5:00 PM, Thursday September 3, 2009

QUALIFYING EXAMINATION, Part 1. 2:00 PM 5:00 PM, Thursday September 3, 2009 QUALIFYING EXAMINATION, Part 1 2:00 PM 5:00 PM, Thursday September 3, 2009 Attempt all parts of all four problems. Please begin your answer to each problem on a separate sheet, write your 3 digit code

More information

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if jt X ( ) = xte ( ) dt, (3-) then X ( ) represents its energy spectrum. his follows from Parseval

More information

Discrete-Time Signals: Time-Domain Representation

Discrete-Time Signals: Time-Domain Representation Discrete-Time Signals: Time-Domain Representation 1 Signals represented as sequences of numbers, called samples Sample value of a typical signal or sequence denoted as x[n] with n being an integer in the

More information

Table of contents. d 2 y dx 2, As the equation is linear, these quantities can only be involved in the following manner:

Table of contents. d 2 y dx 2, As the equation is linear, these quantities can only be involved in the following manner: M ath 0 1 E S 1 W inter 0 1 0 Last Updated: January, 01 0 Solving Second Order Linear ODEs Disclaimer: This lecture note tries to provide an alternative approach to the material in Sections 4. 4. 7 and

More information

PHYS 3900 Homework Set #03

PHYS 3900 Homework Set #03 PHYS 3900 Homework Set #03 Part = HWP 3.0 3.04. Due: Mon. Feb. 2, 208, 4:00pm Part 2 = HWP 3.05, 3.06. Due: Mon. Feb. 9, 208, 4:00pm All textbook problems assigned, unless otherwise stated, are from the

More information

COMS 4721: Machine Learning for Data Science Lecture 19, 4/6/2017

COMS 4721: Machine Learning for Data Science Lecture 19, 4/6/2017 COMS 4721: Machine Learning for Data Science Lecture 19, 4/6/2017 Prof. John Paisley Department of Electrical Engineering & Data Science Institute Columbia University PRINCIPAL COMPONENT ANALYSIS DIMENSIONALITY

More information

Part III Spectrum Estimation

Part III Spectrum Estimation ECE79-4 Part III Part III Spectrum Estimation 3. Parametric Methods for Spectral Estimation Electrical & Computer Engineering North Carolina State University Acnowledgment: ECE79-4 slides were adapted

More information

Ch 4. Linear Models for Classification

Ch 4. Linear Models for Classification Ch 4. Linear Models for Classification Pattern Recognition and Machine Learning, C. M. Bishop, 2006. Department of Computer Science and Engineering Pohang University of Science and echnology 77 Cheongam-ro,

More information

arxiv:astro-ph/ v1 14 Sep 2005

arxiv:astro-ph/ v1 14 Sep 2005 For publication in Bayesian Inference and Maximum Entropy Methods, San Jose 25, K. H. Knuth, A. E. Abbas, R. D. Morris, J. P. Castle (eds.), AIP Conference Proceeding A Bayesian Analysis of Extrasolar

More information

Gaussian Processes in Machine Learning

Gaussian Processes in Machine Learning Gaussian Processes in Machine Learning November 17, 2011 CharmGil Hong Agenda Motivation GP : How does it make sense? Prior : Defining a GP More about Mean and Covariance Functions Posterior : Conditioning

More information

Discrete-Time Signals: Time-Domain Representation

Discrete-Time Signals: Time-Domain Representation Discrete-Time Signals: Time-Domain Representation 1 Signals represented as sequences of numbers, called samples Sample value of a typical signal or sequence denoted as x[n] with n being an integer in the

More information

System Identification & Parameter Estimation

System Identification & Parameter Estimation System Identification & Parameter Estimation Wb3: SIPE lecture Correlation functions in time & frequency domain Alfred C. Schouten, Dept. of Biomechanical Engineering (BMechE), Fac. 3mE // Delft University

More information

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016 Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2016 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several

More information

GATE EE Topic wise Questions SIGNALS & SYSTEMS

GATE EE Topic wise Questions SIGNALS & SYSTEMS www.gatehelp.com GATE EE Topic wise Questions YEAR 010 ONE MARK Question. 1 For the system /( s + 1), the approximate time taken for a step response to reach 98% of the final value is (A) 1 s (B) s (C)

More information

2A1H Time-Frequency Analysis II

2A1H Time-Frequency Analysis II 2AH Time-Frequency Analysis II Bugs/queries to david.murray@eng.ox.ac.uk HT 209 For any corrections see the course page DW Murray at www.robots.ox.ac.uk/ dwm/courses/2tf. (a) A signal g(t) with period

More information

Searches for a Stochastic Gravitational-Wave Background

Searches for a Stochastic Gravitational-Wave Background earches for a tochastic Gravitational-Wave Background John T. Whelan chool of Gravitational Waves, Warsaw, 13 July Contents 1 tochastic GW Backgrounds 1 1.1 Exercise........................ 1. patial Distributions.................

More information

LOPE3202: Communication Systems 10/18/2017 2

LOPE3202: Communication Systems 10/18/2017 2 By Lecturer Ahmed Wael Academic Year 2017-2018 LOPE3202: Communication Systems 10/18/2017 We need tools to build any communication system. Mathematics is our premium tool to do work with signals and systems.

More information

Lecture 4 - Spectral Estimation

Lecture 4 - Spectral Estimation Lecture 4 - Spectral Estimation The Discrete Fourier Transform The Discrete Fourier Transform (DFT) is the equivalent of the continuous Fourier Transform for signals known only at N instants separated

More information

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides were adapted

More information

Advanced Signal Processing Minimum Variance Unbiased Estimation (MVU)

Advanced Signal Processing Minimum Variance Unbiased Estimation (MVU) Advanced Signal Processing Minimum Variance Unbiased Estimation (MVU) Danilo Mandic room 813, ext: 46271 Department of Electrical and Electronic Engineering Imperial College London, UK d.mandic@imperial.ac.uk,

More information

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring A653 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 15 http://www.astro.cornell.edu/~cordes/a653 Lecture 3 Power spectrum issues Frequentist approach Bayesian approach (some

More information

Statistical Methods for Particle Physics Lecture 1: parameter estimation, statistical tests

Statistical Methods for Particle Physics Lecture 1: parameter estimation, statistical tests Statistical Methods for Particle Physics Lecture 1: parameter estimation, statistical tests http://benasque.org/2018tae/cgi-bin/talks/allprint.pl TAE 2018 Benasque, Spain 3-15 Sept 2018 Glen Cowan Physics

More information

SPRING 2006 PRELIMINARY EXAMINATION SOLUTIONS

SPRING 2006 PRELIMINARY EXAMINATION SOLUTIONS SPRING 006 PRELIMINARY EXAMINATION SOLUTIONS 1A. Let G be the subgroup of the free abelian group Z 4 consisting of all integer vectors (x, y, z, w) such that x + 3y + 5z + 7w = 0. (a) Determine a linearly

More information

Pattern Recognition and Machine Learning

Pattern Recognition and Machine Learning Christopher M. Bishop Pattern Recognition and Machine Learning ÖSpri inger Contents Preface Mathematical notation Contents vii xi xiii 1 Introduction 1 1.1 Example: Polynomial Curve Fitting 4 1.2 Probability

More information

Density Estimation. Seungjin Choi

Density Estimation. Seungjin Choi Density Estimation Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr http://mlg.postech.ac.kr/

More information

Unstable Oscillations!

Unstable Oscillations! Unstable Oscillations X( t ) = [ A 0 + A( t ) ] sin( ω t + Φ 0 + Φ( t ) ) Amplitude modulation: A( t ) Phase modulation: Φ( t ) S(ω) S(ω) Special case: C(ω) Unstable oscillation has a broader periodogram

More information

A=randn(500,100); mu=mean(a); sigma_a=std(a); std_a=sigma_a/sqrt(500); [std(mu) mean(std_a)] % compare standard deviation of means % vs standard error

A=randn(500,100); mu=mean(a); sigma_a=std(a); std_a=sigma_a/sqrt(500); [std(mu) mean(std_a)] % compare standard deviation of means % vs standard error UCSD SIOC 221A: (Gille) 1 Reading: Bendat and Piersol, Ch. 5.2.1 Lecture 10: Recap Last time we looked at the sinc function, windowing, and detrending with an eye to reducing edge effects in our spectra.

More information

Practical Spectral Estimation

Practical Spectral Estimation Digital Signal Processing/F.G. Meyer Lecture 4 Copyright 2015 François G. Meyer. All Rights Reserved. Practical Spectral Estimation 1 Introduction The goal of spectral estimation is to estimate how the

More information

IV. Covariance Analysis

IV. Covariance Analysis IV. Covariance Analysis Autocovariance Remember that when a stochastic process has time values that are interdependent, then we can characterize that interdependency by computing the autocovariance function.

More information

Fourier transforms, Generalised functions and Greens functions

Fourier transforms, Generalised functions and Greens functions Fourier transforms, Generalised functions and Greens functions T. Johnson 2015-01-23 Electromagnetic Processes In Dispersive Media, Lecture 2 - T. Johnson 1 Motivation A big part of this course concerns

More information

13.42 READING 6: SPECTRUM OF A RANDOM PROCESS 1. STATIONARY AND ERGODIC RANDOM PROCESSES

13.42 READING 6: SPECTRUM OF A RANDOM PROCESS 1. STATIONARY AND ERGODIC RANDOM PROCESSES 13.42 READING 6: SPECTRUM OF A RANDOM PROCESS SPRING 24 c A. H. TECHET & M.S. TRIANTAFYLLOU 1. STATIONARY AND ERGODIC RANDOM PROCESSES Given the random process y(ζ, t) we assume that the expected value

More information

Periodogram of a sinusoid + spike Single high value is sum of cosine curves all in phase at time t 0 :

Periodogram of a sinusoid + spike Single high value is sum of cosine curves all in phase at time t 0 : Periodogram of a sinusoid + spike Single high value is sum of cosine curves all in phase at time t 0 : X(t) = µ + Asin(ω 0 t)+ Δ δ ( t t 0 ) ±σ N =100 Δ =100 χ ( ω ) Raises the amplitude uniformly at all

More information

Parameter estimation and forecasting. Cristiano Porciani AIfA, Uni-Bonn

Parameter estimation and forecasting. Cristiano Porciani AIfA, Uni-Bonn Parameter estimation and forecasting Cristiano Porciani AIfA, Uni-Bonn Questions? C. Porciani Estimation & forecasting 2 Temperature fluctuations Variance at multipole l (angle ~180o/l) C. Porciani Estimation

More information

Estimation Theory Fredrik Rusek. Chapters

Estimation Theory Fredrik Rusek. Chapters Estimation Theory Fredrik Rusek Chapters 3.5-3.10 Recap We deal with unbiased estimators of deterministic parameters Performance of an estimator is measured by the variance of the estimate (due to the

More information

CITY UNIVERSITY LONDON. BEng (Hons) in Electrical and Electronic Engineering PART 2 EXAMINATION. ENGINEERING MATHEMATICS 2 (resit) EX2003

CITY UNIVERSITY LONDON. BEng (Hons) in Electrical and Electronic Engineering PART 2 EXAMINATION. ENGINEERING MATHEMATICS 2 (resit) EX2003 No: CITY UNIVERSITY LONDON BEng (Hons) in Electrical and Electronic Engineering PART 2 EXAMINATION ENGINEERING MATHEMATICS 2 (resit) EX2003 Date: August 2004 Time: 3 hours Attempt Five out of EIGHT questions

More information

CMPT 318: Lecture 5 Complex Exponentials, Spectrum Representation

CMPT 318: Lecture 5 Complex Exponentials, Spectrum Representation CMPT 318: Lecture 5 Complex Exponentials, Spectrum Representation Tamara Smyth, tamaras@cs.sfu.ca School of Computing Science, Simon Fraser University January 23, 2006 1 Exponentials The exponential is

More information

Time Series: Theory and Methods

Time Series: Theory and Methods Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary

More information

Advanced Digital Signal Processing -Introduction

Advanced Digital Signal Processing -Introduction Advanced Digital Signal Processing -Introduction LECTURE-2 1 AP9211- ADVANCED DIGITAL SIGNAL PROCESSING UNIT I DISCRETE RANDOM SIGNAL PROCESSING Discrete Random Processes- Ensemble Averages, Stationary

More information

Probability and Statistics for Final Year Engineering Students

Probability and Statistics for Final Year Engineering Students Probability and Statistics for Final Year Engineering Students By Yoni Nazarathy, Last Updated: May 24, 2011. Lecture 6p: Spectral Density, Passing Random Processes through LTI Systems, Filtering Terms

More information

An Introduction to Bayesian Linear Regression

An Introduction to Bayesian Linear Regression An Introduction to Bayesian Linear Regression APPM 5720: Bayesian Computation Fall 2018 A SIMPLE LINEAR MODEL Suppose that we observe explanatory variables x 1, x 2,..., x n and dependent variables y 1,

More information

Periodogram of a sinusoid + spike Single high value is sum of cosine curves all in phase at time t 0 :

Periodogram of a sinusoid + spike Single high value is sum of cosine curves all in phase at time t 0 : Periodogram of a sinusoid + spike Single high value is sum of cosine curves all in phase at time t 0 : ( ) ±σ X(t) = µ + Asin(ω 0 t)+ Δ δ t t 0 N =100 Δ =100 χ ( ω ) Raises the amplitude uniformly at all

More information

CMPT 889: Lecture 2 Sinusoids, Complex Exponentials, Spectrum Representation

CMPT 889: Lecture 2 Sinusoids, Complex Exponentials, Spectrum Representation CMPT 889: Lecture 2 Sinusoids, Complex Exponentials, Spectrum Representation Tamara Smyth, tamaras@cs.sfu.ca School of Computing Science, Simon Fraser University September 26, 2005 1 Sinusoids Sinusoids

More information

D.S.G. POLLOCK: BRIEF NOTES

D.S.G. POLLOCK: BRIEF NOTES BIVARIATE SPECTRAL ANALYSIS Let x(t) and y(t) be two stationary stochastic processes with E{x(t)} = E{y(t)} =. These processes have the following spectral representations: (1) x(t) = y(t) = {cos(ωt)da

More information

Statistics and Data Analysis

Statistics and Data Analysis Statistics and Data Analysis The Crash Course Physics 226, Fall 2013 "There are three kinds of lies: lies, damned lies, and statistics. Mark Twain, allegedly after Benjamin Disraeli Statistics and Data

More information