Estimating Periodic Signals
|
|
- Theodora Crawford
- 6 years ago
- Views:
Transcription
1 Department of Mathematics & Statistics Indian Institute of Technology Kanpur Most of this talk has been taken from the book Statistical Signal Processing, by D. Kundu and S. Nandi. August 26, 2012
2 Outline 1 Introduction 2 3 Least Squares and Approximate Least Squares Estimators
3 Outline 1 Introduction 2 3 Least Squares and Approximate Least Squares Estimators
4 Some Definitions: What is a Signal? A signal is a function that conveys information about the behavior or attributes of some phenomenon (Wikipedia)
5 Some Definitions: Different Examples: 1 Daily Gold price. 2 Monthly expenditure in a family 3 ECG signal of a human being. 4 Satellite images. 5 Textures
6 Some Definitions: What is Signal Processing? Signal Processing may broadly be considered to involve the recovery of information from physical observations. The received signal is usually disturbed by external or internal noises. Due to random nature of the signal, statistical techniques play important roles in analyzing the signals.
7 Examples: ECG Signal
8 Examples: Rumford Data
9 Example: Vowel Sound
10 Example: Variable Star Brightness Signal
11 Example: Airlines Passenger Data
12 Some More Definition What is a Periodic Signal? A signal (function) which repeats after a fixed period of time. f (t) = f (t ); where t = t mod T Example: y(t) = A cos(ωt) + B sin(ωt).
13 Fourier Transform A smooth mean zero periodic function can be written as y(t) = A k cos(kωt) + B k sin(kωt), k=1 and it is known as the Fourier expansion of y(t). Most of the times y(t) is corrupted with noise, hence we use y(t) = A k cos(kωt) + B k sin(kωt) + X (t) k=1
14 Sinusoidal Signal Since it is impossible to estimate infinite number of parameters, the following model has been used y(t) = p A k cos(ω k t) + B k sin(ω k t) + X (t); k=1 where p <. Often the problem boils down to estimate A k s, B k s, ω k s and p based on a sample of size n, namely y(1),..., y(n).
15 Outline 1 Introduction 2 3 Least Squares and Approximate Least Squares Estimators
16 Periodogram Estimators The most used and popular estimation procedure is the periodogram estimators. The periodogram at a particular frequency is defined as or equivalently I (ω) = 1 n 2 y(t)e iωt n t=1 ( n ) 2 ( n I (ω) = 1 y(t) cos(ωt) + 1 y(t) sin(ωt) n n t=1 t=1 ) 2
17 Periodogram Estimator Consider the following sinusoidal signal: Sinusoidal Example 1: y(t) = 3.0(cos(0.2πt)+sin(0.2πt))+3.0(cos(0.5πt)+sin(0.5πt))+X (t) Here X (t) s are i.i.d. N(0,0.5)
18 Examples: Sinusoidal Signal
19 Periodogram Estimator Consider the following sinusoidal signal: Sinusoidal Example 2: y(t) = 3.0(cos(0.2πt)+sin(0.2πt))+0.25(cos(0.5πt)+sin(0.5πt))+X (t) Here X (t) s are i.i.d. N(0,2.0)
20 Examples: Sinusoidal Signal
21 Outline 1 Introduction 2 3 Least Squares and Approximate Least Squares Estimators
22 Least Squares Estimators The model can be seen as a non-linear regression model: where y(t) = f t (θ, p) + X (t) p f t (θ, p) = A k cos(ω k t) + B k sin(ω k t) k=1
23 Least Squares Estimators Assuming p is known, the most natural estimators will be the least squares estimators and they can be obtained as follows: ( [ n p ]) 2 y(t) A k cos(ω k t) + B k sin(ω k t) t=1 k=1
24 Theoretical and Numerical Issues 1 It does not satisfy the standard sufficient condition of Jennrich or Wu. 2 The least squares may not be consistent. 3 The asymptotic distribution of the least squares estimators are not n consistent. 4 Numerically it is a challenging problem.
25 Separable Regression Technique The model can be written as follows Y = A(θ)β + e where cos(ω 1 ) sin(ω 1 )... cos(ω p ) sin(ω p ) A(θ) =..... cos(nω 1 ) sin(nω 1 )... cos(nω p ) sin(nω p ) β T = (A 1, B 1,..., A p, B p ), e T = (X (1),..., X (n)).
26 Separable Regression Technique The least squares estimators can be obtained by minimizing Q(θ, β) = (Y A(θ)β) T (Y A(θ)β) with respect to the unknown parameters. Note that if θ is known then β T (θ) = (A(θ) T A(θ)) 1 A T (θ)y.
27 Separable Regression Technique The least squares estimators of θ can be obtained by minimizing Q(θ, β(θ)) = with respect to θ. It is equivalent in saying minimize ( ) T ( ) Y A(θ) β T (θ) Y A(θ) β(θ) Q(θ) = Y T (I P A )Y, where P A is the projection matrix as P A = A(A T A) 1 A T
28 Approximate Least Squares Estimators Note that minimizing Q(θ) = Y T (I P A )Y, is equivalent to maximizing R(θ) = Y T P A Y. Approximate 1 n (AT A) = I. Therefore, ( n ) 2 ( n Q(θ) = 1 n Y T AA T Y = 1 y(t) cos(ωt) + 1 y(t) sin(ωt) n n t=1 t=1 ) 2
29 Outline 1 Introduction 2 3 Least Squares and Approximate Least Squares Estimators
30 Complex Exponential The sum of sinusoidal model has a very close resemblance with the corresponding model y(t) = p A k e iωkt + X (t) k=1 Here y(t) s are complex valued, A k and B k are complex valued, 0 < ω k < 2π. The problem remains the same, estimate the unknown parameters based on y(t) s.
31 Prony s Equation Prony in 1795 observed the following interesting facts: If µ(t) = p A k e βkt ; t = 1,..., n, k=1 here A k s and β k s are real and β k s are distinct, then there exists g 0,..., g p such that µ(1)... µ(p + 1) g 0 0 µ(2)... µ(p + 2) g µ(n p)... µ(n). g p =. 0
32 Prony s Equation The constants g 0,..., g p do not depend on A k s, they depend only on β k s. β k s can be obtained from g k s as follows: Consider the following polynomial equation g 0 + g 1 x g p x p = 0, then e β 1,..., e βp are the roots of the above polynomial equations. Once β k s are obtained, A k s are obtained using simple linear regression method.
33 Prony s Equations Similar results are true in case of complex exponential also, i.e. if µ(t) = p A k e βkt ; t = 1,..., n, k=1 here A k s and β k s are complex. Similarly, if µ(t) = p A k cos(ω k t) + B k sin(ω k t); t = 1,..., n, k=1 here A k s and B k s are real, and 0 < ω k < 2π.
34 Prony s Equation It is immediate that if there is no error then A k s and β k can be recovered from µ(t) s without any problem.. Now suppose p y(t) = A k e βkt + e(t); t = 1,..., n, k=1 here A k s and β k s are real and β k s are distinct, and e(t) s are small mean zero error. Then it is expected y(1)... y(p + 1) g 0 0 y(2)... y(p + 2) g y(n p)... y(n) g p 0
35 Prony s Equation Therefore, if y(1)... y(p + 1) g 0 y(2)... y(p + 2) A =... g = g 1. y(n p)... y(n) g p then we want to solve Ag = 0 A T Ag = 0 g is an eigen vector corresponds to 0 eigenvalue of A T A.
36 Outline 1 Introduction 2 3 Least Squares and Approximate Least Squares Estimators
37 Numerical Issues 1 It is a highly non-linear problem. The least squares surface has several local minima. 2 Most of the time the standard Newton-Raphson algorithm may not converge. 3 Even if they converge, often it converges to the local minimum rather than the global minimum. 4 If p is large, it becomes a higher dimensional optimization problem, extremely accurate initial guesses are required for any iterative procedure to work well.
38 Sequential Estimation Procedures It is based on the facts that the components are orthogonal and it works like this First minimize n (y(t) A cos(ωt) B sin(ωt)) 2 t=1 with respect to A, B and ω. Take out their effect from y(t), i.e. consider ỹ(t) = y(t) Â cos( ωt) B sin( ωt) Repeat the procedure p times.
39 Advantage It reduces the computational burden significantly. For example if p = 25, instead of solving a 25 dimensional optimization problem, we need to solve 25 one dimensional optimization problems. It does not have any problem about initial guess or convergence. It produces the same accuracy as the least squares estimators.
40 Super Efficient Estimators When p = 1, the Newton-Raphson algorithm will be of the following form: ω (j+1) = ω (j) Q (ω) Q (ω) After few pages of calculations it has been suggested ω (j+1) = ω (j) 1 Q (ω) 4 Q (ω) It not only converges, it produces estimators which are better than the least squares estimators.
41 Outline 1 Introduction 2 3 Least Squares and Approximate Least Squares Estimators
42 Main Asymptotic Results 1 Least squares estimators are consistent under mild assumptions on the errors. 2 Least squares estimators have the convergence rate n 3/2. 3 Sequential estimators have the same convergence rate as the least squares estimators. 4 Asymptotic variances of the super efficient estimators are smaller than the least squares estimators. 5 Prony s estimators are not consistent. 6 Periodogram estimators are consistent, but it has the convergence rate n 1/2.
43 Outline 1 Introduction 2 3 Least Squares and Approximate Least Squares Estimators
44 1 Consider the number of peaks of the periodogram function. 2 It can be very misleading. 3 In the least squares procedure, consider residual sums of squares. 4 It can be very misleading too. 5 Information theoretic criterion. 6 Cross validation technique. 7 Likelihood ratio approach.
45 Information Theoretic Criterion AIC(k) = n ln R k + 2(3k) BIC(k) = n ln R k + 1 ln n(3k) 2 EDC(k) = n ln R k + C n k. Here C n satisfies certain conditions namely C n n C n. ln ln n Choose that model for which AIC(k), BIC(k) or EDC(k) is minimum
46 Information Theoretic Criterion Which C n to choose? Resampling technique can be used to compute PCS for each C n and choose that C n for which the PCS is maximum.
47 Outline 1 Introduction 2 3 Least Squares and Approximate Least Squares Estimators
48 Compartment Model Consider the following real valued model: y(t) = p A k e βkt + e(t); k=1 t = 1,..., n Here A k s and β k s are real numbers. The number of components p may be known or unknown. The problem is to estimate A k s and β k s based on y(t) s.
49 Fundamental Frequency Model Consider the following model: y(t) = p [A k cos(kλt) + B k sin(kλt)] + e(t) k=1 Here λ is the fundamental frequency, and it has p harmonics. The problem remains the same.
50 Chirp Signal Model Consider the following model: p [ y(t) = Ak cos(λ k t + β k t 2 ) + B k sin(λ k t + β k t 2 ) ] + e(t) k=1 The problem is to estimate the frequency and frequency rates.
51 Partially Sum of Sinusoidal Model Consider the following model: p y(t) = a + bt + [A k cos(ω k t) + B k sin(ω k t)] + e(t) k=1 The problem is to estimate the unknown parameters.
52 Thank You
On Two Different Signal Processing Models
On Two Different Signal Processing Models Department of Mathematics & Statistics Indian Institute of Technology Kanpur January 15, 2015 Outline First Model 1 First Model 2 3 4 5 6 Outline First Model 1
More informationESTIMATION OF PARAMETERS OF PARTIALLY SINUSOIDAL FREQUENCY MODEL
1 ESTIMATION OF PARAMETERS OF PARTIALLY SINUSOIDAL FREQUENCY MODEL SWAGATA NANDI 1 AND DEBASIS KUNDU Abstract. In this paper, we propose a modification of the multiple sinusoidal model such that periodic
More informationAn Efficient and Fast Algorithm for Estimating the Parameters of Sinusoidal Signals
An Efficient and Fast Algorithm for Estimating the Parameters of Sinusoidal Signals Swagata Nandi 1 Debasis Kundu Abstract A computationally efficient algorithm is proposed for estimating the parameters
More informationAsymptotic of Approximate Least Squares Estimators of Parameters Two-Dimensional Chirp Signal
Asymptotic of Approximate Least Squares Estimators of Parameters Two-Dimensional Chirp Signal Rhythm Grover, Debasis Kundu,, and Amit Mitra Department of Mathematics, Indian Institute of Technology Kanpur,
More informationAn Efficient and Fast Algorithm for Estimating the Parameters of Two-Dimensional Sinusoidal Signals
isid/ms/8/ November 6, 8 http://www.isid.ac.in/ statmath/eprints An Efficient and Fast Algorithm for Estimating the Parameters of Two-Dimensional Sinusoidal Signals Swagata Nandi Anurag Prasad Debasis
More informationNotes on the Periodically Forced Harmonic Oscillator
Notes on the Periodically orced Harmonic Oscillator Warren Weckesser Math 38 - Differential Equations 1 The Periodically orced Harmonic Oscillator. By periodically forced harmonic oscillator, we mean the
More informationUnstable Oscillations!
Unstable Oscillations X( t ) = [ A 0 + A( t ) ] sin( ω t + Φ 0 + Φ( t ) ) Amplitude modulation: A( t ) Phase modulation: Φ( t ) S(ω) S(ω) Special case: C(ω) Unstable oscillation has a broader periodogram
More informationEFFICIENT ALGORITHM FOR ESTIMATING THE PARAMETERS OF CHIRP SIGNAL
EFFICIENT ALGORITHM FOR ESTIMATING THE PARAMETERS OF CHIRP SIGNAL ANANYA LAHIRI, & DEBASIS KUNDU,3,4 & AMIT MITRA,3 Abstract. Chirp signals play an important role in the statistical signal processing.
More informationAmplitude Modulated Model For Analyzing Non Stationary Speech Signals
Amplitude Modulated Model For Analyzing on Stationary Speech Signals Swagata andi, Debasis Kundu and Srikanth K. Iyer Institut für Angewandte Mathematik Ruprecht-Karls-Universität Heidelberg Im euenheimer
More informationPeriodogram of a sinusoid + spike Single high value is sum of cosine curves all in phase at time t 0 :
Periodogram of a sinusoid + spike Single high value is sum of cosine curves all in phase at time t 0 : X(t) = µ + Asin(ω 0 t)+ Δ δ ( t t 0 ) ±σ N =100 Δ =100 χ ( ω ) Raises the amplitude uniformly at all
More informationPARAMETER ESTIMATION OF CHIRP SIGNALS IN PRESENCE OF STATIONARY NOISE
Statistica Sinica 8(008), 87-0 PARAMETER ESTIMATION OF CHIRP SIGNALS IN PRESENCE OF STATIONARY NOISE Debasis Kundu and Swagata Nandi Indian Institute of Technology, Kanpur and Indian Statistical Institute
More informationOn Parameter Estimation of Two Dimensional Chirp Signal
On Parameter Estimation of Two Dimensional Chirp Signal Ananya Lahiri & Debasis Kundu, & Amit Mitra Abstract Two dimensional (-D) chirp signals occur in different areas of image processing. In this paper,
More informationDamped harmonic motion
Damped harmonic motion March 3, 016 Harmonic motion is studied in the presence of a damping force proportional to the velocity. The complex method is introduced, and the different cases of under-damping,
More informationPeriodogram of a sinusoid + spike Single high value is sum of cosine curves all in phase at time t 0 :
Periodogram of a sinusoid + spike Single high value is sum of cosine curves all in phase at time t 0 : ( ) ±σ X(t) = µ + Asin(ω 0 t)+ Δ δ t t 0 N =100 Δ =100 χ ( ω ) Raises the amplitude uniformly at all
More informationPARAMETER ESTIMATION OF CHIRP SIGNALS IN PRESENCE OF STATIONARY NOISE
PARAMETER ESTIMATION OF CHIRP SIGNALS IN PRESENCE OF STATIONARY NOISE DEBASIS KUNDU AND SWAGATA NANDI Abstract. The problem of parameter estimation of the chirp signals in presence of stationary noise
More informationThe Spectral Density Estimation of Stationary Time Series with Missing Data
The Spectral Density Estimation of Stationary Time Series with Missing Data Jian Huang and Finbarr O Sullivan Department of Statistics University College Cork Ireland Abstract The spectral estimation of
More informationOn Least Absolute Deviation Estimators For One Dimensional Chirp Model
On Least Absolute Deviation Estimators For One Dimensional Chirp Model Ananya Lahiri & Debasis Kundu, & Amit Mitra Abstract It is well known that the least absolute deviation (LAD) estimators are more
More informationFourier Series and Fourier Transforms
Fourier Series and Fourier Transforms EECS2 (6.082), MIT Fall 2006 Lectures 2 and 3 Fourier Series From your differential equations course, 18.03, you know Fourier s expression representing a T -periodic
More informationBackground ODEs (2A) Young Won Lim 3/7/15
Background ODEs (2A) Copyright (c) 2014-2015 Young W. Lim. Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any
More information( ) 2 75( ) 3
Chemistry 380.37 Dr. Jean M. Standard Homework Problem Set 3 Solutions 1. The part of a particular MM3-like force field that describes stretching energy for an O-H single bond is given by the following
More informationChapter 2: Complex numbers
Chapter 2: Complex numbers Complex numbers are commonplace in physics and engineering. In particular, complex numbers enable us to simplify equations and/or more easily find solutions to equations. We
More informationCHAPTER 4 FOURIER SERIES S A B A R I N A I S M A I L
CHAPTER 4 FOURIER SERIES 1 S A B A R I N A I S M A I L Outline Introduction of the Fourier series. The properties of the Fourier series. Symmetry consideration Application of the Fourier series to circuit
More informationTime Series. Anthony Davison. c
Series Anthony Davison c 2008 http://stat.epfl.ch Periodogram 76 Motivation............................................................ 77 Lutenizing hormone data..................................................
More information8.2 Harmonic Regression and the Periodogram
Chapter 8 Spectral Methods 8.1 Introduction Spectral methods are based on thining of a time series as a superposition of sinusoidal fluctuations of various frequencies the analogue for a random process
More informationASYMPTOTIC PROPERTIES OF THE LEAST SQUARES ESTIMATORS OF MULTIDIMENSIONAL EXPONENTIAL SIGNALS
ASYMPTOTIC PROPERTIES OF THE LEAST SQUARES ESTIMATORS OF MULTIDIMENSIONAL EXPONENTIAL SIGNALS Debasis Kundu Department of Mathematics Indian Institute of Technology Kanpur Kanpur, Pin 20806 India Abstract:
More informationIterative Matching Pursuit and its Applications in Adaptive Time-Frequency Analysis
Iterative Matching Pursuit and its Applications in Adaptive Time-Frequency Analysis Zuoqiang Shi Mathematical Sciences Center, Tsinghua University Joint wor with Prof. Thomas Y. Hou and Sparsity, Jan 9,
More informationCMPT 318: Lecture 5 Complex Exponentials, Spectrum Representation
CMPT 318: Lecture 5 Complex Exponentials, Spectrum Representation Tamara Smyth, tamaras@cs.sfu.ca School of Computing Science, Simon Fraser University January 23, 2006 1 Exponentials The exponential is
More informationLIST OF PUBLICATIONS
LIST OF PUBLICATIONS Papers in referred journals [1] Estimating the ratio of smaller and larger of two uniform scale parameters, Amit Mitra, Debasis Kundu, I.D. Dhariyal and N.Misra, Journal of Statistical
More informationPhasor Young Won Lim 05/19/2015
Phasor Copyright (c) 2009-2015 Young W. Lim. Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version
More informationYou must continuously work on this project over the course of four weeks.
The project Five project topics are described below. You should choose one the projects. Maximum of two people per project is allowed. If two people are working on a topic they are expected to do double
More informationFOURIER ANALYSIS. (a) Fourier Series
(a) Fourier Series FOURIER ANAYSIS (b) Fourier Transforms Useful books: 1. Advanced Mathematics for Engineers and Scientists, Schaum s Outline Series, M. R. Spiegel - The course text. We follow their notation
More informationSecond order linear equations
Second order linear equations Samy Tindel Purdue University Differential equations - MA 266 Taken from Elementary differential equations by Boyce and DiPrima Samy T. Second order equations Differential
More informationPATTERN RECOGNITION AND MACHINE LEARNING
PATTERN RECOGNITION AND MACHINE LEARNING Slide Set 2: Estimation Theory January 2018 Heikki Huttunen heikki.huttunen@tut.fi Department of Signal Processing Tampere University of Technology Classical Estimation
More informationLinear Methods for Prediction
Chapter 5 Linear Methods for Prediction 5.1 Introduction We now revisit the classification problem and focus on linear methods. Since our prediction Ĝ(x) will always take values in the discrete set G we
More informationMATH 53H - Solutions to Problem Set III
MATH 53H - Solutions to Problem Set III. Fix λ not an eigenvalue of L. Then det(λi L) 0 λi L is invertible. We have then p L+N (λ) = det(λi L N) = det(λi L) det(i (λi L) N) = = p L (λ) det(i (λi L) N)
More informationMachine Learning. 7. Logistic and Linear Regression
Sapienza University of Rome, Italy - Machine Learning (27/28) University of Rome La Sapienza Master in Artificial Intelligence and Robotics Machine Learning 7. Logistic and Linear Regression Luca Iocchi,
More information12.1. Exponential shift. The calculation (10.1)
62 12. Resonance and the exponential shift law 12.1. Exponential shift. The calculation (10.1) (1) p(d)e = p(r)e extends to a formula for the effect of the operator p(d) on a product of the form e u, where
More informationNon-linear least squares
Non-linear least squares Concept of non-linear least squares We have extensively studied linear least squares or linear regression. We see that there is a unique regression line that can be determined
More informationProblem Set 2 Solution Sketches Time Series Analysis Spring 2010
Problem Set 2 Solution Sketches Time Series Analysis Spring 2010 Forecasting 1. Let X and Y be two random variables such that E(X 2 ) < and E(Y 2 )
More informationSF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES
SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES This document is meant as a complement to Chapter 4 in the textbook, the aim being to get a basic understanding of spectral densities through
More informationTopic 3: Fourier Series (FS)
ELEC264: Signals And Systems Topic 3: Fourier Series (FS) o o o o Introduction to frequency analysis of signals CT FS Fourier series of CT periodic signals Signal Symmetry and CT Fourier Series Properties
More informationEconomics 520. Lecture Note 19: Hypothesis Testing via the Neyman-Pearson Lemma CB 8.1,
Economics 520 Lecture Note 9: Hypothesis Testing via the Neyman-Pearson Lemma CB 8., 8.3.-8.3.3 Uniformly Most Powerful Tests and the Neyman-Pearson Lemma Let s return to the hypothesis testing problem
More informationAssignment 3 Solutions
Assignment Solutions Networks and systems August 8, 7. Consider an LTI system with transfer function H(jw) = input is sin(t + π 4 ), what is the output? +jw. If the Solution : C For an LTI system with
More informationCONTENTS NOTATIONAL CONVENTIONS GLOSSARY OF KEY SYMBOLS 1 INTRODUCTION 1
DIGITAL SPECTRAL ANALYSIS WITH APPLICATIONS S.LAWRENCE MARPLE, JR. SUMMARY This new book provides a broad perspective of spectral estimation techniques and their implementation. It concerned with spectral
More informationMaximum Likelihood Estimation. only training data is available to design a classifier
Introduction to Pattern Recognition [ Part 5 ] Mahdi Vasighi Introduction Bayesian Decision Theory shows that we could design an optimal classifier if we knew: P( i ) : priors p(x i ) : class-conditional
More informationSECTION A. f(x) = ln(x). Sketch the graph of y = f(x), indicating the coordinates of any points where the graph crosses the axes.
SECTION A 1. State the maximal domain and range of the function f(x) = ln(x). Sketch the graph of y = f(x), indicating the coordinates of any points where the graph crosses the axes. 2. By evaluating f(0),
More informationChapter 4 The Fourier Series and Fourier Transform
Chapter 4 The Fourier Series and Fourier Transform Fourier Series Representation of Periodic Signals Let x(t) be a CT periodic signal with period T, i.e., xt ( + T) = xt ( ), t R Example: the rectangular
More informationChapter 4 The Fourier Series and Fourier Transform
Chapter 4 The Fourier Series and Fourier Transform Representation of Signals in Terms of Frequency Components Consider the CT signal defined by N xt () = Acos( ω t+ θ ), t k = 1 k k k The frequencies `present
More informationNon-maximum likelihood estimation and statistical inference for linear and nonlinear mixed models
Optimum Design for Mixed Effects Non-Linear and generalized Linear Models Cambridge, August 9-12, 2011 Non-maximum likelihood estimation and statistical inference for linear and nonlinear mixed models
More informationAR(p) + I(d) + MA(q) = ARIMA(p, d, q)
AR(p) + I(d) + MA(q) = ARIMA(p, d, q) Outline 1 4.1: Nonstationarity in the Mean 2 ARIMA Arthur Berg AR(p) + I(d)+ MA(q) = ARIMA(p, d, q) 2/ 19 Deterministic Trend Models Polynomial Trend Consider the
More informationPrincipal Component Analysis of solar magnetic field and prediction of solar activity on a millennium timescale
Principal Component Analysis of solar magnetic field and prediction of solar activity on a millennium timescale a V.V. Zharkova 1, Shepherd S.J. 2, Popova E. 3 and [ S.I. Zharkov 4 Zharkov et al., 2008,
More informationFitting Linear Statistical Models to Data by Least Squares: Introduction
Fitting Linear Statistical Models to Data by Least Squares: Introduction Radu Balan, Brian R. Hunt and C. David Levermore University of Maryland, College Park University of Maryland, College Park, MD Math
More informationCheng Soon Ong & Christian Walder. Canberra February June 2018
Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 (Many figures from C. M. Bishop, "Pattern Recognition and ") 1of 254 Part V
More informationClassical Mechanics Phys105A, Winter 2007
Classical Mechanics Phys5A, Winter 7 Wim van Dam Room 59, Harold Frank Hall vandam@cs.ucsb.edu http://www.cs.ucsb.edu/~vandam/ Phys5A, Winter 7, Wim van Dam, UCSB Midterm New homework has been announced
More informationSGN Advanced Signal Processing: Lecture 8 Parameter estimation for AR and MA models. Model order selection
SG 21006 Advanced Signal Processing: Lecture 8 Parameter estimation for AR and MA models. Model order selection Ioan Tabus Department of Signal Processing Tampere University of Technology Finland 1 / 28
More informationChapter 10: Sinusoidal Steady-State Analysis
Chapter 10: Sinusoidal Steady-State Analysis 1 Objectives : sinusoidal functions Impedance use phasors to determine the forced response of a circuit subjected to sinusoidal excitation Apply techniques
More informationReview of Frequency Domain Fourier Series: Continuous periodic frequency components
Today we will review: Review of Frequency Domain Fourier series why we use it trig form & exponential form how to get coefficients for each form Eigenfunctions what they are how they relate to LTI systems
More informationNoise Space Decomposition Method for two dimensional sinusoidal model
Noise Space Decomposition Method for two dimensional sinusoidal model Swagata Nandi & Debasis Kundu & Rajesh Kumar Srivastava Abstract The estimation of the parameters of the two dimensional sinusoidal
More information7. Forecasting with ARIMA models
7. Forecasting with ARIMA models 309 Outline: Introduction The prediction equation of an ARIMA model Interpreting the predictions Variance of the predictions Forecast updating Measuring predictability
More informationNon-parametric identification
Non-parametric Non-parametric Transient Step-response using Spectral Transient Correlation Frequency function estimate Spectral System Identification, SSY230 Non-parametric 1 Non-parametric Transient Step-response
More informationSolutions to Laplace s Equation in Cylindrical Coordinates and Numerical solutions. ρ + (1/ρ) 2 V
Solutions to Laplace s Equation in Cylindrical Coordinates and Numerical solutions Lecture 8 1 Introduction Solutions to Laplace s equation can be obtained using separation of variables in Cartesian and
More informationTest #2 Math 2250 Summer 2003
Test #2 Math 225 Summer 23 Name: Score: There are six problems on the front and back of the pages. Each subpart is worth 5 points. Show all of your work where appropriate for full credit. ) Show the following
More informationLinear and Nonlinear Oscillators (Lecture 2)
Linear and Nonlinear Oscillators (Lecture 2) January 25, 2016 7/441 Lecture outline A simple model of a linear oscillator lies in the foundation of many physical phenomena in accelerator dynamics. A typical
More informationThe Harmonic Oscillator
The Harmonic Oscillator Math 4: Ordinary Differential Equations Chris Meyer May 3, 008 Introduction The harmonic oscillator is a common model used in physics because of the wide range of problems it can
More informationMachine Learning 2017
Machine Learning 2017 Volker Roth Department of Mathematics & Computer Science University of Basel 21st March 2017 Volker Roth (University of Basel) Machine Learning 2017 21st March 2017 1 / 41 Section
More informationLecture 5: Estimation of time series
Lecture 5, page 1 Lecture 5: Estimation of time series Outline of lesson 5 (chapter 4) (Extended version of the book): a.) Model formulation Explorative analyses Model formulation b.) Model estimation
More informationENGI 2422 First Order ODEs - Separable Page 3-01
ENGI 4 First Order ODEs - Separable Page 3-0 3. Ordinary Differential Equations Equations involving only one independent variable and one or more dependent variables, together with their derivatives with
More informationThe Caterpillar -SSA approach to time series analysis and its automatization
The Caterpillar -SSA approach to time series analysis and its automatization Th.Alexandrov,.Golyandina theo@pdmi.ras.ru, nina@ng1174.spb.edu St.Petersburg State University Caterpillar -SSA and its automatization
More informationSinusoids. Amplitude and Magnitude. Phase and Period. CMPT 889: Lecture 2 Sinusoids, Complex Exponentials, Spectrum Representation
Sinusoids CMPT 889: Lecture Sinusoids, Complex Exponentials, Spectrum Representation Tamara Smyth, tamaras@cs.sfu.ca School of Computing Science, Simon Fraser University September 6, 005 Sinusoids are
More informationChapter 4: Asymptotic Properties of the MLE (Part 2)
Chapter 4: Asymptotic Properties of the MLE (Part 2) Daniel O. Scharfstein 09/24/13 1 / 1 Example Let {(R i, X i ) : i = 1,..., n} be an i.i.d. sample of n random vectors (R, X ). Here R is a response
More informationHonors Differential Equations
MIT OpenCourseWare http://ocw.mit.edu 18.034 Honors Differential Equations Spring 009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. LECTURE 7. MECHANICAL
More informationCMPT 889: Lecture 2 Sinusoids, Complex Exponentials, Spectrum Representation
CMPT 889: Lecture 2 Sinusoids, Complex Exponentials, Spectrum Representation Tamara Smyth, tamaras@cs.sfu.ca School of Computing Science, Simon Fraser University September 26, 2005 1 Sinusoids Sinusoids
More informationStat 535 C - Statistical Computing & Monte Carlo Methods. Arnaud Doucet.
Stat 535 C - Statistical Computing & Monte Carlo Methods Arnaud Doucet Email: arnaud@cs.ubc.ca 1 Suggested Projects: www.cs.ubc.ca/~arnaud/projects.html First assignement on the web this afternoon: capture/recapture.
More informationLecture 32: Infinite-dimensional/Functionvalued. Functions and Random Regressions. Bruce Walsh lecture notes Synbreed course version 11 July 2013
Lecture 32: Infinite-dimensional/Functionvalued Traits: Covariance Functions and Random Regressions Bruce Walsh lecture notes Synbreed course version 11 July 2013 1 Longitudinal traits Many classic quantitative
More informationRowan University Department of Electrical and Computer Engineering
Rowan University Department of Electrical and Computer Engineering Estimation and Detection Theory Fall 2013 to Practice Exam II This is a closed book exam. There are 8 problems in the exam. The problems
More informationMachine Learning Lecture 7
Course Outline Machine Learning Lecture 7 Fundamentals (2 weeks) Bayes Decision Theory Probability Density Estimation Statistical Learning Theory 23.05.2016 Discriminative Approaches (5 weeks) Linear Discriminant
More informationCh 4. Linear Models for Classification
Ch 4. Linear Models for Classification Pattern Recognition and Machine Learning, C. M. Bishop, 2006. Department of Computer Science and Engineering Pohang University of Science and echnology 77 Cheongam-ro,
More informationRaktim Bhattacharya. . AERO 632: Design of Advance Flight Control System. Preliminaries
. AERO 632: of Advance Flight Control System. Preliminaries Raktim Bhattacharya Laboratory For Uncertainty Quantification Aerospace Engineering, Texas A&M University. Preliminaries Signals & Systems Laplace
More information1. (10 points) Find the general solution to the following second-order differential equation:
Math 307A, Winter 014 Midterm Solutions Page 1 of 8 1. (10 points) Find the general solution to the following second-order differential equation: 4y 1y + 9y = 9t. To find the general solution to this nonhomogeneous
More informationSolutions to the Homework Replaces Section 3.7, 3.8
Solutions to the Homework Replaces Section 3.7, 3.8. Show that the period of motion of an undamped vibration of a mass hanging from a vertical spring is 2π L/g SOLUTION: With no damping, mu + ku = 0 has
More informationTime Series Analysis. Solutions to problems in Chapter 7 IMM
Time Series Analysis Solutions to problems in Chapter 7 I Solution 7.1 Question 1. As we get by subtraction kx t = 1 + B + B +...B k 1 )ǫ t θb) = 1 + B +... + B k 1 ), and BθB) = B + B +... + B k ), 1
More informationLECTURE NOTES IN AUDIO ANALYSIS: PITCH ESTIMATION FOR DUMMIES
LECTURE NOTES IN AUDIO ANALYSIS: PITCH ESTIMATION FOR DUMMIES Abstract March, 3 Mads Græsbøll Christensen Audio Analysis Lab, AD:MT Aalborg University This document contains a brief introduction to pitch
More informationLecture 2. Contents. 1 Fermi s Method 2. 2 Lattice Oscillators 3. 3 The Sine-Gordon Equation 8. Wednesday, August 28
Lecture 2 Wednesday, August 28 Contents 1 Fermi s Method 2 2 Lattice Oscillators 3 3 The Sine-Gordon Equation 8 1 1 Fermi s Method Feynman s Quantum Electrodynamics refers on the first page of the first
More informationAspects of Continuous- and Discrete-Time Signals and Systems
Aspects of Continuous- and Discrete-Time Signals and Systems C.S. Ramalingam Department of Electrical Engineering IIT Madras C.S. Ramalingam (EE Dept., IIT Madras) Networks and Systems 1 / 45 Scaling the
More informationTraveling Harmonic Waves
Traveling Harmonic Waves 6 January 2016 PHYC 1290 Department of Physics and Atmospheric Science Functional Form for Traveling Waves We can show that traveling waves whose shape does not change with time
More informationProblem 1: Lagrangians and Conserved Quantities. Consider the following action for a particle of mass m moving in one dimension
105A Practice Final Solutions March 13, 01 William Kelly Problem 1: Lagrangians and Conserved Quantities Consider the following action for a particle of mass m moving in one dimension S = dtl = mc dt 1
More informationMath 216 Second Midterm 19 March, 2018
Math 26 Second Midterm 9 March, 28 This sample exam is provided to serve as one component of your studying for this exam in this course. Please note that it is not guaranteed to cover the material that
More informationSolutions to the Homework Replaces Section 3.7, 3.8
Solutions to the Homework Replaces Section 3.7, 3.8 1. Our text (p. 198) states that µ ω 0 = ( 1 γ2 4km ) 1/2 1 1 2 γ 2 4km How was this approximation made? (Hint: Linearize 1 x) SOLUTION: We linearize
More informationOn Moving Average Parameter Estimation
On Moving Average Parameter Estimation Niclas Sandgren and Petre Stoica Contact information: niclas.sandgren@it.uu.se, tel: +46 8 473392 Abstract Estimation of the autoregressive moving average (ARMA)
More informationLECTURE 12 Sections Introduction to the Fourier series of periodic signals
Signals and Systems I Wednesday, February 11, 29 LECURE 12 Sections 3.1-3.3 Introduction to the Fourier series of periodic signals Chapter 3: Fourier Series of periodic signals 3. Introduction 3.1 Historical
More informationFrom Data To Functions Howdowegofrom. Basis Expansions From multiple linear regression: The Monomial Basis. The Monomial Basis
From Data To Functions Howdowegofrom Basis Expansions From multiple linear regression: data to functions? Or if there is curvature: y i = β 0 + x 1i β 1 + x 2i β 2 + + ɛ i y i = β 0 + x i β 1 + xi 2 β
More informationω (rad/s)
1. (a) From the figure we see that the signal has energy content in frequencies up to about 15rad/s. According to the sampling theorem, we must therefore sample with at least twice that frequency: 3rad/s
More informationReading Group on Deep Learning Session 1
Reading Group on Deep Learning Session 1 Stephane Lathuiliere & Pablo Mesejo 2 June 2016 1/31 Contents Introduction to Artificial Neural Networks to understand, and to be able to efficiently use, the popular
More informationTime Series Examples Sheet
Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,
More informationSTAT5044: Regression and Anova
STAT5044: Regression and Anova Inyoung Kim 1 / 15 Outline 1 Fitting GLMs 2 / 15 Fitting GLMS We study how to find the maxlimum likelihood estimator ˆβ of GLM parameters The likelihood equaions are usually
More information2.3 Oscillation. The harmonic oscillator equation is the differential equation. d 2 y dt 2 r y (r > 0). Its solutions have the form
2. Oscillation So far, we have used differential equations to describe functions that grow or decay over time. The next most common behavior for a function is to oscillate, meaning that it increases and
More informationThe Klein-Gordon Equation Meets the Cauchy Horizon
Enrico Fermi Institute and Department of Physics University of Chicago University of Mississippi May 10, 2005 Relativistic Wave Equations At the present time, our best theory for describing nature is Quantum
More informationLesson 1. Optimal signalbehandling LTH. September Statistical Digital Signal Processing and Modeling, Hayes, M:
Lesson 1 Optimal Signal Processing Optimal signalbehandling LTH September 2013 Statistical Digital Signal Processing and Modeling, Hayes, M: John Wiley & Sons, 1996. ISBN 0471594318 Nedelko Grbic Mtrl
More informationHOMEWORK 4: MATH 265: SOLUTIONS. y p = cos(ω 0t) 9 ω 2 0
HOMEWORK 4: MATH 265: SOLUTIONS. Find the solution to the initial value problems y + 9y = cos(ωt) with y(0) = 0, y (0) = 0 (account for all ω > 0). Draw a plot of the solution when ω = and when ω = 3.
More informationFinal Examination Linear Partial Differential Equations. Matthew J. Hancock. Feb. 3, 2006
Final Examination 8.303 Linear Partial ifferential Equations Matthew J. Hancock Feb. 3, 006 Total points: 00 Rules [requires student signature!]. I will use only pencils, pens, erasers, and straight edges
More information