Recursive Identification

Size: px
Start display at page:

Download "Recursive Identification"

Transcription

1 Chaper 8 Recursive Idenificaion Given a curren esimaed model and a new observaion, how should we updae his model in order o ake his new piece of informaion ino accoun? In many cases i is beneficial o have a model of he sysem available online while he sysem is in operaion. The model should hen be based on he observaions up ill he curren ime. A naive way o go ahead is o use all observaions up o o compue an esimae ˆ of he sysem parameers. In recursive idenificaion mehods, he parameer esimaes are compued recursively over ime: suppose we have an esimae ˆ a ieraion, hen recursive idenificaion aims o compue a new esimae ˆ by a simple modificaion of ˆ when a new observaion becomes available a ieraion. The counerpar o online mehods are he so-called o ine or bach mehods in which all he observaions are used simulaneously o esimae he model. Recursive mehods have he following general feaures: They are a cenral par in adapive sysems where he nex acion is based on he laes esimaed parameers. Typical examples are found in adapive conrol or adapive filering applicaions. Memory and compuaional requiremens a any imesep has o be modes. Specifically, one ofen requires ha boh are independen o he lengh of he hisory a any imesep. They are ofen applied o real-ime seings, where he rue underlying parameers are changing over ime (i.e. racking applicaions). They are ofen used for faul deecion sysems. Here one wans o deec when he observed signals or he underlying sysem di ers significanly from wha one would associae from a normal operaion modus. In general, he echniques go wih he same saisical properies as heir counerpars in bach seing. For example, he RLS gives consisen esimaes under he condiions as discussed in Secion 5.3. Tha is, he discussion on he recursive esimaors is ofen concerned wih compuaional issues. 23

2 8.. RECURSIVE LEAST SQUARES 8. Recursive Leas Squares Le us sar his secion wih perhaps he simples applicaion possible, neverheless inroducing ideas. Example 50 (RLS for Esimaing a Consan) Given he following sysem y = 0 + e, 8 =, 2,... (8.) In chaper 2, example we derive how he leas squares esimae of 0 using he firs observaions is given as he arihmeic (sample) mean, i.e. ˆ = y i. (8.2) i= Now i is no oo di ˆ = cul o rewrie his in a recursive form. X y i + y = ( )ˆ + y = ˆ + i= y ˆ. (8.3) This resul is quie appealing: he new esimae ˆ equals he previous esimae ˆ plus a small correcion erm. The correcion erm is proporional o he deviaion of he predicion ˆ and he observaion y. Moreover he correcion erm is weighed by he erm, which implies ha he magniude of he correcion will decrease in ime. Insead he esimae ˆ will become more reliable. In case a proper sochasic framework is assumed (see chaper 5, secion 3), he variance of ˆ becomes which can in urn be compued recursively as = =, (8.4) + = +. (8.5) In order o generalize he resul, we need he following well-known marix properies. Lemma 9 (Marix Inversion Lemma) Le Z 2 R d d be a posiive definie marix wih unique inverse Z, and le z 2 R d be any vecor, hen where Z + = Z + zz T. Z + =(Z + zz T ) = Z Z zz T Z T +z T Z z, (8.6) 24

3 8.. RECURSIVE LEAST SQUARES In words, he inverse of a marix wih a rank-one updae can be wrien in closed form using he inverse of he marix and a small correcion. roof: The proof is insrumenal. (Z + zz T ) Z Z zz T Z Z zz T Z +z T Z = I d Z z +z T Z z zz T Z zz T Z = I d +z T Z + z +z T Z z ( + zz z) zz T Z = I d + +z T Z z Now, noe ha (z T Z z) is a scalar, and hus zz T Z = I d + +z T Z (z T Z z) z as desired. + zz T Z (zz T ) (zz T Z ) Z zz T Z +z T Z z zz T Z zz T Z +z T Z z zz T Z zz T Z +z T Z z. (z T Z z)zz T Z +z T Z = I d, (8.7) z The previous example serves as a blueprin of he Recursive Leas Squares (RLS) algorihm, which we now will develop in full. Given a model for he observaions {(x,y )} R d given as y = T 0 x + e, 8 =, 2,..., (8.8) where 0 2 R d and he erms {e } are he corresponding residuals. Then chaper 2 learns us ha he LS soluion based on he observaions x i : i =,...,will be given as he soluion o he normal equaions x i x T i ˆ = y i x i. (8.9) i= Assume for now ha he soluion ˆ is unique, i.e. he marix R = i= x ix T i can be invered. Since rivially one has R = R x x T, (8.0) if follows ha ˆ = R = R i= X y i x i + y x i= = ˆ + R R ˆ + y x y x (x x T )ˆ = ˆ + R x y ˆ T x, (8.) and in summary 8 >< =(y x T ˆ ) K = R x >: ˆ = ˆ + K. (8.2) 25

4 8.. RECURSIVE LEAST SQUARES Here he erm will be inerpreed as he predicion error: i is he di erence beween he observed sample y and he prediced value x T ˆ.If is small, he esimae ˆ is good and should no be modified much. The marix K is inerpreed as he weighing or gain marix characerizing how much each elemen of he parameer vecor ˆ should be modified by. The RLS algorihm is compleed by circumvening he marix inversion of R in each imesep. Hereo, we can use he marix inversion Lemma. R = R R x x T R +x T R x. (8.3) Noe ha as such we subsiue he marix inversion by a simple scalar division. K = R x = R x R x (x T R x ) +x T R x = +x T R x R x. (8.4) Given iniial values R 0 and ˆ 0, he final RLS algorihm can as such be wrien as 8 =(y x T ˆ ) >< = x x T +x T x K = x = >: ˆ = ˆ + K, +x T x x (8.5) where we use = R for any. For e ciency reasons, one can We will come back o he imporan issue on how o choose he iniial values 0 and ˆ 0 in Subsecion Real-ime Idenificaion This subsecion presens some ideas which are useful in case he RLS algorihm is applied for racking ime-varying parameers. This is for example he case when he rue parameer vecor 0 varies over ime, and are as such denoed as { 0, }. This seing is referred o as he real-ime idenificaion seing. There are wo common approaches o modify he RLS algorihm o handle such case: (i) use of a forgeing facor (his subsecion); (ii) use of a Kalman filer as a parameer esimaor (nex subsecion). The forgeing facor approach sars from a slighly modified loss funcion V ( ) = s= s (y T x ) 2. (8.6) The Squared Loss funcion lying a he basis of RLS is recovered when =. If is se o some value slighly smaller han (say = 0.99 or = 0.95), one has ha for increasing pas observaions are discouned. The smaller go, he quicker informaion obained from previous daa will be forgoen, and hence he name. I is now sraighforward o re-derive he RLS based 26

5 8.. RECURSIVE LEAST SQUARES on (8.6), and he modified RLS becomes: 8 =(y x T ˆ ) >< = K = x = >: ˆ = ˆ + K. x x T +x T x +x T x x (8.7) Example 5 (Esimaor Windup) Ofen, some periods of he idenificaion experimen exhibi poor exciaion. This causes problems for he idenificaion algorihms. Consider he siuaion where ' =0in he RLS algorihm, hen (ˆ = ˆ =, (8.8) Noice ha ˆ remains consan during his period,... and increases exponenially wih ime when <. When he sysem is excied again (' 6=0), he esimaion gain K will be very large, and here will be an abrup change in he esimae, despie he fac ha he sysem has no changed. This e ec is referred o as esimaor windup. Since he sudy of Kalman filers will come back in some deail in laer chapers, we rea he Kalman filer inerpreaion as merely an example here. Example 52 (RLS as a Kalman Filer) A sochasic sae-space sysem akes he form ( X + = F X + V Y = H X + W 8 =, 2, 3,..., (8.9) where {X 2 R n } denoe he sochasic saes, {Y 2 R m } denoe he observed oucomes. {V 2 R n } denoe he process noise. {W 2 R m } denoe he observaion noise. {F 2 R n n } are called he sysem marices {H 2 R m n } Now i is easily seen ha he problem of ime-invarian RLS esimaion can be wrien as ( + = Y = x T + E 8 =, 2, 3,..., (8.20) 27

6 8.. RECURSIVE LEAST SQUARES where = = = =... is he unknown sae one wans o esimae based on observaions {Y }. Hence one can phrase he problem as a filering problem, where he Kalman filer provides he opimal soluion o under appropriae assumpions, evenually reducing o (8.5). The benefi of his is ha one can exend he model sraighforwardly by including unknown process noise erms {V }, modeling he drifing rue values as a random walk - approaching e ecively he real-ime idenificaion problem. Suppose {V,...,V,...} are sampled independenly from a Gaussian wih mean zero and covariance V 2 R n n, hen he Kalman filer would become 8 =(y x T ˆ ) >< = x x T + V +x T x (8.2) K = x >: ˆ = ˆ + K. Observe ha boh in case (8.7) as in (8.2) he basic RLS algorihm is modified such ha will no longer end o zero. In his way K also is prevened from decreasing o zero. The parameer esimae will herefore change coninuously Choosing Iniial Values The choice of he iniial values is paramoun in real life applicaion of RLS schemes. Close inspecion of he meaning of helps us here. In he Kalman filer inerpreaion of RLS plays he role of he covariance marix of he esimae ˆ, as such suggesing ha in case one is no a all cerain of a cerain choice of ˆ 0, one should ake a large 0 ; if one is fairly confiden in a cerain choice of ˆ 0, 0 should be aken small. If 0 is small, so will {K } >0 and he esimae {ˆ } will no change oo much from ˆ 0. If 0 would be large, ˆ will quickly jump away from ˆ 0. Wihou a priori knowledge, i is common pracice o ake he following iniial values (ˆ =0d (8.22) 0 = I d, wih I d = diag(,...,) 2 R d d he ideniy marix, and >0 a large number. The e ec on he choice of he iniial values (or he ransien behavior) can be derived algebraically. Consider he basic RLS algorihm (8.5). Then Now se Then z = R ˆ +x = R + x x T And hence ˆ ˆ = z = R 0 + R = R 0 + x x. (8.23) s= z = R ˆ. (8.24) +x y ˆ T x = z +x y = z 0 + x s x T s s= 28 R 0 ˆ 0 + x s y s. (8.25) s= x s y s. (8.26) s=

7 8.. RECURSIVE LEAST SQUARES So, if R 0 is small (i.e. 0 is large), hen ˆ is close o he o ine esimae = argmin (y s T x ) 2, (8.27) s= as seen by comparison of (8.26) wih he normal equaions associaed o (8.27) = x s x T s x s y s. (8.28) s= The mehods discussed in he above subsecions are appropriae o sysems ha are known o change slowly over ime. In such cases is chosen close o, or V is chosen as a small non-negaive posiive definie marix. If he sysem exhibis more likely from ime o ime some abrup changes of he parameers, echniques based on faul deecion migh be more suiable An ODE Analysis Simulaion no doub gives useful insigh. However, i is also clear ha i does no permi generally valid conclusions o be drawn, and herefore i is only a complemen o heory. The scope of a heoreical derivaion would in paricular be o sudy wheher he parameer esimaes ˆ converge as ends o infiniy. If so, o wha limi? And if possible also o esablish he limiing disribuion of ˆ. A successful approach considers he sequence {ˆ } =0,,2 as approximaing a coninuous vecor valued funcion { : R + R d }. This coninuos funcion evaluaed a a ime insan > 0is denoed as ( ), and he whole sequence is described as an Ordinary Di erenial Equaion. Such approach ypically adops a sochasic seing where {Y } is a sochasic process, and {X } is a vecor valued sochasic process, and boh have bounded firs- and second momens. Recall ha he minimal MMSE 2 R d is hen given as he soluion o E X X T = E [X Y ]. (8.29) Define again R = E X X T, and suppose his one is inverible. Define he funcional r (recall ha here is a funcion) as: r( ) =E X (Y X T ). (8.30) Now, consider he following ( ) = R r( ( )). If his ODE is solved numerically by an Euler mehod on discreizaion seps, 2,... one ges s= k k +( k k )R E[Y X T k ]. (8.32) Noe he similariy beween (8.32) and he algorihm (8.5), suggesing ha he soluions o he deerminisic ODE will be close in some sense. Specifically, consider he following recursion described by he algorihm ˆ = ˆ + R X (Y X T ˆ ). (8.33) wih ˆ 0 given. Then he pahs described by his discree recursion will be similar o he soluions { k } k using he imescale ( k k )=. The above saemens can be made quie precise as was done in [?]. The sudy of his ODE gives us new insigh in he RLS algorihm, including: 29

8 8.2. OTHER ALGORITHMS. The rajecories which solve he ODE are he expeced pahs of he algorihm. 2. Assume ha here is a posiive funcion V (, R) such ha along along he soluions of he ODE we have V ( ( ), R) apple 0. Then as, ( ) D c V ( ( ), R) =0, or o he boundary of he se of feasible soluions. In oher words, ( ) for go o he sable saionary poins of he ODE. Equivalenly, ˆ converge locally o a soluion in D c. 8.2 Oher Algorihms Since he problem of recursive idenificaion, adapive filering or online esimaion is so ubiquious, i comes as no surprise ha many di eren approaches exis. This secion reviews hree common variaions Recursive Insrumenal Variables Recall ha insrumenal variable echniques come ino he picure when he noise is known o be srongly colored, and a plain LSE is no consisen. An insrumenal variable esimaor uses random insrumens {Z } which are known o be independen o he noise of he sysem. Then we look for he parameers which mach his propery using he sample correlaions insead. Formally, consider he saisical sysem Y = x T 0 + D, (8.35) where {D } is colored noise, and x 2 R d, wih deerminisic bu unknown vecor 0. Suppose we have d-dimensional insrumens Z such ha E [Z D ]=0 d. (8.36) Tha is, he insrumens are orhogonal o he noise. Then he (bach) IV esimaor n is given as he soluion of 2 R d o nx Z (Y x T ) =0 d, (8.37) = which look similar o he normal equaions. If n = (Z x T ) were inverible, hen he soluion is unique and can be wrien as n = nx Z x T = nx = Z Y T, (8.38) Now we can use he echniques used for RLS o consruc a recursive mehod o esimae when he daa comes in. I is a simple example o derive he algorihm, which is given as 8 =(y x T ˆ ) >< = Z x T +x T Z (8.39) K = Z >: ˆ = ˆ + K. 30

9 8.2. OTHER ALGORITHMS The discussion on he behavior of RLS w.r.. iniial variables and forgeing facor remains valid Recursive redicion Error Mehod Recall ha a EM mehod bases inference on maximizing performance of he bes predicor corresponding o a model. Also his echnique is sraighforwardly o phrase in a recursive form. = argmin V ( ) = k= k k ( ), (8.40) where 0 < apple is ypically chosen as 0.99, 0.95, 0.9. As before ( ) denoes he predicion errors of corresponding o model parameers, ha is ( ) =y ŷ ( ) whereŷ ( ) is he opimal predicor a he h insance. Now, unlike he previous algorihms, no closed form soluion of (8.40) exiss in general, and one resors o numerical opimizaion ools. Bu here is an opporuniy here: i is no oo di cul o inegrae -say- a Gauss-Newon sep in he opimizer wih he online proocol. To see how his goes, consider again he second order Taylor decomposiion of he loss funcion. Les assume we have a fairly good esimae ˆ a he previous insance V ( ) =V (ˆ )+V 0 (ˆ ) T ( ˆ )+ 2 ( ˆ ) T V 00 (ˆ )( ˆ ). (8.4) Now, he challenge is o compue gradien V 0 and Hessian V 00 recursively. Deails can be found in he book (Södersröm, Soica, 989), bu are necessarily ied o he adaped model and are ofen approximaive in naure Recursive seudo-linear Leas Squares The following example expresses an ARMAX as a pseudo-linear model as follows. Example 53 (ARMAX) Given an ARMAX sysem A(q )y = B(q )u + C(q )e, (8.42) of orders n a,n b,n c. Then his sysem can almos be wrien as a LI model as follows y = ' T 0 + e, (8.43) where ( ' =( y,..., y na,u,...,u nb, ê,...,ê nc ) T 0 =(a,...,a na,b,...,b nb,c,...,c nc ), (8.44) where ê is he predicion error compued based on he model parameers ˆ.Theraionaleisha in case, ê is a good proxy o he predicion errors e based on he parameers. Then he Recursive arial Leas Squares algorihm implemens a RLS sraegy based on his linearized model. Indeed one can prove ha he resuling esimaes do converge if he sysem is obeys some regulariy condiions. Specifically, if he sysem is almos unsable he recursive esimaes are ofen unsable (and diverging) as well. In pracice, he resuling algorihm needs monioring of he resuling esimaes in order o deec such divergen behavior. 3

10 8.3. MODEL SELECTION Sochasic Approximaion The class of sochasic approximaion echniques ake a quie di eren perspecive on he recursive idenificaion problem. Here he parameer esimae ˆ obained previously is modified such ha i is beer suied for explaining he new sample relaed o (',y ). Formally, a new esimae ˆ is obained from he given ˆ and he sample (',y ) by solving for a given >0 he opimizaion problem ˆ = argmin J ( ) =( ˆ ) T ( ˆ )+ x T 2 y. (8.45) The opimal resul is hen given direcly as ˆ = ˆ x T y ', (8.46) obained by equaing he derivaive of J ( ) o zero. The algorihm is hen compleed by specificaion of he iniial esimae ˆ 0. This recursion gives hen wha is called he Leas Mean Squares (LMS) algorihm. This is he building sone of many implemenaions of adapive filering. The naming convenion sochasic approximaion is moivaed as follows. The correcion a insance is based on he gradien of a single poin (x,y ), and is a very noisy esimae of he overall gradien. A variaion of his algorihm is given by he recursion ˆ = ˆ kx k 2 + xt y, (8.47) wih > 0 small, and where ˆ 0 is given. This recursion is he basis of he Normalized LMS algorihm. The raionale is ha here each sample modifies he presen esimae proporional how close he esimae is o he working poin 0 d. 8.3 Model Selecion As in he bach seing, i is paramoun o be able o qualify and quanify how well our recursive algorihms succeeds in is ask. Bu he concepual and pracical ways o do urn ou o be enirely di eren. As i sands here is no comprehensive heoreical framework for his quesion, bu some insigh is gained in he following example. Example 54 (redicing Random noise) As seen, a lo of fancy mahemaics can be brough in o form complex recursive schemes, bu a he end of he day he mehods implemened need merely may good predicions. I helps o reason abou his objecive by considering he predicion of random whie noise: by consrucion his is impossible o do beer han ŷ =0(why?). A mehod rying o fi a complex model o such daa will necessarily do worse han his simple predicor, and he example is ofen used as a validiy check of a new mehod. Excep for he radiional consideraions of bias and variance of a model, and he saisical uncerainy associaed wih esimaing parameers, oher issues include he following: Iniializaion of he parameers. If he iniial guess of he parameers is no adequae, he recursive algorihm migh ake much samples before correcing his (ransien e ec). Forgeing Facor. The choice of a forgeing facor makes a rade-o beween flexibiliy and accuracy. 32

11 8.3. MODEL SELECTION Window. If he window used for esimaing hen one mus decide on how many samples are used for esimaing a a cerain insance. Sabiliy of he esimae. If he algorihm a hand is no well-uned o he ask a hand, i may display diverging esimaes. This is clearly undesirable, and some algorihms go wih guaranees ha no such unsable behavior can occur. Gain. A ypical parameer which needs be uned concerns he size of he updae made a a new sample. If he gain is oo low, a resuling algorihm will no converge fasly. If he gain is oo large, one may risk unsable behavior. In order o check weher a recursive idenificaion is well-uned for a cerain applicaion, i is insrumenal o monior closely he online behavior of he mehod, and o make appropriae graphical illusraions of he mehod. 33

State-Space Models. Initialization, Estimation and Smoothing of the Kalman Filter

State-Space Models. Initialization, Estimation and Smoothing of the Kalman Filter Sae-Space Models Iniializaion, Esimaion and Smoohing of he Kalman Filer Iniializaion of he Kalman Filer The Kalman filer shows how o updae pas predicors and he corresponding predicion error variances when

More information

L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS. NA568 Mobile Robotics: Methods & Algorithms

L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS. NA568 Mobile Robotics: Methods & Algorithms L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS NA568 Mobile Roboics: Mehods & Algorihms Today s Topic Quick review on (Linear) Kalman Filer Kalman Filering for Non-Linear Sysems Exended Kalman Filer (EKF)

More information

Kriging Models Predicting Atrazine Concentrations in Surface Water Draining Agricultural Watersheds

Kriging Models Predicting Atrazine Concentrations in Surface Water Draining Agricultural Watersheds 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Kriging Models Predicing Arazine Concenraions in Surface Waer Draining Agriculural Waersheds Paul L. Mosquin, Jeremy Aldworh, Wenlin Chen Supplemenal Maerial Number

More information

Two Popular Bayesian Estimators: Particle and Kalman Filters. McGill COMP 765 Sept 14 th, 2017

Two Popular Bayesian Estimators: Particle and Kalman Filters. McGill COMP 765 Sept 14 th, 2017 Two Popular Bayesian Esimaors: Paricle and Kalman Filers McGill COMP 765 Sep 14 h, 2017 1 1 1, dx x Bel x u x P x z P Recall: Bayes Filers,,,,,,, 1 1 1 1 u z u x P u z u x z P Bayes z = observaion u =

More information

Notes on Kalman Filtering

Notes on Kalman Filtering Noes on Kalman Filering Brian Borchers and Rick Aser November 7, Inroducion Daa Assimilaion is he problem of merging model predicions wih acual measuremens of a sysem o produce an opimal esimae of he curren

More information

Vehicle Arrival Models : Headway

Vehicle Arrival Models : Headway Chaper 12 Vehicle Arrival Models : Headway 12.1 Inroducion Modelling arrival of vehicle a secion of road is an imporan sep in raffic flow modelling. I has imporan applicaion in raffic flow simulaion where

More information

Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles

Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles Diebold, Chaper 7 Francis X. Diebold, Elemens of Forecasing, 4h Ediion (Mason, Ohio: Cengage Learning, 006). Chaper 7. Characerizing Cycles Afer compleing his reading you should be able o: Define covariance

More information

Zürich. ETH Master Course: L Autonomous Mobile Robots Localization II

Zürich. ETH Master Course: L Autonomous Mobile Robots Localization II Roland Siegwar Margaria Chli Paul Furgale Marco Huer Marin Rufli Davide Scaramuzza ETH Maser Course: 151-0854-00L Auonomous Mobile Robos Localizaion II ACT and SEE For all do, (predicion updae / ACT),

More information

t is a basis for the solution space to this system, then the matrix having these solutions as columns, t x 1 t, x 2 t,... x n t x 2 t...

t is a basis for the solution space to this system, then the matrix having these solutions as columns, t x 1 t, x 2 t,... x n t x 2 t... Mah 228- Fri Mar 24 5.6 Marix exponenials and linear sysems: The analogy beween firs order sysems of linear differenial equaions (Chaper 5) and scalar linear differenial equaions (Chaper ) is much sronger

More information

Physics 235 Chapter 2. Chapter 2 Newtonian Mechanics Single Particle

Physics 235 Chapter 2. Chapter 2 Newtonian Mechanics Single Particle Chaper 2 Newonian Mechanics Single Paricle In his Chaper we will review wha Newon s laws of mechanics ell us abou he moion of a single paricle. Newon s laws are only valid in suiable reference frames,

More information

Econ107 Applied Econometrics Topic 7: Multicollinearity (Studenmund, Chapter 8)

Econ107 Applied Econometrics Topic 7: Multicollinearity (Studenmund, Chapter 8) I. Definiions and Problems A. Perfec Mulicollineariy Econ7 Applied Economerics Topic 7: Mulicollineariy (Sudenmund, Chaper 8) Definiion: Perfec mulicollineariy exiss in a following K-variable regression

More information

R t. C t P t. + u t. C t = αp t + βr t + v t. + β + w t

R t. C t P t. + u t. C t = αp t + βr t + v t. + β + w t Exercise 7 C P = α + β R P + u C = αp + βr + v (a) (b) C R = α P R + β + w (c) Assumpions abou he disurbances u, v, w : Classical assumions on he disurbance of one of he equaions, eg. on (b): E(v v s P,

More information

References are appeared in the last slide. Last update: (1393/08/19)

References are appeared in the last slide. Last update: (1393/08/19) SYSEM IDEIFICAIO Ali Karimpour Associae Professor Ferdowsi Universi of Mashhad References are appeared in he las slide. Las updae: 0..204 393/08/9 Lecure 5 lecure 5 Parameer Esimaion Mehods opics o be

More information

PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD

PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD HAN XIAO 1. Penalized Leas Squares Lasso solves he following opimizaion problem, ˆβ lasso = arg max β R p+1 1 N y i β 0 N x ij β j β j (1.1) for some 0.

More information

An introduction to the theory of SDDP algorithm

An introduction to the theory of SDDP algorithm An inroducion o he heory of SDDP algorihm V. Leclère (ENPC) Augus 1, 2014 V. Leclère Inroducion o SDDP Augus 1, 2014 1 / 21 Inroducion Large scale sochasic problem are hard o solve. Two ways of aacking

More information

Lecture 33: November 29

Lecture 33: November 29 36-705: Inermediae Saisics Fall 2017 Lecurer: Siva Balakrishnan Lecure 33: November 29 Today we will coninue discussing he boosrap, and hen ry o undersand why i works in a simple case. In he las lecure

More information

1 Review of Zero-Sum Games

1 Review of Zero-Sum Games COS 5: heoreical Machine Learning Lecurer: Rob Schapire Lecure #23 Scribe: Eugene Brevdo April 30, 2008 Review of Zero-Sum Games Las ime we inroduced a mahemaical model for wo player zero-sum games. Any

More information

14 Autoregressive Moving Average Models

14 Autoregressive Moving Average Models 14 Auoregressive Moving Average Models In his chaper an imporan parameric family of saionary ime series is inroduced, he family of he auoregressive moving average, or ARMA, processes. For a large class

More information

2.160 System Identification, Estimation, and Learning. Lecture Notes No. 8. March 6, 2006

2.160 System Identification, Estimation, and Learning. Lecture Notes No. 8. March 6, 2006 2.160 Sysem Idenificaion, Esimaion, and Learning Lecure Noes No. 8 March 6, 2006 4.9 Eended Kalman Filer In many pracical problems, he process dynamics are nonlinear. w Process Dynamics v y u Model (Linearized)

More information

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Course Noes for EE7C Spring 018: Convex Opimizaion and Approximaion Insrucor: Moriz Hard Email: hard+ee7c@berkeley.edu Graduae Insrucor: Max Simchowiz Email: msimchow+ee7c@berkeley.edu Ocober 15, 018 3

More information

T L. t=1. Proof of Lemma 1. Using the marginal cost accounting in Equation(4) and standard arguments. t )+Π RB. t )+K 1(Q RB

T L. t=1. Proof of Lemma 1. Using the marginal cost accounting in Equation(4) and standard arguments. t )+Π RB. t )+K 1(Q RB Elecronic Companion EC.1. Proofs of Technical Lemmas and Theorems LEMMA 1. Le C(RB) be he oal cos incurred by he RB policy. Then we have, T L E[C(RB)] 3 E[Z RB ]. (EC.1) Proof of Lemma 1. Using he marginal

More information

23.2. Representing Periodic Functions by Fourier Series. Introduction. Prerequisites. Learning Outcomes

23.2. Representing Periodic Functions by Fourier Series. Introduction. Prerequisites. Learning Outcomes Represening Periodic Funcions by Fourier Series 3. Inroducion In his Secion we show how a periodic funcion can be expressed as a series of sines and cosines. We begin by obaining some sandard inegrals

More information

Matlab and Python programming: how to get started

Matlab and Python programming: how to get started Malab and Pyhon programming: how o ge sared Equipping readers he skills o wrie programs o explore complex sysems and discover ineresing paerns from big daa is one of he main goals of his book. In his chaper,

More information

( ) ( ) if t = t. It must satisfy the identity. So, bulkiness of the unit impulse (hyper)function is equal to 1. The defining characteristic is

( ) ( ) if t = t. It must satisfy the identity. So, bulkiness of the unit impulse (hyper)function is equal to 1. The defining characteristic is UNIT IMPULSE RESPONSE, UNIT STEP RESPONSE, STABILITY. Uni impulse funcion (Dirac dela funcion, dela funcion) rigorously defined is no sricly a funcion, bu disribuion (or measure), precise reamen requires

More information

Lecture Notes 2. The Hilbert Space Approach to Time Series

Lecture Notes 2. The Hilbert Space Approach to Time Series Time Series Seven N. Durlauf Universiy of Wisconsin. Basic ideas Lecure Noes. The Hilber Space Approach o Time Series The Hilber space framework provides a very powerful language for discussing he relaionship

More information

Recursive Least-Squares Fixed-Interval Smoother Using Covariance Information based on Innovation Approach in Linear Continuous Stochastic Systems

Recursive Least-Squares Fixed-Interval Smoother Using Covariance Information based on Innovation Approach in Linear Continuous Stochastic Systems 8 Froniers in Signal Processing, Vol. 1, No. 1, July 217 hps://dx.doi.org/1.2266/fsp.217.112 Recursive Leas-Squares Fixed-Inerval Smooher Using Covariance Informaion based on Innovaion Approach in Linear

More information

An recursive analytical technique to estimate time dependent physical parameters in the presence of noise processes

An recursive analytical technique to estimate time dependent physical parameters in the presence of noise processes WHAT IS A KALMAN FILTER An recursive analyical echnique o esimae ime dependen physical parameers in he presence of noise processes Example of a ime and frequency applicaion: Offse beween wo clocks PREDICTORS,

More information

d 1 = c 1 b 2 - b 1 c 2 d 2 = c 1 b 3 - b 1 c 3

d 1 = c 1 b 2 - b 1 c 2 d 2 = c 1 b 3 - b 1 c 3 and d = c b - b c c d = c b - b c c This process is coninued unil he nh row has been compleed. The complee array of coefficiens is riangular. Noe ha in developing he array an enire row may be divided or

More information

Lecture 2 October ε-approximation of 2-player zero-sum games

Lecture 2 October ε-approximation of 2-player zero-sum games Opimizaion II Winer 009/10 Lecurer: Khaled Elbassioni Lecure Ocober 19 1 ε-approximaion of -player zero-sum games In his lecure we give a randomized ficiious play algorihm for obaining an approximae soluion

More information

OBJECTIVES OF TIME SERIES ANALYSIS

OBJECTIVES OF TIME SERIES ANALYSIS OBJECTIVES OF TIME SERIES ANALYSIS Undersanding he dynamic or imedependen srucure of he observaions of a single series (univariae analysis) Forecasing of fuure observaions Asceraining he leading, lagging

More information

Chapter 3 Boundary Value Problem

Chapter 3 Boundary Value Problem Chaper 3 Boundary Value Problem A boundary value problem (BVP) is a problem, ypically an ODE or a PDE, which has values assigned on he physical boundary of he domain in which he problem is specified. Le

More information

Sequential Importance Resampling (SIR) Particle Filter

Sequential Importance Resampling (SIR) Particle Filter Paricle Filers++ Pieer Abbeel UC Berkeley EECS Many slides adaped from Thrun, Burgard and Fox, Probabilisic Roboics 1. Algorihm paricle_filer( S -1, u, z ): 2. Sequenial Imporance Resampling (SIR) Paricle

More information

Institute for Mathematical Methods in Economics. University of Technology Vienna. Singapore, May Manfred Deistler

Institute for Mathematical Methods in Economics. University of Technology Vienna. Singapore, May Manfred Deistler MULTIVARIATE TIME SERIES ANALYSIS AND FORECASTING Manfred Deisler E O S Economerics and Sysems Theory Insiue for Mahemaical Mehods in Economics Universiy of Technology Vienna Singapore, May 2004 Inroducion

More information

GMM - Generalized Method of Moments

GMM - Generalized Method of Moments GMM - Generalized Mehod of Momens Conens GMM esimaion, shor inroducion 2 GMM inuiion: Maching momens 2 3 General overview of GMM esimaion. 3 3. Weighing marix...........................................

More information

3.1 More on model selection

3.1 More on model selection 3. More on Model selecion 3. Comparing models AIC, BIC, Adjused R squared. 3. Over Fiing problem. 3.3 Sample spliing. 3. More on model selecion crieria Ofen afer model fiing you are lef wih a handful of

More information

Chapter 2. First Order Scalar Equations

Chapter 2. First Order Scalar Equations Chaper. Firs Order Scalar Equaions We sar our sudy of differenial equaions in he same way he pioneers in his field did. We show paricular echniques o solve paricular ypes of firs order differenial equaions.

More information

Online Convex Optimization Example And Follow-The-Leader

Online Convex Optimization Example And Follow-The-Leader CSE599s, Spring 2014, Online Learning Lecure 2-04/03/2014 Online Convex Opimizaion Example And Follow-The-Leader Lecurer: Brendan McMahan Scribe: Sephen Joe Jonany 1 Review of Online Convex Opimizaion

More information

STATE-SPACE MODELLING. A mass balance across the tank gives:

STATE-SPACE MODELLING. A mass balance across the tank gives: B. Lennox and N.F. Thornhill, 9, Sae Space Modelling, IChemE Process Managemen and Conrol Subjec Group Newsleer STE-SPACE MODELLING Inroducion: Over he pas decade or so here has been an ever increasing

More information

Technical Report Doc ID: TR March-2013 (Last revision: 23-February-2016) On formulating quadratic functions in optimization models.

Technical Report Doc ID: TR March-2013 (Last revision: 23-February-2016) On formulating quadratic functions in optimization models. Technical Repor Doc ID: TR--203 06-March-203 (Las revision: 23-Februar-206) On formulaing quadraic funcions in opimizaion models. Auhor: Erling D. Andersen Convex quadraic consrains quie frequenl appear

More information

Unit Root Time Series. Univariate random walk

Unit Root Time Series. Univariate random walk Uni Roo ime Series Univariae random walk Consider he regression y y where ~ iid N 0, he leas squares esimae of is: ˆ yy y y yy Now wha if = If y y hen le y 0 =0 so ha y j j If ~ iid N 0, hen y ~ N 0, he

More information

Simulation-Solving Dynamic Models ABE 5646 Week 2, Spring 2010

Simulation-Solving Dynamic Models ABE 5646 Week 2, Spring 2010 Simulaion-Solving Dynamic Models ABE 5646 Week 2, Spring 2010 Week Descripion Reading Maerial 2 Compuer Simulaion of Dynamic Models Finie Difference, coninuous saes, discree ime Simple Mehods Euler Trapezoid

More information

Hamilton- J acobi Equation: Weak S olution We continue the study of the Hamilton-Jacobi equation:

Hamilton- J acobi Equation: Weak S olution We continue the study of the Hamilton-Jacobi equation: M ah 5 7 Fall 9 L ecure O c. 4, 9 ) Hamilon- J acobi Equaion: Weak S oluion We coninue he sudy of he Hamilon-Jacobi equaion: We have shown ha u + H D u) = R n, ) ; u = g R n { = }. ). In general we canno

More information

Georey E. Hinton. University oftoronto. Technical Report CRG-TR February 22, Abstract

Georey E. Hinton. University oftoronto.   Technical Report CRG-TR February 22, Abstract Parameer Esimaion for Linear Dynamical Sysems Zoubin Ghahramani Georey E. Hinon Deparmen of Compuer Science Universiy oftorono 6 King's College Road Torono, Canada M5S A4 Email: zoubin@cs.orono.edu Technical

More information

ACE 562 Fall Lecture 5: The Simple Linear Regression Model: Sampling Properties of the Least Squares Estimators. by Professor Scott H.

ACE 562 Fall Lecture 5: The Simple Linear Regression Model: Sampling Properties of the Least Squares Estimators. by Professor Scott H. ACE 56 Fall 005 Lecure 5: he Simple Linear Regression Model: Sampling Properies of he Leas Squares Esimaors by Professor Sco H. Irwin Required Reading: Griffihs, Hill and Judge. "Inference in he Simple

More information

MATH 4330/5330, Fourier Analysis Section 6, Proof of Fourier s Theorem for Pointwise Convergence

MATH 4330/5330, Fourier Analysis Section 6, Proof of Fourier s Theorem for Pointwise Convergence MATH 433/533, Fourier Analysis Secion 6, Proof of Fourier s Theorem for Poinwise Convergence Firs, some commens abou inegraing periodic funcions. If g is a periodic funcion, g(x + ) g(x) for all real x,

More information

Modal identification of structures from roving input data by means of maximum likelihood estimation of the state space model

Modal identification of structures from roving input data by means of maximum likelihood estimation of the state space model Modal idenificaion of srucures from roving inpu daa by means of maximum likelihood esimaion of he sae space model J. Cara, J. Juan, E. Alarcón Absrac The usual way o perform a forced vibraion es is o fix

More information

SOLUTIONS TO ECE 3084

SOLUTIONS TO ECE 3084 SOLUTIONS TO ECE 384 PROBLEM 2.. For each sysem below, specify wheher or no i is: (i) memoryless; (ii) causal; (iii) inverible; (iv) linear; (v) ime invarian; Explain your reasoning. If he propery is no

More information

Solutions Problem Set 3 Macro II (14.452)

Solutions Problem Set 3 Macro II (14.452) Soluions Problem Se 3 Macro II (14.452) Francisco A. Gallego 04/27/2005 1 Q heory of invesmen in coninuous ime and no uncerainy Consider he in nie horizon model of a rm facing adjusmen coss o invesmen.

More information

Ordinary dierential equations

Ordinary dierential equations Chaper 5 Ordinary dierenial equaions Conens 5.1 Iniial value problem........................... 31 5. Forward Euler's mehod......................... 3 5.3 Runge-Kua mehods.......................... 36

More information

Matrix Versions of Some Refinements of the Arithmetic-Geometric Mean Inequality

Matrix Versions of Some Refinements of the Arithmetic-Geometric Mean Inequality Marix Versions of Some Refinemens of he Arihmeic-Geomeric Mean Inequaliy Bao Qi Feng and Andrew Tonge Absrac. We esablish marix versions of refinemens due o Alzer ], Carwrigh and Field 4], and Mercer 5]

More information

Generalized Least Squares

Generalized Least Squares Generalized Leas Squares Augus 006 1 Modified Model Original assumpions: 1 Specificaion: y = Xβ + ε (1) Eε =0 3 EX 0 ε =0 4 Eεε 0 = σ I In his secion, we consider relaxing assumpion (4) Insead, assume

More information

A Specification Test for Linear Dynamic Stochastic General Equilibrium Models

A Specification Test for Linear Dynamic Stochastic General Equilibrium Models Journal of Saisical and Economeric Mehods, vol.1, no.2, 2012, 65-70 ISSN: 2241-0384 (prin), 2241-0376 (online) Scienpress Ld, 2012 A Specificaion Tes for Linear Dynamic Sochasic General Equilibrium Models

More information

Lecture 20: Riccati Equations and Least Squares Feedback Control

Lecture 20: Riccati Equations and Least Squares Feedback Control 34-5 LINEAR SYSTEMS Lecure : Riccai Equaions and Leas Squares Feedback Conrol 5.6.4 Sae Feedback via Riccai Equaions A recursive approach in generaing he marix-valued funcion W ( ) equaion for i for he

More information

Inventory Analysis and Management. Multi-Period Stochastic Models: Optimality of (s, S) Policy for K-Convex Objective Functions

Inventory Analysis and Management. Multi-Period Stochastic Models: Optimality of (s, S) Policy for K-Convex Objective Functions Muli-Period Sochasic Models: Opimali of (s, S) Polic for -Convex Objecive Funcions Consider a seing similar o he N-sage newsvendor problem excep ha now here is a fixed re-ordering cos (> 0) for each (re-)order.

More information

Predator - Prey Model Trajectories and the nonlinear conservation law

Predator - Prey Model Trajectories and the nonlinear conservation law Predaor - Prey Model Trajecories and he nonlinear conservaion law James K. Peerson Deparmen of Biological Sciences and Deparmen of Mahemaical Sciences Clemson Universiy Ocober 28, 213 Ouline Drawing Trajecories

More information

ACE 562 Fall Lecture 8: The Simple Linear Regression Model: R 2, Reporting the Results and Prediction. by Professor Scott H.

ACE 562 Fall Lecture 8: The Simple Linear Regression Model: R 2, Reporting the Results and Prediction. by Professor Scott H. ACE 56 Fall 5 Lecure 8: The Simple Linear Regression Model: R, Reporing he Resuls and Predicion by Professor Sco H. Irwin Required Readings: Griffihs, Hill and Judge. "Explaining Variaion in he Dependen

More information

12: AUTOREGRESSIVE AND MOVING AVERAGE PROCESSES IN DISCRETE TIME. Σ j =

12: AUTOREGRESSIVE AND MOVING AVERAGE PROCESSES IN DISCRETE TIME. Σ j = 1: AUTOREGRESSIVE AND MOVING AVERAGE PROCESSES IN DISCRETE TIME Moving Averages Recall ha a whie noise process is a series { } = having variance σ. The whie noise process has specral densiy f (λ) = of

More information

III. Module 3. Empirical and Theoretical Techniques

III. Module 3. Empirical and Theoretical Techniques III. Module 3. Empirical and Theoreical Techniques Applied Saisical Techniques 3. Auocorrelaion Correcions Persisence affecs sandard errors. The radiional response is o rea he auocorrelaion as a echnical

More information

Linear Response Theory: The connection between QFT and experiments

Linear Response Theory: The connection between QFT and experiments Phys540.nb 39 3 Linear Response Theory: The connecion beween QFT and experimens 3.1. Basic conceps and ideas Q: How do we measure he conduciviy of a meal? A: we firs inroduce a weak elecric field E, and

More information

Introduction D P. r = constant discount rate, g = Gordon Model (1962): constant dividend growth rate.

Introduction D P. r = constant discount rate, g = Gordon Model (1962): constant dividend growth rate. Inroducion Gordon Model (1962): D P = r g r = consan discoun rae, g = consan dividend growh rae. If raional expecaions of fuure discoun raes and dividend growh vary over ime, so should he D/P raio. Since

More information

Lecture 10 Estimating Nonlinear Regression Models

Lecture 10 Estimating Nonlinear Regression Models Lecure 0 Esimaing Nonlinear Regression Models References: Greene, Economeric Analysis, Chaper 0 Consider he following regression model: y = f(x, β) + ε =,, x is kx for each, β is an rxconsan vecor, ε is

More information

Article from. Predictive Analytics and Futurism. July 2016 Issue 13

Article from. Predictive Analytics and Futurism. July 2016 Issue 13 Aricle from Predicive Analyics and Fuurism July 6 Issue An Inroducion o Incremenal Learning By Qiang Wu and Dave Snell Machine learning provides useful ools for predicive analyics The ypical machine learning

More information

This document was generated at 1:04 PM, 09/10/13 Copyright 2013 Richard T. Woodward. 4. End points and transversality conditions AGEC

This document was generated at 1:04 PM, 09/10/13 Copyright 2013 Richard T. Woodward. 4. End points and transversality conditions AGEC his documen was generaed a 1:4 PM, 9/1/13 Copyrigh 213 Richard. Woodward 4. End poins and ransversaliy condiions AGEC 637-213 F z d Recall from Lecure 3 ha a ypical opimal conrol problem is o maimize (,,

More information

Application of a Stochastic-Fuzzy Approach to Modeling Optimal Discrete Time Dynamical Systems by Using Large Scale Data Processing

Application of a Stochastic-Fuzzy Approach to Modeling Optimal Discrete Time Dynamical Systems by Using Large Scale Data Processing Applicaion of a Sochasic-Fuzzy Approach o Modeling Opimal Discree Time Dynamical Sysems by Using Large Scale Daa Processing AA WALASZE-BABISZEWSA Deparmen of Compuer Engineering Opole Universiy of Technology

More information

Essential Microeconomics : OPTIMAL CONTROL 1. Consider the following class of optimization problems

Essential Microeconomics : OPTIMAL CONTROL 1. Consider the following class of optimization problems Essenial Microeconomics -- 6.5: OPIMAL CONROL Consider he following class of opimizaion problems Max{ U( k, x) + U+ ( k+ ) k+ k F( k, x)}. { x, k+ } = In he language of conrol heory, he vecor k is he vecor

More information

Lecture 1 Overview. course mechanics. outline & topics. what is a linear dynamical system? why study linear systems? some examples

Lecture 1 Overview. course mechanics. outline & topics. what is a linear dynamical system? why study linear systems? some examples EE263 Auumn 27-8 Sephen Boyd Lecure 1 Overview course mechanics ouline & opics wha is a linear dynamical sysem? why sudy linear sysems? some examples 1 1 Course mechanics all class info, lecures, homeworks,

More information

Tracking. Announcements

Tracking. Announcements Tracking Tuesday, Nov 24 Krisen Grauman UT Ausin Announcemens Pse 5 ou onigh, due 12/4 Shorer assignmen Auo exension il 12/8 I will no hold office hours omorrow 5 6 pm due o Thanksgiving 1 Las ime: Moion

More information

Most Probable Phase Portraits of Stochastic Differential Equations and Its Numerical Simulation

Most Probable Phase Portraits of Stochastic Differential Equations and Its Numerical Simulation Mos Probable Phase Porrais of Sochasic Differenial Equaions and Is Numerical Simulaion Bing Yang, Zhu Zeng and Ling Wang 3 School of Mahemaics and Saisics, Huazhong Universiy of Science and Technology,

More information

Some Basic Information about M-S-D Systems

Some Basic Information about M-S-D Systems Some Basic Informaion abou M-S-D Sysems 1 Inroducion We wan o give some summary of he facs concerning unforced (homogeneous) and forced (non-homogeneous) models for linear oscillaors governed by second-order,

More information

CHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK

CHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK 175 CHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK 10.1 INTRODUCTION Amongs he research work performed, he bes resuls of experimenal work are validaed wih Arificial Neural Nework. From he

More information

Testing the Random Walk Model. i.i.d. ( ) r

Testing the Random Walk Model. i.i.d. ( ) r he random walk heory saes: esing he Random Walk Model µ ε () np = + np + Momen Condiions where where ε ~ i.i.d he idea here is o es direcly he resricions imposed by momen condiions. lnp lnp µ ( lnp lnp

More information

18 Biological models with discrete time

18 Biological models with discrete time 8 Biological models wih discree ime The mos imporan applicaions, however, may be pedagogical. The elegan body of mahemaical heory peraining o linear sysems (Fourier analysis, orhogonal funcions, and so

More information

Lecture 2-1 Kinematics in One Dimension Displacement, Velocity and Acceleration Everything in the world is moving. Nothing stays still.

Lecture 2-1 Kinematics in One Dimension Displacement, Velocity and Acceleration Everything in the world is moving. Nothing stays still. Lecure - Kinemaics in One Dimension Displacemen, Velociy and Acceleraion Everyhing in he world is moving. Nohing says sill. Moion occurs a all scales of he universe, saring from he moion of elecrons in

More information

CONTROL SYSTEMS, ROBOTICS AND AUTOMATION Vol. XI Control of Stochastic Systems - P.R. Kumar

CONTROL SYSTEMS, ROBOTICS AND AUTOMATION Vol. XI Control of Stochastic Systems - P.R. Kumar CONROL OF SOCHASIC SYSEMS P.R. Kumar Deparmen of Elecrical and Compuer Engineering, and Coordinaed Science Laboraory, Universiy of Illinois, Urbana-Champaign, USA. Keywords: Markov chains, ransiion probabiliies,

More information

Linear Gaussian State Space Models

Linear Gaussian State Space Models Linear Gaussian Sae Space Models Srucural Time Series Models Level and Trend Models Basic Srucural Model (BSM Dynamic Linear Models Sae Space Model Represenaion Level, Trend, and Seasonal Models Time Varying

More information

20. Applications of the Genetic-Drift Model

20. Applications of the Genetic-Drift Model 0. Applicaions of he Geneic-Drif Model 1) Deermining he probabiliy of forming any paricular combinaion of genoypes in he nex generaion: Example: If he parenal allele frequencies are p 0 = 0.35 and q 0

More information

Ordinary Differential Equations

Ordinary Differential Equations Ordinary Differenial Equaions 5. Examples of linear differenial equaions and heir applicaions We consider some examples of sysems of linear differenial equaions wih consan coefficiens y = a y +... + a

More information

Notes for Lecture 17-18

Notes for Lecture 17-18 U.C. Berkeley CS278: Compuaional Complexiy Handou N7-8 Professor Luca Trevisan April 3-8, 2008 Noes for Lecure 7-8 In hese wo lecures we prove he firs half of he PCP Theorem, he Amplificaion Lemma, up

More information

hen found from Bayes rule. Specically, he prior disribuion is given by p( ) = N( ; ^ ; r ) (.3) where r is he prior variance (we add on he random drif

hen found from Bayes rule. Specically, he prior disribuion is given by p( ) = N( ; ^ ; r ) (.3) where r is he prior variance (we add on he random drif Chaper Kalman Filers. Inroducion We describe Bayesian Learning for sequenial esimaion of parameers (eg. means, AR coeciens). The updae procedures are known as Kalman Filers. We show how Dynamic Linear

More information

Math 2142 Exam 1 Review Problems. x 2 + f (0) 3! for the 3rd Taylor polynomial at x = 0. To calculate the various quantities:

Math 2142 Exam 1 Review Problems. x 2 + f (0) 3! for the 3rd Taylor polynomial at x = 0. To calculate the various quantities: Mah 4 Eam Review Problems Problem. Calculae he 3rd Taylor polynomial for arcsin a =. Soluion. Le f() = arcsin. For his problem, we use he formula f() + f () + f ()! + f () 3! for he 3rd Taylor polynomial

More information

Advanced Organic Chemistry

Advanced Organic Chemistry Lalic, G. Chem 53A Chemisry 53A Advanced Organic Chemisry Lecure noes 1 Kineics: A racical Approach Simple Kineics Scenarios Fiing Experimenal Daa Using Kineics o Deermine he Mechanism Doughery, D. A.,

More information

Multi-scale 2D acoustic full waveform inversion with high frequency impulsive source

Multi-scale 2D acoustic full waveform inversion with high frequency impulsive source Muli-scale D acousic full waveform inversion wih high frequency impulsive source Vladimir N Zubov*, Universiy of Calgary, Calgary AB vzubov@ucalgaryca and Michael P Lamoureux, Universiy of Calgary, Calgary

More information

3.1.3 INTRODUCTION TO DYNAMIC OPTIMIZATION: DISCRETE TIME PROBLEMS. A. The Hamiltonian and First-Order Conditions in a Finite Time Horizon

3.1.3 INTRODUCTION TO DYNAMIC OPTIMIZATION: DISCRETE TIME PROBLEMS. A. The Hamiltonian and First-Order Conditions in a Finite Time Horizon 3..3 INRODUCION O DYNAMIC OPIMIZAION: DISCREE IME PROBLEMS A. he Hamilonian and Firs-Order Condiions in a Finie ime Horizon Define a new funcion, he Hamilonian funcion, H. H he change in he oal value of

More information

MATH 5720: Gradient Methods Hung Phan, UMass Lowell October 4, 2018

MATH 5720: Gradient Methods Hung Phan, UMass Lowell October 4, 2018 MATH 5720: Gradien Mehods Hung Phan, UMass Lowell Ocober 4, 208 Descen Direcion Mehods Consider he problem min { f(x) x R n}. The general descen direcions mehod is x k+ = x k + k d k where x k is he curren

More information

Bias in Conditional and Unconditional Fixed Effects Logit Estimation: a Correction * Tom Coupé

Bias in Conditional and Unconditional Fixed Effects Logit Estimation: a Correction * Tom Coupé Bias in Condiional and Uncondiional Fixed Effecs Logi Esimaion: a Correcion * Tom Coupé Economics Educaion and Research Consorium, Naional Universiy of Kyiv Mohyla Academy Address: Vul Voloska 10, 04070

More information

Lecture 3: Exponential Smoothing

Lecture 3: Exponential Smoothing NATCOR: Forecasing & Predicive Analyics Lecure 3: Exponenial Smoohing John Boylan Lancaser Cenre for Forecasing Deparmen of Managemen Science Mehods and Models Forecasing Mehod A (numerical) procedure

More information

In this chapter the model of free motion under gravity is extended to objects projected at an angle. When you have completed it, you should

In this chapter the model of free motion under gravity is extended to objects projected at an angle. When you have completed it, you should Cambridge Universiy Press 978--36-60033-7 Cambridge Inernaional AS and A Level Mahemaics: Mechanics Coursebook Excerp More Informaion Chaper The moion of projeciles In his chaper he model of free moion

More information

Robust estimation based on the first- and third-moment restrictions of the power transformation model

Robust estimation based on the first- and third-moment restrictions of the power transformation model h Inernaional Congress on Modelling and Simulaion, Adelaide, Ausralia, 6 December 3 www.mssanz.org.au/modsim3 Robus esimaion based on he firs- and hird-momen resricions of he power ransformaion Nawaa,

More information

A Primal-Dual Type Algorithm with the O(1/t) Convergence Rate for Large Scale Constrained Convex Programs

A Primal-Dual Type Algorithm with the O(1/t) Convergence Rate for Large Scale Constrained Convex Programs PROC. IEEE CONFERENCE ON DECISION AND CONTROL, 06 A Primal-Dual Type Algorihm wih he O(/) Convergence Rae for Large Scale Consrained Convex Programs Hao Yu and Michael J. Neely Absrac This paper considers

More information

EKF SLAM vs. FastSLAM A Comparison

EKF SLAM vs. FastSLAM A Comparison vs. A Comparison Michael Calonder, Compuer Vision Lab Swiss Federal Insiue of Technology, Lausanne EPFL) michael.calonder@epfl.ch The wo algorihms are described wih a planar robo applicaion in mind. Generalizaion

More information

Chapter 5. Heterocedastic Models. Introduction to time series (2008) 1

Chapter 5. Heterocedastic Models. Introduction to time series (2008) 1 Chaper 5 Heerocedasic Models Inroducion o ime series (2008) 1 Chaper 5. Conens. 5.1. The ARCH model. 5.2. The GARCH model. 5.3. The exponenial GARCH model. 5.4. The CHARMA model. 5.5. Random coefficien

More information

Online Appendix to Solution Methods for Models with Rare Disasters

Online Appendix to Solution Methods for Models with Rare Disasters Online Appendix o Soluion Mehods for Models wih Rare Disasers Jesús Fernández-Villaverde and Oren Levinal In his Online Appendix, we presen he Euler condiions of he model, we develop he pricing Calvo block,

More information

Stable block Toeplitz matrix for the processing of multichannel seismic data

Stable block Toeplitz matrix for the processing of multichannel seismic data Indian Journal of Marine Sciences Vol. 33(3), Sepember 2004, pp. 215-219 Sable block Toepliz marix for he processing of mulichannel seismic daa Kiri Srivasava* & V P Dimri Naional Geophysical Research

More information

Nature Neuroscience: doi: /nn Supplementary Figure 1. Spike-count autocorrelations in time.

Nature Neuroscience: doi: /nn Supplementary Figure 1. Spike-count autocorrelations in time. Supplemenary Figure 1 Spike-coun auocorrelaions in ime. Normalized auocorrelaion marices are shown for each area in a daase. The marix shows he mean correlaion of he spike coun in each ime bin wih he spike

More information

Chapter 6. Systems of First Order Linear Differential Equations

Chapter 6. Systems of First Order Linear Differential Equations Chaper 6 Sysems of Firs Order Linear Differenial Equaions We will only discuss firs order sysems However higher order sysems may be made ino firs order sysems by a rick shown below We will have a sligh

More information

Affine term structure models

Affine term structure models Affine erm srucure models A. Inro o Gaussian affine erm srucure models B. Esimaion by minimum chi square (Hamilon and Wu) C. Esimaion by OLS (Adrian, Moench, and Crump) D. Dynamic Nelson-Siegel model (Chrisensen,

More information

Christos Papadimitriou & Luca Trevisan November 22, 2016

Christos Papadimitriou & Luca Trevisan November 22, 2016 U.C. Bereley CS170: Algorihms Handou LN-11-22 Chrisos Papadimiriou & Luca Trevisan November 22, 2016 Sreaming algorihms In his lecure and he nex one we sudy memory-efficien algorihms ha process a sream

More information

Probabilistic Robotics SLAM

Probabilistic Robotics SLAM Probabilisic Roboics SLAM The SLAM Problem SLAM is he process by which a robo builds a map of he environmen and, a he same ime, uses his map o compue is locaion Localizaion: inferring locaion given a map

More information

0.1 MAXIMUM LIKELIHOOD ESTIMATION EXPLAINED

0.1 MAXIMUM LIKELIHOOD ESTIMATION EXPLAINED 0.1 MAXIMUM LIKELIHOOD ESTIMATIO EXPLAIED Maximum likelihood esimaion is a bes-fi saisical mehod for he esimaion of he values of he parameers of a sysem, based on a se of observaions of a random variable

More information

Section 3.5 Nonhomogeneous Equations; Method of Undetermined Coefficients

Section 3.5 Nonhomogeneous Equations; Method of Undetermined Coefficients Secion 3.5 Nonhomogeneous Equaions; Mehod of Undeermined Coefficiens Key Terms/Ideas: Linear Differenial operaor Nonlinear operaor Second order homogeneous DE Second order nonhomogeneous DE Soluion o homogeneous

More information