ELE 538B: Large-Scale Optimization for Data Science. Quasi-Newton methods. Yuxin Chen Princeton University, Spring 2018
|
|
- Francis Horn
- 5 years ago
- Views:
Transcription
1 ELE 538B: Large-Scale Opimizaion for Daa Science Quasi-Newon mehods Yuxin Chen Princeon Universiy, Spring 208
2 00 op ff(x (x)(k)) f p 2 L µ f 05 k f (xk ) k f (xk ) =) f op ieraions converges in only 5 seps X k=0 ypically requires quadraic local convergence f (x ) x k22 kx f op L2f 2L2f P µ µ( + ) k=0 k x k22 + µ( + ) + kx k=0 () f op 0 Summing over all ieraions before, we ge X quadraic backracking parameers αconvergence: = 0., β = 0.7 and hence ) µ( + ) kx+ x k22 + µ( + ) + µ( x R ) n f (x) Examplesminimize f (x ) f op kx x k22 kx + 2 x = x ( f (x )) f (x ) Summing over all ieraions before, we ge f bes,k k=0 X k 2 kg k2 µ 2 kg k2 µ x k22 + µ( + ) + kx x k22 µ( ) kx x f op kg k22 µ( + ) and hence xample in R2 (page 0 9) x(0) µ( Newon s mehod f (x ) =) µ( + ) + kx kg k2 µ x k22 + L2f µ f bes,k x k22 + X k 2 kg k2 µ k=0 L2f 2L2f P µ µ( + ) k=0 k k f (x ) 5 f op aains ε accuracy wihin O(log log ε ) soring and invering Hessian 2 f (x) Rn n a single ieraion may las forever; prohibiive sorage requiremen consrained minimizaion Quasi-Newon mehods 0 2-2
3 Quasi-Newon mehods key idea: approximae Hessian marix using only gradien informaion x + = x η H }{{} f(x ) surrogae of ( 2 f(x )) challenges: how o find good approximaion H 0 of ( 2 f(x ) ) using only gradien informaion using limied memory achieving super-linear convergence Quasi-Newon mehods -3
4 Crierion for choosing H Consider following approximae quadraic model of f( ): f (x) := f(x + )+ f(x + ), x x which saisfies f (x) = f(x + ) + H+ ( x x + ) ( x x + ) H ( + x x + ) One reasonable crierion: gradien maching for laes wo ieraes: f (x ) = f(x ) f (x + ) = f(x + ) (.a) (.b) Quasi-Newon mehods -
5 rf (x+ ) + H+ x sar x+ (x ) We wih=a rf monooniciy resul: () Secan equaion rf (x+ ) = rf (x+ ) holds For (), one requires 2.5(x+ ) ()rf H+ x+ Lemma x = rf (xrivially. ) Proof of Lemma 2.5 x = rf (x Le f be convex and L-smooh. rf If (x ) +=H/L, + x hen rf (x) + () H x asú follows x = rf (x+ ) r Consider approximae quadraic model of+ f( ) Îx+ xfú Î ( ) 2 Æ Îx x Î2 + + I follows+ ha ú ú ) rf >(x) is any minimizer wih opimal (x+ f (x) := f (xwhere ) +xhrf (x+ ), x x+ i+ x fx H+ x 2 slope H xú Î22 =.x xú (Òf (x ) Òf (xú )).2 whichîx saisfies + rf (x) = rf (x+ ) + H+ x x =0..2. x+.x xú. 2 Èx xú, Òf (x ) Òf (xú )Í + 2.Òf (x ) Òf (xú 2crierion: One= reasonable gradien maching for laes wo ieraes: (.b) holds auomaically. To saisfy (.a), one requires Gradien mehods 2 2 Ø ÎÒf (x ) Òf (xú )Î2 (smoohness). (x )+. (x+.2. rf rf ) (x ) x= =2(x f.òf +(xx Æ. f x xú.2 ) + 2H ) Òf (xú ) ) = rf (x )..2 rf (x ú. H x+ x = f (x+ ) f (x ) Æ.x x L + {z 2 () holds rivially. For (), one requires secan equaion } + rf (x+ ) + H+ xdisplacemen = rf (x ) secan equaion requires ha maps + x + + x x ino change of gradiens f (x ) f (x ) () H+ x+ x = rf (x+ ) rf (x ) Gradien mehods Quasi-Newon mehods -5
6 Secan equaion H + ( f(x + ) f(x ) ) }{{} :=y = x + x }{{} :=s (.2) only possible when s y > 0, since s y = y H + y > 0 admi infinie soluions, since degrees of freedom O(n 2 ) in choosing H+ far exceeds number of consrains n in (.2) which H+ shall we choose? Quasi-Newon mehods -6
7 Broyden-Flecher-Goldfarb-Shanno (BFGS) mehod Quasi-Newon mehods -7
8 Closedness o H In addiion o secan equaion, choose H + sufficienly close o H : minimize H subjec o H H H = H Hy = s for some norm exploi pas informaion regarding H choosing differen norms resuls in differen quasi-newon mehods Quasi-Newon mehods -8
9 Choice of norm in BFGS Choosing M := W /2 MW /2 F for any weigh marix W obeying W s = y, we ge minimize H subjec o W /2 (H H )W /2 F H = H Hy = s This admis closed-form expression H + = ( I ρ s y ) ( H I ρ y s ) + ρ s s }{{} BFGS updae rule; H + 0 if H 0 (.3) wih ρ = y s Quasi-Newon mehods -9
10 An alernaive inerpreaion H + is also soluion o minimize H H, H log de ( H H ) n }{{} KL divergence beween N (0,H ) and N (0,H ) subjec o Hy = s minimizing KL divergence subjec o secan equaion consrains Quasi-Newon mehods -0
11 BFGS mehods Algorihm. BFGS : for = 0,, do 2: x + = x η H f(x ) (line search o deermine η ) 3: H + = ( I ρ s y ) ( H I ρ y s ) + ρ s s, where s = x + x, y = f(x + ) f(x ), and ρ = y s each ieraion coss O(n 2 ) (in addiion o compuing gradiens) no need o solve linear sysems or inver marices no magic formula for iniializaion; possible choices: approximae inverse Hessian a x 0, or ideniy marix Quasi-Newon mehods -
12 Rank-2 updae on H From Sherman-Morrison-Woodbury formula ( A + UV ) = A A U ( I + V A U ) V A, BFGS rule is equivalen o H+ = H s H H s s H + ρ y y s }{{} rank-2 updae Quasi-Newon mehods -2
13 Local superlinear convergence Theorem. (informal) Suppose f is srongly convex and has Lipschiz-coninuous Hessian. Under mild condiions, BFGS achieves x + x 2 x x = 0 2 lim ieraion complexiy: larger han Newon mehods bu smaller han gradien mehods asympoic resul: holds when Quasi-Newon mehods -3
14 Key observaion BFGS updae rule achieves lim ( H 2 f(x ) )( x + x ) 2 x + x 2 = 0 Implicaions even hough H may no converge o 2 f(x ), i becomes increasingly accurae approximaion of 2 f(x ) along search direcion x + x asympoically, x + x ( 2 f(x ) ) f(x ) } {{ } Newon search direcion Quasi-Newon mehods -
15 rgence analysis 7- Convergence analysis Numerical example definiion of mehods Opimaliy offirs-order Neserov s mehod More precisely, convex and L-smooh funcion f s.. HL = V> V> m HL m V m V Example L L noes > LemmaH5., V> immediae V> m V > EE236C lecure +H arrive V>ma = we m mv V m+ s m s m V m+ Using ae arrive a Theorem 5.3 More precisely, convex and L-smooh funcion f s.. as long as xk œ x0 + span{òf (x0 ),, Òf (xk )} for all Æ k Æ 7- y> s Ineresingly, no firs-order mehods can improve upon Neserov s resul in general as long as xk œ x0 + span{òf (x0 ),, Òf (xk )} for all Æ k Æ wih = wih = BFGS updae rule y > s 3LÎx0 xú Î22 32( + ) ! 2 " 0 9 Quasi-Newon cos permehods Newon ieraion: O(n3) plus compuing r2f (x) Acceleraed GD Acceleraed GD 0 R := supxœc DÏ x, x, hen 2 0A 0 A 0 Ô 50 B 00 B 0 Ô Lf R log k Lf R log k bes, op Ô Ô f f ÆO ÆO Ô Ô fl n = 00, N = 500 fl op definiion of firs-order mehods Ô 9! " 2flR Ô wih 0IfD Ï = supxœc x, xl0 f, hen 2? 6 k 0 (x )) ffop Ø ff(x f (xk ) f? f (x ) f op Ø 3LÎx0 xú Î22 32( + )2 = f > >> m + m V> + VX V m+ Vs> m+ T mm+2 V>s m sv s m+ N m m+ V m+ X minimize c>t x> log(b a x) i > i > > > n + x R x V minimize log b s ai xv m+ V m V c + m+ + m+2 s s i s m+ i= >coninuous i= Suppose f is convex and Lipschiz (i.e. Îg Îú Æ Lf ) on C, + + s s = 00, m = 500 chizn coninuous (i.e. Îg Îú ÆSherman-Morrison-Woodbury L f) on C, From formula, BFGS ru suppoe isnewon fl-srongly convex w.r.. formula, Î Î. Then onvexand w.r.. ÎFrom Î. ÏThen Sherman-Morrison-Woodbury BFGS Newon BFGS rule is equivalen Newon H+ =H! H s s> + y H " > L2f sq 2 > >! 0 0 " Lf q s H H = H H s s H + y y + 2 sup {z 0 0 D x, x + > Ï xœc k=0 k pxœc DÏ x, x + s H s 2fl bes, op k k=0 2fl f {z } rank-2 updae Æ q3 q3 f 0 0 rank-2 updae k k=0 k=0 k 50-5
16 Limied-memory quasi-newon mehods Hessian marices are usually dense. For large-scale problems, even soring (inverse) Hessian marices is prohibiive Insead of soring full Hessian approximaions, one may wan o mainain more parsimonious approximaion of Hessians, using only a few vecors Quasi-Newon mehods -6
17 Limied-memory BFGS (L-BFGS) H + = V H V + ρ s s }{{} BFGS updae rule wih V = I ρ y s key idea: mainain modified version of H implicily by soring m (e.g. 20) mos recen vecor pairs (s, y ) Quasi-Newon mehods -7
18 Limied-memory BFGS (L-BFGS) L-BFGS mainains H L = V V mh L,0V m V + ρ m V V m+s m s mv m+ V + ρ m+ V V m+2s m+ s m+v m+ V + + ρ s s can be compued recursively iniializaion H,0 L may vary from ieraion o ieraion only needs o sore {(s i, y i )} m i< Quasi-Newon mehods -8
19 Reference [] Numerical opimizaion, J. Nocedal, S. Wrigh, [2] Opimizaion mehods for large-scale sysems, EE236C lecure noes, L. Vandenberghe, UCLA. [3] Opimizaion mehods for large-scale machine learning, L. Boou e al., arxiv, 206. [] Convex opimizaion, EE36B lecure noes, S. Boyd, Sanford. Quasi-Newon mehods -9
ELE 538B: Large-Scale Optimization for Data Science. Introduction. Yuxin Chen Princeton University, Spring 2018
ELE 538B: Large-Scale Opimizaion for Daa Science Inroducion Yuxin Chen Princeon Universiy, Spring 2018 Surge of daa-inensive applicaions Widespread applicaions in large-scale daa science and learning 2.5
More informationMATH 5720: Gradient Methods Hung Phan, UMass Lowell October 4, 2018
MATH 5720: Gradien Mehods Hung Phan, UMass Lowell Ocober 4, 208 Descen Direcion Mehods Consider he problem min { f(x) x R n}. The general descen direcions mehod is x k+ = x k + k d k where x k is he curren
More informationCourse Notes for EE227C (Spring 2018): Convex Optimization and Approximation
Course Noes for EE7C Spring 018: Convex Opimizaion and Approximaion Insrucor: Moriz Hard Email: hard+ee7c@berkeley.edu Graduae Insrucor: Max Simchowiz Email: msimchow+ee7c@berkeley.edu Ocober 15, 018 3
More informationLecture 9: September 25
0-725: Opimizaion Fall 202 Lecure 9: Sepember 25 Lecurer: Geoff Gordon/Ryan Tibshirani Scribes: Xuezhi Wang, Subhodeep Moira, Abhimanu Kumar Noe: LaTeX emplae couresy of UC Berkeley EECS dep. Disclaimer:
More informationSupplement for Stochastic Convex Optimization: Faster Local Growth Implies Faster Global Convergence
Supplemen for Sochasic Convex Opimizaion: Faser Local Growh Implies Faser Global Convergence Yi Xu Qihang Lin ianbao Yang Proof of heorem heorem Suppose Assumpion holds and F (w) obeys he LGC (6) Given
More informationA Primal-Dual Type Algorithm with the O(1/t) Convergence Rate for Large Scale Constrained Convex Programs
PROC. IEEE CONFERENCE ON DECISION AND CONTROL, 06 A Primal-Dual Type Algorihm wih he O(/) Convergence Rae for Large Scale Consrained Convex Programs Hao Yu and Michael J. Neely Absrac This paper considers
More informationLecture 20: Riccati Equations and Least Squares Feedback Control
34-5 LINEAR SYSTEMS Lecure : Riccai Equaions and Leas Squares Feedback Conrol 5.6.4 Sae Feedback via Riccai Equaions A recursive approach in generaing he marix-valued funcion W ( ) equaion for i for he
More informationChapter 3 Boundary Value Problem
Chaper 3 Boundary Value Problem A boundary value problem (BVP) is a problem, ypically an ODE or a PDE, which has values assigned on he physical boundary of he domain in which he problem is specified. Le
More informationPrimal-Dual Splitting: Recent Improvements and Variants
Primal-Dual Spliing: Recen Improvemens and Varians 1 Thomas Pock and 2 Anonin Chambolle 1 Insiue for Compuer Graphics and Vision, TU Graz, Ausria 2 CMAP & CNRS École Polyechnique, France The proximal poin
More informationLecture 10 Estimating Nonlinear Regression Models
Lecure 0 Esimaing Nonlinear Regression Models References: Greene, Economeric Analysis, Chaper 0 Consider he following regression model: y = f(x, β) + ε =,, x is kx for each, β is an rxconsan vecor, ε is
More informationSequential Importance Resampling (SIR) Particle Filter
Paricle Filers++ Pieer Abbeel UC Berkeley EECS Many slides adaped from Thrun, Burgard and Fox, Probabilisic Roboics 1. Algorihm paricle_filer( S -1, u, z ): 2. Sequenial Imporance Resampling (SIR) Paricle
More informationt is a basis for the solution space to this system, then the matrix having these solutions as columns, t x 1 t, x 2 t,... x n t x 2 t...
Mah 228- Fri Mar 24 5.6 Marix exponenials and linear sysems: The analogy beween firs order sysems of linear differenial equaions (Chaper 5) and scalar linear differenial equaions (Chaper ) is much sronger
More informationTracking Adversarial Targets
A. Proofs Proof of Lemma 3. Consider he Bellman equaion λ + V π,l x, a lx, a + V π,l Ax + Ba, πax + Ba. We prove he lemma by showing ha he given quadraic form is he unique soluion of he Bellman equaion.
More informationHamilton Jacobi equations
Hamilon Jacobi equaions Inoducion o PDE The rigorous suff from Evans, mosly. We discuss firs u + H( u = 0, (1 where H(p is convex, and superlinear a infiniy, H(p lim p p = + This by comes by inegraion
More informationAn introduction to the theory of SDDP algorithm
An inroducion o he heory of SDDP algorihm V. Leclère (ENPC) Augus 1, 2014 V. Leclère Inroducion o SDDP Augus 1, 2014 1 / 21 Inroducion Large scale sochasic problem are hard o solve. Two ways of aacking
More informationEMS SCM joint meeting. On stochastic partial differential equations of parabolic type
EMS SCM join meeing Barcelona, May 28-30, 2015 On sochasic parial differenial equaions of parabolic ype Isván Gyöngy School of Mahemaics and Maxwell Insiue Edinburgh Universiy 1 I. Filering problem II.
More informationNotes on online convex optimization
Noes on online convex opimizaion Karl Sraos Online convex opimizaion (OCO) is a principled framework for online learning: OnlineConvexOpimizaion Inpu: convex se S, number of seps T For =, 2,..., T : Selec
More informationCHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK
175 CHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK 10.1 INTRODUCTION Amongs he research work performed, he bes resuls of experimenal work are validaed wih Arificial Neural Nework. From he
More informationAppendix to Online l 1 -Dictionary Learning with Application to Novel Document Detection
Appendix o Online l -Dicionary Learning wih Applicaion o Novel Documen Deecion Shiva Prasad Kasiviswanahan Huahua Wang Arindam Banerjee Prem Melville A Background abou ADMM In his secion, we give a brief
More informationOptimality Conditions for Unconstrained Problems
62 CHAPTER 6 Opimaliy Condiions for Unconsrained Problems 1 Unconsrained Opimizaion 11 Exisence Consider he problem of minimizing he funcion f : R n R where f is coninuous on all of R n : P min f(x) x
More informationA Decentralized Second-Order Method with Exact Linear Convergence Rate for Consensus Optimization
1 A Decenralized Second-Order Mehod wih Exac Linear Convergence Rae for Consensus Opimizaion Aryan Mokhari, Wei Shi, Qing Ling, and Alejandro Ribeiro Absrac This paper considers decenralized consensus
More informationZápadočeská Univerzita v Plzni, Czech Republic and Groupe ESIEE Paris, France
ADAPTIVE SIGNAL PROCESSING USING MAXIMUM ENTROPY ON THE MEAN METHOD AND MONTE CARLO ANALYSIS Pavla Holejšovsá, Ing. *), Z. Peroua, Ing. **), J.-F. Bercher, Prof. Assis. ***) Západočesá Univerzia v Plzni,
More informationHamilton- J acobi Equation: Weak S olution We continue the study of the Hamilton-Jacobi equation:
M ah 5 7 Fall 9 L ecure O c. 4, 9 ) Hamilon- J acobi Equaion: Weak S oluion We coninue he sudy of he Hamilon-Jacobi equaion: We have shown ha u + H D u) = R n, ) ; u = g R n { = }. ). In general we canno
More informationPhysics 235 Chapter 2. Chapter 2 Newtonian Mechanics Single Particle
Chaper 2 Newonian Mechanics Single Paricle In his Chaper we will review wha Newon s laws of mechanics ell us abou he moion of a single paricle. Newon s laws are only valid in suiable reference frames,
More informationELE 538B: Large-Scale Optimization for Data Science. Mirror descent. Yuxin Chen Princeton University, Spring 2018
ELE 538B: Large-Scale Opimizaion for aa Science Yuin hen Princeon Universiy Spring 08 Ouline Bregman divergence Alernaive forms mirror onvergence analysis 5- {z }.. A proimal view gradien. 7 (X 6 :g(ˆ
More informationMath 527 Lecture 6: Hamilton-Jacobi Equation: Explicit Formulas
Mah 527 Lecure 6: Hamilon-Jacobi Equaion: Explici Formulas Sep. 23, 2 Mehod of characerisics. We r o appl he mehod of characerisics o he Hamilon-Jacobi equaion: u +Hx, Du = in R n, u = g on R n =. 2 To
More informationL07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS. NA568 Mobile Robotics: Methods & Algorithms
L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS NA568 Mobile Roboics: Mehods & Algorihms Today s Topic Quick review on (Linear) Kalman Filer Kalman Filering for Non-Linear Sysems Exended Kalman Filer (EKF)
More informationAccelerated gradient methods
ELE 538B: Large-Scale Optimization for Data Science Accelerated gradient methods Yuxin Chen Princeton University, Spring 018 Outline Heavy-ball methods Nesterov s accelerated gradient methods Accelerated
More informationOptima and Equilibria for Traffic Flow on a Network
Opima and Equilibria for Traffic Flow on a Nework Albero Bressan Deparmen of Mahemaics, Penn Sae Universiy bressan@mah.psu.edu Albero Bressan (Penn Sae) Opima and equilibria for raffic flow 1 / 1 A Traffic
More informationLinear Gaussian State Space Models
Linear Gaussian Sae Space Models Srucural Time Series Models Level and Trend Models Basic Srucural Model (BSM Dynamic Linear Models Sae Space Model Represenaion Level, Trend, and Seasonal Models Time Varying
More informationTimed Circuits. Asynchronous Circuit Design. Timing Relationships. A Simple Example. Timed States. Timing Sequences. ({r 6 },t6 = 1.
Timed Circuis Asynchronous Circui Design Chris J. Myers Lecure 7: Timed Circuis Chaper 7 Previous mehods only use limied knowledge of delays. Very robus sysems, bu exremely conservaive. Large funcional
More informationHamilton- J acobi Equation: Explicit Formulas In this lecture we try to apply the method of characteristics to the Hamilton-Jacobi equation: u t
M ah 5 2 7 Fall 2 0 0 9 L ecure 1 0 O c. 7, 2 0 0 9 Hamilon- J acobi Equaion: Explici Formulas In his lecure we ry o apply he mehod of characerisics o he Hamilon-Jacobi equaion: u + H D u, x = 0 in R n
More informationMaximum Likelihood Parameter Estimation in State-Space Models
Maximum Likelihood Parameer Esimaion in Sae-Space Models Arnaud Douce Deparmen of Saisics, Oxford Universiy Universiy College London 4 h Ocober 212 A. Douce (UCL Maserclass Oc. 212 4 h Ocober 212 1 / 32
More informationCS376 Computer Vision Lecture 6: Optical Flow
CS376 Compuer Vision Lecure 6: Opical Flow Qiing Huang Feb. 11 h 2019 Slides Credi: Krisen Grauman and Sebasian Thrun, Michael Black, Marc Pollefeys Opical Flow mage racking 3D compuaion mage sequence
More informationAryan Mokhtari, Wei Shi, Qing Ling, and Alejandro Ribeiro. cost function n
IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS, VOL. 2, NO. 4, DECEMBER 2016 507 A Decenralized Second-Order Mehod wih Exac Linear Convergence Rae for Consensus Opimizaion Aryan Mokhari,
More informationPENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD
PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD HAN XIAO 1. Penalized Leas Squares Lasso solves he following opimizaion problem, ˆβ lasso = arg max β R p+1 1 N y i β 0 N x ij β j β j (1.1) for some 0.
More informationOrdinary dierential equations
Chaper 5 Ordinary dierenial equaions Conens 5.1 Iniial value problem........................... 31 5. Forward Euler's mehod......................... 3 5.3 Runge-Kua mehods.......................... 36
More informationSimulation-Solving Dynamic Models ABE 5646 Week 2, Spring 2010
Simulaion-Solving Dynamic Models ABE 5646 Week 2, Spring 2010 Week Descripion Reading Maerial 2 Compuer Simulaion of Dynamic Models Finie Difference, coninuous saes, discree ime Simple Mehods Euler Trapezoid
More informationChapter 2. First Order Scalar Equations
Chaper. Firs Order Scalar Equaions We sar our sudy of differenial equaions in he same way he pioneers in his field did. We show paricular echniques o solve paricular ypes of firs order differenial equaions.
More informationExponential Smoothing
Exponenial moohing Inroducion A simple mehod for forecasing. Does no require long series. Enables o decompose he series ino a rend and seasonal effecs. Paricularly useful mehod when here is a need o forecas
More informationGMM - Generalized Method of Moments
GMM - Generalized Mehod of Momens Conens GMM esimaion, shor inroducion 2 GMM inuiion: Maching momens 2 3 General overview of GMM esimaion. 3 3. Weighing marix...........................................
More informationOrdinary differential equations. Phys 750 Lecture 7
Ordinary differenial equaions Phys 750 Lecure 7 Ordinary Differenial Equaions Mos physical laws are expressed as differenial equaions These come in hree flavours: iniial-value problems boundary-value problems
More informationMATH 128A, SUMMER 2009, FINAL EXAM SOLUTION
MATH 28A, SUMME 2009, FINAL EXAM SOLUTION BENJAMIN JOHNSON () (8 poins) [Lagrange Inerpolaion] (a) (4 poins) Le f be a funcion defined a some real numbers x 0,..., x n. Give a defining equaion for he Lagrange
More informationLecture 2 October ε-approximation of 2-player zero-sum games
Opimizaion II Winer 009/10 Lecurer: Khaled Elbassioni Lecure Ocober 19 1 ε-approximaion of -player zero-sum games In his lecure we give a randomized ficiious play algorihm for obaining an approximae soluion
More informationState-Space Models. Initialization, Estimation and Smoothing of the Kalman Filter
Sae-Space Models Iniializaion, Esimaion and Smoohing of he Kalman Filer Iniializaion of he Kalman Filer The Kalman filer shows how o updae pas predicors and he corresponding predicion error variances when
More informationOnline Convex Optimization Example And Follow-The-Leader
CSE599s, Spring 2014, Online Learning Lecure 2-04/03/2014 Online Convex Opimizaion Example And Follow-The-Leader Lecurer: Brendan McMahan Scribe: Sephen Joe Jonany 1 Review of Online Convex Opimizaion
More informationAn Introduction to Malliavin calculus and its applications
An Inroducion o Malliavin calculus and is applicaions Lecure 5: Smoohness of he densiy and Hörmander s heorem David Nualar Deparmen of Mahemaics Kansas Universiy Universiy of Wyoming Summer School 214
More information1 Subdivide the optimization horizon [t 0,t f ] into n s 1 control stages,
Opimal Conrol Formulaion Opimal Conrol Lecures 19-2: Direc Soluion Mehods Benoî Chachua Deparmen of Chemical Engineering Spring 29 We are concerned wih numerical soluion procedures for
More informationLinear Response Theory: The connection between QFT and experiments
Phys540.nb 39 3 Linear Response Theory: The connecion beween QFT and experimens 3.1. Basic conceps and ideas Q: How do we measure he conduciviy of a meal? A: we firs inroduce a weak elecric field E, and
More informationFINM 6900 Finance Theory
FINM 6900 Finance Theory Universiy of Queensland Lecure Noe 4 The Lucas Model 1. Inroducion In his lecure we consider a simple endowmen economy in which an unspecified number of raional invesors rade asses
More informationChapter 6. Systems of First Order Linear Differential Equations
Chaper 6 Sysems of Firs Order Linear Differenial Equaions We will only discuss firs order sysems However higher order sysems may be made ino firs order sysems by a rick shown below We will have a sligh
More informationMulti-scale 2D acoustic full waveform inversion with high frequency impulsive source
Muli-scale D acousic full waveform inversion wih high frequency impulsive source Vladimir N Zubov*, Universiy of Calgary, Calgary AB vzubov@ucalgaryca and Michael P Lamoureux, Universiy of Calgary, Calgary
More informationBU Macro BU Macro Fall 2008, Lecture 4
Dynamic Programming BU Macro 2008 Lecure 4 1 Ouline 1. Cerainy opimizaion problem used o illusrae: a. Resricions on exogenous variables b. Value funcion c. Policy funcion d. The Bellman equaion and an
More informationSolutions of Sample Problems for Third In-Class Exam Math 246, Spring 2011, Professor David Levermore
Soluions of Sample Problems for Third In-Class Exam Mah 6, Spring, Professor David Levermore Compue he Laplace ransform of f e from is definiion Soluion The definiion of he Laplace ransform gives L[f]s
More informationScheduling of Crude Oil Movements at Refinery Front-end
Scheduling of Crude Oil Movemens a Refinery Fron-end Ramkumar Karuppiah and Ignacio Grossmann Carnegie Mellon Universiy ExxonMobil Case Sudy: Dr. Kevin Furman Enerprise-wide Opimizaion Projec March 15,
More informationLecture 1 Overview. course mechanics. outline & topics. what is a linear dynamical system? why study linear systems? some examples
EE263 Auumn 27-8 Sephen Boyd Lecure 1 Overview course mechanics ouline & opics wha is a linear dynamical sysem? why sudy linear sysems? some examples 1 1 Course mechanics all class info, lecures, homeworks,
More informationNotes on Kalman Filtering
Noes on Kalman Filering Brian Borchers and Rick Aser November 7, Inroducion Daa Assimilaion is he problem of merging model predicions wih acual measuremens of a sysem o produce an opimal esimae of he curren
More informationAnnouncements. Recap: Filtering. Recap: Reasoning Over Time. Example: State Representations for Robot Localization. Particle Filtering
Inroducion o Arificial Inelligence V22.0472-001 Fall 2009 Lecure 18: aricle & Kalman Filering Announcemens Final exam will be a 7pm on Wednesday December 14 h Dae of las class 1.5 hrs long I won ask anyhing
More informationTom Heskes and Onno Zoeter. Presented by Mark Buller
Tom Heskes and Onno Zoeer Presened by Mark Buller Dynamic Bayesian Neworks Direced graphical models of sochasic processes Represen hidden and observed variables wih differen dependencies Generalize Hidden
More informationTechnical Report Doc ID: TR March-2013 (Last revision: 23-February-2016) On formulating quadratic functions in optimization models.
Technical Repor Doc ID: TR--203 06-March-203 (Las revision: 23-Februar-206) On formulaing quadraic funcions in opimizaion models. Auhor: Erling D. Andersen Convex quadraic consrains quie frequenl appear
More informationA universal ordinary differential equation
1 / 10 A universal ordinary differenial equaion Olivier Bournez 1, Amaury Pouly 2 1 LIX, École Polyechnique, France 2 Max Planck Insiue for Sofware Sysems, Germany 12 july 2017 2 / 10 Universal differenial
More informationInventory Analysis and Management. Multi-Period Stochastic Models: Optimality of (s, S) Policy for K-Convex Objective Functions
Muli-Period Sochasic Models: Opimali of (s, S) Polic for -Convex Objecive Funcions Consider a seing similar o he N-sage newsvendor problem excep ha now here is a fixed re-ordering cos (> 0) for each (re-)order.
More informationA Shooting Method for A Node Generation Algorithm
A Shooing Mehod for A Node Generaion Algorihm Hiroaki Nishikawa W.M.Keck Foundaion Laboraory for Compuaional Fluid Dynamics Deparmen of Aerospace Engineering, Universiy of Michigan, Ann Arbor, Michigan
More informationExercises: Similarity Transformation
Exercises: Similariy Transformaion Problem. Diagonalize he following marix: A [ 2 4 Soluion. Marix A has wo eigenvalues λ 3 and λ 2 2. Since (i) A is a 2 2 marix and (ii) i has 2 disinc eigenvalues, we
More informationEXERCISES FOR SECTION 1.5
1.5 Exisence and Uniqueness of Soluions 43 20. 1 v c 21. 1 v c 1 2 4 6 8 10 1 2 2 4 6 8 10 Graph of approximae soluion obained using Euler s mehod wih = 0.1. Graph of approximae soluion obained using Euler
More informationNetwork Newton Distributed Optimization Methods
Nework Newon Disribued Opimizaion Mehods Aryan Mokhari, Qing Ling, and Alejandro Ribeiro Absrac We sudy he problem of minimizing a sum of convex objecive funcions where he componens of he objecive are
More informationSUFFICIENT CONDITIONS FOR EXISTENCE SOLUTION OF LINEAR TWO-POINT BOUNDARY PROBLEM IN MINIMIZATION OF QUADRATIC FUNCTIONAL
HE PUBLISHING HOUSE PROCEEDINGS OF HE ROMANIAN ACADEMY, Series A, OF HE ROMANIAN ACADEMY Volume, Number 4/200, pp 287 293 SUFFICIEN CONDIIONS FOR EXISENCE SOLUION OF LINEAR WO-POIN BOUNDARY PROBLEM IN
More informationInstitute for Mathematical Methods in Economics. University of Technology Vienna. Singapore, May Manfred Deistler
MULTIVARIATE TIME SERIES ANALYSIS AND FORECASTING Manfred Deisler E O S Economerics and Sysems Theory Insiue for Mahemaical Mehods in Economics Universiy of Technology Vienna Singapore, May 2004 Inroducion
More informationTheory of! Partial Differential Equations-I!
hp://users.wpi.edu/~grear/me61.hml! Ouline! Theory o! Parial Dierenial Equaions-I! Gréar Tryggvason! Spring 010! Basic Properies o PDE!! Quasi-linear Firs Order Equaions! - Characerisics! - Linear and
More information3.1.3 INTRODUCTION TO DYNAMIC OPTIMIZATION: DISCRETE TIME PROBLEMS. A. The Hamiltonian and First-Order Conditions in a Finite Time Horizon
3..3 INRODUCION O DYNAMIC OPIMIZAION: DISCREE IME PROBLEMS A. he Hamilonian and Firs-Order Condiions in a Finie ime Horizon Define a new funcion, he Hamilonian funcion, H. H he change in he oal value of
More informationMath Week 14 April 16-20: sections first order systems of linear differential equations; 7.4 mass-spring systems.
Mah 2250-004 Week 4 April 6-20 secions 7.-7.3 firs order sysems of linear differenial equaions; 7.4 mass-spring sysems. Mon Apr 6 7.-7.2 Sysems of differenial equaions (7.), and he vecor Calculus we need
More informationSZG Macro 2011 Lecture 3: Dynamic Programming. SZG macro 2011 lecture 3 1
SZG Macro 2011 Lecure 3: Dynamic Programming SZG macro 2011 lecure 3 1 Background Our previous discussion of opimal consumpion over ime and of opimal capial accumulaion sugges sudying he general decision
More informationGlobal Convergence of Online Limited Memory BFGS
Journal of Machine Learning Research 16 (2015 3151-3181 Submied 9/14; Revised 7/15; Published 12/15 Global Convergence of Online Limied Memory BFGS Aryan Mokhari Alejandro Ribeiro Deparmen of Elecrical
More informationSUPPLEMENTARY INFORMATION
SUPPLEMENTARY INFORMATION DOI: 0.038/NCLIMATE893 Temporal resoluion and DICE * Supplemenal Informaion Alex L. Maren and Sephen C. Newbold Naional Cener for Environmenal Economics, US Environmenal Proecion
More informationSlide03 Historical Overview Haykin Chapter 3 (Chap 1, 3, 3rd Ed): Single-Layer Perceptrons Multiple Faces of a Single Neuron Part I: Adaptive Filter
Slide3 Haykin Chaper 3 (Chap, 3, 3rd Ed): Single-Layer Perceprons CPSC 636-6 Insrucor: Yoonsuck Choe Hisorical Overview McCulloch and Pis (943): neural neworks as compuing machines. Hebb (949): posulaed
More informationOptimal Path Planning for Flexible Redundant Robot Manipulators
25 WSEAS In. Conf. on DYNAMICAL SYSEMS and CONROL, Venice, Ialy, November 2-4, 25 (pp363-368) Opimal Pah Planning for Flexible Redundan Robo Manipulaors H. HOMAEI, M. KESHMIRI Deparmen of Mechanical Engineering
More informationSmoothing. Backward smoother: At any give T, replace the observation yt by a combination of observations at & before T
Smoohing Consan process Separae signal & noise Smooh he daa: Backward smooher: A an give, replace he observaion b a combinaion of observaions a & before Simple smooher : replace he curren observaion wih
More information12: AUTOREGRESSIVE AND MOVING AVERAGE PROCESSES IN DISCRETE TIME. Σ j =
1: AUTOREGRESSIVE AND MOVING AVERAGE PROCESSES IN DISCRETE TIME Moving Averages Recall ha a whie noise process is a series { } = having variance σ. The whie noise process has specral densiy f (λ) = of
More informationFrom Particles to Rigid Bodies
Rigid Body Dynamics From Paricles o Rigid Bodies Paricles No roaions Linear velociy v only Rigid bodies Body roaions Linear velociy v Angular velociy ω Rigid Bodies Rigid bodies have boh a posiion and
More informationMath 315: Linear Algebra Solutions to Assignment 6
Mah 35: Linear Algebra s o Assignmen 6 # Which of he following ses of vecors are bases for R 2? {2,, 3, }, {4,, 7, 8}, {,,, 3}, {3, 9, 4, 2}. Explain your answer. To generae he whole R 2, wo linearly independen
More informationLie Derivatives operator vector field flow push back Lie derivative of
Lie Derivaives The Lie derivaive is a mehod of compuing he direcional derivaive of a vecor field wih respec o anoher vecor field We already know how o make sense of a direcional derivaive of real valued
More informationAnnouncements: Warm-up Exercise:
Fri Apr 13 7.1 Sysems of differenial equaions - o model muli-componen sysems via comparmenal analysis hp//en.wikipedia.org/wiki/muli-comparmen_model Announcemens Warm-up Exercise Here's a relaively simple
More informationY. Xiang, Learning Bayesian Networks 1
Learning Bayesian Neworks Objecives Acquisiion of BNs Technical conex of BN learning Crierion of sound srucure learning BN srucure learning in 2 seps BN CPT esimaion Reference R.E. Neapolian: Learning
More informationChapter 7: Inverse-Response Systems
Chaper 7: Invere-Repone Syem Normal Syem Invere-Repone Syem Baic Sar ou in he wrong direcion End up in he original eady-ae gain value Two or more yem wih differen magniude and cale in parallel Main yem
More informationConvergence of the Neumann series in higher norms
Convergence of he Neumann series in higher norms Charles L. Epsein Deparmen of Mahemaics, Universiy of Pennsylvania Version 1.0 Augus 1, 003 Absrac Naural condiions on an operaor A are given so ha he Neumann
More informationLecture 2-1 Kinematics in One Dimension Displacement, Velocity and Acceleration Everything in the world is moving. Nothing stays still.
Lecure - Kinemaics in One Dimension Displacemen, Velociy and Acceleraion Everyhing in he world is moving. Nohing says sill. Moion occurs a all scales of he universe, saring from he moion of elecrons in
More informationConcourse Math Spring 2012 Worked Examples: Matrix Methods for Solving Systems of 1st Order Linear Differential Equations
Concourse Mah 80 Spring 0 Worked Examples: Marix Mehods for Solving Sysems of s Order Linear Differenial Equaions The Main Idea: Given a sysem of s order linear differenial equaions d x d Ax wih iniial
More information2. Nonlinear Conservation Law Equations
. Nonlinear Conservaion Law Equaions One of he clear lessons learned over recen years in sudying nonlinear parial differenial equaions is ha i is generally no wise o ry o aack a general class of nonlinear
More informationm = 41 members n = 27 (nonfounders), f = 14 (founders) 8 markers from chromosome 19
Sequenial Imporance Sampling (SIS) AKA Paricle Filering, Sequenial Impuaion (Kong, Liu, Wong, 994) For many problems, sampling direcly from he arge disribuion is difficul or impossible. One reason possible
More informationMA 214 Calculus IV (Spring 2016) Section 2. Homework Assignment 1 Solutions
MA 14 Calculus IV (Spring 016) Secion Homework Assignmen 1 Soluions 1 Boyce and DiPrima, p 40, Problem 10 (c) Soluion: In sandard form he given firs-order linear ODE is: An inegraing facor is given by
More informationOutline. lse-logo. Outline. Outline. 1 Wald Test. 2 The Likelihood Ratio Test. 3 Lagrange Multiplier Tests
Ouline Ouline Hypohesis Tes wihin he Maximum Likelihood Framework There are hree main frequenis approaches o inference wihin he Maximum Likelihood framework: he Wald es, he Likelihood Raio es and he Lagrange
More informationBackward stochastic dynamics on a filtered probability space
Backward sochasic dynamics on a filered probabiliy space Gechun Liang Oxford-Man Insiue, Universiy of Oxford based on join work wih Terry Lyons and Zhongmin Qian Page 1 of 15 gliang@oxford-man.ox.ac.uk
More informationMath 334 Fall 2011 Homework 11 Solutions
Dec. 2, 2 Mah 334 Fall 2 Homework Soluions Basic Problem. Transform he following iniial value problem ino an iniial value problem for a sysem: u + p()u + q() u g(), u() u, u () v. () Soluion. Le v u. Then
More informationIMPLICIT AND INVERSE FUNCTION THEOREMS PAUL SCHRIMPF 1 OCTOBER 25, 2013
IMPLICI AND INVERSE FUNCION HEOREMS PAUL SCHRIMPF 1 OCOBER 25, 213 UNIVERSIY OF BRIISH COLUMBIA ECONOMICS 526 We have exensively sudied how o solve sysems of linear equaions. We know how o check wheher
More informationClass Meeting # 10: Introduction to the Wave Equation
MATH 8.5 COURSE NOTES - CLASS MEETING # 0 8.5 Inroducion o PDEs, Fall 0 Professor: Jared Speck Class Meeing # 0: Inroducion o he Wave Equaion. Wha is he wave equaion? The sandard wave equaion for a funcion
More informationVariational Iteration Method for Solving System of Fractional Order Ordinary Differential Equations
IOSR Journal of Mahemaics (IOSR-JM) e-issn: 2278-5728, p-issn: 2319-765X. Volume 1, Issue 6 Ver. II (Nov - Dec. 214), PP 48-54 Variaional Ieraion Mehod for Solving Sysem of Fracional Order Ordinary Differenial
More informationStationary Distribution. Design and Analysis of Algorithms Andrei Bulatov
Saionary Disribuion Design and Analysis of Algorihms Andrei Bulaov Algorihms Markov Chains 34-2 Classificaion of Saes k By P we denoe he (i,j)-enry of i, j Sae is accessible from sae if 0 for some k 0
More informationSystem of Linear Differential Equations
Sysem of Linear Differenial Equaions In "Ordinary Differenial Equaions" we've learned how o solve a differenial equaion for a variable, such as: y'k5$e K2$x =0 solve DE yx = K 5 2 ek2 x C_C1 2$y''C7$y
More informationLet us start with a two dimensional case. We consider a vector ( x,
Roaion marices We consider now roaion marices in wo and hree dimensions. We sar wih wo dimensions since wo dimensions are easier han hree o undersand, and one dimension is a lile oo simple. However, our
More informationSome Basic Information about M-S-D Systems
Some Basic Informaion abou M-S-D Sysems 1 Inroducion We wan o give some summary of he facs concerning unforced (homogeneous) and forced (non-homogeneous) models for linear oscillaors governed by second-order,
More information