Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation
|
|
- Oscar Colin Walsh
- 5 years ago
- Views:
Transcription
1 Course Noes for EE7C Spring 018: Convex Opimizaion and Approximaion Insrucor: Moriz Hard Graduae Insrucor: Max Simchowiz Ocober 15, Srong convexiy This lecure inroduces he noion of srong convexiy and combines i wih smoohness o develop he concep of condiion number. While smoohness gave as an upper bound on he second-order erm in Taylor s approximaion, srong convexiy will give us a lower bound. Taking ogeher, hese wo assumpions are quie powerful as hey lead o a much faser convergence rae of he form exp Ω. In words, gradien descen on smooh and srongly convex funcions decreases he error muliplicaively by some facor sricly less han 1 in each ieraion. The echnical par follows he corresponding chaper in Bubeck s ex [Bub15]. 3.1 Reminders Recall ha we had a leas wo definiions apiece for convexiy and smoohness: a general definiion for all funcions and a more compac definiion for wice-differeniable funcions. A funcion f is convex if, for each inpu, here exiss a globally valid linear lower bound on he funcion: f y f x + g xy x. For differeniable funcions, he role of g is played by he gradien. A funcion f is β-smooh if, for each inpu, here exiss a globally valid quadraic upper bound on he funcion, wih finie quadraic parameer β: f y f x + g xy x + β x y. More poeically, a smooh, convex funcion is rapped 1
2 beween a parabola and a line. Since β is covarian wih affine ransformaions, e.g. changes of unis of measuremen, we will frequenly refer o a β-smooh funcion as simply smooh. For wice-differeniable funcions, hese properies admi simple condiions for smoohness in erms of he Hessian, or marix of second parial derivaives. A D funcion f is convex if f x 0 and i is β-smooh if f x βi. We furhermore defined he noion of L-Lipschizness. A funcion f is L-Lipschiz if he amoun ha i sreches is inpus is bounded by L: f x f y L x y. Noe ha for differeniable funcions, β-smoohness is equivalen o β-lipschizness of he gradien. 3. Srong convexiy Wih hese hree conceps, we were able o prove wo error decay raes for gradien descen and is projecive, sochasic, and subgradien flavors. However, hese raes were subsanially slower han wha s observed in cerain seings in pracice. Noing he asymmery beween our linear lower bound from convexiy and our quadraic upper bound from smoohness we inroduce a new, more resriced funcion class by upgrading our lower bound o second order. Definiion 3.1 Srong convexiy. A funcion f : Ω R is α-srongly convex if, for all x, y Ω, he following inequaliy holds for some α > 0: f y f x + gx y x + α x y As wih smoohness, we will ofen shoren α-srongly convex o srongly convex. A srongly convex, smooh funcion is one ha can be squeezed beween wo parabolas. If β-smoohness is a good hing, hen α-convexiy guaranees we don have oo much of a good hing. A wice differeniable funcion is α-srongly convex if f x αi. Once again, noe ha he parameer α changes under affine ransformaions. Convenienly enough, for α-srongly convex, β-smooh funcions, we can define a basisindependen quaniy called he condiion number. Definiion 3. Condiion Number. An α-srongly convex, β-smooh funcion f has condiion number β α. For a posiive-definie quadraic funcion f, his definiion of he condiion number corresponds wih he perhaps more familiar definiion of he condiion number of he marix defining he quadraic.
3 A look back and ahead. The following able summarizes he resuls from he previous lecure and he resuls o be obained in his lecure. In boh, he value ɛ is he difference beween f a some value x compued from he oupus of gradien descen and f calculaed a an opimizer x. Convex Srongly convex Lipschiz ɛ O1/ ɛ O1/ Smooh ɛ O1/ ɛ e Ω Table 1: Bounds on error ɛ as a funcion of number of seps aken for gradien descen applied o various classes of funcions. Since a rae ha is exponenial in erms of he magniude of he error is linear in erms of he bi precision, his rae of convergence is ermed linear. We now move o prove hese raes. 3.3 Convergence rae srongly convex funcions For no good reason we begin wih a convergence bound for srongly convex Lipschiz funcions, in which we obain a O1/ rae of convergence. Theorem 3.3. Assume f : Ω R is α-srongly convex and L-Lipschiz. Le x be an opimizer of f, and le x s be he updaed poin a sep s using projeced gradien descen. Le he max number of ieraions be wih an adapive sep size η s = αs+1, hen f s + 1 x s f x L α + 1 The heorem implies he convergence rae of projeced gradien descen for α-srongly convex funcions is similar o ha of β-smooh funcions wih a bound on error ɛ O1/. In order o prove Theorem 3.3, we need he following proposiion. Proposiion 3.4 Jensen s inequaliy. Assume f : Ω R is a convex funcion and x 1, x,...,, x n, i=1 n γ ix i / i=1 n γ i Ω wih weighs γ i > 0, hen n f i=1 γ i x i i=1 n γ n i=1 γ i f x i i i=1 n γ i For a graphical proof follow his link. Proof of Theorem 3.3. Recall he wo seps updae rule of projeced gradien descen y s+1 = x s η s f x s x s+1 = Π Ω y s+1 3
4 Firs, he proof begins by exploring an upper bound of difference beween funcion values f x s and f x. f x s f x f x s x s x α x s x = 1 η s x s y s+1 x s x α x s x by updae rule = 1 η s x s x + x s y s+1 y s+1 x α x s x by "Fundamenal Theorem of Opimizaion" = 1 η s x s x y s+1 x + η s f x s α x s x by updae rule 1 η s x s x x s+1 x + η s f x s α x s x 1 α η s x s x 1 x s+1 x + η sl η s by Lemma?? by Lipschizness By muliplying s on boh sides and subsiuing he sep size η s by, we ge αs+1 s f x s f x L α + α 4 ss 1 x s x ss + 1 x s+1 x Finally, we can find he upper bound of he funcion value shown in Theorem 3.3 obained using seps projeced gradien descen f s + 1 x s = s + 1 f x s by Proposiion 3.4 s f x + L α + α 4 ss 1 x s x ss + 1 x s+1 x f x + s f x + L α + 1 L α + 1 α x +1 x by elescoping sum This concludes ha solving an opimizaion problem wih a srongly convex objecive 1 funcion wih projeced gradien descen has a convergence rae is of he order +1, which is faser compared o he case purely wih Lipschizness. 4
5 3.4 Convergence rae for smooh and srongly convex funcions Theorem 3.5. Assume f : R n R is α-srongly convex and β-smooh. Le x be an opimizer of f, and le x be he updaed poin a sep using gradien descen wih a consan sep size 1 β, i.e. using he updae rule x +1 = x 1 β f x. Then, x +1 x exp α β x 1 x In order o prove Theorem 3.5, we require use of he following lemma. Lemma 3.6. Assume f as in Theorem 3.5. Then x, y R n and an updae of he form x + = x 1 β f x, Proof of Lemma 3.6. f x + f y f x x y 1 β f x α x y f x + f x + f x f y f x x + x + β x+ x Smoohness + f x x y α x y Srong convexiy = f x x + y + 1 β f x α x y Definiion of x + = f x x y 1 β f x α x y Definiion of x + Now wih Lemma 3.6 we are able o prove Theorem 3.5. Proof of Theorem 3.5. x +1 x = x 1 β f x x = x x β f x x x + 1 β f x 1 α x x β Use of Lemma 3.6 wih y = x, x = x 1 α x1 x β exp α x 1 x β 5
6 We can also prove he same resul for he consrained case using projeced gradien descen. Theorem 3.7. Assume f : Ω R is α-srongly convex and β-smooh. Le x be an opimizer of f, and le x be he updaed poin a sep using projeced gradien descen wih a consan sep size 1 β, i.e. using he updae rule x +1 = Π Ω x 1 β f x where Π Ω is he projecion operaor. Then, x +1 x exp α β x 1 x As in Theorem 3.5, we will require he use of he following Lemma in order o prove Theorem 3.7. Lemma 3.8. Assume f as in Theorem 3.5. Then x, y Ω, define x + Ω as x + = Π Ω x β 1 f x and he funcion g : Ω R as gx = βx x+. Then f x + f y gx x y 1 β gx α x y Proof of Lemma 3.8. The following is given by he Projecion Lemma, for all x, x +, y defined as in Theorem 3.7. f x x + y gx x + y Therefore, following he form of he proof of Lemma 3.6, f x + f x + f x f y f x x + y + 1 β gx α x y gx x + y + 1 β gx α x y = gx x y 1 β gx α x y The proof of Theorem 3.7 is exacly as in Theorem 3.5 afer subsiuing he appropriae projeced gradien descen updae in place of he sandard gradien descen updae, wih Lemma 3.8 used in place of Lemma 3.6. References [Bub15] Sébasien Bubeck. Convex opimizaion: Algorihms and complexiy. Foundaions and Trends in Machine Learning, 83-4:31 357,
A Primal-Dual Type Algorithm with the O(1/t) Convergence Rate for Large Scale Constrained Convex Programs
PROC. IEEE CONFERENCE ON DECISION AND CONTROL, 06 A Primal-Dual Type Algorihm wih he O(/) Convergence Rae for Large Scale Consrained Convex Programs Hao Yu and Michael J. Neely Absrac This paper considers
More informationLecture 9: September 25
0-725: Opimizaion Fall 202 Lecure 9: Sepember 25 Lecurer: Geoff Gordon/Ryan Tibshirani Scribes: Xuezhi Wang, Subhodeep Moira, Abhimanu Kumar Noe: LaTeX emplae couresy of UC Berkeley EECS dep. Disclaimer:
More informationMATH 5720: Gradient Methods Hung Phan, UMass Lowell October 4, 2018
MATH 5720: Gradien Mehods Hung Phan, UMass Lowell Ocober 4, 208 Descen Direcion Mehods Consider he problem min { f(x) x R n}. The general descen direcions mehod is x k+ = x k + k d k where x k is he curren
More informationSupplement for Stochastic Convex Optimization: Faster Local Growth Implies Faster Global Convergence
Supplemen for Sochasic Convex Opimizaion: Faser Local Growh Implies Faser Global Convergence Yi Xu Qihang Lin ianbao Yang Proof of heorem heorem Suppose Assumpion holds and F (w) obeys he LGC (6) Given
More informationLecture 2 October ε-approximation of 2-player zero-sum games
Opimizaion II Winer 009/10 Lecurer: Khaled Elbassioni Lecure Ocober 19 1 ε-approximaion of -player zero-sum games In his lecure we give a randomized ficiious play algorihm for obaining an approximae soluion
More informationOnline Convex Optimization Example And Follow-The-Leader
CSE599s, Spring 2014, Online Learning Lecure 2-04/03/2014 Online Convex Opimizaion Example And Follow-The-Leader Lecurer: Brendan McMahan Scribe: Sephen Joe Jonany 1 Review of Online Convex Opimizaion
More informationAn Introduction to Malliavin calculus and its applications
An Inroducion o Malliavin calculus and is applicaions Lecure 5: Smoohness of he densiy and Hörmander s heorem David Nualar Deparmen of Mahemaics Kansas Universiy Universiy of Wyoming Summer School 214
More informationOptimality Conditions for Unconstrained Problems
62 CHAPTER 6 Opimaliy Condiions for Unconsrained Problems 1 Unconsrained Opimizaion 11 Exisence Consider he problem of minimizing he funcion f : R n R where f is coninuous on all of R n : P min f(x) x
More informationLinear Response Theory: The connection between QFT and experiments
Phys540.nb 39 3 Linear Response Theory: The connecion beween QFT and experimens 3.1. Basic conceps and ideas Q: How do we measure he conduciviy of a meal? A: we firs inroduce a weak elecric field E, and
More information1 Review of Zero-Sum Games
COS 5: heoreical Machine Learning Lecurer: Rob Schapire Lecure #23 Scribe: Eugene Brevdo April 30, 2008 Review of Zero-Sum Games Las ime we inroduced a mahemaical model for wo player zero-sum games. Any
More informationIMPLICIT AND INVERSE FUNCTION THEOREMS PAUL SCHRIMPF 1 OCTOBER 25, 2013
IMPLICI AND INVERSE FUNCION HEOREMS PAUL SCHRIMPF 1 OCOBER 25, 213 UNIVERSIY OF BRIISH COLUMBIA ECONOMICS 526 We have exensively sudied how o solve sysems of linear equaions. We know how o check wheher
More informationL07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS. NA568 Mobile Robotics: Methods & Algorithms
L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS NA568 Mobile Roboics: Mehods & Algorihms Today s Topic Quick review on (Linear) Kalman Filer Kalman Filering for Non-Linear Sysems Exended Kalman Filer (EKF)
More informationNotes on online convex optimization
Noes on online convex opimizaion Karl Sraos Online convex opimizaion (OCO) is a principled framework for online learning: OnlineConvexOpimizaion Inpu: convex se S, number of seps T For =, 2,..., T : Selec
More informationHeat kernel and Harnack inequality on Riemannian manifolds
Hea kernel and Harnack inequaliy on Riemannian manifolds Alexander Grigor yan UHK 11/02/2014 onens 1 Laplace operaor and hea kernel 1 2 Uniform Faber-Krahn inequaliy 3 3 Gaussian upper bounds 4 4 ean-value
More informationLecture 20: Riccati Equations and Least Squares Feedback Control
34-5 LINEAR SYSTEMS Lecure : Riccai Equaions and Leas Squares Feedback Conrol 5.6.4 Sae Feedback via Riccai Equaions A recursive approach in generaing he marix-valued funcion W ( ) equaion for i for he
More informationt is a basis for the solution space to this system, then the matrix having these solutions as columns, t x 1 t, x 2 t,... x n t x 2 t...
Mah 228- Fri Mar 24 5.6 Marix exponenials and linear sysems: The analogy beween firs order sysems of linear differenial equaions (Chaper 5) and scalar linear differenial equaions (Chaper ) is much sronger
More informationA Decentralized Second-Order Method with Exact Linear Convergence Rate for Consensus Optimization
1 A Decenralized Second-Order Mehod wih Exac Linear Convergence Rae for Consensus Opimizaion Aryan Mokhari, Wei Shi, Qing Ling, and Alejandro Ribeiro Absrac This paper considers decenralized consensus
More information15. Vector Valued Functions
1. Vecor Valued Funcions Up o his poin, we have presened vecors wih consan componens, for example, 1, and,,4. However, we can allow he componens of a vecor o be funcions of a common variable. For example,
More informationAryan Mokhtari, Wei Shi, Qing Ling, and Alejandro Ribeiro. cost function n
IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS, VOL. 2, NO. 4, DECEMBER 2016 507 A Decenralized Second-Order Mehod wih Exac Linear Convergence Rae for Consensus Opimizaion Aryan Mokhari,
More informationHamilton- J acobi Equation: Weak S olution We continue the study of the Hamilton-Jacobi equation:
M ah 5 7 Fall 9 L ecure O c. 4, 9 ) Hamilon- J acobi Equaion: Weak S oluion We coninue he sudy of he Hamilon-Jacobi equaion: We have shown ha u + H D u) = R n, ) ; u = g R n { = }. ). In general we canno
More informationLecture 4: November 13
Compuaional Learning Theory Fall Semeser, 2017/18 Lecure 4: November 13 Lecurer: Yishay Mansour Scribe: Guy Dolinsky, Yogev Bar-On, Yuval Lewi 4.1 Fenchel-Conjugae 4.1.1 Moivaion Unil his lecure we saw
More informationarxiv: v1 [math.oc] 13 Sep 2018
Hamilonian Descen Mehods Chris J. Maddison 1,2,*, Daniel Paulin 1,*, Yee Whye Teh 1,2, Brendan O Donoghue 2, and Arnaud Douce 1 arxiv:1809.05042v1 [mah.oc] 13 Sep 2018 1 Deparmen of Saisics, Universiy
More information3.1.3 INTRODUCTION TO DYNAMIC OPTIMIZATION: DISCRETE TIME PROBLEMS. A. The Hamiltonian and First-Order Conditions in a Finite Time Horizon
3..3 INRODUCION O DYNAMIC OPIMIZAION: DISCREE IME PROBLEMS A. he Hamilonian and Firs-Order Condiions in a Finie ime Horizon Define a new funcion, he Hamilonian funcion, H. H he change in he oal value of
More informationLAPLACE TRANSFORM AND TRANSFER FUNCTION
CHBE320 LECTURE V LAPLACE TRANSFORM AND TRANSFER FUNCTION Professor Dae Ryook Yang Spring 2018 Dep. of Chemical and Biological Engineering 5-1 Road Map of he Lecure V Laplace Transform and Transfer funcions
More informationApproximation Algorithms for Unique Games via Orthogonal Separators
Approximaion Algorihms for Unique Games via Orhogonal Separaors Lecure noes by Konsanin Makarychev. Lecure noes are based on he papers [CMM06a, CMM06b, LM4]. Unique Games In hese lecure noes, we define
More informationSome Basic Information about M-S-D Systems
Some Basic Informaion abou M-S-D Sysems 1 Inroducion We wan o give some summary of he facs concerning unforced (homogeneous) and forced (non-homogeneous) models for linear oscillaors governed by second-order,
More informationHamilton Jacobi equations
Hamilon Jacobi equaions Inoducion o PDE The rigorous suff from Evans, mosly. We discuss firs u + H( u = 0, (1 where H(p is convex, and superlinear a infiniy, H(p lim p p = + This by comes by inegraion
More informationEmpirical Process Theory
Empirical Process heory 4.384 ime Series Analysis, Fall 27 Reciaion by Paul Schrimpf Supplemenary o lecures given by Anna Mikusheva Ocober 7, 28 Reciaion 7 Empirical Process heory Le x be a real-valued
More informationGuest Lectures for Dr. MacFarlane s EE3350 Part Deux
Gues Lecures for Dr. MacFarlane s EE3350 Par Deux Michael Plane Mon., 08-30-2010 Wrie name in corner. Poin ou his is a review, so I will go faser. Remind hem o go lisen o online lecure abou geing an A
More informationOnline Appendix to Solution Methods for Models with Rare Disasters
Online Appendix o Soluion Mehods for Models wih Rare Disasers Jesús Fernández-Villaverde and Oren Levinal In his Online Appendix, we presen he Euler condiions of he model, we develop he pricing Calvo block,
More informationTwo Coupled Oscillators / Normal Modes
Lecure 3 Phys 3750 Two Coupled Oscillaors / Normal Modes Overview and Moivaion: Today we ake a small, bu significan, sep owards wave moion. We will no ye observe waves, bu his sep is imporan in is own
More informationIntroduction to Probability and Statistics Slides 4 Chapter 4
Inroducion o Probabiliy and Saisics Slides 4 Chaper 4 Ammar M. Sarhan, asarhan@mahsa.dal.ca Deparmen of Mahemaics and Saisics, Dalhousie Universiy Fall Semeser 8 Dr. Ammar Sarhan Chaper 4 Coninuous Random
More informationChapter 6. Systems of First Order Linear Differential Equations
Chaper 6 Sysems of Firs Order Linear Differenial Equaions We will only discuss firs order sysems However higher order sysems may be made ino firs order sysems by a rick shown below We will have a sligh
More informationAppendix to Online l 1 -Dictionary Learning with Application to Novel Document Detection
Appendix o Online l -Dicionary Learning wih Applicaion o Novel Documen Deecion Shiva Prasad Kasiviswanahan Huahua Wang Arindam Banerjee Prem Melville A Background abou ADMM In his secion, we give a brief
More informationNotes for Lecture 17-18
U.C. Berkeley CS278: Compuaional Complexiy Handou N7-8 Professor Luca Trevisan April 3-8, 2008 Noes for Lecure 7-8 In hese wo lecures we prove he firs half of he PCP Theorem, he Amplificaion Lemma, up
More informationThis document was generated at 1:04 PM, 09/10/13 Copyright 2013 Richard T. Woodward. 4. End points and transversality conditions AGEC
his documen was generaed a 1:4 PM, 9/1/13 Copyrigh 213 Richard. Woodward 4. End poins and ransversaliy condiions AGEC 637-213 F z d Recall from Lecure 3 ha a ypical opimal conrol problem is o maimize (,,
More informationMATH 4330/5330, Fourier Analysis Section 6, Proof of Fourier s Theorem for Pointwise Convergence
MATH 433/533, Fourier Analysis Secion 6, Proof of Fourier s Theorem for Poinwise Convergence Firs, some commens abou inegraing periodic funcions. If g is a periodic funcion, g(x + ) g(x) for all real x,
More informationU( θ, θ), U(θ 1/2, θ + 1/2) and Cauchy (θ) are not exponential families. (The proofs are not easy and require measure theory. See the references.
Lecure 5 Exponenial Families Exponenial families, also called Koopman-Darmois families, include a quie number of well known disribuions. Many nice properies enjoyed by exponenial families allow us o provide
More informationMatrix Versions of Some Refinements of the Arithmetic-Geometric Mean Inequality
Marix Versions of Some Refinemens of he Arihmeic-Geomeric Mean Inequaliy Bao Qi Feng and Andrew Tonge Absrac. We esablish marix versions of refinemens due o Alzer ], Carwrigh and Field 4], and Mercer 5]
More informationSimulation-Solving Dynamic Models ABE 5646 Week 2, Spring 2010
Simulaion-Solving Dynamic Models ABE 5646 Week 2, Spring 2010 Week Descripion Reading Maerial 2 Compuer Simulaion of Dynamic Models Finie Difference, coninuous saes, discree ime Simple Mehods Euler Trapezoid
More informationEXERCISES FOR SECTION 1.5
1.5 Exisence and Uniqueness of Soluions 43 20. 1 v c 21. 1 v c 1 2 4 6 8 10 1 2 2 4 6 8 10 Graph of approximae soluion obained using Euler s mehod wih = 0.1. Graph of approximae soluion obained using Euler
More informationOutline. lse-logo. Outline. Outline. 1 Wald Test. 2 The Likelihood Ratio Test. 3 Lagrange Multiplier Tests
Ouline Ouline Hypohesis Tes wihin he Maximum Likelihood Framework There are hree main frequenis approaches o inference wihin he Maximum Likelihood framework: he Wald es, he Likelihood Raio es and he Lagrange
More informationNetwork Newton Distributed Optimization Methods
Nework Newon Disribued Opimizaion Mehods Aryan Mokhari, Qing Ling, and Alejandro Ribeiro Absrac We sudy he problem of minimizing a sum of convex objecive funcions where he componens of he objecive are
More informationAn introduction to the theory of SDDP algorithm
An inroducion o he heory of SDDP algorihm V. Leclère (ENPC) Augus 1, 2014 V. Leclère Inroducion o SDDP Augus 1, 2014 1 / 21 Inroducion Large scale sochasic problem are hard o solve. Two ways of aacking
More informationSection 3.5 Nonhomogeneous Equations; Method of Undetermined Coefficients
Secion 3.5 Nonhomogeneous Equaions; Mehod of Undeermined Coefficiens Key Terms/Ideas: Linear Differenial operaor Nonlinear operaor Second order homogeneous DE Second order nonhomogeneous DE Soluion o homogeneous
More informationInventory Analysis and Management. Multi-Period Stochastic Models: Optimality of (s, S) Policy for K-Convex Objective Functions
Muli-Period Sochasic Models: Opimali of (s, S) Polic for -Convex Objecive Funcions Consider a seing similar o he N-sage newsvendor problem excep ha now here is a fixed re-ordering cos (> 0) for each (re-)order.
More informationTom Heskes and Onno Zoeter. Presented by Mark Buller
Tom Heskes and Onno Zoeer Presened by Mark Buller Dynamic Bayesian Neworks Direced graphical models of sochasic processes Represen hidden and observed variables wih differen dependencies Generalize Hidden
More information6.003 Homework #9 Solutions
6.00 Homework #9 Soluions Problems. Fourier varieies a. Deermine he Fourier series coefficiens of he following signal, which is periodic in 0. x () 0 0 a 0 5 a k sin πk 5 sin πk 5 πk for k 0 a k 0 πk j
More informationMA 214 Calculus IV (Spring 2016) Section 2. Homework Assignment 1 Solutions
MA 14 Calculus IV (Spring 016) Secion Homework Assignmen 1 Soluions 1 Boyce and DiPrima, p 40, Problem 10 (c) Soluion: In sandard form he given firs-order linear ODE is: An inegraing facor is given by
More informationVehicle Arrival Models : Headway
Chaper 12 Vehicle Arrival Models : Headway 12.1 Inroducion Modelling arrival of vehicle a secion of road is an imporan sep in raffic flow modelling. I has imporan applicaion in raffic flow simulaion where
More informationOscillation of an Euler Cauchy Dynamic Equation S. Huff, G. Olumolode, N. Pennington, and A. Peterson
PROCEEDINGS OF THE FOURTH INTERNATIONAL CONFERENCE ON DYNAMICAL SYSTEMS AND DIFFERENTIAL EQUATIONS May 4 7, 00, Wilmingon, NC, USA pp 0 Oscillaion of an Euler Cauchy Dynamic Equaion S Huff, G Olumolode,
More informationCHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK
175 CHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK 10.1 INTRODUCTION Amongs he research work performed, he bes resuls of experimenal work are validaed wih Arificial Neural Nework. From he
More information6.003 Homework #9 Solutions
6.003 Homework #9 Soluions Problems. Fourier varieies a. Deermine he Fourier series coefficiens of he following signal, which is periodic in 0. x () 0 3 0 a 0 5 a k a k 0 πk j3 e 0 e j πk 0 jπk πk e 0
More informationDiebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles
Diebold, Chaper 7 Francis X. Diebold, Elemens of Forecasing, 4h Ediion (Mason, Ohio: Cengage Learning, 006). Chaper 7. Characerizing Cycles Afer compleing his reading you should be able o: Define covariance
More informationELE 538B: Large-Scale Optimization for Data Science. Quasi-Newton methods. Yuxin Chen Princeton University, Spring 2018
ELE 538B: Large-Scale Opimizaion for Daa Science Quasi-Newon mehods Yuxin Chen Princeon Universiy, Spring 208 00 op ff(x (x)(k)) f p 2 L µ f 05 k f (xk ) k f (xk ) =) f op ieraions converges in only 5
More informationChapter 3 Boundary Value Problem
Chaper 3 Boundary Value Problem A boundary value problem (BVP) is a problem, ypically an ODE or a PDE, which has values assigned on he physical boundary of he domain in which he problem is specified. Le
More informationCorrespondence should be addressed to Nguyen Buong,
Hindawi Publishing Corporaion Fixed Poin Theory and Applicaions Volume 011, Aricle ID 76859, 10 pages doi:101155/011/76859 Research Aricle An Implici Ieraion Mehod for Variaional Inequaliies over he Se
More information14 Autoregressive Moving Average Models
14 Auoregressive Moving Average Models In his chaper an imporan parameric family of saionary ime series is inroduced, he family of he auoregressive moving average, or ARMA, processes. For a large class
More informationFinish reading Chapter 2 of Spivak, rereading earlier sections as necessary. handout and fill in some missing details!
MAT 257, Handou 6: Ocober 7-2, 20. I. Assignmen. Finish reading Chaper 2 of Spiva, rereading earlier secions as necessary. handou and fill in some missing deails! II. Higher derivaives. Also, read his
More informationChapter 2. First Order Scalar Equations
Chaper. Firs Order Scalar Equaions We sar our sudy of differenial equaions in he same way he pioneers in his field did. We show paricular echniques o solve paricular ypes of firs order differenial equaions.
More informationArticle from. Predictive Analytics and Futurism. July 2016 Issue 13
Aricle from Predicive Analyics and Fuurism July 6 Issue An Inroducion o Incremenal Learning By Qiang Wu and Dave Snell Machine learning provides useful ools for predicive analyics The ypical machine learning
More informationA Forward-Backward Splitting Method with Component-wise Lazy Evaluation for Online Structured Convex Optimization
A Forward-Backward Spliing Mehod wih Componen-wise Lazy Evaluaion for Online Srucured Convex Opimizaion Yukihiro Togari and Nobuo Yamashia March 28, 2016 Absrac: We consider large-scale opimizaion problems
More informationTechnical Report Doc ID: TR March-2013 (Last revision: 23-February-2016) On formulating quadratic functions in optimization models.
Technical Repor Doc ID: TR--203 06-March-203 (Las revision: 23-Februar-206) On formulaing quadraic funcions in opimizaion models. Auhor: Erling D. Andersen Convex quadraic consrains quie frequenl appear
More informationUtility maximization in incomplete markets
Uiliy maximizaion in incomplee markes Marcel Ladkau 27.1.29 Conens 1 Inroducion and general seings 2 1.1 Marke model....................................... 2 1.2 Trading sraegy.....................................
More informationAn introduction to evolution PDEs November 16, 2018 CHAPTER 5 - MARKOV SEMIGROUP
An inroucion o evoluion PDEs November 6, 8 CHAPTER 5 - MARKOV SEMIGROUP Conens. Markov semigroup. Asympoic of Markov semigroups 3.. Srong posiiviy coniion an Doeblin Theorem 3.. Geomeric sabiliy uner Harris
More informationOnline Learning with Partial Feedback. 1 Online Mirror Descent with Estimated Gradient
Avance Course in Machine Learning Spring 2010 Online Learning wih Parial Feeback Hanous are joinly prepare by Shie Mannor an Shai Shalev-Shwarz In previous lecures we alke abou he general framework of
More informationSlide03 Historical Overview Haykin Chapter 3 (Chap 1, 3, 3rd Ed): Single-Layer Perceptrons Multiple Faces of a Single Neuron Part I: Adaptive Filter
Slide3 Haykin Chaper 3 (Chap, 3, 3rd Ed): Single-Layer Perceprons CPSC 636-6 Insrucor: Yoonsuck Choe Hisorical Overview McCulloch and Pis (943): neural neworks as compuing machines. Hebb (949): posulaed
More informationThe Simple Linear Regression Model: Reporting the Results and Choosing the Functional Form
Chaper 6 The Simple Linear Regression Model: Reporing he Resuls and Choosing he Funcional Form To complee he analysis of he simple linear regression model, in his chaper we will consider how o measure
More informationClarke s Generalized Gradient and Edalat s L-derivative
1 21 ISSN 1759-9008 1 Clarke s Generalized Gradien and Edala s L-derivaive PETER HERTLING Absrac: Clarke [2, 3, 4] inroduced a generalized gradien for real-valued Lipschiz coninuous funcions on Banach
More informationLecture 2-1 Kinematics in One Dimension Displacement, Velocity and Acceleration Everything in the world is moving. Nothing stays still.
Lecure - Kinemaics in One Dimension Displacemen, Velociy and Acceleraion Everyhing in he world is moving. Nohing says sill. Moion occurs a all scales of he universe, saring from he moion of elecrons in
More informationPENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD
PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD HAN XIAO 1. Penalized Leas Squares Lasso solves he following opimizaion problem, ˆβ lasso = arg max β R p+1 1 N y i β 0 N x ij β j β j (1.1) for some 0.
More informationVariational Iteration Method for Solving System of Fractional Order Ordinary Differential Equations
IOSR Journal of Mahemaics (IOSR-JM) e-issn: 2278-5728, p-issn: 2319-765X. Volume 1, Issue 6 Ver. II (Nov - Dec. 214), PP 48-54 Variaional Ieraion Mehod for Solving Sysem of Fracional Order Ordinary Differenial
More informationGames Against Nature
Advanced Course in Machine Learning Spring 2010 Games Agains Naure Handous are joinly prepared by Shie Mannor and Shai Shalev-Shwarz In he previous lecures we alked abou expers in differen seups and analyzed
More informationPrimal-Dual Splitting: Recent Improvements and Variants
Primal-Dual Spliing: Recen Improvemens and Varians 1 Thomas Pock and 2 Anonin Chambolle 1 Insiue for Compuer Graphics and Vision, TU Graz, Ausria 2 CMAP & CNRS École Polyechnique, France The proximal poin
More informationCHAPTER 2: Mathematics for Microeconomics
CHAPTER : Mahemaics for Microeconomics The problems in his chaper are primarily mahemaical. They are inended o give sudens some pracice wih he conceps inroduced in Chaper, bu he problems in hemselves offer
More informationProjection-Based Optimal Mode Scheduling
Projecion-Based Opimal Mode Scheduling T. M. Caldwell and T. D. Murphey Absrac This paper develops an ieraive opimizaion echnique ha can be applied o mode scheduling. The algorihm provides boh a mode schedule
More informationExam 1 Solutions. 1 Question 1. February 10, Part (A) 1.2 Part (B) To find equilibrium solutions, set P (t) = C = dp
Exam Soluions Februar 0, 05 Quesion. Par (A) To find equilibrium soluions, se P () = C = = 0. This implies: = P ( P ) P = P P P = P P = P ( + P ) = 0 The equilibrium soluion are hus P () = 0 and P () =..
More informationLECTURE 1: GENERALIZED RAY KNIGHT THEOREM FOR FINITE MARKOV CHAINS
LECTURE : GENERALIZED RAY KNIGHT THEOREM FOR FINITE MARKOV CHAINS We will work wih a coninuous ime reversible Markov chain X on a finie conneced sae space, wih generaor Lf(x = y q x,yf(y. (Recall ha q
More informationLecture 33: November 29
36-705: Inermediae Saisics Fall 2017 Lecurer: Siva Balakrishnan Lecure 33: November 29 Today we will coninue discussing he boosrap, and hen ry o undersand why i works in a simple case. In he las lecure
More information6. Stochastic calculus with jump processes
A) Trading sraegies (1/3) Marke wih d asses S = (S 1,, S d ) A rading sraegy can be modelled wih a vecor φ describing he quaniies invesed in each asse a each insan : φ = (φ 1,, φ d ) The value a of a porfolio
More information10. State Space Methods
. Sae Space Mehods. Inroducion Sae space modelling was briefly inroduced in chaper. Here more coverage is provided of sae space mehods before some of heir uses in conrol sysem design are covered in he
More information2.7. Some common engineering functions. Introduction. Prerequisites. Learning Outcomes
Some common engineering funcions 2.7 Inroducion This secion provides a caalogue of some common funcions ofen used in Science and Engineering. These include polynomials, raional funcions, he modulus funcion
More informationACE 562 Fall Lecture 4: Simple Linear Regression Model: Specification and Estimation. by Professor Scott H. Irwin
ACE 56 Fall 005 Lecure 4: Simple Linear Regression Model: Specificaion and Esimaion by Professor Sco H. Irwin Required Reading: Griffihs, Hill and Judge. "Simple Regression: Economic and Saisical Model
More information1 1 + x 2 dx. tan 1 (2) = ] ] x 3. Solution: Recall that the given integral is improper because. x 3. 1 x 3. dx = lim dx.
. Use Simpson s rule wih n 4 o esimae an () +. Soluion: Since we are using 4 seps, 4 Thus we have [ ( ) f() + 4f + f() + 4f 3 [ + 4 4 6 5 + + 4 4 3 + ] 5 [ + 6 6 5 + + 6 3 + ]. 5. Our funcion is f() +.
More informationELE 538B: Large-Scale Optimization for Data Science. Introduction. Yuxin Chen Princeton University, Spring 2018
ELE 538B: Large-Scale Opimizaion for Daa Science Inroducion Yuxin Chen Princeon Universiy, Spring 2018 Surge of daa-inensive applicaions Widespread applicaions in large-scale daa science and learning 2.5
More informationClass Meeting # 10: Introduction to the Wave Equation
MATH 8.5 COURSE NOTES - CLASS MEETING # 0 8.5 Inroducion o PDEs, Fall 0 Professor: Jared Speck Class Meeing # 0: Inroducion o he Wave Equaion. Wha is he wave equaion? The sandard wave equaion for a funcion
More informationCash Flow Valuation Mode Lin Discrete Time
IOSR Journal of Mahemaics (IOSR-JM) e-issn: 2278-5728,p-ISSN: 2319-765X, 6, Issue 6 (May. - Jun. 2013), PP 35-41 Cash Flow Valuaion Mode Lin Discree Time Olayiwola. M. A. and Oni, N. O. Deparmen of Mahemaics
More informationConstrained Stochastic Gradient Descent for Large-scale Least Squares Problem
Consrained Sochasic Gradien Descen for Large-scale Leas Squares Problem ABSRAC Yang Mu Universiy of Massachuses Boson 100 Morrissey Boulevard Boson, MA, US 015 yangmu@cs.umb.edu ianyi Zhou Universiy of
More informationEE 315 Notes. Gürdal Arslan CLASS 1. (Sections ) What is a signal?
EE 35 Noes Gürdal Arslan CLASS (Secions.-.2) Wha is a signal? In his class, a signal is some funcion of ime and i represens how some physical quaniy changes over some window of ime. Examples: velociy of
More informationFinal Spring 2007
.615 Final Spring 7 Overview The purpose of he final exam is o calculae he MHD β limi in a high-bea oroidal okamak agains he dangerous n = 1 exernal ballooning-kink mode. Effecively, his corresponds o
More information556: MATHEMATICAL STATISTICS I
556: MATHEMATICAL STATISTICS I INEQUALITIES 5.1 Concenraion and Tail Probabiliy Inequaliies Lemma (CHEBYCHEV S LEMMA) c > 0, If X is a random variable, hen for non-negaive funcion h, and P X [h(x) c] E
More informationQUANTITATIVE DECAY FOR NONLINEAR WAVE EQUATIONS
QUANTITATIVE DECAY FOR NONLINEAR WAVE EQUATIONS SPUR FINAL PAPER, SUMMER 08 CALVIN HSU MENTOR: RUOXUAN YANG PROJECT SUGGESTED BY: ANDREW LAWRIE Augus, 08 Absrac. In his paper, we discuss he decay rae for
More informationLongest Common Prefixes
Longes Common Prefixes The sandard ordering for srings is he lexicographical order. I is induced by an order over he alphabe. We will use he same symbols (,
More informationChapter 8 The Complete Response of RL and RC Circuits
Chaper 8 The Complee Response of RL and RC Circuis Seoul Naional Universiy Deparmen of Elecrical and Compuer Engineering Wha is Firs Order Circuis? Circuis ha conain only one inducor or only one capacior
More informationChristos Papadimitriou & Luca Trevisan November 22, 2016
U.C. Bereley CS170: Algorihms Handou LN-11-22 Chrisos Papadimiriou & Luca Trevisan November 22, 2016 Sreaming algorihms In his lecure and he nex one we sudy memory-efficien algorihms ha process a sream
More informationLaplace transfom: t-translation rule , Haynes Miller and Jeremy Orloff
Laplace ransfom: -ranslaion rule 8.03, Haynes Miller and Jeremy Orloff Inroducory example Consider he sysem ẋ + 3x = f(, where f is he inpu and x he response. We know is uni impulse response is 0 for
More informationOn the Convergence Time of Dual Subgradient Methods for Strongly Convex Programs
IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 63(4), PP. 05, APRIL, 08 On he Convergence Time of Dual Subgradien Mehods for Srongly Convex Programs Hao Yu and Michael J. Neely Universiy of Souhern California
More information(1) (2) Differentiation of (1) and then substitution of (3) leads to. Therefore, we will simply consider the second-order linear system given by (4)
Phase Plane Analysis of Linear Sysems Adaped from Applied Nonlinear Conrol by Sloine and Li The general form of a linear second-order sysem is a c b d From and b bc d a Differeniaion of and hen subsiuion
More informationHW6: MRI Imaging Pulse Sequences (7 Problems for 100 pts)
HW6: MRI Imaging Pulse Sequences (7 Problems for 100 ps) GOAL The overall goal of HW6 is o beer undersand pulse sequences for MRI image reconsrucion. OBJECTIVES 1) Design a spin echo pulse sequence o image
More informationOptimal rate of convergence of an ODE associated to the Fast Gradient Descent schemes for b>0
Opimal rae of convergence of an ODE associaed o he Fas Gradien Descen schemes for b>0 Jf Aujol, Ch Dossal To cie his version: Jf Aujol, Ch Dossal. Opimal rae of convergence of an ODE associaed o he Fas
More information