ENSIEEHT-IRIT, 2, rue Camichel, Toulouse (France) LMS SAMTECH, A Siemens Business,15-16, Lower Park Row, BS1 5BN Bristol (UK)

Size: px
Start display at page:

Download "ENSIEEHT-IRIT, 2, rue Camichel, Toulouse (France) LMS SAMTECH, A Siemens Business,15-16, Lower Park Row, BS1 5BN Bristol (UK)"

Transcription

1 Quasi-Newton updates with weighted secant equations by. Gratton, V. Malmedy and Ph. L. oint Report NAXY October ENIEEH-IRI, 2, rue Camichel, 3000 oulouse France LM AMECH, A iemens Business,5-6, Lower Park Row, B 5BN Bristol UK University of Namur, 6, rue de Bruxelles, B5000 Namur Belgium

2 Quasi-Newton updates with weighted secant equations. Gratton, V. Malmedy and Ph. L. oint 6 October 203 Abstract We provide a formula for variational quasi-newton updates with multiple weighted secant equations. he derivation of the formula leads to a ylvester equation in the correction matrix. Examples are given. Introduction Quasi-Newton methods have long been at the core of nonlinear optimization, both in the constrained and unconstrained case. uch methods are characterized by their use of an approximation for the Hessian of the underlying nonlinear functions which is typically recurred by low rank modifications of an initial estimate. he corresponding updates are derived variationally to enforce minimal correction in a suitable norm while guaranteeing the preservation of available secant equations which provide information about local curvature. By far the most common case is when a single such secant equation is considered for updating the Hessian approximation, and several famous formulae have been derived in this context, such as the DFP Davidon, 959, Fletcher and Powell, 963, BFG Broyden, 970, Fletcher, 970, Goldfarb, 970, hanno, 970 or PB Powell, 970 updates. More unusual is the proposal by chnabel 983 see also Byrd, Nocedal and chnabel, 994 to consider the simultaneous enforcement of several secant equations. he purpose of this small note, whose content follows up from Malmedy 200, is to pursue this line of thought, but in a slightly relaxed context where the deviation from the multiple secant equations is penalized rather than strictly forced to zero. Our development is primarily motivated by an attempt to explain the influence of the order in which secant pairs are considered in limited-memory BFG, as described in Malmedy 200, where a weighed combination of secant updates is used to conduct this investigation. his motivation was recently reinforced by a presentation of a stochastic context in which secant equations are enforced on average Nocedal, 203. his short paper is organized as follows. he new penalized multiple secant update is derived in ection 2, while a few examples are considered in ection 3. Conclusions are drawn in ection 4. 2 Penalized multisecant quasi-newton updates We consider deriving a new quasi-newton approximation B + for the Hessian of a nonlinear function from IR n into IR from an existing one B, while at the same time attempting to satisfy m secant equations. More specifically, we assume that m secant pairs of the form s i,y i i =,...,m are at our disposal. hese pairs are supposed to contain useful curvature information and are typically derived by computing differences y i in the gradient of the underlying nonlinear function corresponding to steps s i in the variable space IR n. he matrix E holds the correction B + B. he vectors s i and y i form the columns of the n m matrices and Y, respectively. ENEEIH-IRI, 2, rue Camichel, 3000 oulouse, France. serge.gratton@enseeiht.fr LM AMECH, A iemens Business, 5-6, Lower Park Row, B 5BN Bristol UK. vincent.malmedy@gmail.com Namur Center for Complex ystems naxys and Department of Mathematics, University of Namur, 6, rue de Bruxelles, B-5000 Namur, Belgium. philippe.toint@unamur.be

3 Gratton, Malmedy, oint: Quasi-Newton updates with weighted secant equations 2 As is well-known see Dennis and chnabel, 983, for instance, most secant updates may be derived by a variational approach. o define multisecant versions of the classical formulae, it would therefore make sense to solve the constrained variational problem min E 2 W EW 2 F 2.a s.t. B + Es i = y i for i =,...,m, and E = E, 2.b using some particular nonsingular weighting matrix W, whose choice determines the formula that is obtained. For instance, choosing W = I yields the multisecant PB update, while any matrix W such that W W = Y yields the multisecant DFP update. he multisecant BFG may then be obtained by duality from its DFP counterpart. his is the approach followed by chnabel 983 see also Byrd et al., 994 for a multisecant analog of the BFG formula. We propose here to consider a slightly different framework : instead of solving 2., we aim to solve the penalized problem min E 2 W EW 2 F + 2 ω i B + Es i y i 2 Ŵ 2.2a s.t. E = E, 2.2b with Ŵ = W W, in effect relaxing the secant equations by penalizing their violation with independent nonnegative penalty parameters ω i for i =,...,m. 2. olving the penalized variational problem Zeroing the derivative of the Lagrangian function of problem 2.2 with respect to E in the matrix direction K, we obtain that trw EW W KW + trk M M + ω i s i K Ŵ B + Es i y i = 0, 2.3 where M is the Lagrange multiplier corresponding to the symmetry constraint. Using the vectorization operator vec, this equation may be rewritten as vecŵ EŴ + M M,vecK + ω i s i Ŵ Es i s i Ŵ r i,veck = 0, 2.4 where r i = y i Bs i is the residual on the i-th secant equation at the current Hessian. As the equation 2.4 must hold for every matrix K, we thus have that which is equivalent to vecŵ EŴ + M M + Ŵ EŴ + M M + ω i s i Ŵ Es i s i Ŵ r i = 0, ω i Ŵ Es i s i Ŵ r i s i = 0. By summing this equation with its transpose and then, pre- and post-multiplying by Ŵ, we eliminate the Lagrangian multiplier M, and obtain that E I + ω i s i s i Ŵ + I + ω i Ŵs i s i E = ω i Ŵs iri + r i s i Ŵ.

4 Gratton, Malmedy, oint: Quasi-Newton updates with weighted secant equations 3 If we define the n m matrices R = Y B, R = R Ω /2, = Ω /2 and Z = Ŵ, where Ω = diagω i, we thus have to solve the Lyapunov equation AE + EA = C, 2.5 where A = I + Z and C = R Z + Z R. We know from that equation 2.5 has a unique solution whenever A does not have opposite eigenvalues see p. 44 of Lancaster and ismenetsky, 985. But A = I + Z = I + Ŵ, and from the relation spa {} = spi + Ŵ {}, we may deduce that A has only positive eigenvalues. hus 2.5 has a unique solution. ransposing the left and right hand-side of 2.5, and using the symmetry of C, we furthermore see that E also satifies 2.5. herefore, since the solution of this equation is unique, E is symmetric. We now show that E can be written in the form E = R,Z X X 2 R X2 X 3 Z 2.6 where X and X 3 are symmetric. ubstituting this expression into 2.5 and using Z R Z + Z R = R, 0 Im R I m 0 Z yields [ R,Z X X 2 2 X2 X RX + ZX 2 RX2 + ZX3 0 X R + X2 Z + 0 X2 R + X3 Z 0 Im I m 0 A solution of 2.5, and thus the unique one, can thus be obtained by setting and choosing X 3 to be the solution of the Lyapunov equation But the matrix ] R Z = 0. X = 0, X 2 = 2I m + Z 2.7 I m + ZX3 + X 3 I m + Z = RX2 X 2 R. 2.8 I m + Z = Im + Ŵ 2.9 is symmetric and positive definite. Hence equation 2.8 also has a unique solution, which can directly be computed using the eigen-decomposition of the m m symmetric matrix I m + Z. herefore, using 2.6, 2.7 and the fact that 2.9 implies the symmetry of X 2, we obtain that the solution of 2.3 can be written as E = R 2I m + Z Z + Z 2I m + Z R + ZX3 Z, 2.0 where X 3 solves 2.8. Note that the core of the computation to obtain E the determination of X 2 and X 3 only involves forming m m matrices from m n factors and then working in this smaller space, resulting in an Om 2 n + m 3 computational complexity.

5 Gratton, Malmedy, oint: Quasi-Newton updates with weighted secant equations 4 An important particular case is the case where we have only one secant equation, i.e. m =. In this case, the Lyapunov equation 2.8 is a scalar equation, and, defining r = y Bs, r = ωr and s = ωs, we obtain that E = rz + zr 2 + z s s r 2 + z s + z s zz, = rs Ŵ + Ŵsr 2ω + s Ŵs s r 2ω + s Ŵs ω + s Ŵs Ŵss Ŵ 2. As the scalar ω goes to infinity, we get the update E = rs Ŵ + Ŵsr s Ŵs s r s Ŵs 2 Ŵss Ŵ. 3 Examples of penalized updates 3. he penalized PB update If we choose Ŵ = I, we obtain from 2.0 and 2.8 that where X 3 solves B + = B + R 2Ω + + 2Ω + R + Ω /2 X 3 Ω /2, 3.2 I m + Ω /2 Ω /2 X 3 + X 3 I m + Ω /2 Ω /2 = Ω /2 R Ω /2 X 2 X 2 Ω /2 R Ω /2 3.3 with X 2 = 2I + Ω /2 Ω /2. If we consider the single-secant case m =, we derive from 2. that B + = B + rs + sr 2ω + s 2 + ω + s 2 2 s,r 2ω + s 2 s 2 ss, he penalized DFP update If we now assume that the matrices B and Y are symmetric, and that Y is positive definite, we deduce that B + = B + R 2Ω + Y Y + Y 2Ω + Y R + Y Ω /2 X 3 Ω /2 Y, 3.5 where X 2 = 2I + ΩY Ω and X3 solves X 2 IX 3 + X 3 X 2 I = Ω /2 R Ω /2 X 2 X 2 Ω /2 R Ω / with X 2 = 2I + Ω /2 Y Ω /2. Note also that a direct computation shows that X 2 I + IX 2 Ω /2 R Ω /2 X 2 + X 2 Ω /2 R Ω /2 X 2 X 2 I + I = Ω /2 R Ω /2 X 2 + X 2 Ω /2 R Ω / o simplify and better understand the relation between the penalized and a standard DFP formula in block form, we set X 4 = X 3 + X 2 Ω /2 R Ω /2 X 2, 3.8 sum up equations 3.7 and 3.6, and obtain that I + Ω /2 Y Ω /2 X 4 + X 4 I + Ω /2 Y Ω /2 = 2X 2 Ω /2 R Ω /2 X

6 Gratton, Malmedy, oint: Quasi-Newton updates with weighted secant equations 5 We also deduce from 3.8 that Ω /2 X 3 Ω /2 = 2Ω + Y R 2Ω + Y + Ω /2 X 4 Ω /2, and inserting this expression into the penalized DFP update 3.5, we obtain that B + = I Y 2Ω 2 + Y B I Y 2Ω 2 + Y +Y 22Ω 2 + Y 2Ω 2 + Y Y 2Ω 2 + Y + X 5 Y 3.20 where, in view of 3.9, X 5 = Ω /2 X 4 Ω /2 solves I + ΩY X 5 + X 5 I + Y Ω = 2 2Ω + Y R 2Ω + Y. 3.2 But, using Y = Y, 22Ω + Y 2Ω + Y Y 2Ω + Y and hence 3.20 finally becomes B + = = 2Ω + Y [ 2 2Ω + Y 2Ω + Y Y 2Ω + Y ] = 2Ω + Y 4Ω + Y 2Ω + Y, I Y 2Ω + Y B I Y 2Ω + Y 2Ω +Y + Y 4Ω + Y 2Ω + Y X5 Y where X 5 solves 3.2. If we set Ω = ωi and assume that Y is non singular, we obtain from 3.2 that X 5 = Oω, and we see from 3.22 that the penalized update converges to the DFP update in block form when ω goes to infinity. If m =, 2. gives that B + = B + ry + yr 2ω + s,y he penalized BFG update ω + s,y 2 2ω + s,y s,r s,y yy Let us assume again that Y is symmetric positive definite and set P = HY, with H being an approximation of the inverse Hessian. Exchanging the role of Y and in the penalized DFP formula, we derive the BFG penalized update defined by where X 6 solves H + = I 2Ω + Y Y H I 2Ω + Y Y 2Ω + + Y 4Ω 2 + Y 2Ω + Y + X6, 3.24 I + ΩY X 6 + X 6 I + Y Ω = 2 2Ω + Y P 2Ω + Y If we set Ω = ωi, and consider large values for ω we see that the penalized update again converges to the BFG formula. We also conclude that for ω sufficiently large, the penalized update will be positive definite. If Y is not symmetric and positive definite, an approximate update could be computed by 2Ω replacing the matrix + Y 4Ω + Y 2Ω + Y + X6 between and in expression 3.24 by any symmetric and positive definite approximation.

7 Gratton, Malmedy, oint: Quasi-Newton updates with weighted secant equations 6 In the case of a single secant pair m =, we obtain that 2ω + s,y + H + = H + ps + sp = I ρsy + 2ρω with ρ = s,y, p = s Hy and θ = + ρ ω 4 Conclusions H ω + s,y 2 2ω + s,y I ρys + 2ρω p,y s,y ss θρss, 3.27 [ ρ y,hy + 2ρ 2 ω + + ρ ω 2 + ρ ] ω. We have used variational techniques to derive a new algorithm for updating a quasi-newton Hessian approximation using multiple secant equations. he cost of the associated updates has been discussed and remains reasonable in the sense that it does not involve terms in the cube of the problem dimension. It is hoped that the new algorithm may prove useful beyond its initial application in Malmedy 200. References C. G. Broyden. he convergence of a class of double-rank minimization algorithms. Journal of the Institute of Mathematics and its Applications, 6, 76 90, 970. R. H. Byrd, J. Nocedal, and R. B. chnabel. Representation of quasi-newton matrices and their use in limited memory methods. Mathematical Programming, 63, 29 56, 994. W. C. Davidon. Variable metric method for minimization. Report ANL-5990Rev., Argonne National Laboratory, Research and Development, 959. Republished in the IAM Journal on Optimization, vol., pp. 7, 99. J. E. Dennis and R. B. chnabel. Numerical Methods for Unconstrained Optimization and Nonlinear Equations. Prentice-Hall, Englewood Cliffs, NJ, UA, 983. Reprinted as Classics in Applied Mathematics 6, IAM, Philadelphia, UA, 996. R. Fletcher. A new approach to variable metric algorithms. Computer Journal, 3, , 970. R. Fletcher and M. J. D. Powell. A rapidly convergent descent method for minimization. Computer Journal, 6, 63 68, 963. D. Goldfarb. A family of variable metric methods derived by variational means. Mathematics of Computation, 24, 23 26, 970. P. Lancaster and M. ismenetsky. he heory of Matrices. Academic Press, London, second edn, 985. V. Malmedy. Hessian approximation in multilevel nonlinear optimization. PhD thesis, Department of Mathematics, University of Namur, Namur, Belgium, 200. J. Nocedal. private communication, oulouse, 203. M. J. D. Powell. A new algorithm for unconstrained optimization. in J. B. Rosen, O. L. Mangasarian and K. Ritter, eds, Nonlinear Programming, pp. 3 65, London, 970. Academic Press. R. B. chnabel. Quasi-newton methods using multiple secant equations. echnical Report CU-C , Departement of Computer cience, University of Colorado at Boulder, Boulder, UA, 983. D. F. hanno. Conditioning of quasi-newton methods for function minimization. Mathematics of Computation, 24, , 970.

5 Quasi-Newton Methods

5 Quasi-Newton Methods Unconstrained Convex Optimization 26 5 Quasi-Newton Methods If the Hessian is unavailable... Notation: H = Hessian matrix. B is the approximation of H. C is the approximation of H 1. Problem: Solve min

More information

Quasi-Newton methods: Symmetric rank 1 (SR1) Broyden Fletcher Goldfarb Shanno February 6, / 25 (BFG. Limited memory BFGS (L-BFGS)

Quasi-Newton methods: Symmetric rank 1 (SR1) Broyden Fletcher Goldfarb Shanno February 6, / 25 (BFG. Limited memory BFGS (L-BFGS) Quasi-Newton methods: Symmetric rank 1 (SR1) Broyden Fletcher Goldfarb Shanno (BFGS) Limited memory BFGS (L-BFGS) February 6, 2014 Quasi-Newton methods: Symmetric rank 1 (SR1) Broyden Fletcher Goldfarb

More information

Algorithms for Constrained Optimization

Algorithms for Constrained Optimization 1 / 42 Algorithms for Constrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University April 19, 2015 2 / 42 Outline 1. Convergence 2. Sequential quadratic

More information

2. Quasi-Newton methods

2. Quasi-Newton methods L. Vandenberghe EE236C (Spring 2016) 2. Quasi-Newton methods variable metric methods quasi-newton methods BFGS update limited-memory quasi-newton methods 2-1 Newton method for unconstrained minimization

More information

Quasi-Newton Methods. Javier Peña Convex Optimization /36-725

Quasi-Newton Methods. Javier Peña Convex Optimization /36-725 Quasi-Newton Methods Javier Peña Convex Optimization 10-725/36-725 Last time: primal-dual interior-point methods Consider the problem min x subject to f(x) Ax = b h(x) 0 Assume f, h 1,..., h m are convex

More information

Optimization II: Unconstrained Multivariable

Optimization II: Unconstrained Multivariable Optimization II: Unconstrained Multivariable CS 205A: Mathematical Methods for Robotics, Vision, and Graphics Justin Solomon CS 205A: Mathematical Methods Optimization II: Unconstrained Multivariable 1

More information

Optimization and Root Finding. Kurt Hornik

Optimization and Root Finding. Kurt Hornik Optimization and Root Finding Kurt Hornik Basics Root finding and unconstrained smooth optimization are closely related: Solving ƒ () = 0 can be accomplished via minimizing ƒ () 2 Slide 2 Basics Root finding

More information

Convex Optimization CMU-10725

Convex Optimization CMU-10725 Convex Optimization CMU-10725 Quasi Newton Methods Barnabás Póczos & Ryan Tibshirani Quasi Newton Methods 2 Outline Modified Newton Method Rank one correction of the inverse Rank two correction of the

More information

Comparative study of Optimization methods for Unconstrained Multivariable Nonlinear Programming Problems

Comparative study of Optimization methods for Unconstrained Multivariable Nonlinear Programming Problems International Journal of Scientific and Research Publications, Volume 3, Issue 10, October 013 1 ISSN 50-3153 Comparative study of Optimization methods for Unconstrained Multivariable Nonlinear Programming

More information

Quasi-Newton methods for minimization

Quasi-Newton methods for minimization Quasi-Newton methods for minimization Lectures for PHD course on Numerical optimization Enrico Bertolazzi DIMS Universitá di Trento November 21 December 14, 2011 Quasi-Newton methods for minimization 1

More information

Methods that avoid calculating the Hessian. Nonlinear Optimization; Steepest Descent, Quasi-Newton. Steepest Descent

Methods that avoid calculating the Hessian. Nonlinear Optimization; Steepest Descent, Quasi-Newton. Steepest Descent Nonlinear Optimization Steepest Descent and Niclas Börlin Department of Computing Science Umeå University niclas.borlin@cs.umu.se A disadvantage with the Newton method is that the Hessian has to be derived

More information

Quasi-Newton Methods. Zico Kolter (notes by Ryan Tibshirani, Javier Peña, Zico Kolter) Convex Optimization

Quasi-Newton Methods. Zico Kolter (notes by Ryan Tibshirani, Javier Peña, Zico Kolter) Convex Optimization Quasi-Newton Methods Zico Kolter (notes by Ryan Tibshirani, Javier Peña, Zico Kolter) Convex Optimization 10-725 Last time: primal-dual interior-point methods Given the problem min x f(x) subject to h(x)

More information

Nonlinear Optimization: What s important?

Nonlinear Optimization: What s important? Nonlinear Optimization: What s important? Julian Hall 10th May 2012 Convexity: convex problems A local minimizer is a global minimizer A solution of f (x) = 0 (stationary point) is a minimizer A global

More information

Improved Damped Quasi-Newton Methods for Unconstrained Optimization

Improved Damped Quasi-Newton Methods for Unconstrained Optimization Improved Damped Quasi-Newton Methods for Unconstrained Optimization Mehiddin Al-Baali and Lucio Grandinetti August 2015 Abstract Recently, Al-Baali (2014) has extended the damped-technique in the modified

More information

Optimization II: Unconstrained Multivariable

Optimization II: Unconstrained Multivariable Optimization II: Unconstrained Multivariable CS 205A: Mathematical Methods for Robotics, Vision, and Graphics Doug James (and Justin Solomon) CS 205A: Mathematical Methods Optimization II: Unconstrained

More information

Lecture 18: November Review on Primal-dual interior-poit methods

Lecture 18: November Review on Primal-dual interior-poit methods 10-725/36-725: Convex Optimization Fall 2016 Lecturer: Lecturer: Javier Pena Lecture 18: November 2 Scribes: Scribes: Yizhu Lin, Pan Liu Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer:

More information

Statistics 580 Optimization Methods

Statistics 580 Optimization Methods Statistics 580 Optimization Methods Introduction Let fx be a given real-valued function on R p. The general optimization problem is to find an x ɛ R p at which fx attain a maximum or a minimum. It is of

More information

On Sparse and Symmetric Matrix Updating

On Sparse and Symmetric Matrix Updating MATHEMATICS OF COMPUTATION, VOLUME 31, NUMBER 140 OCTOBER 1977, PAGES 954-961 On Sparse and Symmetric Matrix Updating Subject to a Linear Equation By Ph. L. Toint* Abstract. A procedure for symmetric matrix

More information

Higher-Order Methods

Higher-Order Methods Higher-Order Methods Stephen J. Wright 1 2 Computer Sciences Department, University of Wisconsin-Madison. PCMI, July 2016 Stephen Wright (UW-Madison) Higher-Order Methods PCMI, July 2016 1 / 25 Smooth

More information

MATH 4211/6211 Optimization Quasi-Newton Method

MATH 4211/6211 Optimization Quasi-Newton Method MATH 4211/6211 Optimization Quasi-Newton Method Xiaojing Ye Department of Mathematics & Statistics Georgia State University Xiaojing Ye, Math & Stat, Georgia State University 0 Quasi-Newton Method Motivation:

More information

Quasi-Newton Methods

Quasi-Newton Methods Newton s Method Pros and Cons Quasi-Newton Methods MA 348 Kurt Bryan Newton s method has some very nice properties: It s extremely fast, at least once it gets near the minimum, and with the simple modifications

More information

Written Examination

Written Examination Division of Scientific Computing Department of Information Technology Uppsala University Optimization Written Examination 202-2-20 Time: 4:00-9:00 Allowed Tools: Pocket Calculator, one A4 paper with notes

More information

Newton s Method. Ryan Tibshirani Convex Optimization /36-725

Newton s Method. Ryan Tibshirani Convex Optimization /36-725 Newton s Method Ryan Tibshirani Convex Optimization 10-725/36-725 1 Last time: dual correspondences Given a function f : R n R, we define its conjugate f : R n R, Properties and examples: f (y) = max x

More information

Minimum Norm Symmetric Quasi-Newton Updates Restricted to Subspaces

Minimum Norm Symmetric Quasi-Newton Updates Restricted to Subspaces MATHEMATICS OF COMPUTATION, VOLUME 32, NUMBER 143 JULY 1978, PAGES 829-837 Minimum Norm Symmetric Quasi-Newton Updates Restricted to Subspaces By Robert B. Schnabel* Abstract. The Davidon-Fletcher-Powell

More information

Seminal papers in nonlinear optimization

Seminal papers in nonlinear optimization Seminal papers in nonlinear optimization Nick Gould, CSED, RAL, Chilton, OX11 0QX, England (n.gould@rl.ac.uk) December 7, 2006 The following papers are classics in the field. Although many of them cover

More information

arxiv: v3 [math.na] 23 Mar 2016

arxiv: v3 [math.na] 23 Mar 2016 Randomized Quasi-Newton Updates are Linearly Convergent Matrix Inversion Algorithms Robert M. Gower and Peter Richtárik arxiv:602.0768v3 [math.na] 23 Mar 206 School of Mathematics University of Edinburgh

More information

Lecture 14: October 17

Lecture 14: October 17 1-725/36-725: Convex Optimization Fall 218 Lecture 14: October 17 Lecturer: Lecturer: Ryan Tibshirani Scribes: Pengsheng Guo, Xian Zhou Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer:

More information

A COMBINED CLASS OF SELF-SCALING AND MODIFIED QUASI-NEWTON METHODS

A COMBINED CLASS OF SELF-SCALING AND MODIFIED QUASI-NEWTON METHODS A COMBINED CLASS OF SELF-SCALING AND MODIFIED QUASI-NEWTON METHODS MEHIDDIN AL-BAALI AND HUMAID KHALFAN Abstract. Techniques for obtaining safely positive definite Hessian approximations with selfscaling

More information

Search Directions for Unconstrained Optimization

Search Directions for Unconstrained Optimization 8 CHAPTER 8 Search Directions for Unconstrained Optimization In this chapter we study the choice of search directions used in our basic updating scheme x +1 = x + t d. for solving P min f(x). x R n All

More information

Quasi-Newton Methods

Quasi-Newton Methods Quasi-Newton Methods Werner C. Rheinboldt These are excerpts of material relating to the boos [OR00 and [Rhe98 and of write-ups prepared for courses held at the University of Pittsburgh. Some further references

More information

Lecture V. Numerical Optimization

Lecture V. Numerical Optimization Lecture V Numerical Optimization Gianluca Violante New York University Quantitative Macroeconomics G. Violante, Numerical Optimization p. 1 /19 Isomorphism I We describe minimization problems: to maximize

More information

ALGORITHM XXX: SC-SR1: MATLAB SOFTWARE FOR SOLVING SHAPE-CHANGING L-SR1 TRUST-REGION SUBPROBLEMS

ALGORITHM XXX: SC-SR1: MATLAB SOFTWARE FOR SOLVING SHAPE-CHANGING L-SR1 TRUST-REGION SUBPROBLEMS ALGORITHM XXX: SC-SR1: MATLAB SOFTWARE FOR SOLVING SHAPE-CHANGING L-SR1 TRUST-REGION SUBPROBLEMS JOHANNES BRUST, OLEG BURDAKOV, JENNIFER B. ERWAY, ROUMMEL F. MARCIA, AND YA-XIANG YUAN Abstract. We present

More information

Preconditioned conjugate gradient algorithms with column scaling

Preconditioned conjugate gradient algorithms with column scaling Proceedings of the 47th IEEE Conference on Decision and Control Cancun, Mexico, Dec. 9-11, 28 Preconditioned conjugate gradient algorithms with column scaling R. Pytla Institute of Automatic Control and

More information

Math 408A: Non-Linear Optimization

Math 408A: Non-Linear Optimization February 12 Broyden Updates Given g : R n R n solve g(x) = 0. Algorithm: Broyden s Method Initialization: x 0 R n, B 0 R n n Having (x k, B k ) compute (x k+1, B x+1 ) as follows: Solve B k s k = g(x

More information

Improving L-BFGS Initialization for Trust-Region Methods in Deep Learning

Improving L-BFGS Initialization for Trust-Region Methods in Deep Learning Improving L-BFGS Initialization for Trust-Region Methods in Deep Learning Jacob Rafati http://rafati.net jrafatiheravi@ucmerced.edu Ph.D. Candidate, Electrical Engineering and Computer Science University

More information

Institute of Computer Science. Limited-memory projective variable metric methods for unconstrained minimization

Institute of Computer Science. Limited-memory projective variable metric methods for unconstrained minimization Institute of Computer Science Academy of Sciences of the Czech Republic Limited-memory projective variable metric methods for unconstrained minimization J. Vlček, L. Lukšan Technical report No. V 036 December

More information

5 Handling Constraints

5 Handling Constraints 5 Handling Constraints Engineering design optimization problems are very rarely unconstrained. Moreover, the constraints that appear in these problems are typically nonlinear. This motivates our interest

More information

Randomized Quasi-Newton Updates are Linearly Convergent Matrix Inversion Algorithms

Randomized Quasi-Newton Updates are Linearly Convergent Matrix Inversion Algorithms Randomized Quasi-Newton Updates are Linearly Convergent Matrix Inversion Algorithms Robert M. Gower and Peter Richtárik School of Mathematics University of Edinburgh United Kingdom ebruary 4, 06 Abstract

More information

A NOTE ON PAN S SECOND-ORDER QUASI-NEWTON UPDATES

A NOTE ON PAN S SECOND-ORDER QUASI-NEWTON UPDATES A NOTE ON PAN S SECOND-ORDER QUASI-NEWTON UPDATES Lei-Hong Zhang, Ping-Qi Pan Department of Mathematics, Southeast University, Nanjing, 210096, P.R.China. Abstract This note, attempts to further Pan s

More information

The Newton-Raphson Algorithm

The Newton-Raphson Algorithm The Newton-Raphson Algorithm David Allen University of Kentucky January 31, 2013 1 The Newton-Raphson Algorithm The Newton-Raphson algorithm, also called Newton s method, is a method for finding the minimum

More information

4TE3/6TE3. Algorithms for. Continuous Optimization

4TE3/6TE3. Algorithms for. Continuous Optimization 4TE3/6TE3 Algorithms for Continuous Optimization (Algorithms for Constrained Nonlinear Optimization Problems) Tamás TERLAKY Computing and Software McMaster University Hamilton, November 2005 terlaky@mcmaster.ca

More information

Chapter 4. Unconstrained optimization

Chapter 4. Unconstrained optimization Chapter 4. Unconstrained optimization Version: 28-10-2012 Material: (for details see) Chapter 11 in [FKS] (pp.251-276) A reference e.g. L.11.2 refers to the corresponding Lemma in the book [FKS] PDF-file

More information

ECS550NFB Introduction to Numerical Methods using Matlab Day 2

ECS550NFB Introduction to Numerical Methods using Matlab Day 2 ECS550NFB Introduction to Numerical Methods using Matlab Day 2 Lukas Laffers lukas.laffers@umb.sk Department of Mathematics, University of Matej Bel June 9, 2015 Today Root-finding: find x that solves

More information

Optimization Methods

Optimization Methods Optimization Methods Decision making Examples: determining which ingredients and in what quantities to add to a mixture being made so that it will meet specifications on its composition allocating available

More information

Programming, numerics and optimization

Programming, numerics and optimization Programming, numerics and optimization Lecture C-3: Unconstrained optimization II Łukasz Jankowski ljank@ippt.pan.pl Institute of Fundamental Technological Research Room 4.32, Phone +22.8261281 ext. 428

More information

Gradient-Based Optimization

Gradient-Based Optimization Multidisciplinary Design Optimization 48 Chapter 3 Gradient-Based Optimization 3. Introduction In Chapter we described methods to minimize (or at least decrease) a function of one variable. While problems

More information

Optimization Methods for Machine Learning

Optimization Methods for Machine Learning Optimization Methods for Machine Learning Sathiya Keerthi Microsoft Talks given at UC Santa Cruz February 21-23, 2017 The slides for the talks will be made available at: http://www.keerthis.com/ Introduction

More information

Outline. Scientific Computing: An Introductory Survey. Optimization. Optimization Problems. Examples: Optimization Problems

Outline. Scientific Computing: An Introductory Survey. Optimization. Optimization Problems. Examples: Optimization Problems Outline Scientific Computing: An Introductory Survey Chapter 6 Optimization 1 Prof. Michael. Heath Department of Computer Science University of Illinois at Urbana-Champaign Copyright c 2002. Reproduction

More information

Handout on Newton s Method for Systems

Handout on Newton s Method for Systems Handout on Newton s Method for Systems The following summarizes the main points of our class discussion of Newton s method for approximately solving a system of nonlinear equations F (x) = 0, F : IR n

More information

Lecture 7 Unconstrained nonlinear programming

Lecture 7 Unconstrained nonlinear programming Lecture 7 Unconstrained nonlinear programming Weinan E 1,2 and Tiejun Li 2 1 Department of Mathematics, Princeton University, weinan@princeton.edu 2 School of Mathematical Sciences, Peking University,

More information

Shiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 3. Gradient Method

Shiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 3. Gradient Method Shiqian Ma, MAT-258A: Numerical Optimization 1 Chapter 3 Gradient Method Shiqian Ma, MAT-258A: Numerical Optimization 2 3.1. Gradient method Classical gradient method: to minimize a differentiable convex

More information

Scientific Computing: An Introductory Survey

Scientific Computing: An Introductory Survey Scientific Computing: An Introductory Survey Chapter 6 Optimization Prof. Michael T. Heath Department of Computer Science University of Illinois at Urbana-Champaign Copyright c 2002. Reproduction permitted

More information

Part 4: IIR Filters Optimization Approach. Tutorial ISCAS 2007

Part 4: IIR Filters Optimization Approach. Tutorial ISCAS 2007 Part 4: IIR Filters Optimization Approach Tutorial ISCAS 2007 Copyright 2007 Andreas Antoniou Victoria, BC, Canada Email: aantoniou@ieee.org July 24, 2007 Frame # 1 Slide # 1 A. Antoniou Part4: IIR Filters

More information

Numerical Analysis of Electromagnetic Fields

Numerical Analysis of Electromagnetic Fields Pei-bai Zhou Numerical Analysis of Electromagnetic Fields With 157 Figures Springer-Verlag Berlin Heidelberg New York London Paris Tokyo Hong Kong Barcelona Budapest Contents Part 1 Universal Concepts

More information

Nonlinear Programming

Nonlinear Programming Nonlinear Programming Kees Roos e-mail: C.Roos@ewi.tudelft.nl URL: http://www.isa.ewi.tudelft.nl/ roos LNMB Course De Uithof, Utrecht February 6 - May 8, A.D. 2006 Optimization Group 1 Outline for week

More information

Spectral gradient projection method for solving nonlinear monotone equations

Spectral gradient projection method for solving nonlinear monotone equations Journal of Computational and Applied Mathematics 196 (2006) 478 484 www.elsevier.com/locate/cam Spectral gradient projection method for solving nonlinear monotone equations Li Zhang, Weijun Zhou Department

More information

1 Computing with constraints

1 Computing with constraints Notes for 2017-04-26 1 Computing with constraints Recall that our basic problem is minimize φ(x) s.t. x Ω where the feasible set Ω is defined by equality and inequality conditions Ω = {x R n : c i (x)

More information

Modification of Non-Linear Constrained Optimization. Ban A. Metras Dep. of Statistics College of Computer Sciences and Mathematics

Modification of Non-Linear Constrained Optimization. Ban A. Metras Dep. of Statistics College of Computer Sciences and Mathematics Raf. J. Of Comp. & Math s., Vol., No., 004 Modification of Non-Linear Constrained Optimization Ban A. Metras Dep. of Statistics College of Computer Sciences and Mathematics Received: 3/5/00 Accepted: /0/00

More information

An Alternative Three-Term Conjugate Gradient Algorithm for Systems of Nonlinear Equations

An Alternative Three-Term Conjugate Gradient Algorithm for Systems of Nonlinear Equations International Journal of Mathematical Modelling & Computations Vol. 07, No. 02, Spring 2017, 145-157 An Alternative Three-Term Conjugate Gradient Algorithm for Systems of Nonlinear Equations L. Muhammad

More information

KKT Conditions for Rank-Deficient Nonlinear Least-Square Problems with Rank-Deficient Nonlinear Constraints1

KKT Conditions for Rank-Deficient Nonlinear Least-Square Problems with Rank-Deficient Nonlinear Constraints1 JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS: Vol. 100, No. 1. pp. 145-160, JANUARY 1999 KKT Conditions for Rank-Deficient Nonlinear Least-Square Problems with Rank-Deficient Nonlinear Constraints1

More information

Methods for Unconstrained Optimization Numerical Optimization Lectures 1-2

Methods for Unconstrained Optimization Numerical Optimization Lectures 1-2 Methods for Unconstrained Optimization Numerical Optimization Lectures 1-2 Coralia Cartis, University of Oxford INFOMM CDT: Modelling, Analysis and Computation of Continuous Real-World Problems Methods

More information

Scientific Computing: An Introductory Survey

Scientific Computing: An Introductory Survey Scientific Computing: An Introductory Survey Chapter 6 Optimization Prof. Michael T. Heath Department of Computer Science University of Illinois at Urbana-Champaign Copyright c 2002. Reproduction permitted

More information

Scientific Computing: An Introductory Survey

Scientific Computing: An Introductory Survey Scientific Computing: An Introductory Survey Chapter 6 Optimization Prof. Michael T. Heath Department of Computer Science University of Illinois at Urbana-Champaign Copyright c 2002. Reproduction permitted

More information

Image restoration. An example in astronomy

Image restoration. An example in astronomy Image restoration Convex approaches: penalties and constraints An example in astronomy Jean-François Giovannelli Groupe Signal Image Laboratoire de l Intégration du Matériau au Système Univ. Bordeaux CNRS

More information

Numerical Optimization Professor Horst Cerjak, Horst Bischof, Thomas Pock Mat Vis-Gra SS09

Numerical Optimization Professor Horst Cerjak, Horst Bischof, Thomas Pock Mat Vis-Gra SS09 Numerical Optimization 1 Working Horse in Computer Vision Variational Methods Shape Analysis Machine Learning Markov Random Fields Geometry Common denominator: optimization problems 2 Overview of Methods

More information

CONVERGENCE ANALYSIS OF AN INTERIOR-POINT METHOD FOR NONCONVEX NONLINEAR PROGRAMMING

CONVERGENCE ANALYSIS OF AN INTERIOR-POINT METHOD FOR NONCONVEX NONLINEAR PROGRAMMING CONVERGENCE ANALYSIS OF AN INTERIOR-POINT METHOD FOR NONCONVEX NONLINEAR PROGRAMMING HANDE Y. BENSON, ARUN SEN, AND DAVID F. SHANNO Abstract. In this paper, we present global and local convergence results

More information

(One Dimension) Problem: for a function f(x), find x 0 such that f(x 0 ) = 0. f(x)

(One Dimension) Problem: for a function f(x), find x 0 such that f(x 0 ) = 0. f(x) Solving Nonlinear Equations & Optimization One Dimension Problem: or a unction, ind 0 such that 0 = 0. 0 One Root: The Bisection Method This one s guaranteed to converge at least to a singularity, i not

More information

ECE580 Exam 1 October 4, Please do not write on the back of the exam pages. Extra paper is available from the instructor.

ECE580 Exam 1 October 4, Please do not write on the back of the exam pages. Extra paper is available from the instructor. ECE580 Exam 1 October 4, 2012 1 Name: Solution Score: /100 You must show ALL of your work for full credit. This exam is closed-book. Calculators may NOT be used. Please leave fractions as fractions, etc.

More information

March 8, 2010 MATH 408 FINAL EXAM SAMPLE

March 8, 2010 MATH 408 FINAL EXAM SAMPLE March 8, 200 MATH 408 FINAL EXAM SAMPLE EXAM OUTLINE The final exam for this course takes place in the regular course classroom (MEB 238) on Monday, March 2, 8:30-0:20 am. You may bring two-sided 8 page

More information

Unconstrained optimization

Unconstrained optimization Chapter 4 Unconstrained optimization An unconstrained optimization problem takes the form min x Rnf(x) (4.1) for a target functional (also called objective function) f : R n R. In this chapter and throughout

More information

On fast trust region methods for quadratic models with linear constraints. M.J.D. Powell

On fast trust region methods for quadratic models with linear constraints. M.J.D. Powell DAMTP 2014/NA02 On fast trust region methods for quadratic models with linear constraints M.J.D. Powell Abstract: Quadratic models Q k (x), x R n, of the objective function F (x), x R n, are used by many

More information

New class of limited-memory variationally-derived variable metric methods 1

New class of limited-memory variationally-derived variable metric methods 1 New class of limited-memory variationally-derived variable metric methods 1 Jan Vlček, Ladislav Lukšan Institute of Computer Science, Academy of Sciences of the Czech Republic, L. Lukšan is also from Technical

More information

Stochastic Quasi-Newton Methods

Stochastic Quasi-Newton Methods Stochastic Quasi-Newton Methods Donald Goldfarb Department of IEOR Columbia University UCLA Distinguished Lecture Series May 17-19, 2016 1 / 35 Outline Stochastic Approximation Stochastic Gradient Descent

More information

Global Convergence of Perry-Shanno Memoryless Quasi-Newton-type Method. 1 Introduction

Global Convergence of Perry-Shanno Memoryless Quasi-Newton-type Method. 1 Introduction ISSN 1749-3889 (print), 1749-3897 (online) International Journal of Nonlinear Science Vol.11(2011) No.2,pp.153-158 Global Convergence of Perry-Shanno Memoryless Quasi-Newton-type Method Yigui Ou, Jun Zhang

More information

Global convergence of a regularized factorized quasi-newton method for nonlinear least squares problems

Global convergence of a regularized factorized quasi-newton method for nonlinear least squares problems Volume 29, N. 2, pp. 195 214, 2010 Copyright 2010 SBMAC ISSN 0101-8205 www.scielo.br/cam Global convergence of a regularized factorized quasi-newton method for nonlinear least squares problems WEIJUN ZHOU

More information

ON THE CONNECTION BETWEEN THE CONJUGATE GRADIENT METHOD AND QUASI-NEWTON METHODS ON QUADRATIC PROBLEMS

ON THE CONNECTION BETWEEN THE CONJUGATE GRADIENT METHOD AND QUASI-NEWTON METHODS ON QUADRATIC PROBLEMS ON THE CONNECTION BETWEEN THE CONJUGATE GRADIENT METHOD AND QUASI-NEWTON METHODS ON QUADRATIC PROBLEMS Anders FORSGREN Tove ODLAND Technical Report TRITA-MAT-203-OS-03 Department of Mathematics KTH Royal

More information

A new ane scaling interior point algorithm for nonlinear optimization subject to linear equality and inequality constraints

A new ane scaling interior point algorithm for nonlinear optimization subject to linear equality and inequality constraints Journal of Computational and Applied Mathematics 161 (003) 1 5 www.elsevier.com/locate/cam A new ane scaling interior point algorithm for nonlinear optimization subject to linear equality and inequality

More information

A globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications

A globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications A globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications Weijun Zhou 28 October 20 Abstract A hybrid HS and PRP type conjugate gradient method for smooth

More information

An Inexact Newton Method for Optimization

An Inexact Newton Method for Optimization New York University Brown Applied Mathematics Seminar, February 10, 2009 Brief biography New York State College of William and Mary (B.S.) Northwestern University (M.S. & Ph.D.) Courant Institute (Postdoc)

More information

AM 205: lecture 19. Last time: Conditions for optimality Today: Newton s method for optimization, survey of optimization methods

AM 205: lecture 19. Last time: Conditions for optimality Today: Newton s method for optimization, survey of optimization methods AM 205: lecture 19 Last time: Conditions for optimality Today: Newton s method for optimization, survey of optimization methods Optimality Conditions: Equality Constrained Case As another example of equality

More information

Stochastic Optimization Algorithms Beyond SG

Stochastic Optimization Algorithms Beyond SG Stochastic Optimization Algorithms Beyond SG Frank E. Curtis 1, Lehigh University involving joint work with Léon Bottou, Facebook AI Research Jorge Nocedal, Northwestern University Optimization Methods

More information

Inexact Newton Methods and Nonlinear Constrained Optimization

Inexact Newton Methods and Nonlinear Constrained Optimization Inexact Newton Methods and Nonlinear Constrained Optimization Frank E. Curtis EPSRC Symposium Capstone Conference Warwick Mathematics Institute July 2, 2009 Outline PDE-Constrained Optimization Newton

More information

Multivariate Newton Minimanization

Multivariate Newton Minimanization Multivariate Newton Minimanization Optymalizacja syntezy biosurfaktantu Rhamnolipid Rhamnolipids are naturally occuring glycolipid produced commercially by the Pseudomonas aeruginosa species of bacteria.

More information

A DIMENSION REDUCING CONIC METHOD FOR UNCONSTRAINED OPTIMIZATION

A DIMENSION REDUCING CONIC METHOD FOR UNCONSTRAINED OPTIMIZATION 1 A DIMENSION REDUCING CONIC METHOD FOR UNCONSTRAINED OPTIMIZATION G E MANOUSSAKIS, T N GRAPSA and C A BOTSARIS Department of Mathematics, University of Patras, GR 26110 Patras, Greece e-mail :gemini@mathupatrasgr,

More information

Multipoint secant and interpolation methods with nonmonotone line search for solving systems of nonlinear equations

Multipoint secant and interpolation methods with nonmonotone line search for solving systems of nonlinear equations Multipoint secant and interpolation methods with nonmonotone line search for solving systems of nonlinear equations Oleg Burdakov a,, Ahmad Kamandi b a Department of Mathematics, Linköping University,

More information

Appendix A Taylor Approximations and Definite Matrices

Appendix A Taylor Approximations and Definite Matrices Appendix A Taylor Approximations and Definite Matrices Taylor approximations provide an easy way to approximate a function as a polynomial, using the derivatives of the function. We know, from elementary

More information

AM 205: lecture 19. Last time: Conditions for optimality, Newton s method for optimization Today: survey of optimization methods

AM 205: lecture 19. Last time: Conditions for optimality, Newton s method for optimization Today: survey of optimization methods AM 205: lecture 19 Last time: Conditions for optimality, Newton s method for optimization Today: survey of optimization methods Quasi-Newton Methods General form of quasi-newton methods: x k+1 = x k α

More information

LINEAR AND NONLINEAR PROGRAMMING

LINEAR AND NONLINEAR PROGRAMMING LINEAR AND NONLINEAR PROGRAMMING Stephen G. Nash and Ariela Sofer George Mason University The McGraw-Hill Companies, Inc. New York St. Louis San Francisco Auckland Bogota Caracas Lisbon London Madrid Mexico

More information

ISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints

ISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints ISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints Instructor: Prof. Kevin Ross Scribe: Nitish John October 18, 2011 1 The Basic Goal The main idea is to transform a given constrained

More information

2.3 Linear Programming

2.3 Linear Programming 2.3 Linear Programming Linear Programming (LP) is the term used to define a wide range of optimization problems in which the objective function is linear in the unknown variables and the constraints are

More information

The Kalman filter is arguably one of the most notable algorithms

The Kalman filter is arguably one of the most notable algorithms LECTURE E NOTES «Kalman Filtering with Newton s Method JEFFREY HUMPHERYS and JEREMY WEST The Kalman filter is arguably one of the most notable algorithms of the 0th century [1]. In this article, we derive

More information

LBFGS. John Langford, Large Scale Machine Learning Class, February 5. (post presentation version)

LBFGS. John Langford, Large Scale Machine Learning Class, February 5. (post presentation version) LBFGS John Langford, Large Scale Machine Learning Class, February 5 (post presentation version) We are still doing Linear Learning Features: a vector x R n Label: y R Goal: Learn w R n such that ŷ w (x)

More information

Lecture 15: SQP methods for equality constrained optimization

Lecture 15: SQP methods for equality constrained optimization Lecture 15: SQP methods for equality constrained optimization Coralia Cartis, Mathematical Institute, University of Oxford C6.2/B2: Continuous Optimization Lecture 15: SQP methods for equality constrained

More information

Cubic regularization in symmetric rank-1 quasi-newton methods

Cubic regularization in symmetric rank-1 quasi-newton methods Math. Prog. Comp. (2018) 10:457 486 https://doi.org/10.1007/s12532-018-0136-7 FULL LENGTH PAPER Cubic regularization in symmetric rank-1 quasi-newton methods Hande Y. Benson 1 David F. Shanno 2 Received:

More information

Line search methods with variable sample size. Nataša Krklec Jerinkić. - PhD thesis -

Line search methods with variable sample size. Nataša Krklec Jerinkić. - PhD thesis - UNIVERSITY OF NOVI SAD FACULTY OF SCIENCES DEPARTMENT OF MATHEMATICS AND INFORMATICS Nataša Krklec Jerinkić Line search methods with variable sample size - PhD thesis - Novi Sad, 2013 2. 3 Introduction

More information

SF2822 Applied Nonlinear Optimization. Preparatory question. Lecture 9: Sequential quadratic programming. Anders Forsgren

SF2822 Applied Nonlinear Optimization. Preparatory question. Lecture 9: Sequential quadratic programming. Anders Forsgren SF2822 Applied Nonlinear Optimization Lecture 9: Sequential quadratic programming Anders Forsgren SF2822 Applied Nonlinear Optimization, KTH / 24 Lecture 9, 207/208 Preparatory question. Try to solve theory

More information

UNIVERSITY OF CALIFORNIA, MERCED LARGE-SCALE QUASI-NEWTON TRUST-REGION METHODS: HIGH-ACCURACY SOLVERS, DENSE INITIALIZATIONS, AND EXTENSIONS

UNIVERSITY OF CALIFORNIA, MERCED LARGE-SCALE QUASI-NEWTON TRUST-REGION METHODS: HIGH-ACCURACY SOLVERS, DENSE INITIALIZATIONS, AND EXTENSIONS UNIVERSITY OF CALIFORNIA, MERCED LARGE-SCALE QUASI-NEWTON TRUST-REGION METHODS: HIGH-ACCURACY SOLVERS, DENSE INITIALIZATIONS, AND EXTENSIONS A dissertation submitted in partial satisfaction of the requirements

More information

OPER 627: Nonlinear Optimization Lecture 14: Mid-term Review

OPER 627: Nonlinear Optimization Lecture 14: Mid-term Review OPER 627: Nonlinear Optimization Lecture 14: Mid-term Review Department of Statistical Sciences and Operations Research Virginia Commonwealth University Oct 16, 2013 (Lecture 14) Nonlinear Optimization

More information

SECTION: CONTINUOUS OPTIMISATION LECTURE 4: QUASI-NEWTON METHODS

SECTION: CONTINUOUS OPTIMISATION LECTURE 4: QUASI-NEWTON METHODS SECTION: CONTINUOUS OPTIMISATION LECTURE 4: QUASI-NEWTON METHODS HONOUR SCHOOL OF MATHEMATICS, OXFORD UNIVERSITY HILARY TERM 2005, DR RAPHAEL HAUSER 1. The Quasi-Newton Idea. In this lecture we will discuss

More information

1 Numerical optimization

1 Numerical optimization Contents 1 Numerical optimization 5 1.1 Optimization of single-variable functions............ 5 1.1.1 Golden Section Search................... 6 1.1. Fibonacci Search...................... 8 1. Algorithms

More information