A Novel of Step Size Selection Procedures. for Steepest Descent Method

Size: px
Start display at page:

Download "A Novel of Step Size Selection Procedures. for Steepest Descent Method"

Transcription

1 Applied Mathematical Sciences, Vol. 6, 0, no. 5, A Novel of Step Size Selection Procedures for Steepest Descent Method Goh Khang Wen, Mustafa Mamat, Ismail bin Mohd, 3 Yosza Dasril Department of Physical and Mathematical Science Faculty of Science, Universiti unu Abdul Rahman, Malaysia. Department of Mathematics Universiti Malaysia erengganu, Kuala erengganu, Malaysia. 3 Department of Industrial Electronics Faculty Electronics and Computer Engineering, Universiti enial Malaysia Melaa (UeM, Malaysia. Abstract: It is well nown that the classical steepest descent method which uses the exact line search procedure in determining the step size is converge very slowly to the solution. By the way, several effective inexact line search procedures have been proposed to overcome the weaness. Beside that, Barzilai and Borwein (BB have proposed two surprising non-line search procedures in determining the step size for steepest descent method which has been proved to be R-superlinearly convergent for convex quadratic in two-dimensional space. However, in order to give the greatest possible reduction to the objective function along the search direction, we have introduced a Newton-lie exact line search procedure. First in this paper, we will elaborate several well nown step size selection procedures in more detail. hen, a numerical performance comparison of all selected step size selection procedures has been done. he results showed the Newton-lie exact line search has performed better than others well nown procedures.

2 508 Goh Khang Wen et al. Introduction he classical steepest descent method which design by Cauchy (847 can be considered among the most important procedures for minimization of real-valued n function defined on R. he steepest descent step size appeared in x x λ d + = +, (. which the step size λ is obtained using exact line search [4] λ = arg min f ( x λ > 0 + λd, (. λ min{ λ f ( x + λd = 0} 0 = λ > (.3 respectively where d f ( x =. However, for solving some complicated optimization problems, it is difficult to compute the step size λ using (. and (.3 in practical computational and some time even impossible to compute it [5,3,4]. herefore, there are several inexact line search condition have been introduced, such as Armijo condition [], Goldstein condition [7], or Wolfe condition [] in determined the step size for steepest descent method. It is easy to show that the steepest descent method with those condition, is always convergent and theoretically the method will only terminate after a stationary point is found. According to the previous study of the inexact line search condition implemented to steepest descent method [,9,3], we found that Armijo line search and Bactracing line search [0] are easier to apply and more effective compared to others. Line search procedure is a useful and efficient technique for solving unconstrained optimization problems, especially for solving small and middle scale problems, such as determines a step size for steepest descent method. However, some researcher mentioned that if line search procedure is constructed every iteration of the approximation, it will lead a significant amount of computational cost. herefore, in order to reduce the evaluations of objective functions and gradient, several researchers have tried to avoid using line search in their designed algorithms. In year 988, Barzilai and Borwein (BB have derived two-point step size for the steepest descent approximating the secant equation [3]. hey have proposed two new step size formulae to compute the step size λ for steepest descent method which have been proved to be R-superlinearly convergent for convex quadratic in two dimension space.

3 Steepest descent method 509 In year 989, JM method [8] which improve the BB methods by taes the average value of both BB methods. Even though there are several inexact line searches and non-line search procedures have been proposed, but in the previous studies on line search, we have found that only the exact line search can gives the greatest possible reduction to the objective function along the search direction. All the exact line search procedures which must satisfy the condition (. are always computed by using (.3. But the previous researchers have shown that the weaness of the (.3 in computational for determining the step size λ [5,3,4]. herefore, according to Newton-Raphson method, we have introduced an alternative Newton-lie exact line search [5,6] and we have proved that it is practically computational available in our preliminary study of the method. his paper is organized as follows. In Section, we elaborate several mentioned procedure for determining the step size λ for steepest descent method. In Section 3, we describe the Newton-lie exact line search procedure and present the algorithm of steepest descent method. Section 4 contains numerical results of the testing examples for implementing the methods discuss in this paper. he conclusion which ends this paper is discussed in Section 5.. Step Size How to select the suitable step size for steepest descent method? In the early studies of this method, there are a varieties of research have been done in determining a suitable step size for the algorithm in solving optimization problems. Several well nown procedures which mentioned in previous section, have been selected and will be discussed in this Section.. Cauchy s step size (exact line search he classical and oldest steepest descent step size λ which was designed by Cauchy, is computed as g g λ =, (. g Ag where g = f x and A = f (. he equation (. can be proved as follows. ( x

4 50 Goh Khang Wen et al Consider the quadratic model problem given by Since g = f ( x, we have n min f ( x = x H x b x, x R. (.. g = H x b. (.. he step size is computed by using the exact line search in (3., to give f ( x λ g = H ( x λ g b = 0 (..3 By introducing (.. into (..3, we obtain g λ H g = 0. (..4 Since step size λ R, (..4 can be written as (..5 g g = λ g H g from which the equation (. can be obtained. Even the formula of the step size (. can be proved by using quadratic model problem, but since there are no any suggested formula in solving complicated optimization problem and it is very hard to compute the step size λ using (.3, we still using (. as Cauchy s step size formula in solving complicated optimization problems as listed in Section 4... Armijo Line Search (inexact line search Zhen (006 notices that among several well nown inexact line search procedures which introduced by previous researchers, Armijo line search [] is one of the most useful and easy to be implemented in computational. he Armijo line search rule is described as follows. Given s > 0, β (0,, σ (0, and λ = max{ s, sβ, sβ, K }

5 Steepest descent method 5 such that f ( x + λ d f ( x σλ g d (. where d g f ( x = =..3 Bactracing Line Search (inexact line search Neculai [9] and Burachi [] say that bactracing line search is more simple and effective than other inexact line search methods [0]. Actually bactracing line search is a method obtained from modification of Armijo line search, and its procedure is as follows. Procedure Bactracing :! his procedure computes the step size λ using Bactracing line search. Set t =.. while f ( x + td > f ( x + σtg d, do. t = t /. 3. λ = t..4 Barzilai and Borwein (BB method Barzilai and Borwein [3] derive two-point step size for the steepest descent routine by approximating the secant equation underlying quasi-newton method, in which they have determined two step sizes λ by minimizing λ g and it s symmetry λ g with respect to λ, where = x x and g = g g, from which we obtain g λ = (.3 g g and λ = (.4 g.4 Mohd and Jaafar (JM method

6 5 Goh Khang Wen et al Mohd and Jaafar [8] have improved the BB methods in determining step size λ for steepest descent by taing the average value of both BB methods with λ is given by = g g + g g λ. (.5 3. Newton-lie exact line search All the exact line search procedure which must satisfied the condition of (. is always computed using (.3. But the previous researchers have showed the weaness of the (.3 in computational for determining the step size λ [5,3,4]. herefore, according to Newton-Raphson method, in this section an alternative Newton-lie exact line search is suggested and we have proved it is available in practical computational in our preliminary study of the method. hus, instead of using (.3 to satisfied the condition of (., we use the approximation of Newton-Raphson s idea to obtained the step size λ through the routine ϕ ( λ λ + = λ, ϕ ( λ (3. where respectively. and he procedure of this alternative exact line search can be described as follows. procedure compute., :! his procedure computes λ. i = 0. λi + ϕ λ ( = λi ϕ ( λ 3. while λ i+ λ i ε, do 3. i = i + ϕ ( λ 3. λi + = λi ϕ ( λ 4. λ = λ i return.

7 Steepest descent method Algorithm of Steepest Descent Method + Procedure ASDM( m, n Z,, :.! his procedure compute implements the algorithm of steepest descent method where m is the! maximum number of iteration, n is the number of variable, x 0 is the initial point, ε is a tolerance.. = 0. d = f ( x 3. converge = false 4. while d ε and m and not converge do 4.. Use any step size procedure to compute the step size λ. 4.. x = x + + λ d 4.3. if x + x ε, then x : = x +! he minimizer converge = true else = d = f ( x 5. return. 4. Numerical Example In this section, we report a comparison results between those selected procedures in Section and the Newton-lie exact line search applied into steepest descent method. 4. esting Problems. f ( x, x = x + 3x, x 0 = (5,5. f ( x, x = 4x + x x x, x 0 = (, 3. six hump came bac function 6 4 x, x 4x.x + xx 4x f ( x = + x, x = (.6, hree hump came bac function 6 4 x 0 f ( x, x = x.05x + xx + x, x = (.6, Booth function f ( x, x 0 = (0.4,.6, x = ( x + x 7 + (x + x 5

8 54 Goh Khang Wen et al 4 x x x 0 6. f ( x, x = + 0.x +, x = (, f ( x, x = (x x x + 6( x x + x, x 0 = (, 8. Goldstein-Price function f ( x, x = g( x h( x g ( x = [ + ( x + x + (9 4x + 3x 4x + 6x x + 3 ], x 0 h ( x = [30 + (x 3x (8 3x + x + 48x 36x x + 7 ], x = (,. x, x 0.5x + 0.5( cos(x 9. f ( x = + x, x 0 = (, 0. he two dimension function.. f ( x x, x = [ x + csin(4πx x] + [ x 0.5sin(π ], = 0. f ( x x, x = [ x + csin(4πx x] + [ x 0.5sin(π ], = 0. 5 f ( x x, x = [ x + csin(4πx x] + [ x 0.5sin(π ], = Numerical Results 0 c, x = (6, c, x 0 = (0,0 0 c, x = (, Each step size procedure which has been discussed in previous section has been implemented into steepest descent method and the algorithms have been programmed into visual C++ language. Our testing problems and the initial points used are shown in Section 4.. For each problem, the limiting number of iteration is set to 00,000 6 and the tolerance ε = 0. For Armijo line search procedure, we use s =, β = 0. 75, σ = 0.38 suggested by Zhen [4] and bactracing line search procedure, we use σ = proposed by Neculai [9]. Beside that, Vrahatis [] have noticed that, a small step size has to be chosen in order to avoid oscillation and to guarantee the steepest descent convergence. herefore, we use the initial step size λ = for our Newton-lie exact line search procedure. P Cauchy Armijo Bactracing BB BB JM Newton F F F F F F F F F F F 0 5 F 0 F able : Iteration Number in Solving Problems

9 Steepest descent method 55 he number of iteration which used to solve each problems by all the selected step size procedures are shown in able, where BB, BB, JM and Newton stand for (.3, (.4, (.5 and Newton-lie exact line search procedure respectively. he bolded numbers in the able represent the least of iteration number among all the procedures for each problem. he symbol F stand for failure in which mean that the algorithm still cannot find the solution when the iteration number reach at 00,000 or the solution determined by the algorithm does not satisfied the minimization properties f x + = f ( x + λ d f ( x f ( x. (4. ( 0 5. Conclusion Stephen and Areila [0] notice that although Newton s method needs less iteration than steepest descent method for obtaining the solution, but labour cost of Newton s method is higher than steepest descent since the Newton s method requires more memory space and expensive in calculating the inverse of the Hessian matrix. Since in our Newton-lie exact line search, the first and second derivatives with respect to λ are defined as and, therefore it does not have the weaness which mentioned above but it does having the fast converge advantage of Newton s method. According to the comparison results shown in Section 4., we can say that the step size obtained by using the Newton-lie exact line search is more effective compared to other step size procedures and even there are three problems it does not obtain the solution with the least iteration, but those number of iteration is not much more than the least one as seeing in able. From the numerical results, we also found that the well nown BB s methods are easy to failure in solving complicated optimization problems. Acnowledgement. he authors are grateful acnowledge financial support from the Government of Malaysia and Universiti Malaysia erengganu through Fundamental Research Grant Scheme (Vot 5900.

10 56 Goh Khang Wen et al References []. Armijo L., (966. Minimization functions having Lipschitz continuous first partial derivative, Pacific J. Math 6, -3. []. Burachi, R., Drummond, L. M. G., Iusem, A. N. and Svaiter, B. F. (995. Full convergence of the steepest descent method with inexact line searches, Optimization [3]. Barzilai J. and Borwein J.M., (988. wo point step size gradient method. IMA J. Numer. Anal., 8, [4]. Curry H.B., (944. he method of steepest descent for nonlinear minimization problems, Quart. Appl. Math.,, [5]. Goh Khang Wen and Ismail Bin Mohd, (006 Analyze steepest descent using maple, Computer Science and Mathematics Symposium 006 (CSMS006, Kolej Universiti Sains dan enologi Malaysia. [6]. Goh, K. W. & Ismail, B. M. (007. An Alternative Newton-lie Exact Line Search for Steepest Descent Method. Proceedings of 3 rd International Conference on Research and Education in Mathematics 007, Universiti Putra Malaysia, Applied Mathematics and Mathematics Education, page -7. [7]. Goldstein A. A., (965. On steepest descent, SIAM Journal of Control, 3, [8]. Ismail Bin Mohd and Azmi Bin Jaafar, (990. A modification of wo-point step size gradient method for unconstrained optimization, Sains Malaysiana 9(4, 97-0.

11 Steepest descent method 57 [9]. Neculai Andrei, (004. A new descent gradient method for unconstrained optimization, Research report, Research Institute for Informatics, Romania. [0]. Stephen G. Nash and Ariela Sofer, (996. Linear and nonlinear programming, George Mason University, he McGraw-Hill Companies, Inc.. []. Vrahatis M.N., Androulais G.S., Lambrinos J.N., Magoulas G.D., (000. A class of gradient unconstrained minimization algorithms with adaptive stepsize, Journal of Computational and Applied Mathematics, 4, []. Wolfe P., (969. Convergence conditions for ascent methods, SIAM Rev. : [3]. Zhen-Jun Shi and Jie Shen, (006a. On step-size estimation of line search methods, Applied Mathematics and Computation, 73, [4]. Zhen-Jun Shi and Jie Shen, (006b. Convergence of descent method with new line search, Journal Applied Mathematics and Computing, 0(-,

12 58 Goh Khang Wen et al Received: ed: January, 0

Step-size Estimation for Unconstrained Optimization Methods

Step-size Estimation for Unconstrained Optimization Methods Volume 24, N. 3, pp. 399 416, 2005 Copyright 2005 SBMAC ISSN 0101-8205 www.scielo.br/cam Step-size Estimation for Unconstrained Optimization Methods ZHEN-JUN SHI 1,2 and JIE SHEN 3 1 College of Operations

More information

Steepest Descent. Juan C. Meza 1. Lawrence Berkeley National Laboratory Berkeley, California 94720

Steepest Descent. Juan C. Meza 1. Lawrence Berkeley National Laboratory Berkeley, California 94720 Steepest Descent Juan C. Meza Lawrence Berkeley National Laboratory Berkeley, California 94720 Abstract The steepest descent method has a rich history and is one of the simplest and best known methods

More information

SOLVING SYSTEMS OF NONLINEAR EQUATIONS USING A GLOBALLY CONVERGENT OPTIMIZATION ALGORITHM

SOLVING SYSTEMS OF NONLINEAR EQUATIONS USING A GLOBALLY CONVERGENT OPTIMIZATION ALGORITHM Transaction on Evolutionary Algorithm and Nonlinear Optimization ISSN: 2229-87 Online Publication, June 2012 www.pcoglobal.com/gjto.htm NG-O21/GJTO SOLVING SYSTEMS OF NONLINEAR EQUATIONS USING A GLOBALLY

More information

Math 164: Optimization Barzilai-Borwein Method

Math 164: Optimization Barzilai-Borwein Method Math 164: Optimization Barzilai-Borwein Method Instructor: Wotao Yin Department of Mathematics, UCLA Spring 2015 online discussions on piazza.com Main features of the Barzilai-Borwein (BB) method The BB

More information

An Efficient Modification of Nonlinear Conjugate Gradient Method

An Efficient Modification of Nonlinear Conjugate Gradient Method Malaysian Journal of Mathematical Sciences 10(S) March : 167-178 (2016) Special Issue: he 10th IM-G International Conference on Mathematics, Statistics and its Applications 2014 (ICMSA 2014) MALAYSIAN

More information

Steepest descent method implementation on unconstrained optimization problem using C++ program

Steepest descent method implementation on unconstrained optimization problem using C++ program IOP Conerence Series: Materials Science and Engineering PAPER OPEN ACCESS Steepest descent method implementation on unconstrained optimization problem using C++ program o cite this article: H Napitupulu

More information

LINEAR AND NONLINEAR PROGRAMMING

LINEAR AND NONLINEAR PROGRAMMING LINEAR AND NONLINEAR PROGRAMMING Stephen G. Nash and Ariela Sofer George Mason University The McGraw-Hill Companies, Inc. New York St. Louis San Francisco Auckland Bogota Caracas Lisbon London Madrid Mexico

More information

A globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications

A globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications A globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications Weijun Zhou 28 October 20 Abstract A hybrid HS and PRP type conjugate gradient method for smooth

More information

First Published on: 11 October 2006 To link to this article: DOI: / URL:

First Published on: 11 October 2006 To link to this article: DOI: / URL: his article was downloaded by:[universitetsbiblioteet i Bergen] [Universitetsbiblioteet i Bergen] On: 12 March 2007 Access Details: [subscription number 768372013] Publisher: aylor & Francis Informa Ltd

More information

Higher-Order Methods

Higher-Order Methods Higher-Order Methods Stephen J. Wright 1 2 Computer Sciences Department, University of Wisconsin-Madison. PCMI, July 2016 Stephen Wright (UW-Madison) Higher-Order Methods PCMI, July 2016 1 / 25 Smooth

More information

New Inexact Line Search Method for Unconstrained Optimization 1,2

New Inexact Line Search Method for Unconstrained Optimization 1,2 journal of optimization theory and applications: Vol. 127, No. 2, pp. 425 446, November 2005 ( 2005) DOI: 10.1007/s10957-005-6553-6 New Inexact Line Search Method for Unconstrained Optimization 1,2 Z.

More information

A derivative-free nonmonotone line search and its application to the spectral residual method

A derivative-free nonmonotone line search and its application to the spectral residual method IMA Journal of Numerical Analysis (2009) 29, 814 825 doi:10.1093/imanum/drn019 Advance Access publication on November 14, 2008 A derivative-free nonmonotone line search and its application to the spectral

More information

8 Numerical methods for unconstrained problems

8 Numerical methods for unconstrained problems 8 Numerical methods for unconstrained problems Optimization is one of the important fields in numerical computation, beside solving differential equations and linear systems. We can see that these fields

More information

OPER 627: Nonlinear Optimization Lecture 14: Mid-term Review

OPER 627: Nonlinear Optimization Lecture 14: Mid-term Review OPER 627: Nonlinear Optimization Lecture 14: Mid-term Review Department of Statistical Sciences and Operations Research Virginia Commonwealth University Oct 16, 2013 (Lecture 14) Nonlinear Optimization

More information

Optimization Methods. Lecture 19: Line Searches and Newton s Method

Optimization Methods. Lecture 19: Line Searches and Newton s Method 15.93 Optimization Methods Lecture 19: Line Searches and Newton s Method 1 Last Lecture Necessary Conditions for Optimality (identifies candidates) x local min f(x ) =, f(x ) PSD Slide 1 Sufficient Conditions

More information

Optimization II: Unconstrained Multivariable

Optimization II: Unconstrained Multivariable Optimization II: Unconstrained Multivariable CS 205A: Mathematical Methods for Robotics, Vision, and Graphics Justin Solomon CS 205A: Mathematical Methods Optimization II: Unconstrained Multivariable 1

More information

Search Directions for Unconstrained Optimization

Search Directions for Unconstrained Optimization 8 CHAPTER 8 Search Directions for Unconstrained Optimization In this chapter we study the choice of search directions used in our basic updating scheme x +1 = x + t d. for solving P min f(x). x R n All

More information

Spectral gradient projection method for solving nonlinear monotone equations

Spectral gradient projection method for solving nonlinear monotone equations Journal of Computational and Applied Mathematics 196 (2006) 478 484 www.elsevier.com/locate/cam Spectral gradient projection method for solving nonlinear monotone equations Li Zhang, Weijun Zhou Department

More information

Unconstrained Optimization

Unconstrained Optimization 1 / 36 Unconstrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University February 2, 2015 2 / 36 3 / 36 4 / 36 5 / 36 1. preliminaries 1.1 local approximation

More information

Line Search Methods for Unconstrained Optimisation

Line Search Methods for Unconstrained Optimisation Line Search Methods for Unconstrained Optimisation Lecture 8, Numerical Linear Algebra and Optimisation Oxford University Computing Laboratory, MT 2007 Dr Raphael Hauser (hauser@comlab.ox.ac.uk) The Generic

More information

2. Quasi-Newton methods

2. Quasi-Newton methods L. Vandenberghe EE236C (Spring 2016) 2. Quasi-Newton methods variable metric methods quasi-newton methods BFGS update limited-memory quasi-newton methods 2-1 Newton method for unconstrained minimization

More information

Methods for Unconstrained Optimization Numerical Optimization Lectures 1-2

Methods for Unconstrained Optimization Numerical Optimization Lectures 1-2 Methods for Unconstrained Optimization Numerical Optimization Lectures 1-2 Coralia Cartis, University of Oxford INFOMM CDT: Modelling, Analysis and Computation of Continuous Real-World Problems Methods

More information

Comparative study of Optimization methods for Unconstrained Multivariable Nonlinear Programming Problems

Comparative study of Optimization methods for Unconstrained Multivariable Nonlinear Programming Problems International Journal of Scientific and Research Publications, Volume 3, Issue 10, October 013 1 ISSN 50-3153 Comparative study of Optimization methods for Unconstrained Multivariable Nonlinear Programming

More information

Open Problems in Nonlinear Conjugate Gradient Algorithms for Unconstrained Optimization

Open Problems in Nonlinear Conjugate Gradient Algorithms for Unconstrained Optimization BULLETIN of the Malaysian Mathematical Sciences Society http://math.usm.my/bulletin Bull. Malays. Math. Sci. Soc. (2) 34(2) (2011), 319 330 Open Problems in Nonlinear Conjugate Gradient Algorithms for

More information

5 Quasi-Newton Methods

5 Quasi-Newton Methods Unconstrained Convex Optimization 26 5 Quasi-Newton Methods If the Hessian is unavailable... Notation: H = Hessian matrix. B is the approximation of H. C is the approximation of H 1. Problem: Solve min

More information

New hybrid conjugate gradient methods with the generalized Wolfe line search

New hybrid conjugate gradient methods with the generalized Wolfe line search Xu and Kong SpringerPlus (016)5:881 DOI 10.1186/s40064-016-5-9 METHODOLOGY New hybrid conjugate gradient methods with the generalized Wolfe line search Open Access Xiao Xu * and Fan yu Kong *Correspondence:

More information

Accelerated Gradient Methods for Constrained Image Deblurring

Accelerated Gradient Methods for Constrained Image Deblurring Accelerated Gradient Methods for Constrained Image Deblurring S Bonettini 1, R Zanella 2, L Zanni 2, M Bertero 3 1 Dipartimento di Matematica, Università di Ferrara, Via Saragat 1, Building B, I-44100

More information

Optimization II: Unconstrained Multivariable

Optimization II: Unconstrained Multivariable Optimization II: Unconstrained Multivariable CS 205A: Mathematical Methods for Robotics, Vision, and Graphics Doug James (and Justin Solomon) CS 205A: Mathematical Methods Optimization II: Unconstrained

More information

Unconstrained optimization

Unconstrained optimization Chapter 4 Unconstrained optimization An unconstrained optimization problem takes the form min x Rnf(x) (4.1) for a target functional (also called objective function) f : R n R. In this chapter and throughout

More information

On the iterate convergence of descent methods for convex optimization

On the iterate convergence of descent methods for convex optimization On the iterate convergence of descent methods for convex optimization Clovis C. Gonzaga March 1, 2014 Abstract We study the iterate convergence of strong descent algorithms applied to convex functions.

More information

Modification of the Armijo line search to satisfy the convergence properties of HS method

Modification of the Armijo line search to satisfy the convergence properties of HS method Université de Sfax Faculté des Sciences de Sfax Département de Mathématiques BP. 1171 Rte. Soukra 3000 Sfax Tunisia INTERNATIONAL CONFERENCE ON ADVANCES IN APPLIED MATHEMATICS 2014 Modification of the

More information

An Implicit Multi-Step Diagonal Secant-Type Method for Solving Large-Scale Systems of Nonlinear Equations

An Implicit Multi-Step Diagonal Secant-Type Method for Solving Large-Scale Systems of Nonlinear Equations Applied Mathematical Sciences, Vol. 6, 2012, no. 114, 5667-5676 An Implicit Multi-Step Diagonal Secant-Type Method for Solving Large-Scale Systems of Nonlinear Equations 1 M. Y. Waziri, 2 Z. A. Majid and

More information

Notes on Numerical Optimization

Notes on Numerical Optimization Notes on Numerical Optimization University of Chicago, 2014 Viva Patel October 18, 2014 1 Contents Contents 2 List of Algorithms 4 I Fundamentals of Optimization 5 1 Overview of Numerical Optimization

More information

Research Article A Two-Step Matrix-Free Secant Method for Solving Large-Scale Systems of Nonlinear Equations

Research Article A Two-Step Matrix-Free Secant Method for Solving Large-Scale Systems of Nonlinear Equations Applied Mathematics Volume 2012, Article ID 348654, 9 pages doi:10.1155/2012/348654 Research Article A Two-Step Matrix-Free Secant Method for Solving Large-Scale Systems of Nonlinear Equations M. Y. Waziri,

More information

Adaptive two-point stepsize gradient algorithm

Adaptive two-point stepsize gradient algorithm Numerical Algorithms 27: 377 385, 2001. 2001 Kluwer Academic Publishers. Printed in the Netherlands. Adaptive two-point stepsize gradient algorithm Yu-Hong Dai and Hongchao Zhang State Key Laboratory of

More information

Programming, numerics and optimization

Programming, numerics and optimization Programming, numerics and optimization Lecture C-3: Unconstrained optimization II Łukasz Jankowski ljank@ippt.pan.pl Institute of Fundamental Technological Research Room 4.32, Phone +22.8261281 ext. 428

More information

SECTION: CONTINUOUS OPTIMISATION LECTURE 4: QUASI-NEWTON METHODS

SECTION: CONTINUOUS OPTIMISATION LECTURE 4: QUASI-NEWTON METHODS SECTION: CONTINUOUS OPTIMISATION LECTURE 4: QUASI-NEWTON METHODS HONOUR SCHOOL OF MATHEMATICS, OXFORD UNIVERSITY HILARY TERM 2005, DR RAPHAEL HAUSER 1. The Quasi-Newton Idea. In this lecture we will discuss

More information

Numerical Optimization

Numerical Optimization Numerical Optimization Emo Todorov Applied Mathematics and Computer Science & Engineering University of Washington Spring 2010 Emo Todorov (UW) AMATH/CSE 579, Spring 2010 Lecture 9 1 / 8 Gradient descent

More information

An Alternative Three-Term Conjugate Gradient Algorithm for Systems of Nonlinear Equations

An Alternative Three-Term Conjugate Gradient Algorithm for Systems of Nonlinear Equations International Journal of Mathematical Modelling & Computations Vol. 07, No. 02, Spring 2017, 145-157 An Alternative Three-Term Conjugate Gradient Algorithm for Systems of Nonlinear Equations L. Muhammad

More information

Introduction. A Modified Steepest Descent Method Based on BFGS Method for Locally Lipschitz Functions. R. Yousefpour 1

Introduction. A Modified Steepest Descent Method Based on BFGS Method for Locally Lipschitz Functions. R. Yousefpour 1 A Modified Steepest Descent Method Based on BFGS Method for Locally Lipschitz Functions R. Yousefpour 1 1 Department Mathematical Sciences, University of Mazandaran, Babolsar, Iran; yousefpour@umz.ac.ir

More information

On memory gradient method with trust region for unconstrained optimization*

On memory gradient method with trust region for unconstrained optimization* Numerical Algorithms (2006) 41: 173 196 DOI: 10.1007/s11075-005-9008-0 * Springer 2006 On memory gradient method with trust region for unconstrained optimization* Zhen-Jun Shi a,b and Jie Shen b a College

More information

Global Convergence Properties of the HS Conjugate Gradient Method

Global Convergence Properties of the HS Conjugate Gradient Method Applied Mathematical Sciences, Vol. 7, 2013, no. 142, 7077-7091 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/ams.2013.311638 Global Convergence Properties of the HS Conjugate Gradient Method

More information

17 Solution of Nonlinear Systems

17 Solution of Nonlinear Systems 17 Solution of Nonlinear Systems We now discuss the solution of systems of nonlinear equations. An important ingredient will be the multivariate Taylor theorem. Theorem 17.1 Let D = {x 1, x 2,..., x m

More information

A DIMENSION REDUCING CONIC METHOD FOR UNCONSTRAINED OPTIMIZATION

A DIMENSION REDUCING CONIC METHOD FOR UNCONSTRAINED OPTIMIZATION 1 A DIMENSION REDUCING CONIC METHOD FOR UNCONSTRAINED OPTIMIZATION G E MANOUSSAKIS, T N GRAPSA and C A BOTSARIS Department of Mathematics, University of Patras, GR 26110 Patras, Greece e-mail :gemini@mathupatrasgr,

More information

NUMERICAL COMPARISON OF LINE SEARCH CRITERIA IN NONLINEAR CONJUGATE GRADIENT ALGORITHMS

NUMERICAL COMPARISON OF LINE SEARCH CRITERIA IN NONLINEAR CONJUGATE GRADIENT ALGORITHMS NUMERICAL COMPARISON OF LINE SEARCH CRITERIA IN NONLINEAR CONJUGATE GRADIENT ALGORITHMS Adeleke O. J. Department of Computer and Information Science/Mathematics Covenat University, Ota. Nigeria. Aderemi

More information

AN EIGENVALUE STUDY ON THE SUFFICIENT DESCENT PROPERTY OF A MODIFIED POLAK-RIBIÈRE-POLYAK CONJUGATE GRADIENT METHOD S.

AN EIGENVALUE STUDY ON THE SUFFICIENT DESCENT PROPERTY OF A MODIFIED POLAK-RIBIÈRE-POLYAK CONJUGATE GRADIENT METHOD S. Bull. Iranian Math. Soc. Vol. 40 (2014), No. 1, pp. 235 242 Online ISSN: 1735-8515 AN EIGENVALUE STUDY ON THE SUFFICIENT DESCENT PROPERTY OF A MODIFIED POLAK-RIBIÈRE-POLYAK CONJUGATE GRADIENT METHOD S.

More information

Trust Region Methods. Lecturer: Pradeep Ravikumar Co-instructor: Aarti Singh. Convex Optimization /36-725

Trust Region Methods. Lecturer: Pradeep Ravikumar Co-instructor: Aarti Singh. Convex Optimization /36-725 Trust Region Methods Lecturer: Pradeep Ravikumar Co-instructor: Aarti Singh Convex Optimization 10-725/36-725 Trust Region Methods min p m k (p) f(x k + p) s.t. p 2 R k Iteratively solve approximations

More information

Optimization. Totally not complete this is...don't use it yet...

Optimization. Totally not complete this is...don't use it yet... Optimization Totally not complete this is...don't use it yet... Bisection? Doing a root method is akin to doing a optimization method, but bi-section would not be an effective method - can detect sign

More information

GLOBAL CONVERGENCE OF CONJUGATE GRADIENT METHODS WITHOUT LINE SEARCH

GLOBAL CONVERGENCE OF CONJUGATE GRADIENT METHODS WITHOUT LINE SEARCH GLOBAL CONVERGENCE OF CONJUGATE GRADIENT METHODS WITHOUT LINE SEARCH Jie Sun 1 Department of Decision Sciences National University of Singapore, Republic of Singapore Jiapu Zhang 2 Department of Mathematics

More information

Convergence of a Two-parameter Family of Conjugate Gradient Methods with a Fixed Formula of Stepsize

Convergence of a Two-parameter Family of Conjugate Gradient Methods with a Fixed Formula of Stepsize Bol. Soc. Paran. Mat. (3s.) v. 00 0 (0000):????. c SPM ISSN-2175-1188 on line ISSN-00378712 in press SPM: www.spm.uem.br/bspm doi:10.5269/bspm.v38i6.35641 Convergence of a Two-parameter Family of Conjugate

More information

An Efficient Solver for Systems of Nonlinear. Equations with Singular Jacobian. via Diagonal Updating

An Efficient Solver for Systems of Nonlinear. Equations with Singular Jacobian. via Diagonal Updating Applied Mathematical Sciences, Vol. 4, 2010, no. 69, 3403-3412 An Efficient Solver for Systems of Nonlinear Equations with Singular Jacobian via Diagonal Updating M. Y. Waziri, W. J. Leong, M. A. Hassan

More information

Introduction. New Nonsmooth Trust Region Method for Unconstraint Locally Lipschitz Optimization Problems

Introduction. New Nonsmooth Trust Region Method for Unconstraint Locally Lipschitz Optimization Problems New Nonsmooth Trust Region Method for Unconstraint Locally Lipschitz Optimization Problems Z. Akbari 1, R. Yousefpour 2, M. R. Peyghami 3 1 Department of Mathematics, K.N. Toosi University of Technology,

More information

Bulletin of the. Iranian Mathematical Society

Bulletin of the. Iranian Mathematical Society ISSN: 1017-060X (Print) ISSN: 1735-8515 (Online) Bulletin of the Iranian Mathematical Society Vol. 43 (2017), No. 7, pp. 2437 2448. Title: Extensions of the Hestenes Stiefel and Pola Ribière Polya conjugate

More information

Chapter 6: Derivative-Based. optimization 1

Chapter 6: Derivative-Based. optimization 1 Chapter 6: Derivative-Based Optimization Introduction (6. Descent Methods (6. he Method of Steepest Descent (6.3 Newton s Methods (NM (6.4 Step Size Determination (6.5 Nonlinear Least-Squares Problems

More information

Handling Nonpositive Curvature in a Limited Memory Steepest Descent Method

Handling Nonpositive Curvature in a Limited Memory Steepest Descent Method Handling Nonpositive Curvature in a Limited Memory Steepest Descent Method Fran E. Curtis and Wei Guo Department of Industrial and Systems Engineering, Lehigh University, USA COR@L Technical Report 14T-011-R1

More information

CONVERGENCE PROPERTIES OF COMBINED RELAXATION METHODS

CONVERGENCE PROPERTIES OF COMBINED RELAXATION METHODS CONVERGENCE PROPERTIES OF COMBINED RELAXATION METHODS Igor V. Konnov Department of Applied Mathematics, Kazan University Kazan 420008, Russia Preprint, March 2002 ISBN 951-42-6687-0 AMS classification:

More information

Unconstrained optimization I Gradient-type methods

Unconstrained optimization I Gradient-type methods Unconstrained optimization I Gradient-type methods Antonio Frangioni Department of Computer Science University of Pisa www.di.unipi.it/~frangio frangio@di.unipi.it Computational Mathematics for Learning

More information

A new ane scaling interior point algorithm for nonlinear optimization subject to linear equality and inequality constraints

A new ane scaling interior point algorithm for nonlinear optimization subject to linear equality and inequality constraints Journal of Computational and Applied Mathematics 161 (003) 1 5 www.elsevier.com/locate/cam A new ane scaling interior point algorithm for nonlinear optimization subject to linear equality and inequality

More information

Optimization 2. CS5240 Theoretical Foundations in Multimedia. Leow Wee Kheng

Optimization 2. CS5240 Theoretical Foundations in Multimedia. Leow Wee Kheng Optimization 2 CS5240 Theoretical Foundations in Multimedia Leow Wee Kheng Department of Computer Science School of Computing National University of Singapore Leow Wee Kheng (NUS) Optimization 2 1 / 38

More information

Unconstrained Multivariate Optimization

Unconstrained Multivariate Optimization Unconstrained Multivariate Optimization Multivariate optimization means optimization of a scalar function of a several variables: and has the general form: y = () min ( ) where () is a nonlinear scalar-valued

More information

R-Linear Convergence of Limited Memory Steepest Descent

R-Linear Convergence of Limited Memory Steepest Descent R-Linear Convergence of Limited Memory Steepest Descent Fran E. Curtis and Wei Guo Department of Industrial and Systems Engineering, Lehigh University, USA COR@L Technical Report 16T-010 R-Linear Convergence

More information

Part 2: Linesearch methods for unconstrained optimization. Nick Gould (RAL)

Part 2: Linesearch methods for unconstrained optimization. Nick Gould (RAL) Part 2: Linesearch methods for unconstrained optimization Nick Gould (RAL) minimize x IR n f(x) MSc course on nonlinear optimization UNCONSTRAINED MINIMIZATION minimize x IR n f(x) where the objective

More information

Step lengths in BFGS method for monotone gradients

Step lengths in BFGS method for monotone gradients Noname manuscript No. (will be inserted by the editor) Step lengths in BFGS method for monotone gradients Yunda Dong Received: date / Accepted: date Abstract In this paper, we consider how to directly

More information

A Modified Hestenes-Stiefel Conjugate Gradient Method and Its Convergence

A Modified Hestenes-Stiefel Conjugate Gradient Method and Its Convergence Journal of Mathematical Research & Exposition Mar., 2010, Vol. 30, No. 2, pp. 297 308 DOI:10.3770/j.issn:1000-341X.2010.02.013 Http://jmre.dlut.edu.cn A Modified Hestenes-Stiefel Conjugate Gradient Method

More information

Lecture 3: Linesearch methods (continued). Steepest descent methods

Lecture 3: Linesearch methods (continued). Steepest descent methods Lecture 3: Linesearch methods (continued). Steepest descent methods Coralia Cartis, Mathematical Institute, University of Oxford C6.2/B2: Continuous Optimization Lecture 3: Linesearch methods (continued).

More information

Optimization: Nonlinear Optimization without Constraints. Nonlinear Optimization without Constraints 1 / 23

Optimization: Nonlinear Optimization without Constraints. Nonlinear Optimization without Constraints 1 / 23 Optimization: Nonlinear Optimization without Constraints Nonlinear Optimization without Constraints 1 / 23 Nonlinear optimization without constraints Unconstrained minimization min x f(x) where f(x) is

More information

Methods that avoid calculating the Hessian. Nonlinear Optimization; Steepest Descent, Quasi-Newton. Steepest Descent

Methods that avoid calculating the Hessian. Nonlinear Optimization; Steepest Descent, Quasi-Newton. Steepest Descent Nonlinear Optimization Steepest Descent and Niclas Börlin Department of Computing Science Umeå University niclas.borlin@cs.umu.se A disadvantage with the Newton method is that the Hessian has to be derived

More information

On efficiency of nonmonotone Armijo-type line searches

On efficiency of nonmonotone Armijo-type line searches Noname manuscript No. (will be inserted by the editor On efficiency of nonmonotone Armijo-type line searches Masoud Ahookhosh Susan Ghaderi Abstract Monotonicity and nonmonotonicity play a key role in

More information

R-linear convergence of limited memory steepest descent

R-linear convergence of limited memory steepest descent IMA Journal of Numerical Analysis (2018) 38, 720 742 doi: 10.1093/imanum/drx016 Advance Access publication on April 24, 2017 R-linear convergence of limited memory steepest descent Fran E. Curtis and Wei

More information

New Accelerated Conjugate Gradient Algorithms for Unconstrained Optimization

New Accelerated Conjugate Gradient Algorithms for Unconstrained Optimization ew Accelerated Conjugate Gradient Algorithms for Unconstrained Optimization eculai Andrei Research Institute for Informatics, Center for Advanced Modeling and Optimization, 8-0, Averescu Avenue, Bucharest,

More information

CPSC 540: Machine Learning

CPSC 540: Machine Learning CPSC 540: Machine Learning First-Order Methods, L1-Regularization, Coordinate Descent Winter 2016 Some images from this lecture are taken from Google Image Search. Admin Room: We ll count final numbers

More information

4 damped (modified) Newton methods

4 damped (modified) Newton methods 4 damped (modified) Newton methods 4.1 damped Newton method Exercise 4.1 Determine with the damped Newton method the unique real zero x of the real valued function of one variable f(x) = x 3 +x 2 using

More information

HYBRID RUNGE-KUTTA AND QUASI-NEWTON METHODS FOR UNCONSTRAINED NONLINEAR OPTIMIZATION. Darin Griffin Mohr. An Abstract

HYBRID RUNGE-KUTTA AND QUASI-NEWTON METHODS FOR UNCONSTRAINED NONLINEAR OPTIMIZATION. Darin Griffin Mohr. An Abstract HYBRID RUNGE-KUTTA AND QUASI-NEWTON METHODS FOR UNCONSTRAINED NONLINEAR OPTIMIZATION by Darin Griffin Mohr An Abstract Of a thesis submitted in partial fulfillment of the requirements for the Doctor of

More information

A NOVEL FILLED FUNCTION METHOD FOR GLOBAL OPTIMIZATION. 1. Introduction Consider the following unconstrained programming problem:

A NOVEL FILLED FUNCTION METHOD FOR GLOBAL OPTIMIZATION. 1. Introduction Consider the following unconstrained programming problem: J. Korean Math. Soc. 47, No. 6, pp. 53 67 DOI.434/JKMS..47.6.53 A NOVEL FILLED FUNCTION METHOD FOR GLOBAL OPTIMIZATION Youjiang Lin, Yongjian Yang, and Liansheng Zhang Abstract. This paper considers the

More information

Nonlinear Optimization: What s important?

Nonlinear Optimization: What s important? Nonlinear Optimization: What s important? Julian Hall 10th May 2012 Convexity: convex problems A local minimizer is a global minimizer A solution of f (x) = 0 (stationary point) is a minimizer A global

More information

Quasi-Newton Methods

Quasi-Newton Methods Newton s Method Pros and Cons Quasi-Newton Methods MA 348 Kurt Bryan Newton s method has some very nice properties: It s extremely fast, at least once it gets near the minimum, and with the simple modifications

More information

A COMBINED CLASS OF SELF-SCALING AND MODIFIED QUASI-NEWTON METHODS

A COMBINED CLASS OF SELF-SCALING AND MODIFIED QUASI-NEWTON METHODS A COMBINED CLASS OF SELF-SCALING AND MODIFIED QUASI-NEWTON METHODS MEHIDDIN AL-BAALI AND HUMAID KHALFAN Abstract. Techniques for obtaining safely positive definite Hessian approximations with selfscaling

More information

Journal of Computational and Applied Mathematics. Notes on the Dai Yuan Yuan modified spectral gradient method

Journal of Computational and Applied Mathematics. Notes on the Dai Yuan Yuan modified spectral gradient method Journal of Computational Applied Mathematics 234 (200) 2986 2992 Contents lists available at ScienceDirect Journal of Computational Applied Mathematics journal homepage: wwwelseviercom/locate/cam Notes

More information

6.252 NONLINEAR PROGRAMMING LECTURE 10 ALTERNATIVES TO GRADIENT PROJECTION LECTURE OUTLINE. Three Alternatives/Remedies for Gradient Projection

6.252 NONLINEAR PROGRAMMING LECTURE 10 ALTERNATIVES TO GRADIENT PROJECTION LECTURE OUTLINE. Three Alternatives/Remedies for Gradient Projection 6.252 NONLINEAR PROGRAMMING LECTURE 10 ALTERNATIVES TO GRADIENT PROJECTION LECTURE OUTLINE Three Alternatives/Remedies for Gradient Projection Two-Metric Projection Methods Manifold Suboptimization Methods

More information

IE 5531: Engineering Optimization I

IE 5531: Engineering Optimization I IE 5531: Engineering Optimization I Lecture 14: Unconstrained optimization Prof. John Gunnar Carlsson October 27, 2010 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I October 27, 2010 1

More information

Gradient Descent. Dr. Xiaowei Huang

Gradient Descent. Dr. Xiaowei Huang Gradient Descent Dr. Xiaowei Huang https://cgi.csc.liv.ac.uk/~xiaowei/ Up to now, Three machine learning algorithms: decision tree learning k-nn linear regression only optimization objectives are discussed,

More information

Research Article Nonlinear Conjugate Gradient Methods with Wolfe Type Line Search

Research Article Nonlinear Conjugate Gradient Methods with Wolfe Type Line Search Abstract and Applied Analysis Volume 013, Article ID 74815, 5 pages http://dx.doi.org/10.1155/013/74815 Research Article Nonlinear Conjugate Gradient Methods with Wolfe Type Line Search Yuan-Yuan Chen

More information

Two improved classes of Broyden s methods for solving nonlinear systems of equations

Two improved classes of Broyden s methods for solving nonlinear systems of equations Available online at www.isr-publications.com/jmcs J. Math. Computer Sci., 17 (2017), 22 31 Research Article Journal Homepage: www.tjmcs.com - www.isr-publications.com/jmcs Two improved classes of Broyden

More information

Maria Cameron. f(x) = 1 n

Maria Cameron. f(x) = 1 n Maria Cameron 1. Local algorithms for solving nonlinear equations Here we discuss local methods for nonlinear equations r(x) =. These methods are Newton, inexact Newton and quasi-newton. We will show that

More information

Trajectory-based optimization

Trajectory-based optimization Trajectory-based optimization Emo Todorov Applied Mathematics and Computer Science & Engineering University of Washington Winter 2012 Emo Todorov (UW) AMATH/CSE 579, Winter 2012 Lecture 6 1 / 13 Using

More information

A new nonmonotone Newton s modification for unconstrained Optimization

A new nonmonotone Newton s modification for unconstrained Optimization A new nonmonotone Newton s modification for unconstrained Optimization Aristotelis E. Kostopoulos a George S. Androulakis b a a Department of Mathematics, University of Patras, GR-265.04, Rio, Greece b

More information

, b = 0. (2) 1 2 The eigenvectors of A corresponding to the eigenvalues λ 1 = 1, λ 2 = 3 are

, b = 0. (2) 1 2 The eigenvectors of A corresponding to the eigenvalues λ 1 = 1, λ 2 = 3 are Quadratic forms We consider the quadratic function f : R 2 R defined by f(x) = 2 xt Ax b T x with x = (x, x 2 ) T, () where A R 2 2 is symmetric and b R 2. We will see that, depending on the eigenvalues

More information

Bindel, Spring 2016 Numerical Analysis (CS 4220) Notes for

Bindel, Spring 2016 Numerical Analysis (CS 4220) Notes for Life beyond Newton Notes for 2016-04-08 Newton s method has many attractive properties, particularly when we combine it with a globalization strategy. Unfortunately, Newton steps are not cheap. At each

More information

Numerical Methods. Roots of Equations

Numerical Methods. Roots of Equations Roots of Equations by Norhayati Rosli & Nadirah Mohd Nasir Faculty of Industrial Sciences & Technology norhayati@ump.edu.my, nadirah@ump.edu.my Description AIMS This chapter is aimed to compute the root(s)

More information

1 Numerical optimization

1 Numerical optimization Contents 1 Numerical optimization 5 1.1 Optimization of single-variable functions............ 5 1.1.1 Golden Section Search................... 6 1.1. Fibonacci Search...................... 8 1. Algorithms

More information

New hybrid conjugate gradient algorithms for unconstrained optimization

New hybrid conjugate gradient algorithms for unconstrained optimization ew hybrid conjugate gradient algorithms for unconstrained optimization eculai Andrei Research Institute for Informatics, Center for Advanced Modeling and Optimization, 8-0, Averescu Avenue, Bucharest,

More information

On the convergence properties of the projected gradient method for convex optimization

On the convergence properties of the projected gradient method for convex optimization Computational and Applied Mathematics Vol. 22, N. 1, pp. 37 52, 2003 Copyright 2003 SBMAC On the convergence properties of the projected gradient method for convex optimization A. N. IUSEM* Instituto de

More information

Modified nonmonotone Armijo line search for descent method

Modified nonmonotone Armijo line search for descent method Numer Alor 2011 57:1 25 DOI 10.1007/s11075-010-9408-7 ORIGINAL PAPER Modified nonmonotone Armijo line search for descent method Zhenjun Shi Shenquan Wan Received: 23 February 2009 / Accepted: 20 June 2010

More information

Optimization Methods. Lecture 18: Optimality Conditions and. Gradient Methods. for Unconstrained Optimization

Optimization Methods. Lecture 18: Optimality Conditions and. Gradient Methods. for Unconstrained Optimization 5.93 Optimization Methods Lecture 8: Optimality Conditions and Gradient Methods for Unconstrained Optimization Outline. Necessary and sucient optimality conditions Slide. Gradient m e t h o d s 3. The

More information

Unconstrained minimization of smooth functions

Unconstrained minimization of smooth functions Unconstrained minimization of smooth functions We want to solve min x R N f(x), where f is convex. In this section, we will assume that f is differentiable (so its gradient exists at every point), and

More information

GRADIENT = STEEPEST DESCENT

GRADIENT = STEEPEST DESCENT GRADIENT METHODS GRADIENT = STEEPEST DESCENT Convex Function Iso-contours gradient 0.5 0.4 4 2 0 8 0.3 0.2 0. 0 0. negative gradient 6 0.2 4 0.3 2.5 0.5 0 0.5 0.5 0 0.5 0.4 0.5.5 0.5 0 0.5 GRADIENT DESCENT

More information

Numerisches Rechnen. (für Informatiker) M. Grepl P. Esser & G. Welper & L. Zhang. Institut für Geometrie und Praktische Mathematik RWTH Aachen

Numerisches Rechnen. (für Informatiker) M. Grepl P. Esser & G. Welper & L. Zhang. Institut für Geometrie und Praktische Mathematik RWTH Aachen Numerisches Rechnen (für Informatiker) M. Grepl P. Esser & G. Welper & L. Zhang Institut für Geometrie und Praktische Mathematik RWTH Aachen Wintersemester 2011/12 IGPM, RWTH Aachen Numerisches Rechnen

More information

A family of derivative-free conjugate gradient methods for large-scale nonlinear systems of equations

A family of derivative-free conjugate gradient methods for large-scale nonlinear systems of equations Journal of Computational Applied Mathematics 224 (2009) 11 19 Contents lists available at ScienceDirect Journal of Computational Applied Mathematics journal homepage: www.elsevier.com/locate/cam A family

More information

Introduction to unconstrained optimization - direct search methods

Introduction to unconstrained optimization - direct search methods Introduction to unconstrained optimization - direct search methods Jussi Hakanen Post-doctoral researcher jussi.hakanen@jyu.fi Structure of optimization methods Typically Constraint handling converts the

More information

A Line search Multigrid Method for Large-Scale Nonlinear Optimization

A Line search Multigrid Method for Large-Scale Nonlinear Optimization A Line search Multigrid Method for Large-Scale Nonlinear Optimization Zaiwen Wen Donald Goldfarb Department of Industrial Engineering and Operations Research Columbia University 2008 Siam Conference on

More information