Steepest descent method implementation on unconstrained optimization problem using C++ program

Size: px
Start display at page:

Download "Steepest descent method implementation on unconstrained optimization problem using C++ program"

Transcription

1 IOP Conerence Series: Materials Science and Engineering PAPER OPEN ACCESS Steepest descent method implementation on unconstrained optimization problem using C++ program o cite this article: H Napitupulu et al 8 IOP Con. Ser.: Mater. Sci. Eng. Related content - Native Language Integrated Queries with CppLINQ in C++ V Vassilev - he ininite-ranged spin glass with m- component spins J R L de Almeida, R C Jones, J M Kosterlitz et al. - Stability conditions o generalised Ising spin glass models E J S Lage and J R L de Almeida View the article online or updates and enhancements. his content was downloaded rom IP address on 8/9/8 at :

2 IORA-ICOR 7 IOP Con. Series: Materials Science and Engineering (8 doi:.88/ x/// Steepest descent method implementation on unconstrained optimization problem using C++ program H Napitupulu, Suono, I Bin Mohd, Y Hidayat, S Supian Department o Mathematics, Faculty o Mathematics and Natural Sciences, Universitas Padjadjaran, Indonesia Institute or Mathematical Research (INSPEM, Universiti Putera Malaysia, Selangor, Malaysia Department o Statistics, Faculty o Mathematics and Natural Sciences, Universitas Padjadjaran, Indonesia * napitupuluherlina@gmail.com Abstract. Steepest Descent is nown as the simplest gradient method. Recently, many researches are done to obtain the appropriate step size in order to reduce the objective unction value progressively. In this paper, the properties o steepest descent method rom literatures are reviewed together with advantages and disadvantages o each step size procedure. he development o steepest descent method due to its step size procedure is discussed. In order to test the perormance o each step size, we run a steepest descent procedure in C++ program. We implemented it to unconstrained optimization test problem with two variables, then we compare the numerical results o each step size procedure. Based on the numerical eperiment, we conclude the general computational eatures and weanesses o each procedure in each case o problem.. Introduction Consider the ollowing nonlinear unconstrained minimization problem ind * R n such that ( * = min ( R n ( where : R n R. Steepest descent method is the simplest gradient method or solving unconstrained optimization which is designed by Cauchy in 87. his gradient method search along the negative gradient unction, can ensure a reduction o the objective unction as long as the current iterate point is not a stationary point ([]. Many researches have been done which is concern in inding the better and more appropriate step size procedure which aecting to the signiicance o minimizing the objective unctions. In this paper we give some reviews the properties o steepest descent method, its development, and compare the numerical results o steepest descent step size rom some literatures on solving global optimization problem. We run a steepest descent in C++ program to test the perormance o tested step size procedure, then compare the numerical results obtained o program eecution. Based on the numerical eperiment, the eatures as well as the weanesses o each step size procedure in each case o problem are concluded. Content rom this wor may be used under the terms o the Creative Commons Attribution. licence. Any urther distribution o this wor must maintain attribution to the author(s and the title o the wor, journal citation and DOI. Published under licence by Ltd

3 IORA-ICOR 7 IOP Con. Series: Materials Science and Engineering (8 doi:.88/ x///. Steepest Descent Method Steepest descent method is the simple minimizing gradient method or solving nonlinear equations since it is based on the linear approimation o aylor series ( + δ ˆ ( + δ = ( + ( δ ( he term ( δ is the directional derivatives o at. he step δ is a descent direction i the directional derivative is negative, which guarantees that the unction could be reduced along this direction. Furthermore, the step δ is need to be chosen to mae the linear directional derivative ( δ as negative as possible, which give the maimum reduction o but no consuming too much time in mae a choice. Assume that a unction ( is continuous in the neighborhood o point, d = g is the steepest descent direction at point and change δ in given by δ = αd. A small positive constant α is called step size, which its selection mae an eect to the reduction value o (. By solving one dimensional global optimization problem min φ ( α = ( + αd ( α> the maimum reduction in ( can be obtained. he procedure o steepest descent iteration is perormed by ormula = + + αd ( Starting with an initial point, a direction d = g and the step size α that minimizes ( + αd can be determined so that the net point = + α d is obtained. he procedure in ( is repeated until the convergence is achieved or the value α d is suiciently small. he algorithm o steepest descent is presented as ollows. ALGORIHM SEEPES DESCEN given : starting point domain set : = repeat. Compute steepest descent direction d = g. Update step size α = min φ ( α = ( + αd α>. Update : = +α d + until convergence or stopping criterion is satisied. Steepest Descent Step Size he step size in steepest descent method plays the important role in obtaining the minimizing the objective unction. he step size o each iteration α, must be eectively chosen to give a signiicant reduction o unction value, while the eiciently timing o choosing it also be optimized. he early step size procedure o steepest descent is considered as step size which obtained by eact line search, { α g } α = arg min ( + ( ( α. But i the steepest descent direction is used with eact line search step size would bring to the zigzag behavior which mae the convergence is very slowly. Aaie ([] analyzing that by this search

4 IORA-ICOR 7 IOP Con. Series: Materials Science and Engineering (8 doi:.88/ x/// procedure the converges is linearly and badly aected by ill-conditioning, by shown that the two asymptotic directions are in a two dimensional sub space spanned by two eigen vector o the Hessian matri. Greenstadt ([] studied the eiciency o steepest descent method and shown the bound o the ratio between the reductions obtained by Cauchy step and by the Newton s method, and the rate o convergence o steepest descent was proved by Forsythe ([]. Besides using eact line search, step size α could be computed by type o ineact line search methods, or instance Armijo condition ([5], Goldstein condition ([6], Wole condition ([7], Powell ([8], Dennis and Schnabel ([9], Fletcher ([], Potra and Shi ([], Lemaréchal ([], Moré and huente ([] and the others. Steepest descent method with mentioned conditions is always convergent theoretically, which is will not terminate unless the stationary point is ound. However, step size computed by those ineact line search procedure also appearing the bad zigzag behavior as well as in eact line search procedure. Considering the weaness o eisting step sizes, Barzilai and Borwein ([] proposed the the step size along negative gradient direction rom a two point approimation to the secant equation rom quasi-newton s method so called BB method. I the dimension is two, Barzilai and Borwein established R-superlinear convergence result or the method and their analyses indicated that the convergence rate is aster. For the general n dimensional strictly conve quadratic unction, Raydan ([] proved that the two-point step size gradient method is globally convergent, and the convergence rate is R linear (Dai and Liao ([5]. For the non-quadratic case, Raydan ([] incorporated the globalization scheme o the two point step size gradient method using the traditional technique o nonmonotone line search by Grippo et al ([6]. he resulted algorithm is competitive and sometimes preerable to several amous conjugate gradient algorithm or large scale unconstrained optimization. Due it simplicity and numerical eiciency the two-point step size gradient method has received many studies. hey have been successully applied to obtaining the local minimizers o large scale real problems ([7]. he algorithm o Raydan ([5] was urther generalized by Birgin et al ([8] or the minimization o dierentiable unction on closed conve sets, yielding an eicient projected gradient method. Eicient projected algorithm based on BB -lie methods have also been designed (Seraini et al ([9] and Dai and Fletcher([] or special quadratic programs arising rom training support vector machine, that has a singly linear constraint in addition to bo constraints. he BB method has also received much attention in inding sparse approimation solution to large undetermined linear systems o equations rom signal/ image processing and statics or eample in Wright et al ([]. However Fletcher ([] shown that or some problems o non-quadratic unction this method may very slow. here are many eisting step sizes modiication, some o them are easy to be applied but some are satisied the complicated algorithm. Following are some step size procedure which is simple to be applied to steepest descent method. able. List o Some Recent Simple Step Size No Step Size Remar g g Step size method by Cauchy (87 α = g which is computed by eact line search H g (C step size. Given s >, β, σ (,, α ma { s, s, s,...} = β β his procedure o step size using ineact line search so called Armijo s line search such that (A step size. ( + α d ( + σα g d Given β, σ (,, ˆ α = his procedure is so called Bactracing line search (B step size, this ineact line α = βαˆ search is use just the suicient decrease

5 IORA-ICOR 7 IOP Con. Series: Materials Science and Engineering (8 doi:.88/ x/// 5 such that ( + α d ( + σα g d s y α = (BB and y s α = (BB where s y s = y = g g g g α α = ( d ( + α + α g g ( condition to terminate the line search procedure. hese two point step sizes are method o non-line search procedure which is namely ater the inventors, Barzilai and Borwein s Formula. hese ormula reduce the value o objective unction and gradient and improved the convergence rom linear to R superlinear. his step size is so called elimination line search (EL step size, which is estimate the step size without compute the Hessian.. Numerical Eperiment In this section, several step size procedure o the steepest descent method shown in able are tested computationally. he programming is using Visual C++ 6., which is applied to testing unction o global optimization given in the ollowing.. Beale Function min (, (.5 (.5 (.65 = s. t..5,.5, = (,. Bohachevsy I Function min (, = +. cos( π. cos( π +.7 s. t. 5, 5, = (.,.. Bohachevsy II Function min (, = +. cos( π cos( π +. s. t. 5, 5, = (.,.. Booth Function min (, ( 7 ( 5 = s. t.,, = (, 5. Matyas Function min (,.6 = ( +.8 s. t., 6. Rosenbroc Simpliied Function min (, =.5 +, = (, ( ( s. t., =.6,.9, (

6 IORA-ICOR 7 IOP Con. Series: Materials Science and Engineering (8 doi:.88/ x/// 7. Si Hump Bac Camel Function min (,. 6 = + + s. t.,, = (, 8. Rastrigin Function min (, = + cos(8 cos(8 s. t.,, = (, 9. hree Hump Bac Camel Function 6 min (,.5 = s. t.,, = (,. reccani Function min (, = s. t.,, = (,. wo Dimensional Function c=.5 min (, = ( +.5 sin ( π + (.5sin ( π s. t.,. wo Dimensional Function c=. min (, = ( +. sin ( π + (.5 sin ( π, = (.5,.5 s. t.,. wo Dimensional Function c=.5 min (, = ( +.5sin ( π + (.5sin ( π, = (, s. t., =,. Goldstein and Price Function min (, ( = ( , ( ( ( ( + ( s. t.,, = (, By reerring to [] or Armijo procedure (A, s : =, β : =.75, σ : =.8, and or Bactracing procedure σ : =.. he initial step size or Barzilai-Borwein step size (BB and Barzilai- Borwein step size (BB is., initial step size or elimination line search is α : =.. he comparison o Numerical result will be based on: time eecution, number o total iteration, total percentage o unction, gradient and Hessian evaluation and the most decreased value o objective unction obtained. Stopping criteria are α d 6 or d 6. Following are the abbreviations used in o numerical results o able. 5

7 IORA-ICOR 7 IOP Con. Series: Materials Science and Engineering (8 doi:.88/ x/// Abbreviation in column one: C = Cauchy step size A = Armijo step size B = Bactracing step size BB = Barzilai & Borwein step size BB = Barzilai & Borwein step size EL = Elimination line search step size Abbreviation in row one: Iter = percentage o the total number o iteration until termination ime = percentage o the total running time until the termination in second Ft = percentage o the total number o unction evaluation until termination Gt = percentage o the total number o gradient evaluation until termination Ht = percentage o the total number o Hessian evaluation until termination able. Numerical Results No. Step ime Iter Ft Gt Ht. C (.99995, e A (.9999, e B (.9999, e BB (.99985, e BB (.99987, e EL (.9999, e Step C (.658, A (.6965e-6, B (.95e-6, BB ( e-7,.6e e BB (9.7959e-7, EL ( e-7, Step C (.68,.757e A (.58e-6, 9.9e e B (.898e-, e e BB (8.8e-7,.956e-7.5e BB (.7968e-7, -5.9e-8.599e EL (-5.56e-7,.77e e Step Iter Ft Gt Ht ime C (.99999,. 7.95e A (.99999, e B (-.868e-, BB ( , 6.97e-..8 BB ( ,.56e-.77. EL ( ,.87e Step C (.9e-5,.9e-5.86e A (.76e-5,.76e-5.568e B (.76e-5,.76e-5.568e BB (.9e-,.9e e BB (.889e-,.889e-.9699e

8 IORA-ICOR 7 IOP Con. Series: Materials Science and Engineering (8 doi:.88/ x/// EL (.867e-,.867e-.796e Step ime Iter Ft Gt Ht C (-.76, A (-.698, B (-.76, BB (.76, BB (-.76, EL (-.76, Step C (, 8.878e A (.99999, e B (,..889e BB ( , e BB (.99999, e EL ( , e Step C (.67, A (.67, B (.8988, BB (.67, BB (.67, EL (.67, Step C (-.7755, A (-.7755, B (-.7755, BB (-.7755, BB (-.7755, EL (-.7755, Step C (.89e-8,.95e-7.558e-.7.. A (7.6e-9, e e B (-,.85e-5.5e BB (-6.678e-8,.65e-6.888e-.. BB (-6.678e-8,.65e-6.888e-.. EL (.5576e-7,.78967e e Step C (.55, e A (.55, e B (.8658,.7.75e BB (.59, e BB (.56, e EL (.8657,.7.5e Step C (.769, A (.596, e B (.87779, e BB (.87779, e BB (.87779, e EL (.87779, e Step ime ime Iter Iter Ft Ft Gt Gt Ht Ht 7

9 IORA-ICOR 7 IOP Con. Series: Materials Science and Engineering (8 doi:.88/ x///. C ail ail ail ail ail ail ail A (.55,.8.879e B (.977, BB (.97, BB (.975, EL (.979, Step C ( , A ( , B (-9.98e-6, BB (-,- ail ail ail ail ail ail BB (-,- ail ail ail ail ail ail EL (-,- ail ail ail ail ail ail From the numerical results o able we summarize the number o successul (ns, number o local minimizer obtained (nl, and number o ailure (n presented in able. able. Summary o Numerical Results Step Size ns nl n C 7 6 A B 9 5 BB BB 9 EL 9 From the numerical result obtained, the A method and BB method are the method with the most successul among others or the case o two dimensional global optimization given. A method never ailed in solving the testing problems but have times go into another local minimizer. he C method is the least in obtaining the minimizer successully with one ailure. he B, BB and EL method reach the same level in successully obtain the minimizer. he B method reach without any ail, but BB and EL have ail. From this numerical eperiment we can conclude that A is better than others based on the number o minimizer obtained in the testing eample given in given initial condition. 5. Conclusion he advantages and disadvantages o steepest descent method are, this method is sensitive to the initial point; this method has descent property and it is a logical starting procedure or all gradient based methods. Since = α ( as ( * =, so + approaches the minimizer rather slowly, in act in a zigzag way. It is not economical to do linear searches thoroughly. All that is necessary to do is to obtain reduction in unction values at successive iterates/ According to the numerical eperiment o the given test unction, we can conclude that in the case o two dimensional unconstrained global optimization problem given, the A and BB is considered as the better method among others and C method is good enough, or the case o testing eample given with given initial condition. Acnowledgments Authors wishing to the Rector, Director o DRPMI, and Dean o FMIPA, Universitas Padjadjaran, which provides a grant program o the Academic Leadership Grant (ALG and a grant program o Competence Research o Lecturer o Unpad (Riset Kompetensi Dosen Unpad/RKDU. 8

10 IORA-ICOR 7 IOP Con. Series: Materials Science and Engineering (8 doi:.88/ x/// Reerences [] Yuan Y 8 Step-sizes or the gradient method In hird International Congress o Chinese Mathematicians. Part, AMS/IP Stud. Adv. Math.,, Pt Amer. Math. Soc., Providence, RI [] Aaie H 959 On a successive transormation o probability distribution and its application to the analysis o the optimum gradient method Ann, Inst. Statist. Math. oyo pp -7 [] Greenstadt J 967 On the relative eiciencies o gradient methods Math. Comp. pp 6-67 [] Forshyte G E 968 On the Asymptotic directions o the s-dimensional optimum gradient method Numerische Mathemati pp [5] Armijo L 966 Minimization o unction having Lipschitz continuous irst partial derivatives Paciic J. o Mathematics 6 pp - [6] Goldstein A A 965 On steepest descent. SIAM Journal on Control, vol., 7-5. [7] Wole P 968 Convergence conditions or ascent methods SIAM Review pp 6-5 [8] Powell M J D 976 Some global convergence properties o a variable-metric algorithm or minimization without eact line searches SIAM-AMS Proceedings, Philadelphia, 9 pp 5-7 [9] Dennis J E and Schnabel R B 98 Numerical Methods or Unconstrained Optimization and Nonlinear Equations Prentice-Hall Englewood Clis New Jersey [] Barzilai J and Borwein J 988 wo-point step size gradient methods IMA J. o Numer. Anal. 8-8 [] Potra F A and Shi Y 995 Eicient line search algorithm or unconstrained optimization JOA 85, no. pp [] Lemaréchal C 98 A view o line search in A. Auslander, W. Oettli and J. Stoer (Eds.. Optimization and Optimal Control Springer Verlag, pp [] Moré J and huente D J On line search algorithms with guaranteed suicient decrease. Mathematics and Computer Science Division Preprint MCS-P Argonne National Laboratory Argonne. [] Raydan M 99 On the Barzilai and Borwein choice o steplength or the gradient method IMA J. Numer. Anal. pp -6 [5] Dai Y H, and Liao L Z.. R-linear convergence o the Barzilai and Borwein gradient method. IMA. J. Numer. Anal. 6, - [6] Grippo L, Lampariello F, and Lucidi S 986. A nonmonotone line search technique or Newton s Method SIAM J. Numer. Anal. pp77-76 [7] Luengo F and Raydan M Gradient method with dynamical retards or large scale Optimization problems Electronic ransactions on Numer. Anal. vol 6 Kentstate University [8] Birgin E G, Floudas C A, Martínez J M Math. Programming; Heidelberg 5 Iss. pp 9-6 [9] Seraini G, Zanghirati and Zanni L 5 Gradient projection methods or quadratic programs and application in training support vector machine. Optim. Meth. Sotware pp 7-7 [] Dai Y H and Fletcher 6 New algorithms or singly linearly constrained quadratic subject to lower and upper bound Math Program Ser A. 6, pp - [] Wright J S, Nowa R D, and Figueiredo M A 9 Sparse Reconstruction by separable approimation. IEEE ransaction on Signal processing. 57, 79-9 [] Fletcher R. On the Barzilai-Borwein Method Numer. Anal. Report NA/ [] Wen G K, Mamat M, Mohd B I, and. Dasril Y A novel Step Size Selection Procedures or Steepest Descent Method Appl. Math. Sci. 6 No. 5, pp

A Novel of Step Size Selection Procedures. for Steepest Descent Method

A Novel of Step Size Selection Procedures. for Steepest Descent Method Applied Mathematical Sciences, Vol. 6, 0, no. 5, 507 58 A Novel of Step Size Selection Procedures for Steepest Descent Method Goh Khang Wen, Mustafa Mamat, Ismail bin Mohd, 3 Yosza Dasril Department of

More information

Adaptive two-point stepsize gradient algorithm

Adaptive two-point stepsize gradient algorithm Numerical Algorithms 27: 377 385, 2001. 2001 Kluwer Academic Publishers. Printed in the Netherlands. Adaptive two-point stepsize gradient algorithm Yu-Hong Dai and Hongchao Zhang State Key Laboratory of

More information

Step-size Estimation for Unconstrained Optimization Methods

Step-size Estimation for Unconstrained Optimization Methods Volume 24, N. 3, pp. 399 416, 2005 Copyright 2005 SBMAC ISSN 0101-8205 www.scielo.br/cam Step-size Estimation for Unconstrained Optimization Methods ZHEN-JUN SHI 1,2 and JIE SHEN 3 1 College of Operations

More information

Steepest Descent. Juan C. Meza 1. Lawrence Berkeley National Laboratory Berkeley, California 94720

Steepest Descent. Juan C. Meza 1. Lawrence Berkeley National Laboratory Berkeley, California 94720 Steepest Descent Juan C. Meza Lawrence Berkeley National Laboratory Berkeley, California 94720 Abstract The steepest descent method has a rich history and is one of the simplest and best known methods

More information

Math 164: Optimization Barzilai-Borwein Method

Math 164: Optimization Barzilai-Borwein Method Math 164: Optimization Barzilai-Borwein Method Instructor: Wotao Yin Department of Mathematics, UCLA Spring 2015 online discussions on piazza.com Main features of the Barzilai-Borwein (BB) method The BB

More information

New Inexact Line Search Method for Unconstrained Optimization 1,2

New Inexact Line Search Method for Unconstrained Optimization 1,2 journal of optimization theory and applications: Vol. 127, No. 2, pp. 425 446, November 2005 ( 2005) DOI: 10.1007/s10957-005-6553-6 New Inexact Line Search Method for Unconstrained Optimization 1,2 Z.

More information

Lecture 8 Optimization

Lecture 8 Optimization 4/9/015 Lecture 8 Optimization EE 4386/5301 Computational Methods in EE Spring 015 Optimization 1 Outline Introduction 1D Optimization Parabolic interpolation Golden section search Newton s method Multidimensional

More information

Gradient methods exploiting spectral properties

Gradient methods exploiting spectral properties Noname manuscript No. (will be inserted by the editor) Gradient methods exploiting spectral properties Yaui Huang Yu-Hong Dai Xin-Wei Liu Received: date / Accepted: date Abstract A new stepsize is derived

More information

Open Problems in Nonlinear Conjugate Gradient Algorithms for Unconstrained Optimization

Open Problems in Nonlinear Conjugate Gradient Algorithms for Unconstrained Optimization BULLETIN of the Malaysian Mathematical Sciences Society http://math.usm.my/bulletin Bull. Malays. Math. Sci. Soc. (2) 34(2) (2011), 319 330 Open Problems in Nonlinear Conjugate Gradient Algorithms for

More information

On spectral properties of steepest descent methods

On spectral properties of steepest descent methods ON SPECTRAL PROPERTIES OF STEEPEST DESCENT METHODS of 20 On spectral properties of steepest descent methods ROBERTA DE ASMUNDIS Department of Statistical Sciences, University of Rome La Sapienza, Piazzale

More information

Spectral gradient projection method for solving nonlinear monotone equations

Spectral gradient projection method for solving nonlinear monotone equations Journal of Computational and Applied Mathematics 196 (2006) 478 484 www.elsevier.com/locate/cam Spectral gradient projection method for solving nonlinear monotone equations Li Zhang, Weijun Zhou Department

More information

A globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications

A globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications A globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications Weijun Zhou 28 October 20 Abstract A hybrid HS and PRP type conjugate gradient method for smooth

More information

Unconstrained Multivariate Optimization

Unconstrained Multivariate Optimization Unconstrained Multivariate Optimization Multivariate optimization means optimization of a scalar function of a several variables: and has the general form: y = () min ( ) where () is a nonlinear scalar-valued

More information

(One Dimension) Problem: for a function f(x), find x 0 such that f(x 0 ) = 0. f(x)

(One Dimension) Problem: for a function f(x), find x 0 such that f(x 0 ) = 0. f(x) Solving Nonlinear Equations & Optimization One Dimension Problem: or a unction, ind 0 such that 0 = 0. 0 One Root: The Bisection Method This one s guaranteed to converge at least to a singularity, i not

More information

A derivative-free nonmonotone line search and its application to the spectral residual method

A derivative-free nonmonotone line search and its application to the spectral residual method IMA Journal of Numerical Analysis (2009) 29, 814 825 doi:10.1093/imanum/drn019 Advance Access publication on November 14, 2008 A derivative-free nonmonotone line search and its application to the spectral

More information

Accelerated Gradient Methods for Constrained Image Deblurring

Accelerated Gradient Methods for Constrained Image Deblurring Accelerated Gradient Methods for Constrained Image Deblurring S Bonettini 1, R Zanella 2, L Zanni 2, M Bertero 3 1 Dipartimento di Matematica, Università di Ferrara, Via Saragat 1, Building B, I-44100

More information

NUMERICAL COMPARISON OF LINE SEARCH CRITERIA IN NONLINEAR CONJUGATE GRADIENT ALGORITHMS

NUMERICAL COMPARISON OF LINE SEARCH CRITERIA IN NONLINEAR CONJUGATE GRADIENT ALGORITHMS NUMERICAL COMPARISON OF LINE SEARCH CRITERIA IN NONLINEAR CONJUGATE GRADIENT ALGORITHMS Adeleke O. J. Department of Computer and Information Science/Mathematics Covenat University, Ota. Nigeria. Aderemi

More information

A new ane scaling interior point algorithm for nonlinear optimization subject to linear equality and inequality constraints

A new ane scaling interior point algorithm for nonlinear optimization subject to linear equality and inequality constraints Journal of Computational and Applied Mathematics 161 (003) 1 5 www.elsevier.com/locate/cam A new ane scaling interior point algorithm for nonlinear optimization subject to linear equality and inequality

More information

Quasi-Newton Methods

Quasi-Newton Methods Quasi-Newton Methods Werner C. Rheinboldt These are excerpts of material relating to the boos [OR00 and [Rhe98 and of write-ups prepared for courses held at the University of Pittsburgh. Some further references

More information

Numerical Methods - Lecture 2. Numerical Methods. Lecture 2. Analysis of errors in numerical methods

Numerical Methods - Lecture 2. Numerical Methods. Lecture 2. Analysis of errors in numerical methods Numerical Methods - Lecture 1 Numerical Methods Lecture. Analysis o errors in numerical methods Numerical Methods - Lecture Why represent numbers in loating point ormat? Eample 1. How a number 56.78 can

More information

2. Quasi-Newton methods

2. Quasi-Newton methods L. Vandenberghe EE236C (Spring 2016) 2. Quasi-Newton methods variable metric methods quasi-newton methods BFGS update limited-memory quasi-newton methods 2-1 Newton method for unconstrained minimization

More information

A Simple Explanation of the Sobolev Gradient Method

A Simple Explanation of the Sobolev Gradient Method A Simple Explanation o the Sobolev Gradient Method R. J. Renka July 3, 2006 Abstract We have observed that the term Sobolev gradient is used more oten than it is understood. Also, the term is oten used

More information

First Published on: 11 October 2006 To link to this article: DOI: / URL:

First Published on: 11 October 2006 To link to this article: DOI: / URL: his article was downloaded by:[universitetsbiblioteet i Bergen] [Universitetsbiblioteet i Bergen] On: 12 March 2007 Access Details: [subscription number 768372013] Publisher: aylor & Francis Informa Ltd

More information

A DIMENSION REDUCING CONIC METHOD FOR UNCONSTRAINED OPTIMIZATION

A DIMENSION REDUCING CONIC METHOD FOR UNCONSTRAINED OPTIMIZATION 1 A DIMENSION REDUCING CONIC METHOD FOR UNCONSTRAINED OPTIMIZATION G E MANOUSSAKIS, T N GRAPSA and C A BOTSARIS Department of Mathematics, University of Patras, GR 26110 Patras, Greece e-mail :gemini@mathupatrasgr,

More information

Modification of the Armijo line search to satisfy the convergence properties of HS method

Modification of the Armijo line search to satisfy the convergence properties of HS method Université de Sfax Faculté des Sciences de Sfax Département de Mathématiques BP. 1171 Rte. Soukra 3000 Sfax Tunisia INTERNATIONAL CONFERENCE ON ADVANCES IN APPLIED MATHEMATICS 2014 Modification of the

More information

A family of derivative-free conjugate gradient methods for large-scale nonlinear systems of equations

A family of derivative-free conjugate gradient methods for large-scale nonlinear systems of equations Journal of Computational Applied Mathematics 224 (2009) 11 19 Contents lists available at ScienceDirect Journal of Computational Applied Mathematics journal homepage: www.elsevier.com/locate/cam A family

More information

New hybrid conjugate gradient methods with the generalized Wolfe line search

New hybrid conjugate gradient methods with the generalized Wolfe line search Xu and Kong SpringerPlus (016)5:881 DOI 10.1186/s40064-016-5-9 METHODOLOGY New hybrid conjugate gradient methods with the generalized Wolfe line search Open Access Xiao Xu * and Fan yu Kong *Correspondence:

More information

Nonmonotonic back-tracking trust region interior point algorithm for linear constrained optimization

Nonmonotonic back-tracking trust region interior point algorithm for linear constrained optimization Journal of Computational and Applied Mathematics 155 (2003) 285 305 www.elsevier.com/locate/cam Nonmonotonic bac-tracing trust region interior point algorithm for linear constrained optimization Detong

More information

An Alternative Three-Term Conjugate Gradient Algorithm for Systems of Nonlinear Equations

An Alternative Three-Term Conjugate Gradient Algorithm for Systems of Nonlinear Equations International Journal of Mathematical Modelling & Computations Vol. 07, No. 02, Spring 2017, 145-157 An Alternative Three-Term Conjugate Gradient Algorithm for Systems of Nonlinear Equations L. Muhammad

More information

YURI LEVIN AND ADI BEN-ISRAEL

YURI LEVIN AND ADI BEN-ISRAEL Pp. 1447-1457 in Progress in Analysis, Vol. Heinrich G W Begehr. Robert P Gilbert and Man Wah Wong, Editors, World Scientiic, Singapore, 003, ISBN 981-38-967-9 AN INVERSE-FREE DIRECTIONAL NEWTON METHOD

More information

Two improved classes of Broyden s methods for solving nonlinear systems of equations

Two improved classes of Broyden s methods for solving nonlinear systems of equations Available online at www.isr-publications.com/jmcs J. Math. Computer Sci., 17 (2017), 22 31 Research Article Journal Homepage: www.tjmcs.com - www.isr-publications.com/jmcs Two improved classes of Broyden

More information

Numerical Optimization

Numerical Optimization Numerical Optimization General Setup Let (. be a unction such that bn R R where b is a vector o unnown parameters. In many cases, b will not have a closed orm solution. We will estimate b by minimizing

More information

18-660: Numerical Methods for Engineering Design and Optimization

18-660: Numerical Methods for Engineering Design and Optimization 8-66: Numerical Methods or Engineering Design and Optimization Xin Li Department o ECE Carnegie Mellon University Pittsburgh, PA 53 Slide Overview Linear Regression Ordinary least-squares regression Minima

More information

Global Convergence Properties of the HS Conjugate Gradient Method

Global Convergence Properties of the HS Conjugate Gradient Method Applied Mathematical Sciences, Vol. 7, 2013, no. 142, 7077-7091 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/ams.2013.311638 Global Convergence Properties of the HS Conjugate Gradient Method

More information

Gradient method based on epsilon algorithm for large-scale nonlinearoptimization

Gradient method based on epsilon algorithm for large-scale nonlinearoptimization ISSN 1746-7233, England, UK World Journal of Modelling and Simulation Vol. 4 (2008) No. 1, pp. 64-68 Gradient method based on epsilon algorithm for large-scale nonlinearoptimization Jianliang Li, Lian

More information

Basic mathematics of economic models. 3. Maximization

Basic mathematics of economic models. 3. Maximization John Riley 1 January 16 Basic mathematics o economic models 3 Maimization 31 Single variable maimization 1 3 Multi variable maimization 6 33 Concave unctions 9 34 Maimization with non-negativity constraints

More information

Spectral Projected Gradient Methods

Spectral Projected Gradient Methods Spectral Projected Gradient Methods E. G. Birgin J. M. Martínez M. Raydan January 17, 2007 Keywords: Spectral Projected Gradient Methods, projected gradients, nonmonotone line search, large scale problems,

More information

Today. Introduction to optimization Definition and motivation 1-dimensional methods. Multi-dimensional methods. General strategies, value-only methods

Today. Introduction to optimization Definition and motivation 1-dimensional methods. Multi-dimensional methods. General strategies, value-only methods Optimization Last time Root inding: deinition, motivation Algorithms: Bisection, alse position, secant, Newton-Raphson Convergence & tradeos Eample applications o Newton s method Root inding in > 1 dimension

More information

Comparative study of Optimization methods for Unconstrained Multivariable Nonlinear Programming Problems

Comparative study of Optimization methods for Unconstrained Multivariable Nonlinear Programming Problems International Journal of Scientific and Research Publications, Volume 3, Issue 10, October 013 1 ISSN 50-3153 Comparative study of Optimization methods for Unconstrained Multivariable Nonlinear Programming

More information

Scaled gradient projection methods in image deblurring and denoising

Scaled gradient projection methods in image deblurring and denoising Scaled gradient projection methods in image deblurring and denoising Mario Bertero 1 Patrizia Boccacci 1 Silvia Bonettini 2 Riccardo Zanella 3 Luca Zanni 3 1 Dipartmento di Matematica, Università di Genova

More information

AN EIGENVALUE STUDY ON THE SUFFICIENT DESCENT PROPERTY OF A MODIFIED POLAK-RIBIÈRE-POLYAK CONJUGATE GRADIENT METHOD S.

AN EIGENVALUE STUDY ON THE SUFFICIENT DESCENT PROPERTY OF A MODIFIED POLAK-RIBIÈRE-POLYAK CONJUGATE GRADIENT METHOD S. Bull. Iranian Math. Soc. Vol. 40 (2014), No. 1, pp. 235 242 Online ISSN: 1735-8515 AN EIGENVALUE STUDY ON THE SUFFICIENT DESCENT PROPERTY OF A MODIFIED POLAK-RIBIÈRE-POLYAK CONJUGATE GRADIENT METHOD S.

More information

Optimization. Totally not complete this is...don't use it yet...

Optimization. Totally not complete this is...don't use it yet... Optimization Totally not complete this is...don't use it yet... Bisection? Doing a root method is akin to doing a optimization method, but bi-section would not be an effective method - can detect sign

More information

8 Numerical methods for unconstrained problems

8 Numerical methods for unconstrained problems 8 Numerical methods for unconstrained problems Optimization is one of the important fields in numerical computation, beside solving differential equations and linear systems. We can see that these fields

More information

A Modified Hestenes-Stiefel Conjugate Gradient Method and Its Convergence

A Modified Hestenes-Stiefel Conjugate Gradient Method and Its Convergence Journal of Mathematical Research & Exposition Mar., 2010, Vol. 30, No. 2, pp. 297 308 DOI:10.3770/j.issn:1000-341X.2010.02.013 Http://jmre.dlut.edu.cn A Modified Hestenes-Stiefel Conjugate Gradient Method

More information

Improved Damped Quasi-Newton Methods for Unconstrained Optimization

Improved Damped Quasi-Newton Methods for Unconstrained Optimization Improved Damped Quasi-Newton Methods for Unconstrained Optimization Mehiddin Al-Baali and Lucio Grandinetti August 2015 Abstract Recently, Al-Baali (2014) has extended the damped-technique in the modified

More information

A Primer on Multidimensional Optimization

A Primer on Multidimensional Optimization A Primer on Multidimensional Optimization Prof. Dr. Florian Rupp German University of Technology in Oman (GUtech) Introduction to Numerical Methods for ENG & CS (Mathematics IV) Spring Term 2016 Eercise

More information

Barzilai-Borwein Step Size for Stochastic Gradient Descent

Barzilai-Borwein Step Size for Stochastic Gradient Descent Barzilai-Borwein Step Size for Stochastic Gradient Descent Conghui Tan The Chinese University of Hong Kong chtan@se.cuhk.edu.hk Shiqian Ma The Chinese University of Hong Kong sqma@se.cuhk.edu.hk Yu-Hong

More information

Math 2412 Activity 1(Due by EOC Sep. 17)

Math 2412 Activity 1(Due by EOC Sep. 17) Math 4 Activity (Due by EOC Sep. 7) Determine whether each relation is a unction.(indicate why or why not.) Find the domain and range o each relation.. 4,5, 6,7, 8,8. 5,6, 5,7, 6,6, 6,7 Determine whether

More information

On the steepest descent algorithm for quadratic functions

On the steepest descent algorithm for quadratic functions On the steepest descent algorithm for quadratic functions Clóvis C. Gonzaga Ruana M. Schneider July 9, 205 Abstract The steepest descent algorithm with exact line searches (Cauchy algorithm) is inefficient,

More information

SOLVING SYSTEMS OF NONLINEAR EQUATIONS USING A GLOBALLY CONVERGENT OPTIMIZATION ALGORITHM

SOLVING SYSTEMS OF NONLINEAR EQUATIONS USING A GLOBALLY CONVERGENT OPTIMIZATION ALGORITHM Transaction on Evolutionary Algorithm and Nonlinear Optimization ISSN: 2229-87 Online Publication, June 2012 www.pcoglobal.com/gjto.htm NG-O21/GJTO SOLVING SYSTEMS OF NONLINEAR EQUATIONS USING A GLOBALLY

More information

A Robust Implementation of a Sequential Quadratic Programming Algorithm with Successive Error Restoration

A Robust Implementation of a Sequential Quadratic Programming Algorithm with Successive Error Restoration A Robust Implementation of a Sequential Quadratic Programming Algorithm with Successive Error Restoration Address: Prof. K. Schittkowski Department of Computer Science University of Bayreuth D - 95440

More information

An Efficient Modification of Nonlinear Conjugate Gradient Method

An Efficient Modification of Nonlinear Conjugate Gradient Method Malaysian Journal of Mathematical Sciences 10(S) March : 167-178 (2016) Special Issue: he 10th IM-G International Conference on Mathematics, Statistics and its Applications 2014 (ICMSA 2014) MALAYSIAN

More information

FALL 2018 MATH 4211/6211 Optimization Homework 4

FALL 2018 MATH 4211/6211 Optimization Homework 4 FALL 2018 MATH 4211/6211 Optimization Homework 4 This homework assignment is open to textbook, reference books, slides, and online resources, excluding any direct solution to the problem (such as solution

More information

EC5555 Economics Masters Refresher Course in Mathematics September 2013

EC5555 Economics Masters Refresher Course in Mathematics September 2013 EC5555 Economics Masters Reresher Course in Mathematics September 3 Lecture 5 Unconstraine Optimization an Quaratic Forms Francesco Feri We consier the unconstraine optimization or the case o unctions

More information

6.252 NONLINEAR PROGRAMMING LECTURE 10 ALTERNATIVES TO GRADIENT PROJECTION LECTURE OUTLINE. Three Alternatives/Remedies for Gradient Projection

6.252 NONLINEAR PROGRAMMING LECTURE 10 ALTERNATIVES TO GRADIENT PROJECTION LECTURE OUTLINE. Three Alternatives/Remedies for Gradient Projection 6.252 NONLINEAR PROGRAMMING LECTURE 10 ALTERNATIVES TO GRADIENT PROJECTION LECTURE OUTLINE Three Alternatives/Remedies for Gradient Projection Two-Metric Projection Methods Manifold Suboptimization Methods

More information

Higher-Order Methods

Higher-Order Methods Higher-Order Methods Stephen J. Wright 1 2 Computer Sciences Department, University of Wisconsin-Madison. PCMI, July 2016 Stephen Wright (UW-Madison) Higher-Order Methods PCMI, July 2016 1 / 25 Smooth

More information

Line Search Methods for Unconstrained Optimisation

Line Search Methods for Unconstrained Optimisation Line Search Methods for Unconstrained Optimisation Lecture 8, Numerical Linear Algebra and Optimisation Oxford University Computing Laboratory, MT 2007 Dr Raphael Hauser (hauser@comlab.ox.ac.uk) The Generic

More information

Journal of Computational and Applied Mathematics. Notes on the Dai Yuan Yuan modified spectral gradient method

Journal of Computational and Applied Mathematics. Notes on the Dai Yuan Yuan modified spectral gradient method Journal of Computational Applied Mathematics 234 (200) 2986 2992 Contents lists available at ScienceDirect Journal of Computational Applied Mathematics journal homepage: wwwelseviercom/locate/cam Notes

More information

Chapter 6: Derivative-Based. optimization 1

Chapter 6: Derivative-Based. optimization 1 Chapter 6: Derivative-Based Optimization Introduction (6. Descent Methods (6. he Method of Steepest Descent (6.3 Newton s Methods (NM (6.4 Step Size Determination (6.5 Nonlinear Least-Squares Problems

More information

Research Article Nonlinear Conjugate Gradient Methods with Wolfe Type Line Search

Research Article Nonlinear Conjugate Gradient Methods with Wolfe Type Line Search Abstract and Applied Analysis Volume 013, Article ID 74815, 5 pages http://dx.doi.org/10.1155/013/74815 Research Article Nonlinear Conjugate Gradient Methods with Wolfe Type Line Search Yuan-Yuan Chen

More information

A COMBINED CLASS OF SELF-SCALING AND MODIFIED QUASI-NEWTON METHODS

A COMBINED CLASS OF SELF-SCALING AND MODIFIED QUASI-NEWTON METHODS A COMBINED CLASS OF SELF-SCALING AND MODIFIED QUASI-NEWTON METHODS MEHIDDIN AL-BAALI AND HUMAID KHALFAN Abstract. Techniques for obtaining safely positive definite Hessian approximations with selfscaling

More information

Research Article A Two-Step Matrix-Free Secant Method for Solving Large-Scale Systems of Nonlinear Equations

Research Article A Two-Step Matrix-Free Secant Method for Solving Large-Scale Systems of Nonlinear Equations Applied Mathematics Volume 2012, Article ID 348654, 9 pages doi:10.1155/2012/348654 Research Article A Two-Step Matrix-Free Secant Method for Solving Large-Scale Systems of Nonlinear Equations M. Y. Waziri,

More information

Dept. of Mathematics, University of Dundee, Dundee DD1 4HN, Scotland, UK.

Dept. of Mathematics, University of Dundee, Dundee DD1 4HN, Scotland, UK. A LIMITED MEMORY STEEPEST DESCENT METHOD ROGER FLETCHER Dept. of Mathematics, University of Dundee, Dundee DD1 4HN, Scotland, UK. e-mail fletcher@uk.ac.dundee.mcs Edinburgh Research Group in Optimization

More information

Chapter 4. Unconstrained optimization

Chapter 4. Unconstrained optimization Chapter 4. Unconstrained optimization Version: 28-10-2012 Material: (for details see) Chapter 11 in [FKS] (pp.251-276) A reference e.g. L.11.2 refers to the corresponding Lemma in the book [FKS] PDF-file

More information

5 Quasi-Newton Methods

5 Quasi-Newton Methods Unconstrained Convex Optimization 26 5 Quasi-Newton Methods If the Hessian is unavailable... Notation: H = Hessian matrix. B is the approximation of H. C is the approximation of H 1. Problem: Solve min

More information

Programming, numerics and optimization

Programming, numerics and optimization Programming, numerics and optimization Lecture C-3: Unconstrained optimization II Łukasz Jankowski ljank@ippt.pan.pl Institute of Fundamental Technological Research Room 4.32, Phone +22.8261281 ext. 428

More information

Seminal papers in nonlinear optimization

Seminal papers in nonlinear optimization Seminal papers in nonlinear optimization Nick Gould, CSED, RAL, Chilton, OX11 0QX, England (n.gould@rl.ac.uk) December 7, 2006 The following papers are classics in the field. Although many of them cover

More information

LECTURE # - NEURAL COMPUTATION, Feb 04, Linear Regression. x 1 θ 1 output... θ M x M. Assumes a functional form

LECTURE # - NEURAL COMPUTATION, Feb 04, Linear Regression. x 1 θ 1 output... θ M x M. Assumes a functional form LECTURE # - EURAL COPUTATIO, Feb 4, 4 Linear Regression Assumes a functional form f (, θ) = θ θ θ K θ (Eq) where = (,, ) are the attributes and θ = (θ, θ, θ ) are the function parameters Eample: f (, θ)

More information

OPTIMAL PLACEMENT AND UTILIZATION OF PHASOR MEASUREMENTS FOR STATE ESTIMATION

OPTIMAL PLACEMENT AND UTILIZATION OF PHASOR MEASUREMENTS FOR STATE ESTIMATION OPTIMAL PLACEMENT AND UTILIZATION OF PHASOR MEASUREMENTS FOR STATE ESTIMATION Xu Bei, Yeo Jun Yoon and Ali Abur Teas A&M University College Station, Teas, U.S.A. abur@ee.tamu.edu Abstract This paper presents

More information

Numerisches Rechnen. (für Informatiker) M. Grepl P. Esser & G. Welper & L. Zhang. Institut für Geometrie und Praktische Mathematik RWTH Aachen

Numerisches Rechnen. (für Informatiker) M. Grepl P. Esser & G. Welper & L. Zhang. Institut für Geometrie und Praktische Mathematik RWTH Aachen Numerisches Rechnen (für Informatiker) M. Grepl P. Esser & G. Welper & L. Zhang Institut für Geometrie und Praktische Mathematik RWTH Aachen Wintersemester 2011/12 IGPM, RWTH Aachen Numerisches Rechnen

More information

Bulletin of the. Iranian Mathematical Society

Bulletin of the. Iranian Mathematical Society ISSN: 1017-060X (Print) ISSN: 1735-8515 (Online) Bulletin of the Iranian Mathematical Society Vol. 43 (2017), No. 7, pp. 2437 2448. Title: Extensions of the Hestenes Stiefel and Pola Ribière Polya conjugate

More information

On efficiency of nonmonotone Armijo-type line searches

On efficiency of nonmonotone Armijo-type line searches Noname manuscript No. (will be inserted by the editor On efficiency of nonmonotone Armijo-type line searches Masoud Ahookhosh Susan Ghaderi Abstract Monotonicity and nonmonotonicity play a key role in

More information

Handling Nonpositive Curvature in a Limited Memory Steepest Descent Method

Handling Nonpositive Curvature in a Limited Memory Steepest Descent Method Handling Nonpositive Curvature in a Limited Memory Steepest Descent Method Fran E. Curtis and Wei Guo Department of Industrial and Systems Engineering, Lehigh University, USA COR@L Technical Report 14T-011-R1

More information

CISE-301: Numerical Methods Topic 1:

CISE-301: Numerical Methods Topic 1: CISE-3: Numerical Methods Topic : Introduction to Numerical Methods and Taylor Series Lectures -4: KFUPM Term 9 Section 8 CISE3_Topic KFUPM - T9 - Section 8 Lecture Introduction to Numerical Methods What

More information

Gradient-Based Optimization

Gradient-Based Optimization Multidisciplinary Design Optimization 48 Chapter 3 Gradient-Based Optimization 3. Introduction In Chapter we described methods to minimize (or at least decrease) a function of one variable. While problems

More information

Unconstrained optimization

Unconstrained optimization Chapter 4 Unconstrained optimization An unconstrained optimization problem takes the form min x Rnf(x) (4.1) for a target functional (also called objective function) f : R n R. In this chapter and throughout

More information

R-Linear Convergence of Limited Memory Steepest Descent

R-Linear Convergence of Limited Memory Steepest Descent R-Linear Convergence of Limited Memory Steepest Descent Fran E. Curtis and Wei Guo Department of Industrial and Systems Engineering, Lehigh University, USA COR@L Technical Report 16T-010 R-Linear Convergence

More information

Methods for Unconstrained Optimization Numerical Optimization Lectures 1-2

Methods for Unconstrained Optimization Numerical Optimization Lectures 1-2 Methods for Unconstrained Optimization Numerical Optimization Lectures 1-2 Coralia Cartis, University of Oxford INFOMM CDT: Modelling, Analysis and Computation of Continuous Real-World Problems Methods

More information

Global minimization with a new filled function approach

Global minimization with a new filled function approach 2nd International Conference on Electronics, Networ and Computer Engineering ICENCE 206 Global minimization with a new filled function approach Weiiang WANG,a, Youlin SHANG2,b and Mengiang LI3,c Shanghai

More information

Statistics 580 Optimization Methods

Statistics 580 Optimization Methods Statistics 580 Optimization Methods Introduction Let fx be a given real-valued function on R p. The general optimization problem is to find an x ɛ R p at which fx attain a maximum or a minimum. It is of

More information

Telescoping Decomposition Method for Solving First Order Nonlinear Differential Equations

Telescoping Decomposition Method for Solving First Order Nonlinear Differential Equations Telescoping Decomposition Method or Solving First Order Nonlinear Dierential Equations 1 Mohammed Al-Reai 2 Maysem Abu-Dalu 3 Ahmed Al-Rawashdeh Abstract The Telescoping Decomposition Method TDM is a new

More information

0,0 B 5,0 C 0, 4 3,5. y x. Recitation Worksheet 1A. 1. Plot these points in the xy plane: A

0,0 B 5,0 C 0, 4 3,5. y x. Recitation Worksheet 1A. 1. Plot these points in the xy plane: A Math 13 Recitation Worksheet 1A 1 Plot these points in the y plane: A 0,0 B 5,0 C 0, 4 D 3,5 Without using a calculator, sketch a graph o each o these in the y plane: A y B 3 Consider the unction a Evaluate

More information

SECTION: CONTINUOUS OPTIMISATION LECTURE 4: QUASI-NEWTON METHODS

SECTION: CONTINUOUS OPTIMISATION LECTURE 4: QUASI-NEWTON METHODS SECTION: CONTINUOUS OPTIMISATION LECTURE 4: QUASI-NEWTON METHODS HONOUR SCHOOL OF MATHEMATICS, OXFORD UNIVERSITY HILARY TERM 2005, DR RAPHAEL HAUSER 1. The Quasi-Newton Idea. In this lecture we will discuss

More information

ITERATED SRINKAGE ALGORITHM FOR BASIS PURSUIT MINIMIZATION

ITERATED SRINKAGE ALGORITHM FOR BASIS PURSUIT MINIMIZATION ITERATED SRINKAGE ALGORITHM FOR BASIS PURSUIT MINIMIZATION Michael Elad The Computer Science Department The Technion Israel Institute o technology Haia 3000, Israel * SIAM Conerence on Imaging Science

More information

Nonlinear Programming

Nonlinear Programming Nonlinear Programming Kees Roos e-mail: C.Roos@ewi.tudelft.nl URL: http://www.isa.ewi.tudelft.nl/ roos LNMB Course De Uithof, Utrecht February 6 - May 8, A.D. 2006 Optimization Group 1 Outline for week

More information

The intrinsic structure of FFT and a synopsis of SFT

The intrinsic structure of FFT and a synopsis of SFT Global Journal o Pure and Applied Mathematics. ISSN 0973-1768 Volume 12, Number 2 (2016), pp. 1671-1676 Research India Publications http://www.ripublication.com/gjpam.htm The intrinsic structure o FFT

More information

The Conjugate Gradient Method

The Conjugate Gradient Method The Conjugate Gradient Method Lecture 5, Continuous Optimisation Oxford University Computing Laboratory, HT 2006 Notes by Dr Raphael Hauser (hauser@comlab.ox.ac.uk) The notion of complexity (per iteration)

More information

A new nonmonotone Newton s modification for unconstrained Optimization

A new nonmonotone Newton s modification for unconstrained Optimization A new nonmonotone Newton s modification for unconstrained Optimization Aristotelis E. Kostopoulos a George S. Androulakis b a a Department of Mathematics, University of Patras, GR-265.04, Rio, Greece b

More information

Fluctuationlessness Theorem and its Application to Boundary Value Problems of ODEs

Fluctuationlessness Theorem and its Application to Boundary Value Problems of ODEs Fluctuationlessness Theorem and its Application to Boundary Value Problems o ODEs NEJLA ALTAY İstanbul Technical University Inormatics Institute Maslak, 34469, İstanbul TÜRKİYE TURKEY) nejla@be.itu.edu.tr

More information

and P RP k = gt k (g k? g k? ) kg k? k ; (.5) where kk is the Euclidean norm. This paper deals with another conjugate gradient method, the method of s

and P RP k = gt k (g k? g k? ) kg k? k ; (.5) where kk is the Euclidean norm. This paper deals with another conjugate gradient method, the method of s Global Convergence of the Method of Shortest Residuals Yu-hong Dai and Ya-xiang Yuan State Key Laboratory of Scientic and Engineering Computing, Institute of Computational Mathematics and Scientic/Engineering

More information

on descent spectral cg algorithm for training recurrent neural networks

on descent spectral cg algorithm for training recurrent neural networks Technical R e p o r t on descent spectral cg algorithm for training recurrent neural networs I.E. Livieris, D.G. Sotiropoulos 2 P. Pintelas,3 No. TR9-4 University of Patras Department of Mathematics GR-265

More information

On the regularization properties of some spectral gradient methods

On the regularization properties of some spectral gradient methods On the regularization properties of some spectral gradient methods Daniela di Serafino Department of Mathematics and Physics, Second University of Naples daniela.diserafino@unina2.it contributions from

More information

Introduction. A Modified Steepest Descent Method Based on BFGS Method for Locally Lipschitz Functions. R. Yousefpour 1

Introduction. A Modified Steepest Descent Method Based on BFGS Method for Locally Lipschitz Functions. R. Yousefpour 1 A Modified Steepest Descent Method Based on BFGS Method for Locally Lipschitz Functions R. Yousefpour 1 1 Department Mathematical Sciences, University of Mazandaran, Babolsar, Iran; yousefpour@umz.ac.ir

More information

Optimization II: Unconstrained Multivariable

Optimization II: Unconstrained Multivariable Optimization II: Unconstrained Multivariable CS 205A: Mathematical Methods for Robotics, Vision, and Graphics Justin Solomon CS 205A: Mathematical Methods Optimization II: Unconstrained Multivariable 1

More information

Syllabus Objective: 2.9 The student will sketch the graph of a polynomial, radical, or rational function.

Syllabus Objective: 2.9 The student will sketch the graph of a polynomial, radical, or rational function. Precalculus Notes: Unit Polynomial Functions Syllabus Objective:.9 The student will sketch the graph o a polynomial, radical, or rational unction. Polynomial Function: a unction that can be written in

More information

Math 273a: Optimization Netwon s methods

Math 273a: Optimization Netwon s methods Math 273a: Optimization Netwon s methods Instructor: Wotao Yin Department of Mathematics, UCLA Fall 2015 some material taken from Chong-Zak, 4th Ed. Main features of Newton s method Uses both first derivatives

More information

An Implicit Multi-Step Diagonal Secant-Type Method for Solving Large-Scale Systems of Nonlinear Equations

An Implicit Multi-Step Diagonal Secant-Type Method for Solving Large-Scale Systems of Nonlinear Equations Applied Mathematical Sciences, Vol. 6, 2012, no. 114, 5667-5676 An Implicit Multi-Step Diagonal Secant-Type Method for Solving Large-Scale Systems of Nonlinear Equations 1 M. Y. Waziri, 2 Z. A. Majid and

More information

The Ascent Trajectory Optimization of Two-Stage-To-Orbit Aerospace Plane Based on Pseudospectral Method

The Ascent Trajectory Optimization of Two-Stage-To-Orbit Aerospace Plane Based on Pseudospectral Method Available online at www.sciencedirect.com ScienceDirect Procedia Engineering 00 (014) 000 000 www.elsevier.com/locate/procedia APISAT014, 014 Asia-Paciic International Symposium on Aerospace Technology,

More information

Introduction to Nonlinear Optimization Paul J. Atzberger

Introduction to Nonlinear Optimization Paul J. Atzberger Introduction to Nonlinear Optimization Paul J. Atzberger Comments should be sent to: atzberg@math.ucsb.edu Introduction We shall discuss in these notes a brief introduction to nonlinear optimization concepts,

More information

IE 5531: Engineering Optimization I

IE 5531: Engineering Optimization I IE 5531: Engineering Optimization I Lecture 14: Unconstrained optimization Prof. John Gunnar Carlsson October 27, 2010 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I October 27, 2010 1

More information