An Alternative Three-Term Conjugate Gradient Algorithm for Systems of Nonlinear Equations

Size: px
Start display at page:

Download "An Alternative Three-Term Conjugate Gradient Algorithm for Systems of Nonlinear Equations"

Transcription

1 International Journal of Mathematical Modelling & Computations Vol. 07, No. 02, Spring 2017, An Alternative Three-Term Conjugate Gradient Algorithm for Systems of Nonlinear Equations L. Muhammad 1, and M. Y. Waziri 2 1 Department of Mathematics, Faculty of Science, Northwest University, Kano, 2 Department of Mathematical Sciences Faculty of Science, Bayero University Kano, Kano, Nigeria. Abstract. This paper presents an alternative three-term conjugate gradient algorithm for solving large-scale systems of nonlinear equations. The proposed method is the modification of memoryless BFGS quasi-newton update for which the direction is descent using projection based technique together with Powel restarting criteria. Moreover, we proved the global convergence of the proposed method with a derivative free line search under suitable assumptions. The numerical results show that the proposed method is promising. Received: 27 October 2016, Revised: 14 August 2017, Accepted: 04 November Keywords: BFGS update, Systems of onlinear equations, Conjugate gradient, Derivative free line search. Index to information contained in this paper 1 Introduction 2 Algorithm 3 Convergence Result 4 Numerical Results 5 Conclusion 1. Introduction Consider the system of nonlinear equations F (x) = 0, (1) where F : R n R n is a nonlinear mapping. Often, the mapping, F is assumed to satisfying the following assumptions: A1. There exists an x R n s.t F (x ) = 0; Corresponding author. lawalmuhd.nuw.maths@gmail.com c 2017 IAUCTB

2 146 L. Muhammad & M. Y. Waziri / IJM 2 C, (2017) A2. F is montone and continuously differentiable mapping. The general conjugate gradient conjugate gradient method for solving (1) generate iterative points {x k } from initial given point {x 0 } as follows: x k+1 = x k + α k d k, (2) where α k > 0 is attained using line search produre, and the direction d k is defined by d k+1 = F (x k+1 ) + β k d k, k 1 d 0 = F (x 0 ), k = 0. (3) The β k is a scalar termed as conjugate gradient parameter. Several methods have been developed for solving (1), most of these methods fall within the framework of Newton and quasi-newton strategy [5, 13, 17], and are particularly accepted because of their rapid convergence property from a sufficiently good initial guess. Notwithstanding, they are usually not powerful for large-scale nonlinear systems of equations because it requires compution and storage of the Jacobian matrix and its inverse as well as solving n linear system of equations in each iteration and the convergence may even be lost when the Jacobian is singular. The conjugate gradient methods are also developed in order to eliminate some of the shortcomings of Newton s and Quasi-Newton methods for unconstrained optimization. An extention of its application to handle nonlinear system of equations was considered possible by some reserchers see [6, 13, 14, 18, 21]. In this paper we are particularly interested in three term conjugate gradient algorithm for solving large scale systems of nonlinear equations which is obtained by modifying the update of the memoryless BFGS inverse approximation of the inverse Jacobian restarted as a θ multiple of an identity matrix at every iteration. The method posseed low memory requirement and, global convergence properties and simple implementation procedure. The main contribution of this paper is to construct a fast and efficient three-term conjugate gradient method for solving (1), the proposed method is based on the recent three-term conjugate gradient method [2] for unconstrained optimization. In other words our algorithm can be thought as an application to three-term conjugate gradient method to a general systems of nonlinear equations. We present extensive numerical results and performance comparism with a DF-SDCG algorithm for large-scale nonlinear equations [9] which illustrated that the proposed algorithm is efficient and promising. The rest of the paper is organized as follows: In section 2, we present the details of the proposed method. Subsequently Convergence results are presented in section 3. Some numerical results are reported in section 4. Finally, conclusions are made in section 5. Throughout this paper,. denote the euclidean norm of a vector. For simplicity, we abbreviate F (x k ) as F k in the context. 2. Algorithm This section, presents a three term conjugate gradient algorithm for solving largescale systems of nonlinear equations.

3 L. Muhammad & M. Y. Waziri / IJM 2 C, (2017) It is well known that if the Jacobian matrix J k of F is positive definite, then the most efficient search direction at x k is the Newton direction Thus, for the Newton direction d k+1 holds true that where s k = x k+1 x k, and from secant condition that, and y k = F k+1 F k. (5) can be approximated by d k+1 = (J k+1 ) 1 F k+1, (4) s T k J k+1d k+1 = s T k F k+1, (5) J k+1 s k = y k (6) y T k d k+1 = s T k F k+1. (7) Therefore any a search direction d k+1 that is required to satisfy (7), then it can be regarded as an approximate Newton direction. Recall that, the BFGS update of the Jacobian inverse approximation is represented as B k+1 = B k s ky T k B k + B k y k s T k y T k s k By setting B k = θ k I (8) can be transformed into ( yt k B ) ky k sk s T k yk T s k yk T s k (8) Q k+1 = θ k I s ky T k θ k θ k y k s T k y T k s k ( yt k θ ) ky k sk s T k yk T s k yk T s k (9) Where, θ k = st k s k s T k y k (10) For the θ k see Raydan [16] for details. It can be noted that, the matrix Q k in (9) is a modification of (8) in the sense that it is restarted with the multiple of an identity matrix at every step (B k = θ k I) and modifying the sign infront of y k s T k in the second term of (9) to get the decsent property. Multiplying both sides of (9) by F (x k+1 ) we obtained Q k+1 F k+1 = θ k F k+1 ( sk y T k θ k θ k y k s T k y T k s k Observe that the direction d k+1 from (11) can be written as ) ( F k yt k θ ) ky k sk s T k yk T s k yk T s F k+1 (11) k d k+1 = Q k F k+1 (12)

4 148 L. Muhammad & M. Y. Waziri / IJM 2 C, (2017) hence our new direction is; d k+1 = θ k F k+1 δ k s k η k y k (13) where δ k = ( 1 + yt k θ ) ky k s T k F k+1 yk T s k yk T s k yt k θ kf k+1 yk T s, (14) k η k = θ ks T k F k+1 yk T s, (15) k Solodov and Svaiter [17] introduced projection based algorithm for optimization problems which requires a hyperplane that separate the current iterate x k from zero of the system of the equation, and this can easely be done by setting x k+1 = x k F (z k) T (x k z k ) F (z k ) 2 F (z k ). (16) Therefore incoporating this projection based algorithm together with the proposed search direction and the derivative freee line search of Li and Li [14] we find α k = max{s, ρs, ρ 2 s,...} such that F (x k + α k d k ) T d k σα k F (x k + α k d k ) d k 2. (17) Where σ, s > 0 and ρ (0, 1). we describe our proposed algorithm as follows; Algorithm 2.1 (ATTCG) Step 1 : Given x 0 σ (0, 1), and compute d 0 = F 0, set k = 0. Step 2 : If F k < ϵ. then stop; otherwise continue with Step 3. Step 3 : Determine the stepsize α k by using conditions in (17), Step 4 : Let the next iterate be z = x k + α k d k. Step 5 : If F (z k ) < ϵ then stop; otherwise compute x k+1 by (16) Step 6 : Find the search direction by (13). Step 7 : Restart criterion. If F T k+1 F k 2 > 0.2 F k+1 2, then set d k+1 = F k+1. Step 8 : Consider k = k + 1 and go to step Convergence Result In this Section, we will establish the global convergence for the ATTCG algorithm. Assumption A (i) The level set S = {x R n : F (x) F (x 0 ) } is bounded, i.e there exists positive constant B > 0 such that, for all x S, x B. (ii) The function F is lipschitz continuous on R n i.e there exist a positive constant L > 0 s.t x, y R n

5 L. Muhammad & M. Y. Waziri / IJM 2 C, (2017) F (x) F (y) L x y (18) Assumption A(ii) implies that there exists a constant λ > 0 such that F k λ. (19) The following lemma is in Solodov and Svaiter [17], from Theorem 2.1 which only state the proof can be in similar way as in [17]. Lemma 1 Suppose that Assumption A holds and {x k } is generated by TTCG Algorithm let x be a solution of problem (1) with F (x ) = 0 Then x k+1 x 2 x k x 2 x k+1 x k 2 (20) holds and the sequence {x k } is bounded. Furthermore, either the sequence {x k } is finite and the last iteration is a solution or the sequence is infinite and x k+1 x k 2 < (21) K 0 Lemma 2 Suppose that the line search satifies the condition (17) and F is uniformly convex then d k+1 given by (13) and (14)-(15) is a descent direction, and F k+1 d k+1 Proof. Since F is uniformly convex it follows that yk T s k computation, we have ( Fk+1 T d k+1 = F k y k 2 ) θ k (s T k F k+1 ) 2 yk T s k yk T s k > 0. Now, by direct 0. (22) F k+1 F T k+1 d k+1 F k+1 T d k+1 F k+1 d k+1 (23) Lemma 3 Suppose that assumptions A hold, and consider the ATTCG algorithm and (13) where d k is a descent direction and α k is computed by the condition (17). Suppose that F is uniformly convex function on S, i.e there exists a constant µ > 0 such that (F (x) F (y)) T (x y) µ x y 2

6 150 L. Muhammad & M. Y. Waziri / IJM 2 C, (2017) for all x, y S; then Proof. d k+1 κ + κ µ L2 (2 + L + ). (24) µ From Lipchitz continuity, we know that y k L s k. On the other hand by uniform convexity, it yields y T k s k µ s k 2. (25) Thus, using the Cauchy Shwartz inequality, assumptions A and the above inequalities, we have δ k st k F k+1 y T k s k + y k 2 θ k s T k F k+1 yk T s k 2 + yt k θ kf k+1 yk T s k (26) κ µ s k + L2 κ µ 2 s k + Lκ µ s k (27) = κ L2 (1 + L + µ µ ) 1 s k. (28) Since η k = θ ks T k F k+1 y T k s k s k F k+1 µ s k 2 κ µ s k, (29) Finally d k+1 θ k F k+1 + δ k s k + η k y k κ + κ µ Hence, d k+1 is bounded. L2 (2 + L + ). (30) µ The next lemma shows that the line search in step 3 of ATTCG algorithm is reasonable, then the presented algorithm is well defined. Lemma 4 Let the Assumption A hold. Then ATTCG algorithm produces an iterate of x k+1 = x k + α k d k, in a finite number of backtraking steps. Proof: We suppose that F k 0 does not hold, or the algorithm is stoped. Then there exists a constant ϵ 0 > 0 such that F k ϵ 0, k 0 (31)

7 L. Muhammad & M. Y. Waziri / IJM 2 C, (2017) up to subsequnce The aim is to guarantee that the linesearch strategy (17) is always terminated in a finite number of steps with a positive steplength α k. we will get this by contradiction. Suppose that for some iterate indexes such as k the condition (17) is not true. Then by letting αk m = ρ m s, it can be concluded that F (x k + α m k d k ) T d k < σα m k F (x k + α m k d k ) d k 2, m 0. combining the above inequality with (22), we have F (x k ) 2 = d T k F k ( 1 + y k 2 θ k y T k s k ) (s T k F k ) 2 y T k s k = [F (x k + α m k d k ) F (x K )] T d k F (x k + α m k d k ) T d k By (23) and (31) < [L + σ F (x k + α m k d k )]α m k d k 2, m 0. F (x k + α m k d k ) F (x k + α m k d k ) F k + F k Lα m k d k Thus, we obtain Ls(κ + κ L2 (2 + L + ν µ )). α m k > F k 2 [L + σ F (x k + α k d k ) ] d k 2 > L + Ls(κ + κ µ ϵ 2 0 L2 (2 + L + µ )) > 0 m 0. Thus, it contradicts with the defination of α m k. consequently, the line search procedure (17) can attain a positive steplength α k in a finite number of backtracking steps. Hence the proof is complete. Theorem 3.1 Let assumption A hold and {α k, d k, x k+1, F k+1 } be generated by ATTCG algorithm. Then

8 152 L. Muhammad & M. Y. Waziri / IJM 2 C, (2017) lim inf k F k = 0 (32) Proof: We prove this theorem by contradiction, namely the relation (31) holds. By (23) we have By lemma 2 α k d k 0, k It follows that from (33) By linesearch technique (17) we have d k F k ϵ 0 k 0 (33) lim α k = 0 (34) k F (x k + α k d k ) T d k < σα k F (x k + α k d k ) d k 2 (35) where α k = α k ρ.. By the boundedness of {x k} in lemma 3 we can deduce that there exist an accumution point x and a subsequence {x kj } of {x k } such that lim x k j = x. j The sequence {d k } is bounded. So d R n and a subsequence {d kj } (we assume that without loss of generality the subsequences {x kj }and {d kj } have the same indices) such that lim d k j = d, k thus, taking the limit as k up to subsequence in both sides of (35) generates F ( x) T d > 0. According to the other hand, by taking the limit as k up to subsequence in both sides of (17) we get F ( x) T d 0. Then a contradiction is generated. So the proof is complete.

9 4. Numerical Results L. Muhammad & M. Y. Waziri / IJM 2 C, (2017) In this section, we tested ATTCG algorithm and compare it s performance with a DF-SDCG Algorithm [9] : The test functions are listed as follows. Examples 1 and 5 are from [19], problem 6 and 8 are from [22] and problem 9 and 10 are from [21] where as the rest are arbitrarily constructed by us. Example 1: The strictly convex function: F i (x) = e x i 1 i = 1, 2,..., n. Example 2: System of n nonlinear equations F i (x) = x i 3x i ( sinx i ) + 2 i = 2, 3,..., n. Example 3:System of n nonlinear equations F i (x) = cos x x 1 + 8e x2, F i (x) = cos x i 9 + 3x i + 8e xi 1, i = 1, 2,..., n Example 4:System of n nonlinear equations F i (x) = (0.5 x i ) 2 + (n + 1 i) x i 1, F n (x) = n 10 1 e x2 n, i = 1, 2,..., n 1. Example 5: System of n nonlinear equations F i (x) = 4x i + x i+1 2x i x i+1, 3 F n (x) = 4x n + x n 1 2x n x n+1, 3 i = 1, 2,..., n 1. Example 6: System of n nonlinear equations F i (x) = x 2 i 4, Example 7:System of n nonlinear equations F 1 (x) = sin(x 1 x 2 ) 4e 2 x2 + 2x 1, F i (x) = sin(2 x i ) 4e x i 2 + 2x i + cos(2 x i ) e 2 x i, i = 2, 3,..., n. Example 8:System of n nonlinear equations F i (x) = n i=1 x ix i 1 + e xi 1 1, i = 2, 3,..., n Example 9:System of n nonlinear equations F i (x) = x i n i=1 x 2 i n 2 + n i=1 x i n, i = 1, 2,..., n Example 10:System of n nonlinear equations F i (x) = 5x 2 i 2x i 3, i = 1, 2,..., n

10 154 L. Muhammad & M. Y. Waziri / IJM 2 C, (2017) Table 1. Numerical results based on dimension of problem (n), Number of iterations and CPU Time (in seconds) of problems 1-6 ATTCG Algorithm DF-SDCG Algorithm P Dim Iter F k CPU time Iter F k CPU time e e e e e e e e e e e e e e e e e e e e e e E e e e e e e e e e e e e e e e e e e Table 2. Numerical results based on dimension of problem (n), Number of iterations and CPU Time (in seconds) of problems 7-10 ATTCG Algorithm TPRP Algorithm P Dim Iter F k CPU time Iter F k CPU time e e e e e e e e e e e e e e e e e e e e e e e e e e e e e e e e

11 L. Muhammad & M. Y. Waziri / IJM 2 C, (2017) y ATTCG DF SDCG x Figure 1. Performance profile of ATTCG and DF-SDCG methods with respect to number of iterations for Examples 1-10 y ATTCG DF SDCG x Figure 2. Performance profile of ATTCG and DF-SDCG methods with respect to CPU time in seconds for Examples 1-10 In the experiments, we compare the performance of the method introduced in this work with that of DF-SDCG conjugate gradient methods for large scale nonlinear systems of equations in order to check it s effectiveness. Numerical computations have been performed in MATLAB R2013a on a PC with Intel CELERON(R) processor with 4.00GB of RAM and CPU 1.80GHz. We used 10 test problems with dimensions 100, 1000, 5000 and to test the performance of the proposed method in terms of the number of iterations (NI) and the CPU time (in seconds). We defined a termination of the method whenever F (x k ) < (36) The table list the numerical results, where Iter and Time stand for the total number of all iterations and the CPU time in seconds, respectively; F k is the norm of the residual at the stopping point. We also used - to represent failure during iteration process due one of the following: 1. The number of iteration and/or the CPU time in second reaches 1000; 2. Failure on code execution due to insufficient memory; 3. If F k is not a number (NaN).

12 156 L. Muhammad & M. Y. Waziri / IJM 2 C, (2017) In Table 1 and 2 we listed numerical results.the numerical results indicated that the proposed method, ATTCG algorithm compared to a DF-SDCG Algorithm [9] for large-scale nonlinear equations has minimum number of iterations and CPU time. Figures 1 and 2 are performance profiles derived by Dolan and More [11] which show that our claim is justified, that is, less CPU time and number of iterations for each test problem. Furthermore, on average, our F k is small which signifies that the solution obtained is true approximation of the exact solution compared to the DF-SDCG Algorithm [9]. 5. Conclusion In this paper a new three-term conjugate gradient algorithm as modification of memoryless BFGS quasi-newton update with descent direction, has been presented. The convergence of this algorithm was proved using a derivative free linesearch. Intensive numerical experiments on some benchmark nonlinear system of equations of different characteristics proved that the suggested algorithm is faster and more efficient compared to a DF-SDCG Algorithm for large-scale nonlinear equations [9]. References [1] N. Andrei, A modified Polak-Ribi ere-polyak conjugate gradient algorithm for unconstrained optimization, Journal of Computational and Applied Mathematics, 60 (2011) [2] N. Andrei, A simple three-term conjugate gradient algorithm for unconstrained optimization, Journal of Computational and Applied Mathematics, 241 (2013) [3] N. Andrei, On three-term conjugate gradient algorithms for unconstrained optimization, Applied Mathematics and Computation, 219 (2013) [4] A. Bouaricha, R. B. Schnabel, Tensor methods for large sparse systems of nonlinear equations, Math. Program, 82 (1998) [5] S. Buhmiler, N. Kreji and Z. Luanin, Practical quasi-newton algorithms for singular nonlinear systems, Numer. Algorithms, 55 (2010) [6] C. G. Broyden, A class of methods for solving nonlinear simultaneous equations. Math. Comput, (1965) [7] C. S. Byeong, et al A comparison of the Newton-Krylov method with high order Newton-like methods to solve nonlinear systems. Appl. Math. Comput, 217 (2010) [8] W. Cheng, A PRP type method for systems of monotone equations,journal of Computational and Applied Mathematics, 50 (2009) [9] W. Cheng et.al, A family of derivative-free conjugate gradient methods for large-scale nonlinear systems of equations, Journal of Computational and Applied Mathematics, 222 (2009) [10] Y. H. Dai and Y. Yuan, A nonlinear conjugate gradient method with a strong global convergence property, SIAM J. Optim, 10 (1999) , Comput., 19 (1965) [11] E. D. Dolan and J. J. More, Benchmarking optimization software with performance profiles, Mathematical Programming, 91 (2) (2002) [12] C. T. Kelley, Iterative Methods for Linear and Nonlinear Equations, PA: SIAM, Philadelphia, (1995). [13] W. Leong, M. A. Hassan and M. Y. Waziri, A matrix-free quasi-newton method for solving nonlinear systems. Computers and Mathematics with Applications, 62 (2011) [14] Q. Li and D. Hui L, A class fo derivative free- methods for large-scale nonlinear monotone equations journal of Numerical Analysis, 31 (2011) [15] Y. Narushima and H. Yabe, Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization, Journal of Computational and Applied Mathematics, 236 (2012) [16] M. Raydan, The Barzilaiai and Borwein gradient method forthe large scale unconstrained minimization problem, SIAM Journal on Optimization, 7 (1997) [17] M. V. Solodov and B. F. Svaiter, A globally convergent inexact Newton method for systems of monotone equations, in: M. Fukushima, L. Qi (Eds.) [18] G. Yuan and M. Zhang, A three-terms Polak-Ribiere-Polyak conjugate gradient algorithm for largescale nonlinear equations Journal of Computational and Applied Mathematics, 286 (2015) [19] La Cruz. W, Martinez. J. M and Raydan. M, spetral residual method without gradient information for solving large-scale nonlinear systems of equations: Theory and experiments, optimization, 76 (79)

13 L. Muhammad & M. Y. Waziri / IJM 2 C, (2017) [20] L. Zhang, W. Zhou and D. H. Li, A descent modified Polak-Ribi ere-polyak conjugate gradient method and its global convergnece, IMA J. Numer.Anal, 26 (2006) [21] M. Y. Waziri and J. Sabi u, A deivative-free conjugate gradient method and its global convergence for solving symmetric nonlinear equations, International Journal of Mathematics and mathematical Science, (2015). [22] M. Y. Waziri, H. A. Aisha and M. Mamat, A structured Broyden s-like method for solving systems of nonlinear equations, Applied mathematical Science, 8 (141) (2014)

A derivative-free nonmonotone line search and its application to the spectral residual method

A derivative-free nonmonotone line search and its application to the spectral residual method IMA Journal of Numerical Analysis (2009) 29, 814 825 doi:10.1093/imanum/drn019 Advance Access publication on November 14, 2008 A derivative-free nonmonotone line search and its application to the spectral

More information

Spectral gradient projection method for solving nonlinear monotone equations

Spectral gradient projection method for solving nonlinear monotone equations Journal of Computational and Applied Mathematics 196 (2006) 478 484 www.elsevier.com/locate/cam Spectral gradient projection method for solving nonlinear monotone equations Li Zhang, Weijun Zhou Department

More information

Quadrature based Broyden-like method for systems of nonlinear equations

Quadrature based Broyden-like method for systems of nonlinear equations STATISTICS, OPTIMIZATION AND INFORMATION COMPUTING Stat., Optim. Inf. Comput., Vol. 6, March 2018, pp 130 138. Published online in International Academic Press (www.iapress.org) Quadrature based Broyden-like

More information

A family of derivative-free conjugate gradient methods for large-scale nonlinear systems of equations

A family of derivative-free conjugate gradient methods for large-scale nonlinear systems of equations Journal of Computational Applied Mathematics 224 (2009) 11 19 Contents lists available at ScienceDirect Journal of Computational Applied Mathematics journal homepage: www.elsevier.com/locate/cam A family

More information

An Implicit Multi-Step Diagonal Secant-Type Method for Solving Large-Scale Systems of Nonlinear Equations

An Implicit Multi-Step Diagonal Secant-Type Method for Solving Large-Scale Systems of Nonlinear Equations Applied Mathematical Sciences, Vol. 6, 2012, no. 114, 5667-5676 An Implicit Multi-Step Diagonal Secant-Type Method for Solving Large-Scale Systems of Nonlinear Equations 1 M. Y. Waziri, 2 Z. A. Majid and

More information

Research Article A Two-Step Matrix-Free Secant Method for Solving Large-Scale Systems of Nonlinear Equations

Research Article A Two-Step Matrix-Free Secant Method for Solving Large-Scale Systems of Nonlinear Equations Applied Mathematics Volume 2012, Article ID 348654, 9 pages doi:10.1155/2012/348654 Research Article A Two-Step Matrix-Free Secant Method for Solving Large-Scale Systems of Nonlinear Equations M. Y. Waziri,

More information

Step-size Estimation for Unconstrained Optimization Methods

Step-size Estimation for Unconstrained Optimization Methods Volume 24, N. 3, pp. 399 416, 2005 Copyright 2005 SBMAC ISSN 0101-8205 www.scielo.br/cam Step-size Estimation for Unconstrained Optimization Methods ZHEN-JUN SHI 1,2 and JIE SHEN 3 1 College of Operations

More information

On the convergence properties of the modified Polak Ribiére Polyak method with the standard Armijo line search

On the convergence properties of the modified Polak Ribiére Polyak method with the standard Armijo line search ANZIAM J. 55 (E) pp.e79 E89, 2014 E79 On the convergence properties of the modified Polak Ribiére Polyak method with the standard Armijo line search Lijun Li 1 Weijun Zhou 2 (Received 21 May 2013; revised

More information

A globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications

A globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications A globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications Weijun Zhou 28 October 20 Abstract A hybrid HS and PRP type conjugate gradient method for smooth

More information

Step lengths in BFGS method for monotone gradients

Step lengths in BFGS method for monotone gradients Noname manuscript No. (will be inserted by the editor) Step lengths in BFGS method for monotone gradients Yunda Dong Received: date / Accepted: date Abstract In this paper, we consider how to directly

More information

New Inexact Line Search Method for Unconstrained Optimization 1,2

New Inexact Line Search Method for Unconstrained Optimization 1,2 journal of optimization theory and applications: Vol. 127, No. 2, pp. 425 446, November 2005 ( 2005) DOI: 10.1007/s10957-005-6553-6 New Inexact Line Search Method for Unconstrained Optimization 1,2 Z.

More information

Modification of the Armijo line search to satisfy the convergence properties of HS method

Modification of the Armijo line search to satisfy the convergence properties of HS method Université de Sfax Faculté des Sciences de Sfax Département de Mathématiques BP. 1171 Rte. Soukra 3000 Sfax Tunisia INTERNATIONAL CONFERENCE ON ADVANCES IN APPLIED MATHEMATICS 2014 Modification of the

More information

An Efficient Modification of Nonlinear Conjugate Gradient Method

An Efficient Modification of Nonlinear Conjugate Gradient Method Malaysian Journal of Mathematical Sciences 10(S) March : 167-178 (2016) Special Issue: he 10th IM-G International Conference on Mathematics, Statistics and its Applications 2014 (ICMSA 2014) MALAYSIAN

More information

Convergence of a Two-parameter Family of Conjugate Gradient Methods with a Fixed Formula of Stepsize

Convergence of a Two-parameter Family of Conjugate Gradient Methods with a Fixed Formula of Stepsize Bol. Soc. Paran. Mat. (3s.) v. 00 0 (0000):????. c SPM ISSN-2175-1188 on line ISSN-00378712 in press SPM: www.spm.uem.br/bspm doi:10.5269/bspm.v38i6.35641 Convergence of a Two-parameter Family of Conjugate

More information

A New Approach for Solving Dual Fuzzy Nonlinear Equations Using Broyden's and Newton's Methods

A New Approach for Solving Dual Fuzzy Nonlinear Equations Using Broyden's and Newton's Methods From the SelectedWorks of Dr. Mohamed Waziri Yusuf August 24, 22 A New Approach for Solving Dual Fuzzy Nonlinear Equations Using Broyden's and Newton's Methods Mohammed Waziri Yusuf, Dr. Available at:

More information

Global Convergence of Perry-Shanno Memoryless Quasi-Newton-type Method. 1 Introduction

Global Convergence of Perry-Shanno Memoryless Quasi-Newton-type Method. 1 Introduction ISSN 1749-3889 (print), 1749-3897 (online) International Journal of Nonlinear Science Vol.11(2011) No.2,pp.153-158 Global Convergence of Perry-Shanno Memoryless Quasi-Newton-type Method Yigui Ou, Jun Zhang

More information

AN EIGENVALUE STUDY ON THE SUFFICIENT DESCENT PROPERTY OF A MODIFIED POLAK-RIBIÈRE-POLYAK CONJUGATE GRADIENT METHOD S.

AN EIGENVALUE STUDY ON THE SUFFICIENT DESCENT PROPERTY OF A MODIFIED POLAK-RIBIÈRE-POLYAK CONJUGATE GRADIENT METHOD S. Bull. Iranian Math. Soc. Vol. 40 (2014), No. 1, pp. 235 242 Online ISSN: 1735-8515 AN EIGENVALUE STUDY ON THE SUFFICIENT DESCENT PROPERTY OF A MODIFIED POLAK-RIBIÈRE-POLYAK CONJUGATE GRADIENT METHOD S.

More information

Residual iterative schemes for largescale linear systems

Residual iterative schemes for largescale linear systems Universidad Central de Venezuela Facultad de Ciencias Escuela de Computación Lecturas en Ciencias de la Computación ISSN 1316-6239 Residual iterative schemes for largescale linear systems William La Cruz

More information

Global Convergence Properties of the HS Conjugate Gradient Method

Global Convergence Properties of the HS Conjugate Gradient Method Applied Mathematical Sciences, Vol. 7, 2013, no. 142, 7077-7091 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/ams.2013.311638 Global Convergence Properties of the HS Conjugate Gradient Method

More information

5 Quasi-Newton Methods

5 Quasi-Newton Methods Unconstrained Convex Optimization 26 5 Quasi-Newton Methods If the Hessian is unavailable... Notation: H = Hessian matrix. B is the approximation of H. C is the approximation of H 1. Problem: Solve min

More information

Bulletin of the. Iranian Mathematical Society

Bulletin of the. Iranian Mathematical Society ISSN: 1017-060X (Print) ISSN: 1735-8515 (Online) Bulletin of the Iranian Mathematical Society Vol. 43 (2017), No. 7, pp. 2437 2448. Title: Extensions of the Hestenes Stiefel and Pola Ribière Polya conjugate

More information

Globally convergent three-term conjugate gradient projection methods for solving nonlinear monotone equations

Globally convergent three-term conjugate gradient projection methods for solving nonlinear monotone equations Arab. J. Math. (2018) 7:289 301 https://doi.org/10.1007/s40065-018-0206-8 Arabian Journal of Mathematics Mompati S. Koorapetse P. Kaelo Globally convergent three-term conjugate gradient projection methods

More information

Line Search Methods for Unconstrained Optimisation

Line Search Methods for Unconstrained Optimisation Line Search Methods for Unconstrained Optimisation Lecture 8, Numerical Linear Algebra and Optimisation Oxford University Computing Laboratory, MT 2007 Dr Raphael Hauser (hauser@comlab.ox.ac.uk) The Generic

More information

and P RP k = gt k (g k? g k? ) kg k? k ; (.5) where kk is the Euclidean norm. This paper deals with another conjugate gradient method, the method of s

and P RP k = gt k (g k? g k? ) kg k? k ; (.5) where kk is the Euclidean norm. This paper deals with another conjugate gradient method, the method of s Global Convergence of the Method of Shortest Residuals Yu-hong Dai and Ya-xiang Yuan State Key Laboratory of Scientic and Engineering Computing, Institute of Computational Mathematics and Scientic/Engineering

More information

Global convergence of a regularized factorized quasi-newton method for nonlinear least squares problems

Global convergence of a regularized factorized quasi-newton method for nonlinear least squares problems Volume 29, N. 2, pp. 195 214, 2010 Copyright 2010 SBMAC ISSN 0101-8205 www.scielo.br/cam Global convergence of a regularized factorized quasi-newton method for nonlinear least squares problems WEIJUN ZHOU

More information

New hybrid conjugate gradient algorithms for unconstrained optimization

New hybrid conjugate gradient algorithms for unconstrained optimization ew hybrid conjugate gradient algorithms for unconstrained optimization eculai Andrei Research Institute for Informatics, Center for Advanced Modeling and Optimization, 8-0, Averescu Avenue, Bucharest,

More information

GLOBAL CONVERGENCE OF CONJUGATE GRADIENT METHODS WITHOUT LINE SEARCH

GLOBAL CONVERGENCE OF CONJUGATE GRADIENT METHODS WITHOUT LINE SEARCH GLOBAL CONVERGENCE OF CONJUGATE GRADIENT METHODS WITHOUT LINE SEARCH Jie Sun 1 Department of Decision Sciences National University of Singapore, Republic of Singapore Jiapu Zhang 2 Department of Mathematics

More information

New Hybrid Conjugate Gradient Method as a Convex Combination of FR and PRP Methods

New Hybrid Conjugate Gradient Method as a Convex Combination of FR and PRP Methods Filomat 3:11 (216), 383 31 DOI 1.2298/FIL161183D Published by Faculty of Sciences and Mathematics, University of Niš, Serbia Available at: http://www.pmf.ni.ac.rs/filomat New Hybrid Conjugate Gradient

More information

New hybrid conjugate gradient methods with the generalized Wolfe line search

New hybrid conjugate gradient methods with the generalized Wolfe line search Xu and Kong SpringerPlus (016)5:881 DOI 10.1186/s40064-016-5-9 METHODOLOGY New hybrid conjugate gradient methods with the generalized Wolfe line search Open Access Xiao Xu * and Fan yu Kong *Correspondence:

More information

Open Problems in Nonlinear Conjugate Gradient Algorithms for Unconstrained Optimization

Open Problems in Nonlinear Conjugate Gradient Algorithms for Unconstrained Optimization BULLETIN of the Malaysian Mathematical Sciences Society http://math.usm.my/bulletin Bull. Malays. Math. Sci. Soc. (2) 34(2) (2011), 319 330 Open Problems in Nonlinear Conjugate Gradient Algorithms for

More information

Multipoint secant and interpolation methods with nonmonotone line search for solving systems of nonlinear equations

Multipoint secant and interpolation methods with nonmonotone line search for solving systems of nonlinear equations Multipoint secant and interpolation methods with nonmonotone line search for solving systems of nonlinear equations Oleg Burdakov a,, Ahmad Kamandi b a Department of Mathematics, Linköping University,

More information

First Published on: 11 October 2006 To link to this article: DOI: / URL:

First Published on: 11 October 2006 To link to this article: DOI: / URL: his article was downloaded by:[universitetsbiblioteet i Bergen] [Universitetsbiblioteet i Bergen] On: 12 March 2007 Access Details: [subscription number 768372013] Publisher: aylor & Francis Informa Ltd

More information

Improved Damped Quasi-Newton Methods for Unconstrained Optimization

Improved Damped Quasi-Newton Methods for Unconstrained Optimization Improved Damped Quasi-Newton Methods for Unconstrained Optimization Mehiddin Al-Baali and Lucio Grandinetti August 2015 Abstract Recently, Al-Baali (2014) has extended the damped-technique in the modified

More information

January 29, Non-linear conjugate gradient method(s): Fletcher Reeves Polak Ribière January 29, 2014 Hestenes Stiefel 1 / 13

January 29, Non-linear conjugate gradient method(s): Fletcher Reeves Polak Ribière January 29, 2014 Hestenes Stiefel 1 / 13 Non-linear conjugate gradient method(s): Fletcher Reeves Polak Ribière Hestenes Stiefel January 29, 2014 Non-linear conjugate gradient method(s): Fletcher Reeves Polak Ribière January 29, 2014 Hestenes

More information

An Efficient Solver for Systems of Nonlinear. Equations with Singular Jacobian. via Diagonal Updating

An Efficient Solver for Systems of Nonlinear. Equations with Singular Jacobian. via Diagonal Updating Applied Mathematical Sciences, Vol. 4, 2010, no. 69, 3403-3412 An Efficient Solver for Systems of Nonlinear Equations with Singular Jacobian via Diagonal Updating M. Y. Waziri, W. J. Leong, M. A. Hassan

More information

A Modified Hestenes-Stiefel Conjugate Gradient Method and Its Convergence

A Modified Hestenes-Stiefel Conjugate Gradient Method and Its Convergence Journal of Mathematical Research & Exposition Mar., 2010, Vol. 30, No. 2, pp. 297 308 DOI:10.3770/j.issn:1000-341X.2010.02.013 Http://jmre.dlut.edu.cn A Modified Hestenes-Stiefel Conjugate Gradient Method

More information

Nonlinear Programming

Nonlinear Programming Nonlinear Programming Kees Roos e-mail: C.Roos@ewi.tudelft.nl URL: http://www.isa.ewi.tudelft.nl/ roos LNMB Course De Uithof, Utrecht February 6 - May 8, A.D. 2006 Optimization Group 1 Outline for week

More information

Quasi-Newton methods: Symmetric rank 1 (SR1) Broyden Fletcher Goldfarb Shanno February 6, / 25 (BFG. Limited memory BFGS (L-BFGS)

Quasi-Newton methods: Symmetric rank 1 (SR1) Broyden Fletcher Goldfarb Shanno February 6, / 25 (BFG. Limited memory BFGS (L-BFGS) Quasi-Newton methods: Symmetric rank 1 (SR1) Broyden Fletcher Goldfarb Shanno (BFGS) Limited memory BFGS (L-BFGS) February 6, 2014 Quasi-Newton methods: Symmetric rank 1 (SR1) Broyden Fletcher Goldfarb

More information

NUMERICAL COMPARISON OF LINE SEARCH CRITERIA IN NONLINEAR CONJUGATE GRADIENT ALGORITHMS

NUMERICAL COMPARISON OF LINE SEARCH CRITERIA IN NONLINEAR CONJUGATE GRADIENT ALGORITHMS NUMERICAL COMPARISON OF LINE SEARCH CRITERIA IN NONLINEAR CONJUGATE GRADIENT ALGORITHMS Adeleke O. J. Department of Computer and Information Science/Mathematics Covenat University, Ota. Nigeria. Aderemi

More information

Higher-Order Methods

Higher-Order Methods Higher-Order Methods Stephen J. Wright 1 2 Computer Sciences Department, University of Wisconsin-Madison. PCMI, July 2016 Stephen Wright (UW-Madison) Higher-Order Methods PCMI, July 2016 1 / 25 Smooth

More information

Research Article A Descent Dai-Liao Conjugate Gradient Method Based on a Modified Secant Equation and Its Global Convergence

Research Article A Descent Dai-Liao Conjugate Gradient Method Based on a Modified Secant Equation and Its Global Convergence International Scholarly Research Networ ISRN Computational Mathematics Volume 2012, Article ID 435495, 8 pages doi:10.5402/2012/435495 Research Article A Descent Dai-Liao Conjugate Gradient Method Based

More information

Methods that avoid calculating the Hessian. Nonlinear Optimization; Steepest Descent, Quasi-Newton. Steepest Descent

Methods that avoid calculating the Hessian. Nonlinear Optimization; Steepest Descent, Quasi-Newton. Steepest Descent Nonlinear Optimization Steepest Descent and Niclas Börlin Department of Computing Science Umeå University niclas.borlin@cs.umu.se A disadvantage with the Newton method is that the Hessian has to be derived

More information

Unconstrained optimization

Unconstrained optimization Chapter 4 Unconstrained optimization An unconstrained optimization problem takes the form min x Rnf(x) (4.1) for a target functional (also called objective function) f : R n R. In this chapter and throughout

More information

Maria Cameron. f(x) = 1 n

Maria Cameron. f(x) = 1 n Maria Cameron 1. Local algorithms for solving nonlinear equations Here we discuss local methods for nonlinear equations r(x) =. These methods are Newton, inexact Newton and quasi-newton. We will show that

More information

Quasi-Newton Methods

Quasi-Newton Methods Quasi-Newton Methods Werner C. Rheinboldt These are excerpts of material relating to the boos [OR00 and [Rhe98 and of write-ups prepared for courses held at the University of Pittsburgh. Some further references

More information

Numerical Methods for Large-Scale Nonlinear Systems

Numerical Methods for Large-Scale Nonlinear Systems Numerical Methods for Large-Scale Nonlinear Systems Handouts by Ronald H.W. Hoppe following the monograph P. Deuflhard Newton Methods for Nonlinear Problems Springer, Berlin-Heidelberg-New York, 2004 Num.

More information

Methods for Unconstrained Optimization Numerical Optimization Lectures 1-2

Methods for Unconstrained Optimization Numerical Optimization Lectures 1-2 Methods for Unconstrained Optimization Numerical Optimization Lectures 1-2 Coralia Cartis, University of Oxford INFOMM CDT: Modelling, Analysis and Computation of Continuous Real-World Problems Methods

More information

A Nonlinear Conjugate Gradient Algorithm with An Optimal Property and An Improved Wolfe Line Search

A Nonlinear Conjugate Gradient Algorithm with An Optimal Property and An Improved Wolfe Line Search A Nonlinear Conjugate Gradient Algorithm with An Optimal Property and An Improved Wolfe Line Search Yu-Hong Dai and Cai-Xia Kou State Key Laboratory of Scientific and Engineering Computing, Institute of

More information

On efficiency of nonmonotone Armijo-type line searches

On efficiency of nonmonotone Armijo-type line searches Noname manuscript No. (will be inserted by the editor On efficiency of nonmonotone Armijo-type line searches Masoud Ahookhosh Susan Ghaderi Abstract Monotonicity and nonmonotonicity play a key role in

More information

Optimization and Root Finding. Kurt Hornik

Optimization and Root Finding. Kurt Hornik Optimization and Root Finding Kurt Hornik Basics Root finding and unconstrained smooth optimization are closely related: Solving ƒ () = 0 can be accomplished via minimizing ƒ () 2 Slide 2 Basics Root finding

More information

Chapter 4. Unconstrained optimization

Chapter 4. Unconstrained optimization Chapter 4. Unconstrained optimization Version: 28-10-2012 Material: (for details see) Chapter 11 in [FKS] (pp.251-276) A reference e.g. L.11.2 refers to the corresponding Lemma in the book [FKS] PDF-file

More information

Programming, numerics and optimization

Programming, numerics and optimization Programming, numerics and optimization Lecture C-3: Unconstrained optimization II Łukasz Jankowski ljank@ippt.pan.pl Institute of Fundamental Technological Research Room 4.32, Phone +22.8261281 ext. 428

More information

CONVERGENCE PROPERTIES OF COMBINED RELAXATION METHODS

CONVERGENCE PROPERTIES OF COMBINED RELAXATION METHODS CONVERGENCE PROPERTIES OF COMBINED RELAXATION METHODS Igor V. Konnov Department of Applied Mathematics, Kazan University Kazan 420008, Russia Preprint, March 2002 ISBN 951-42-6687-0 AMS classification:

More information

Handling nonpositive curvature in a limited memory steepest descent method

Handling nonpositive curvature in a limited memory steepest descent method IMA Journal of Numerical Analysis (2016) 36, 717 742 doi:10.1093/imanum/drv034 Advance Access publication on July 8, 2015 Handling nonpositive curvature in a limited memory steepest descent method Frank

More information

Shiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 3. Gradient Method

Shiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 3. Gradient Method Shiqian Ma, MAT-258A: Numerical Optimization 1 Chapter 3 Gradient Method Shiqian Ma, MAT-258A: Numerical Optimization 2 3.1. Gradient method Classical gradient method: to minimize a differentiable convex

More information

17 Solution of Nonlinear Systems

17 Solution of Nonlinear Systems 17 Solution of Nonlinear Systems We now discuss the solution of systems of nonlinear equations. An important ingredient will be the multivariate Taylor theorem. Theorem 17.1 Let D = {x 1, x 2,..., x m

More information

Research Article Modified Halfspace-Relaxation Projection Methods for Solving the Split Feasibility Problem

Research Article Modified Halfspace-Relaxation Projection Methods for Solving the Split Feasibility Problem Advances in Operations Research Volume 01, Article ID 483479, 17 pages doi:10.1155/01/483479 Research Article Modified Halfspace-Relaxation Projection Methods for Solving the Split Feasibility Problem

More information

Math 164: Optimization Barzilai-Borwein Method

Math 164: Optimization Barzilai-Borwein Method Math 164: Optimization Barzilai-Borwein Method Instructor: Wotao Yin Department of Mathematics, UCLA Spring 2015 online discussions on piazza.com Main features of the Barzilai-Borwein (BB) method The BB

More information

A COMBINED CLASS OF SELF-SCALING AND MODIFIED QUASI-NEWTON METHODS

A COMBINED CLASS OF SELF-SCALING AND MODIFIED QUASI-NEWTON METHODS A COMBINED CLASS OF SELF-SCALING AND MODIFIED QUASI-NEWTON METHODS MEHIDDIN AL-BAALI AND HUMAID KHALFAN Abstract. Techniques for obtaining safely positive definite Hessian approximations with selfscaling

More information

Research Article Nonlinear Conjugate Gradient Methods with Wolfe Type Line Search

Research Article Nonlinear Conjugate Gradient Methods with Wolfe Type Line Search Abstract and Applied Analysis Volume 013, Article ID 74815, 5 pages http://dx.doi.org/10.1155/013/74815 Research Article Nonlinear Conjugate Gradient Methods with Wolfe Type Line Search Yuan-Yuan Chen

More information

2. Quasi-Newton methods

2. Quasi-Newton methods L. Vandenberghe EE236C (Spring 2016) 2. Quasi-Newton methods variable metric methods quasi-newton methods BFGS update limited-memory quasi-newton methods 2-1 Newton method for unconstrained minimization

More information

Quasi-Newton methods for minimization

Quasi-Newton methods for minimization Quasi-Newton methods for minimization Lectures for PHD course on Numerical optimization Enrico Bertolazzi DIMS Universitá di Trento November 21 December 14, 2011 Quasi-Newton methods for minimization 1

More information

GRADIENT METHODS FOR LARGE-SCALE NONLINEAR OPTIMIZATION

GRADIENT METHODS FOR LARGE-SCALE NONLINEAR OPTIMIZATION GRADIENT METHODS FOR LARGE-SCALE NONLINEAR OPTIMIZATION By HONGCHAO ZHANG A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE

More information

Static unconstrained optimization

Static unconstrained optimization Static unconstrained optimization 2 In unconstrained optimization an objective function is minimized without any additional restriction on the decision variables, i.e. min f(x) x X ad (2.) with X ad R

More information

A double projection method for solving variational inequalities without monotonicity

A double projection method for solving variational inequalities without monotonicity A double projection method for solving variational inequalities without monotonicity Minglu Ye Yiran He Accepted by Computational Optimization and Applications, DOI: 10.1007/s10589-014-9659-7,Apr 05, 2014

More information

Acceleration Method for Convex Optimization over the Fixed Point Set of a Nonexpansive Mapping

Acceleration Method for Convex Optimization over the Fixed Point Set of a Nonexpansive Mapping Noname manuscript No. will be inserted by the editor) Acceleration Method for Convex Optimization over the Fixed Point Set of a Nonexpansive Mapping Hideaki Iiduka Received: date / Accepted: date Abstract

More information

Università di Firenze, via C. Lombroso 6/17, Firenze, Italia,

Università di Firenze, via C. Lombroso 6/17, Firenze, Italia, Convergence of a Regularized Euclidean Residual Algorithm for Nonlinear Least-Squares by S. Bellavia 1, C. Cartis 2, N. I. M. Gould 3, B. Morini 1 and Ph. L. Toint 4 25 July 2008 1 Dipartimento di Energetica

More information

Part 3: Trust-region methods for unconstrained optimization. Nick Gould (RAL)

Part 3: Trust-region methods for unconstrained optimization. Nick Gould (RAL) Part 3: Trust-region methods for unconstrained optimization Nick Gould (RAL) minimize x IR n f(x) MSc course on nonlinear optimization UNCONSTRAINED MINIMIZATION minimize x IR n f(x) where the objective

More information

8 Numerical methods for unconstrained problems

8 Numerical methods for unconstrained problems 8 Numerical methods for unconstrained problems Optimization is one of the important fields in numerical computation, beside solving differential equations and linear systems. We can see that these fields

More information

Numerical Method for Solution of Fuzzy Nonlinear Equations

Numerical Method for Solution of Fuzzy Nonlinear Equations 1, Issue 1 (218) 13-18 Journal of Advanced Research in Modelling and Simulations Journal homepage: www.akademiabaru.com/armas.html ISSN: XXXX-XXXX Numerical for Solution of Fuzzy Nonlinear Equations Open

More information

Handout on Newton s Method for Systems

Handout on Newton s Method for Systems Handout on Newton s Method for Systems The following summarizes the main points of our class discussion of Newton s method for approximately solving a system of nonlinear equations F (x) = 0, F : IR n

More information

Numerical solutions of nonlinear systems of equations

Numerical solutions of nonlinear systems of equations Numerical solutions of nonlinear systems of equations Tsung-Ming Huang Department of Mathematics National Taiwan Normal University, Taiwan E-mail: min@math.ntnu.edu.tw August 28, 2011 Outline 1 Fixed points

More information

Keywords: Nonlinear least-squares problems, regularized models, error bound condition, local convergence.

Keywords: Nonlinear least-squares problems, regularized models, error bound condition, local convergence. STRONG LOCAL CONVERGENCE PROPERTIES OF ADAPTIVE REGULARIZED METHODS FOR NONLINEAR LEAST-SQUARES S. BELLAVIA AND B. MORINI Abstract. This paper studies adaptive regularized methods for nonlinear least-squares

More information

Convex Optimization. Problem set 2. Due Monday April 26th

Convex Optimization. Problem set 2. Due Monday April 26th Convex Optimization Problem set 2 Due Monday April 26th 1 Gradient Decent without Line-search In this problem we will consider gradient descent with predetermined step sizes. That is, instead of determining

More information

University of Maryland at College Park. limited amount of computer memory, thereby allowing problems with a very large number

University of Maryland at College Park. limited amount of computer memory, thereby allowing problems with a very large number Limited-Memory Matrix Methods with Applications 1 Tamara Gibson Kolda 2 Applied Mathematics Program University of Maryland at College Park Abstract. The focus of this dissertation is on matrix decompositions

More information

LIMITED MEMORY BUNDLE METHOD FOR LARGE BOUND CONSTRAINED NONSMOOTH OPTIMIZATION: CONVERGENCE ANALYSIS

LIMITED MEMORY BUNDLE METHOD FOR LARGE BOUND CONSTRAINED NONSMOOTH OPTIMIZATION: CONVERGENCE ANALYSIS LIMITED MEMORY BUNDLE METHOD FOR LARGE BOUND CONSTRAINED NONSMOOTH OPTIMIZATION: CONVERGENCE ANALYSIS Napsu Karmitsa 1 Marko M. Mäkelä 2 Department of Mathematics, University of Turku, FI-20014 Turku,

More information

Some Inexact Hybrid Proximal Augmented Lagrangian Algorithms

Some Inexact Hybrid Proximal Augmented Lagrangian Algorithms Some Inexact Hybrid Proximal Augmented Lagrangian Algorithms Carlos Humes Jr. a, Benar F. Svaiter b, Paulo J. S. Silva a, a Dept. of Computer Science, University of São Paulo, Brazil Email: {humes,rsilva}@ime.usp.br

More information

Bulletin of the. Iranian Mathematical Society

Bulletin of the. Iranian Mathematical Society ISSN: 1017-060X (Print) ISSN: 1735-8515 (Online) Bulletin of the Iranian Mathematical Society Vol. 41 (2015), No. 5, pp. 1259 1269. Title: A uniform approximation method to solve absolute value equation

More information

Optimization II: Unconstrained Multivariable

Optimization II: Unconstrained Multivariable Optimization II: Unconstrained Multivariable CS 205A: Mathematical Methods for Robotics, Vision, and Graphics Justin Solomon CS 205A: Mathematical Methods Optimization II: Unconstrained Multivariable 1

More information

ORIE 6326: Convex Optimization. Quasi-Newton Methods

ORIE 6326: Convex Optimization. Quasi-Newton Methods ORIE 6326: Convex Optimization Quasi-Newton Methods Professor Udell Operations Research and Information Engineering Cornell April 10, 2017 Slides on steepest descent and analysis of Newton s method adapted

More information

On the Local Quadratic Convergence of the Primal-Dual Augmented Lagrangian Method

On the Local Quadratic Convergence of the Primal-Dual Augmented Lagrangian Method Optimization Methods and Software Vol. 00, No. 00, Month 200x, 1 11 On the Local Quadratic Convergence of the Primal-Dual Augmented Lagrangian Method ROMAN A. POLYAK Department of SEOR and Mathematical

More information

Cubic regularization in symmetric rank-1 quasi-newton methods

Cubic regularization in symmetric rank-1 quasi-newton methods Math. Prog. Comp. (2018) 10:457 486 https://doi.org/10.1007/s12532-018-0136-7 FULL LENGTH PAPER Cubic regularization in symmetric rank-1 quasi-newton methods Hande Y. Benson 1 David F. Shanno 2 Received:

More information

NONSMOOTH VARIANTS OF POWELL S BFGS CONVERGENCE THEOREM

NONSMOOTH VARIANTS OF POWELL S BFGS CONVERGENCE THEOREM NONSMOOTH VARIANTS OF POWELL S BFGS CONVERGENCE THEOREM JIAYI GUO AND A.S. LEWIS Abstract. The popular BFGS quasi-newton minimization algorithm under reasonable conditions converges globally on smooth

More information

Numerical Methods for Large-Scale Nonlinear Equations

Numerical Methods for Large-Scale Nonlinear Equations Slide 1 Numerical Methods for Large-Scale Nonlinear Equations Homer Walker MA 512 April 28, 2005 Inexact Newton and Newton Krylov Methods a. Newton-iterative and inexact Newton methods. Slide 2 i. Formulation

More information

The speed of Shor s R-algorithm

The speed of Shor s R-algorithm IMA Journal of Numerical Analysis 2008) 28, 711 720 doi:10.1093/imanum/drn008 Advance Access publication on September 12, 2008 The speed of Shor s R-algorithm J. V. BURKE Department of Mathematics, University

More information

Proximal-like contraction methods for monotone variational inequalities in a unified framework

Proximal-like contraction methods for monotone variational inequalities in a unified framework Proximal-like contraction methods for monotone variational inequalities in a unified framework Bingsheng He 1 Li-Zhi Liao 2 Xiang Wang Department of Mathematics, Nanjing University, Nanjing, 210093, China

More information

DELFT UNIVERSITY OF TECHNOLOGY

DELFT UNIVERSITY OF TECHNOLOGY DELFT UNIVERSITY OF TECHNOLOGY REPORT -09 Computational and Sensitivity Aspects of Eigenvalue-Based Methods for the Large-Scale Trust-Region Subproblem Marielba Rojas, Bjørn H. Fotland, and Trond Steihaug

More information

5 Handling Constraints

5 Handling Constraints 5 Handling Constraints Engineering design optimization problems are very rarely unconstrained. Moreover, the constraints that appear in these problems are typically nonlinear. This motivates our interest

More information

An Accelerated Block-Parallel Newton Method via Overlapped Partitioning

An Accelerated Block-Parallel Newton Method via Overlapped Partitioning An Accelerated Block-Parallel Newton Method via Overlapped Partitioning Yurong Chen Lab. of Parallel Computing, Institute of Software, CAS (http://www.rdcps.ac.cn/~ychen/english.htm) Summary. This paper

More information

Search Directions for Unconstrained Optimization

Search Directions for Unconstrained Optimization 8 CHAPTER 8 Search Directions for Unconstrained Optimization In this chapter we study the choice of search directions used in our basic updating scheme x +1 = x + t d. for solving P min f(x). x R n All

More information

Suppose that the approximate solutions of Eq. (1) satisfy the condition (3). Then (1) if η = 0 in the algorithm Trust Region, then lim inf.

Suppose that the approximate solutions of Eq. (1) satisfy the condition (3). Then (1) if η = 0 in the algorithm Trust Region, then lim inf. Maria Cameron 1. Trust Region Methods At every iteration the trust region methods generate a model m k (p), choose a trust region, and solve the constraint optimization problem of finding the minimum of

More information

An Active Set Strategy for Solving Optimization Problems with up to 200,000,000 Nonlinear Constraints

An Active Set Strategy for Solving Optimization Problems with up to 200,000,000 Nonlinear Constraints An Active Set Strategy for Solving Optimization Problems with up to 200,000,000 Nonlinear Constraints Klaus Schittkowski Department of Computer Science, University of Bayreuth 95440 Bayreuth, Germany e-mail:

More information

Complexity of gradient descent for multiobjective optimization

Complexity of gradient descent for multiobjective optimization Complexity of gradient descent for multiobjective optimization J. Fliege A. I. F. Vaz L. N. Vicente July 18, 2018 Abstract A number of first-order methods have been proposed for smooth multiobjective optimization

More information

Part 2: Linesearch methods for unconstrained optimization. Nick Gould (RAL)

Part 2: Linesearch methods for unconstrained optimization. Nick Gould (RAL) Part 2: Linesearch methods for unconstrained optimization Nick Gould (RAL) minimize x IR n f(x) MSc course on nonlinear optimization UNCONSTRAINED MINIMIZATION minimize x IR n f(x) where the objective

More information

A new Newton like method for solving nonlinear equations

A new Newton like method for solving nonlinear equations DOI 10.1186/s40064-016-2909-7 RESEARCH Open Access A new Newton like method for solving nonlinear equations B. Saheya 1,2, Guo qing Chen 1, Yun kang Sui 3* and Cai ying Wu 1 *Correspondence: yksui@sina.com

More information

Algorithms for Constrained Optimization

Algorithms for Constrained Optimization 1 / 42 Algorithms for Constrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University April 19, 2015 2 / 42 Outline 1. Convergence 2. Sequential quadratic

More information

A proximal-like algorithm for a class of nonconvex programming

A proximal-like algorithm for a class of nonconvex programming Pacific Journal of Optimization, vol. 4, pp. 319-333, 2008 A proximal-like algorithm for a class of nonconvex programming Jein-Shan Chen 1 Department of Mathematics National Taiwan Normal University Taipei,

More information

PARALLEL SUBGRADIENT METHOD FOR NONSMOOTH CONVEX OPTIMIZATION WITH A SIMPLE CONSTRAINT

PARALLEL SUBGRADIENT METHOD FOR NONSMOOTH CONVEX OPTIMIZATION WITH A SIMPLE CONSTRAINT Linear and Nonlinear Analysis Volume 1, Number 1, 2015, 1 PARALLEL SUBGRADIENT METHOD FOR NONSMOOTH CONVEX OPTIMIZATION WITH A SIMPLE CONSTRAINT KAZUHIRO HISHINUMA AND HIDEAKI IIDUKA Abstract. In this

More information

On the convergence properties of the projected gradient method for convex optimization

On the convergence properties of the projected gradient method for convex optimization Computational and Applied Mathematics Vol. 22, N. 1, pp. 37 52, 2003 Copyright 2003 SBMAC On the convergence properties of the projected gradient method for convex optimization A. N. IUSEM* Instituto de

More information

Extra-Updates Criterion for the Limited Memory BFGS Algorithm for Large Scale Nonlinear Optimization 1

Extra-Updates Criterion for the Limited Memory BFGS Algorithm for Large Scale Nonlinear Optimization 1 journal of complexity 18, 557 572 (2002) doi:10.1006/jcom.2001.0623 Extra-Updates Criterion for the Limited Memory BFGS Algorithm for Large Scale Nonlinear Optimization 1 M. Al-Baali Department of Mathematics

More information