Two new spectral conjugate gradient algorithms based on Hestenes Stiefel
|
|
- Lynette Cunningham
- 6 years ago
- Views:
Transcription
1 Research Article Two new spectral conjuate radient alorithms based on Hestenes Stiefel Journal of Alorithms & Computational Technoloy 207, Vol. (4) ! The Author(s) 207 Reprints and permissions: saepub.co.u/journalspermissions.nav DOI: 0.77/ journals.saepub.com/home/act Guofan Wan, Rui Shan, Wei Huan, Wen Liu and Jinyi Zhao Abstract The spectral conjuate radient alorithm, which is a variant of conjuate radient method, is one of the effective methods for solvin unconstrained optimization problems. In this paper, based on Hestenes Stiefel method, two new spectral conjuate radient alorithms (Descend Hestenes-Stiefel (DHS) and Wan-Hestenes-Stiefel (WHS)) are proposed. Under Wolfe line search and mild assumptions on objective function, the two alorithms possess sufficient descent property without any other conditions and are always lobally converent. Numerical results turn out the new alorithms outperform Hestenes Stiefel conjuate radient method. Keywords Spectral conjuate radient alorithm, unconstrained optimization, Wolfe line search, sufficient descent property, lobally converent Date received: 2 May 207; revised: 23 May 207; accepted: 20 June 207 Introduction Consider the unconstrained optimization problem (UP) min f ðxþ, x 2 R n where the function f : R n! R is continuously differentiable. The most commonly used method for solvin this ind of problem is the conjuate radient (CG) method, which is especially suitable for solvin lare dimension or non-linear problems. Its converence rate is between Newton method and steepest descent method, CG method avoids the shortcomins of the Newton method to calculate the Hessen matrix, and also has a secondary termination. Its main iterative format is x þ ¼ x þ d d ¼, ¼ 0; þ d, where is the step factor, which can be determined by some methods (line search, etc.), d is the down search direction, is a scalar. Different CG methods are ðþ ð2þ ð3þ enerated accordin to the different formulae of scalar parameters, and different spectral CG methods are enerated accordin to different search directions d. The expressions used in some well now CG alorithms are list below HS ¼ T ð Þ d T ð Þ alorithm, 2 FR ¼ T T PRP DY ¼ T ð Þ T HS ðhestenes StiefelÞ FR ðfletcher ReevesÞalorithm, 3 PRP (Pola Ribie` re Polya) alorithm, 4,5 ¼ T d T ð Þ DY ðdai YuanÞ alorithm: 6 Collee of Science, Yanshan University, Qinhuandao, China Correspondin author: Rui Shan, Collee of Science, Yanshan University, Qinhuandao , China @qq.com Creative Commons CC BY-NC: This article is distributed under the terms of the Creative Commons Attribution-NonCommercial 4.0 License ( creativecommons.or/licenses/by-nc/4.0/) which permits non-commercial use, reproduction and distribution of the wor without further permission provided the oriinal wor is attributed as specified on the SAGE and Open Access paes (
2 346 Journal of Alorithms & Computational Technoloy (4) Amon these four alorithms, FR and DY alorithm have ood lobal converence, while Hestenes Stiefel (HS) and PRP alorithm have fantastic numerical performance. HS alorithm for strict convex quadratic function has finite step converence under exact line search, but for eneral non strict convex quadratic objective function, even under exact line search can t uarantee converent in finite steps, and lobal converence cannot be uaranteed. 7 Combinin with the advantae of HS and DY alorithm, references 8 proposed a new conjuate method 8 T ð Þ, ð cos Þjj ð ¼ T d Þ 2 d T jj 2 4 T >< ; >: maxf jj jj 2 d T y,0, otherwise, (, ¼ ; d ¼ ð þ T d Þ jj jj 2 þ d, 4 NLS NLS DY alorithm: 8 The motivation of this paper is to combine the advantaes of HS 3 and NLS-DY 8 in order to provide novel alorithms with better converence. The new alorithms Consider the unconstrained optimization problem (), combinin with the literature. 3,8 The formulae of DHS and WHS are constructed as follows WHS T ¼ ð Þ ð T d Þ 2 d T 8 <, ¼ ; d ¼ þ T d : jj jj 2 þ d, 4 Compared with HS alorithm, DHS alorithm s innovation lies in d. In the HS alorithm, iteration format is (2), search direction is (3), and search method is line search. However, in the DHS alorithm, iteration format is (2), search direction is (5), and search method is Wolfe line search. Under the same scalar parameter HS, usin different search direction and search method, the DHS alorithm has better numerical results under the premise of converence. WHS alorithm also uses search direction (5) and Wolfe line search, the scalar parameter NLS in Shi et al. 8 is modified to et the scalar parameter WHS : The parameter in WHS is a constant and 0, different parameters have different iterative effects. ð4þ ð5þ DHS alorithm implementation process:. Given a initial value x 2 R n, " 4 0, d ¼, ¼ : 2. Perform the Wolfe line search f ðx þ d Þfðx Þþ T d ; d T ðx þ d Þd T ð6þ where , and are real numbers. From (6), we et, and accordin to (2), we obtain x þ : Then calculate f þ ¼ f ðx þ Þ, þ ¼ ðx þ Þ: 3. If jj þ jj 5 ", the minimum value is x þ ; but if jj þ jj ", o to the next step. 4. Calculate formula HS and formula d from (5). 5. Put ¼ þ, and turn to 2. WHS alorithm implementation process:. Given a initial value x 2 R n, " 4 0, d ¼, ¼ : 2. Perform the Wolfe line search (6) ,we et, and accordin to (2), we obtain x þ : Then calculate f þ ¼ f ðx þ Þ, þ ¼ ðx þ Þ: 3. If jj þ jj 5 ", the minimum value is x þ ; but if jj þ jj ", o to the next step. 4. Calculate formula (4) and (5). 5. Put ¼ þ, and turn to 2. Global converence Assumptions: 9. The level set ¼fx 2 R n f ðxþ f ðx 0 Þ is bounded, where x 0 is the initial point. 2. The function f is continuously differentiable in a neihborhood of, and the function radient satisfies the Lipschitz continuity condition, that is, there exists a positive constant L such that the followin holds jjðx Þ ðx 2 Þjj 5 Ljjx x 2 jj, 8x, x 2 2 Theorem 2.. If 6¼ 0, the directions enerated by DHS and WHS alorithm are descendant, that is, 8, T d 5 0: Proof. While ¼, d ¼, T d ¼ jj jj 2 ; while 4, d ¼ þ T d jj jj 2 þ d, T d ¼ jj jj 2 þ T d ¼ jj jj 2 þ T d jj jj 2
3 Wan et al. 347 So it is easy to verify HS and WHS descent property. satisfy sufficient Lemma 2.. Suppose that f satisfies the above premises, x from (2), d from (5), satisfies the Wolfe line search (6). 0 Then the Zoutendij holds X ¼ ð T d Þ 2 5 þ 2 ð7þ jjd jj Theorem 2.2. Assumin that assumptions () and (2) is established, consider the CG method with the form of (2) and (5), and ¼ HS, the followin holds lim inf jj jj ¼ 0 ð8þ! Proof. (Reduction to absurdity) first of all, assume that the conclusion is not established, then 8 4 0, 9" 4 0is a real constant, and jj jj 4 " holds. Accordin to (5), T d ¼ jj jj 2 and let l ¼ þ T d, while 4, (5) is squared, taen jj jj 2 norm, and simplified as follow jjd jj 2 ¼ð Þ 2 jjd jj 2 2l d T l 2 jj jj 2, and ¼ HS, then we have jjd jj 2 jjd jj 2 ð T d Þ 2 ¼ðHS Þ2 ð T d Þ 2 2l d T ð T d Þ 2 l2 jj jj 2 ð T d Þ 2 ð HS Þ2 jjd jj 2 jj jj 4 ðl Þ 2 jj jj 2 þ jj jj 2 ¼ ½ T ð ÞŠ 2 ½d T ð ÞŠ 2 jjd jj 2 jj jj 4 ðl Þ 2 jj jj 2 þ jj jj 2 jj jj 4 ð T d Þ 2 jjd jj 2 jj jj 4 þ jj jj 2 ¼ jjd jj 2 ð T d Þ 2 þ jj jj 2 we can delivery as follows jjd jj 2 ð T d Þ 2 X X ¼ ð T d Þ 2 jjd jj 2 i¼ jj i jj 2 X X ¼ i¼ " 2 X ¼ "2 " 2 ¼ " 2, ðt d 2 Þ jjd jj 2 "2, ¼ ¼þ it is contrary to condition (7) of Lemma 2.. ((7) holds), then Theorem 2.2. holds, DHS alorithm has lobal converence. Theorem 2.3. Assumin that assumptions () and (2) is established, consider the CG method with the form of (2) and (5), and ¼ WHS, the followin holds lim! inf jj jj ¼ 0 ð9þ Proof. (Reduction to absurdity) first of all, assume that the conclusion is not established, then 8 4 0, 9" 4 0is a real constant, and jj jj 4 " holds. Accordin to (5), T d ¼ jj jj 2 and let l ¼ þ T d, while 4, (4) is squared, taen jj jj 2 norm, and simplified as follow jjd jj 2 ¼ð Þ 2 jjd jj 2 2l d T l 2 jj jj 2, and ¼ WHS, then we have jjd jj 2 ð T d Þ 2 ¼ðWHS Þ 2 jjd jj 2 ð T d Þ 2 2l d T ð T d Þ 2 l2 jj jj 2 ð T d Þ 2 ð WHS Þ 2 jjd jj 2 jj jj 4 ðl Þ 2 jj jj 2 þ jj jj 2 ¼ ½ T ð ÞŠ 2 jjd jj 2 ½ð T d Þ 2 d T Š 2 jj jj 4 ðl Þ 2 jj jj 2 þ jj jj 2 jj jj 4 ð T d Þ 2 jjd jj 2 jj jj 4 þ jj jj 2 ¼ jjd jj 2 ð T d Þ 2 þ jj jj 2 we can delivery as follows jjd jj 2 ð T d Þ 2 X X ¼ ð T d Þ 2 jjd jj 2 i¼ jj i jj 2 X X ¼ i¼ " 2 X ¼ "2 " 2 ¼ " 2, ðt d 2 Þ jjd jj 2 "2, ¼ ¼þ it is contrary to condition (7) of Lemma 2.. ((7) holds), then Theorem 2.3. holds, WHS alorithm has lobal converence. Numerical experiments In this section, we use some test functions of More et al., under the Wolfe line search, to balance the numerical performance of the two new spectral CG alorithms (DHS and WHS) and traditional HS alorithm. The proram 2 is written in the MATLAB 200b, and run on the computer with Intel(R)
4 348 Journal of Alorithms & Computational Technoloy (4) Table. Iterative comparison of two alorithms DHS and HS. DHS HS Function Dim NI NF NG t f* NI NF NG t f* Brown e e-03 Rosenbroc e e-02 Beale e e-02 Jennrich ** ** ** ** ** Jennrich Jennrich e e-0 Helical e e-02 Box e-06 Powell e e-07 Wood e e-04 Dennis e e-005 Osborne e-006 ** ** ** ** ** Bis e e-03 Osborne ** ** ** ** ** HS: Hestenes Stiefel; NI: number of iterations; NF: number of times that the function is evaluated; NG: number of radient function calculations. Table 2. Iterative comparison of two alorithms WHS and HS. WHS HS Function Dim NI NF NG t f* NI NF NG t f* Brown e e-03 Rosenbroc e e-02 Beal e e-02 Helical e e-02 Jennrich ** ** ** ** ** Jennrich Box e e-06 Box e e-06 Powell e e-07 Wood e e-04 Kowali e e-02 Dennis e e-08 Osbornel e-006 ** ** ** ** ** HS: Hestenes Stiefel; NI: number of iterations; NF: number of times that the function is evaluated; NG: number of radient function calculations. Core(TM) i5-5200u and 4.00 GB SDRAM. Durin the test, the parameters are set as follows ¼ 0:0, ¼ 0:05, " ¼ 0 6,NI 0000 The test results are shown in Tables to 3, where Dim is the dimension of the function, NI is the number of iterations, NF is the number of times that the function is evaluated, NG is the number of radient function calculations, t is the proram run time, and f* is optimal function value. The sin ** means that run stopped because the line search procedure failed to find a step lenth, this means that the alorithm has poor converence. The data in Table show that most of the test functions NI, NF, and NG which are calculated by DHS alorithm are less than HS, so the new iterative method is effective. And t of DHS obviously lower than HS. The reduction of the number of iterations and the
5 Wan et al. 349 Table 3. Iterative comparison of two alorithms DHS and WHS. WHS DHS Function Dim NI NF NG t f* NI NF NG t f* Rosenbroc e e-0 Freudenstein e þ 0 Jennrich e-0 Beal e e-02 Helical e e-07 Box e Powell e e-07 Wood e e-09 Kowali e e-0 Osbornel e e-006 HS: Hestenes Stiefel; NI: number of iterations; NF: number of times that the function is evaluated; NG: number of radient function calculations. runnin time reflects the stron converence of the alorithm, the decrease of the error indicates that the alorithm has better numerical results. DHS is more useful for solvin unconstrained problems. In Table 2, different function select different value of (at present, the choice of is uniform discrete, and 0 ). The data in Table 2 shows that most of the test functions NI, NF and NG which are calculated by WHS alorithm are less than by HS, and t of WHS obviously lower than HS. WHS is better than HS. In Table 3, after selectin the appropriate. Calculatin some of the functions, for example function Rosenbroc, Jennrich3, Helical and Box3, WHS alorithm perform slihtly better than DHS. Looin the others, for example function Freudenstein, Beal, Powell, Wood, Kowali and Osbornel, DHS alorithm perform slihtly better than WHS. DHS and WHS methods are approximately equal. To sum up, DHS and WHS both perform better than HS. The comparison between DHS and WHS needs a concrete analysis, but they are approximately equal. In addition, we also put the performance profiles of WHS with uniform discrete in Table 4 of Appendix. Note: In each function that in Tables 2 and 3, the value of selects the best one of the iterations. Conclusion In this paper, based on the classical HS method, we present two improved CG methods, that is, DHS and WHS methods. In Section 3, we obtain the followin theoretical results: The DHS has sufficient descent property, and is lobally converent if the Wolfe line search (6) is used, and the parameter The WHS has sufficient descent property, and is lobally converent if the Wolfe line search (6) is used, and the parameter On the other hand, numerical results reported in Section 4 show that: The averae performance of the DHS and WHS methods proposed in this paper are enerally better than that of the HS method. The averae performance of the DHS and WHS methods are approximately equal. Acnowledements The authors are very rateful to the anonymous referees for their valuable comments and useful suestions, which improved the quality of this paper. Declaration of conflictin interests The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article. Fundin The author(s) disclosed receipt of the followin financial support for the research, authorship, and/or publication of this article: National Nature Science Foundation of China (No , , ) and Basic Research Project of Yanshan University (6LGY02). References. Fu YD, Chen XY and Tan YH. Optimization theory and method (Chinese). Beijin: National Defense Industry Press, Hestenes MR and Stiefel E. Method of conjuate radient for solvin linear equations. J Res Natl Bureau Standards 952; 49: Fletcher R and Reeves C. Function minimization by conjuate radients. Comput J 964; 7:
6 350 Journal of Alorithms & Computational Technoloy (4) 4. Polya E and Ribiere G. Note sur la converence de methode de directions conjuuees. Revue Française Informatique Recherche Ope rationnelle 969; 6: Polya B. The conjuate radient method in extreme problems. USSR Comp Math Math Phys 969; 9: Dai YH and Yuan Y. A nonlinear conjuate radient with a stron lobal converence property. SIAM J Optim 999; 0: Gibert JC and Nocedal J. Global converence properties of conjuate radient methods for optimization. SIMA J Optim 992; 2: Shi ST, Shan R and Liu W. The conjuate radient alorithm with perturbation factor and its converence. J Henan Univ Sci Technol 203; 34: Wan KR and Gao PT. Two mixed conjuate radient methods based on DY. J Shandon Univ (Nat Sci) 206; 5: Zoutendij G. Nonlinear prorammin computational methods. Inteer Nonlin Proram 970; 43: More JJ, Garbow BS and Hilstrome KE. Testin unconstrained optimization software. ACM Trans Math Softw 98; 7: Liu XG and Hu YQ. Application of optimization method and MATLAB implementation (Chinese). Beijin: Science Press, 204. Appendix Table 4. Numerical results of the WHS method with. Function f* NI t NG NF Kowali e e e e e e e e e e e Freudenstein Jennrich (continued)
7 Wan et al. 35 Table 4. Continued Function f* NI t NG NF Beale e e e e e e e e e e e Helical e e e e e e e e e e e Box e e þ e e e e þ e e e þ e e Powell e e e e e e e e e e e (continued)
8 352 Journal of Alorithms & Computational Technoloy (4) Table 4. Continued Function f* NI t NG NF Wood e e e e e e e e e e e Rosenbroc e e e e e e e e e e e NI: number of iterations; NG: number of radient function calculations; NF: number of times that the function is evaluated.
New hybrid conjugate gradient methods with the generalized Wolfe line search
Xu and Kong SpringerPlus (016)5:881 DOI 10.1186/s40064-016-5-9 METHODOLOGY New hybrid conjugate gradient methods with the generalized Wolfe line search Open Access Xiao Xu * and Fan yu Kong *Correspondence:
More informationA Modified Hestenes-Stiefel Conjugate Gradient Method and Its Convergence
Journal of Mathematical Research & Exposition Mar., 2010, Vol. 30, No. 2, pp. 297 308 DOI:10.3770/j.issn:1000-341X.2010.02.013 Http://jmre.dlut.edu.cn A Modified Hestenes-Stiefel Conjugate Gradient Method
More informationA New Nonlinear Conjugate Gradient Coefficient. for Unconstrained Optimization
Applie Mathematical Sciences, Vol. 9, 05, no. 37, 83-8 HIKARI Lt, www.m-hiari.com http://x.oi.or/0.988/ams.05.4994 A New Nonlinear Conjuate Graient Coefficient for Unconstraine Optimization * Mohame Hamoa,
More informationNew Hybrid Conjugate Gradient Method as a Convex Combination of FR and PRP Methods
Filomat 3:11 (216), 383 31 DOI 1.2298/FIL161183D Published by Faculty of Sciences and Mathematics, University of Niš, Serbia Available at: http://www.pmf.ni.ac.rs/filomat New Hybrid Conjugate Gradient
More informationConvergence of a Two-parameter Family of Conjugate Gradient Methods with a Fixed Formula of Stepsize
Bol. Soc. Paran. Mat. (3s.) v. 00 0 (0000):????. c SPM ISSN-2175-1188 on line ISSN-00378712 in press SPM: www.spm.uem.br/bspm doi:10.5269/bspm.v38i6.35641 Convergence of a Two-parameter Family of Conjugate
More informationBulletin of the. Iranian Mathematical Society
ISSN: 1017-060X (Print) ISSN: 1735-8515 (Online) Bulletin of the Iranian Mathematical Society Vol. 43 (2017), No. 7, pp. 2437 2448. Title: Extensions of the Hestenes Stiefel and Pola Ribière Polya conjugate
More informationNew hybrid conjugate gradient algorithms for unconstrained optimization
ew hybrid conjugate gradient algorithms for unconstrained optimization eculai Andrei Research Institute for Informatics, Center for Advanced Modeling and Optimization, 8-0, Averescu Avenue, Bucharest,
More information2010 Mathematics Subject Classification: 90C30. , (1.2) where t. 0 is a step size, received from the line search, and the directions d
Journal of Applie Mathematics an Computation (JAMC), 8, (9), 366-378 http://wwwhillpublisheror/journal/jamc ISSN Online:576-645 ISSN Print:576-653 New Hbri Conjuate Graient Metho as A Convex Combination
More informationA modified quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient method for unconstrained optimization problems
An International Journal of Optimization Control: Theories & Applications ISSN:246-0957 eissn:246-5703 Vol.7, No.2, pp.77-85 (207) http://doi.org/0.2/ijocta.0.207.00339 RESEARCH ARTICLE A modified quadratic
More informationA globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications
A globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications Weijun Zhou 28 October 20 Abstract A hybrid HS and PRP type conjugate gradient method for smooth
More informationGLOBAL CONVERGENCE OF CONJUGATE GRADIENT METHODS WITHOUT LINE SEARCH
GLOBAL CONVERGENCE OF CONJUGATE GRADIENT METHODS WITHOUT LINE SEARCH Jie Sun 1 Department of Decision Sciences National University of Singapore, Republic of Singapore Jiapu Zhang 2 Department of Mathematics
More informationResearch Article Nonlinear Conjugate Gradient Methods with Wolfe Type Line Search
Abstract and Applied Analysis Volume 013, Article ID 74815, 5 pages http://dx.doi.org/10.1155/013/74815 Research Article Nonlinear Conjugate Gradient Methods with Wolfe Type Line Search Yuan-Yuan Chen
More informationA Conjugate Gradient Method with Inexact. Line Search for Unconstrained Optimization
Applie Mathematical Sciences, Vol. 9, 5, no. 37, 83-83 HIKARI Lt, www.m-hiari.com http://x.oi.or/.988/ams.5.4995 A Conuate Graient Metho with Inexact Line Search for Unconstraine Optimization * Mohame
More informationA New Conjugate Gradient Method. with Exact Line Search
Applie Mathematical Sciences, Vol. 9, 5, no. 9, 799-8 HIKARI Lt, www.m-hiari.com http://x.oi.or/.988/ams.5.533 A New Conjuate Graient Metho with Exact Line Search Syazni Shoi, Moh Rivaie, Mustafa Mamat
More informationJanuary 29, Non-linear conjugate gradient method(s): Fletcher Reeves Polak Ribière January 29, 2014 Hestenes Stiefel 1 / 13
Non-linear conjugate gradient method(s): Fletcher Reeves Polak Ribière Hestenes Stiefel January 29, 2014 Non-linear conjugate gradient method(s): Fletcher Reeves Polak Ribière January 29, 2014 Hestenes
More informationOn the convergence properties of the modified Polak Ribiére Polyak method with the standard Armijo line search
ANZIAM J. 55 (E) pp.e79 E89, 2014 E79 On the convergence properties of the modified Polak Ribiére Polyak method with the standard Armijo line search Lijun Li 1 Weijun Zhou 2 (Received 21 May 2013; revised
More informationStep lengths in BFGS method for monotone gradients
Noname manuscript No. (will be inserted by the editor) Step lengths in BFGS method for monotone gradients Yunda Dong Received: date / Accepted: date Abstract In this paper, we consider how to directly
More informationStep-size Estimation for Unconstrained Optimization Methods
Volume 24, N. 3, pp. 399 416, 2005 Copyright 2005 SBMAC ISSN 0101-8205 www.scielo.br/cam Step-size Estimation for Unconstrained Optimization Methods ZHEN-JUN SHI 1,2 and JIE SHEN 3 1 College of Operations
More informationAn Efficient Modification of Nonlinear Conjugate Gradient Method
Malaysian Journal of Mathematical Sciences 10(S) March : 167-178 (2016) Special Issue: he 10th IM-G International Conference on Mathematics, Statistics and its Applications 2014 (ICMSA 2014) MALAYSIAN
More informationAN EIGENVALUE STUDY ON THE SUFFICIENT DESCENT PROPERTY OF A MODIFIED POLAK-RIBIÈRE-POLYAK CONJUGATE GRADIENT METHOD S.
Bull. Iranian Math. Soc. Vol. 40 (2014), No. 1, pp. 235 242 Online ISSN: 1735-8515 AN EIGENVALUE STUDY ON THE SUFFICIENT DESCENT PROPERTY OF A MODIFIED POLAK-RIBIÈRE-POLYAK CONJUGATE GRADIENT METHOD S.
More informationModified nonmonotone Armijo line search for descent method
Numer Alor 2011 57:1 25 DOI 10.1007/s11075-010-9408-7 ORIGINAL PAPER Modified nonmonotone Armijo line search for descent method Zhenjun Shi Shenquan Wan Received: 23 February 2009 / Accepted: 20 June 2010
More informationFirst Published on: 11 October 2006 To link to this article: DOI: / URL:
his article was downloaded by:[universitetsbiblioteet i Bergen] [Universitetsbiblioteet i Bergen] On: 12 March 2007 Access Details: [subscription number 768372013] Publisher: aylor & Francis Informa Ltd
More informationModification of the Armijo line search to satisfy the convergence properties of HS method
Université de Sfax Faculté des Sciences de Sfax Département de Mathématiques BP. 1171 Rte. Soukra 3000 Sfax Tunisia INTERNATIONAL CONFERENCE ON ADVANCES IN APPLIED MATHEMATICS 2014 Modification of the
More informationand P RP k = gt k (g k? g k? ) kg k? k ; (.5) where kk is the Euclidean norm. This paper deals with another conjugate gradient method, the method of s
Global Convergence of the Method of Shortest Residuals Yu-hong Dai and Ya-xiang Yuan State Key Laboratory of Scientic and Engineering Computing, Institute of Computational Mathematics and Scientic/Engineering
More informationGlobally convergent three-term conjugate gradient projection methods for solving nonlinear monotone equations
Arab. J. Math. (2018) 7:289 301 https://doi.org/10.1007/s40065-018-0206-8 Arabian Journal of Mathematics Mompati S. Koorapetse P. Kaelo Globally convergent three-term conjugate gradient projection methods
More informationGlobal Convergence Properties of the HS Conjugate Gradient Method
Applied Mathematical Sciences, Vol. 7, 2013, no. 142, 7077-7091 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/ams.2013.311638 Global Convergence Properties of the HS Conjugate Gradient Method
More informationA derivative-free nonmonotone line search and its application to the spectral residual method
IMA Journal of Numerical Analysis (2009) 29, 814 825 doi:10.1093/imanum/drn019 Advance Access publication on November 14, 2008 A derivative-free nonmonotone line search and its application to the spectral
More informationNUMERICAL COMPARISON OF LINE SEARCH CRITERIA IN NONLINEAR CONJUGATE GRADIENT ALGORITHMS
NUMERICAL COMPARISON OF LINE SEARCH CRITERIA IN NONLINEAR CONJUGATE GRADIENT ALGORITHMS Adeleke O. J. Department of Computer and Information Science/Mathematics Covenat University, Ota. Nigeria. Aderemi
More informationResearch Article A Descent Dai-Liao Conjugate Gradient Method Based on a Modified Secant Equation and Its Global Convergence
International Scholarly Research Networ ISRN Computational Mathematics Volume 2012, Article ID 435495, 8 pages doi:10.5402/2012/435495 Research Article A Descent Dai-Liao Conjugate Gradient Method Based
More informationSimple Optimization (SOPT) for Nonlinear Constrained Optimization Problem
(ISSN 4-6) Journal of Science & Enineerin Education (ISSN 4-6) Vol.,, Pae-3-39, Year-7 Simple Optimization (SOPT) for Nonlinear Constrained Optimization Vivek Kumar Chouhan *, Joji Thomas **, S. S. Mahapatra
More information230 L. HEI if ρ k is satisfactory enough, and to reduce it by a constant fraction (say, ahalf): k+1 = fi 2 k (0 <fi 2 < 1); (1.7) in the case ρ k is n
Journal of Computational Mathematics, Vol.21, No.2, 2003, 229 236. A SELF-ADAPTIVE TRUST REGION ALGORITHM Λ1) Long Hei y (Institute of Computational Mathematics and Scientific/Engineering Computing, Academy
More informationThe Wolfe Epsilon Steepest Descent Algorithm
Applied Mathematical Sciences, Vol. 8, 204, no. 55, 273-274 HIKARI Ltd, www.m-hiari.com http://dx.doi.org/0.2988/ams.204.4375 The Wolfe Epsilon Steepest Descent Algorithm Haima Degaichia Department of
More informationSpectral gradient projection method for solving nonlinear monotone equations
Journal of Computational and Applied Mathematics 196 (2006) 478 484 www.elsevier.com/locate/cam Spectral gradient projection method for solving nonlinear monotone equations Li Zhang, Weijun Zhou Department
More informationResearch Article A New Conjugate Gradient Algorithm with Sufficient Descent Property for Unconstrained Optimization
Mathematical Problems in Engineering Volume 205, Article ID 352524, 8 pages http://dx.doi.org/0.55/205/352524 Research Article A New Conjugate Gradient Algorithm with Sufficient Descent Property for Unconstrained
More informationGradient method based on epsilon algorithm for large-scale nonlinearoptimization
ISSN 1746-7233, England, UK World Journal of Modelling and Simulation Vol. 4 (2008) No. 1, pp. 64-68 Gradient method based on epsilon algorithm for large-scale nonlinearoptimization Jianliang Li, Lian
More informationNonlinear conjugate gradient methods, Unconstrained optimization, Nonlinear
A SURVEY OF NONLINEAR CONJUGATE GRADIENT METHODS WILLIAM W. HAGER AND HONGCHAO ZHANG Abstract. This paper reviews the development of different versions of nonlinear conjugate gradient methods, with special
More informationOpen Problems in Nonlinear Conjugate Gradient Algorithms for Unconstrained Optimization
BULLETIN of the Malaysian Mathematical Sciences Society http://math.usm.my/bulletin Bull. Malays. Math. Sci. Soc. (2) 34(2) (2011), 319 330 Open Problems in Nonlinear Conjugate Gradient Algorithms for
More informationGlobal Convergence Properties of a New Class of Conjugate. Gradient Method for Unconstrained Optimization
Applie Mathematical Sciences, Vol. 8, 0, no. 67, 3307-339 HIKARI Lt, www.m-hiari.com http://x.oi.or/0.988/ams.0.36 Global Converence Properties of a New Class of Conjuate Graient Metho for Unconstraine
More informationA Nonlinear Conjugate Gradient Algorithm with An Optimal Property and An Improved Wolfe Line Search
A Nonlinear Conjugate Gradient Algorithm with An Optimal Property and An Improved Wolfe Line Search Yu-Hong Dai and Cai-Xia Kou State Key Laboratory of Scientific and Engineering Computing, Institute of
More informationNew Inexact Line Search Method for Unconstrained Optimization 1,2
journal of optimization theory and applications: Vol. 127, No. 2, pp. 425 446, November 2005 ( 2005) DOI: 10.1007/s10957-005-6553-6 New Inexact Line Search Method for Unconstrained Optimization 1,2 Z.
More informationAdaptive two-point stepsize gradient algorithm
Numerical Algorithms 27: 377 385, 2001. 2001 Kluwer Academic Publishers. Printed in the Netherlands. Adaptive two-point stepsize gradient algorithm Yu-Hong Dai and Hongchao Zhang State Key Laboratory of
More informationResearch Article Identifying a Global Optimizer with Filled Function for Nonlinear Integer Programming
Discrete Dynamics in Nature and Society Volume 20, Article ID 7697, pages doi:0.55/20/7697 Research Article Identifying a Global Optimizer with Filled Function for Nonlinear Integer Programming Wei-Xiang
More informationGlobal convergence of a regularized factorized quasi-newton method for nonlinear least squares problems
Volume 29, N. 2, pp. 195 214, 2010 Copyright 2010 SBMAC ISSN 0101-8205 www.scielo.br/cam Global convergence of a regularized factorized quasi-newton method for nonlinear least squares problems WEIJUN ZHOU
More informationSCALED CONJUGATE GRADIENT TYPE METHOD WITH IT`S CONVERGENCE FOR BACK PROPAGATION NEURAL NETWORK
International Journal of Information echnoloy an Business Manaement 0-05 JIBM & ARF. All rihts reserve SCALED CONJUGAE GRADIEN YPE MEHOD WIH I`S CONVERGENCE FOR BACK PROPAGAION NEURAL NEWORK, Collee of
More informationNonmonotonic back-tracking trust region interior point algorithm for linear constrained optimization
Journal of Computational and Applied Mathematics 155 (2003) 285 305 www.elsevier.com/locate/cam Nonmonotonic bac-tracing trust region interior point algorithm for linear constrained optimization Detong
More informationA family of derivative-free conjugate gradient methods for large-scale nonlinear systems of equations
Journal of Computational Applied Mathematics 224 (2009) 11 19 Contents lists available at ScienceDirect Journal of Computational Applied Mathematics journal homepage: www.elsevier.com/locate/cam A family
More informationResearch Article Finding Global Minima with a Filled Function Approach for Non-Smooth Global Optimization
Hindawi Publishing Corporation Discrete Dynamics in Nature and Society Volume 00, Article ID 843609, 0 pages doi:0.55/00/843609 Research Article Finding Global Minima with a Filled Function Approach for
More informationA new class of Conjugate Gradient Methods with extended Nonmonotone Line Search
Appl. Math. Inf. Sci. 6, No. 1S, 147S-154S (2012) 147 Applied Mathematics & Information Sciences An International Journal A new class of Conjugate Gradient Methods with extended Nonmonotone Line Search
More informationarxiv: v2 [math.oc] 11 Jun 2018
: Tiht Automated Converence Guarantees Adrien Taylor * Bryan Van Scoy * 2 Laurent Lessard * 2 3 arxiv:83.673v2 [math.oc] Jun 28 Abstract We present a novel way of eneratin Lyapunov functions for provin
More informationChapter 4. Unconstrained optimization
Chapter 4. Unconstrained optimization Version: 28-10-2012 Material: (for details see) Chapter 11 in [FKS] (pp.251-276) A reference e.g. L.11.2 refers to the corresponding Lemma in the book [FKS] PDF-file
More informationA null space method for solving system of equations q
Applied Mathematics and Computation 149 (004) 15 6 www.elsevier.com/locate/amc A null space method for solving system of equations q Pu-yan Nie 1 Department of Mathematics, Jinan University, Guangzhou
More informationA simple scheme for realizing six-photon entangled state based on cavity quantum electrodynamics
J. At. Mol. Sci. doi: 10.4208/jams.041711.051011a Vol. 3, No. 1, pp. 73-77 February 2012 A simple scheme for realizin six-photon entanled state based on cavity quantum electrodynamics Den-Yu Zhan, Shi-Qin
More informationA DIMENSION REDUCING CONIC METHOD FOR UNCONSTRAINED OPTIMIZATION
1 A DIMENSION REDUCING CONIC METHOD FOR UNCONSTRAINED OPTIMIZATION G E MANOUSSAKIS, T N GRAPSA and C A BOTSARIS Department of Mathematics, University of Patras, GR 26110 Patras, Greece e-mail :gemini@mathupatrasgr,
More informationAcceleration Method for Convex Optimization over the Fixed Point Set of a Nonexpansive Mapping
Noname manuscript No. will be inserted by the editor) Acceleration Method for Convex Optimization over the Fixed Point Set of a Nonexpansive Mapping Hideaki Iiduka Received: date / Accepted: date Abstract
More informationSolution sheet 9. λ < g u = 0, λ = g α 0 : u = αλ. C(u, λ) = gλ gλ = 0
Advanced Finite Elements MA5337 - WS17/18 Solution sheet 9 On this final exercise sheet about variational inequalities we deal with nonlinear complementarity functions as a way to rewrite a variational
More informationSOLVING SYSTEMS OF NONLINEAR EQUATIONS USING A GLOBALLY CONVERGENT OPTIMIZATION ALGORITHM
Transaction on Evolutionary Algorithm and Nonlinear Optimization ISSN: 2229-87 Online Publication, June 2012 www.pcoglobal.com/gjto.htm NG-O21/GJTO SOLVING SYSTEMS OF NONLINEAR EQUATIONS USING A GLOBALLY
More informationon descent spectral cg algorithm for training recurrent neural networks
Technical R e p o r t on descent spectral cg algorithm for training recurrent neural networs I.E. Livieris, D.G. Sotiropoulos 2 P. Pintelas,3 No. TR9-4 University of Patras Department of Mathematics GR-265
More informationResidual iterative schemes for largescale linear systems
Universidad Central de Venezuela Facultad de Ciencias Escuela de Computación Lecturas en Ciencias de la Computación ISSN 1316-6239 Residual iterative schemes for largescale linear systems William La Cruz
More informationGlobal Convergence of Perry-Shanno Memoryless Quasi-Newton-type Method. 1 Introduction
ISSN 1749-3889 (print), 1749-3897 (online) International Journal of Nonlinear Science Vol.11(2011) No.2,pp.153-158 Global Convergence of Perry-Shanno Memoryless Quasi-Newton-type Method Yigui Ou, Jun Zhang
More informationChapter 10 Conjugate Direction Methods
Chapter 10 Conjugate Direction Methods An Introduction to Optimization Spring, 2012 1 Wei-Ta Chu 2012/4/13 Introduction Conjugate direction methods can be viewed as being intermediate between the method
More informationImproved Damped Quasi-Newton Methods for Unconstrained Optimization
Improved Damped Quasi-Newton Methods for Unconstrained Optimization Mehiddin Al-Baali and Lucio Grandinetti August 2015 Abstract Recently, Al-Baali (2014) has extended the damped-technique in the modified
More informationAn Alternative Three-Term Conjugate Gradient Algorithm for Systems of Nonlinear Equations
International Journal of Mathematical Modelling & Computations Vol. 07, No. 02, Spring 2017, 145-157 An Alternative Three-Term Conjugate Gradient Algorithm for Systems of Nonlinear Equations L. Muhammad
More informationPLEASE SCROLL DOWN FOR ARTICLE
This article was downloaded by: [University of New South Wales] On: 18 February 2009 Access details: Access Details: [subscription number 906810409] Publisher Taylor & Francis Informa Ltd Reistered in
More informationExtended Spectral Nonlinear Conjugate Gradient methods for solving unconstrained problems
International Journal of All Researh Euation an Sientifi Methos IJARESM ISSN: 55-6 Volume Issue 5 May-0 Extene Spetral Nonlinear Conjuate Graient methos for solvin unonstraine problems Dr Basim A Hassan
More informationEmotional Optimized Design of Electro-hydraulic Actuators
Sensors & Transducers, Vol. 77, Issue 8, Auust, pp. 9-9 Sensors & Transducers by IFSA Publishin, S. L. http://www.sensorsportal.com Emotional Optimized Desin of Electro-hydraulic Actuators Shi Boqian,
More informationA COMBINED CLASS OF SELF-SCALING AND MODIFIED QUASI-NEWTON METHODS
A COMBINED CLASS OF SELF-SCALING AND MODIFIED QUASI-NEWTON METHODS MEHIDDIN AL-BAALI AND HUMAID KHALFAN Abstract. Techniques for obtaining safely positive definite Hessian approximations with selfscaling
More informationOn memory gradient method with trust region for unconstrained optimization*
Numerical Algorithms (2006) 41: 173 196 DOI: 10.1007/s11075-005-9008-0 * Springer 2006 On memory gradient method with trust region for unconstrained optimization* Zhen-Jun Shi a,b and Jie Shen b a College
More informationANALYTIC CENTER CUTTING PLANE METHODS FOR VARIATIONAL INEQUALITIES OVER CONVEX BODIES
ANALYI ENER UING PLANE MEHODS OR VARIAIONAL INEQUALIIES OVER ONVE BODIES Renin Zen School of Mathematical Sciences honqin Normal Universit honqin hina ABSRA An analtic center cuttin plane method is an
More information154 ADVANCES IN NONLINEAR PROGRAMMING Abstract: We propose an algorithm for nonlinear optimization that employs both trust region techniques and line
7 COMBINING TRUST REGION AND LINE SEARCH TECHNIQUES Jorge Nocedal Department of Electrical and Computer Engineering, Northwestern University, Evanston, IL 60208-3118, USA. Ya-xiang Yuan State Key Laboratory
More informationNumerical Optimization of Partial Differential Equations
Numerical Optimization of Partial Differential Equations Part I: basic optimization concepts in R n Bartosz Protas Department of Mathematics & Statistics McMaster University, Hamilton, Ontario, Canada
More informationModification of the Wolfe line search rules to satisfy the descent condition in the Polak-Ribière-Polyak conjugate gradient method
Laboratoire d Arithmétique, Calcul formel et d Optimisation UMR CNRS 6090 Modification of the Wolfe line search rules to satisfy the descent condition in the Polak-Ribière-Polyak conjugate gradient method
More informationOptimization of Mechanical Design Problems Using Improved Differential Evolution Algorithm
International Journal of Recent Trends in Enineerin Vol. No. 5 May 009 Optimization of Mechanical Desin Problems Usin Improved Differential Evolution Alorithm Millie Pant Radha Thanaraj and V. P. Sinh
More informationA new conjugate gradient method with the new Armijo search based on a modified secant equations
ISSN: 35-38 Enineerin an echnoloy Vol 5 Iue 7 July 8 A new conjuate raient metho with the new Armijo earch bae on a moifie ecant equation Weijuan Shi Guohua Chen Zhibin Zhu Department of Mathematic & Applie
More informationAn interior point type QP-free algorithm with superlinear convergence for inequality constrained optimization
Applied Mathematical Modelling 31 (2007) 1201 1212 www.elsevier.com/locate/apm An interior point type QP-free algorithm with superlinear convergence for inequality constrained optimization Zhibin Zhu *
More informationResearch Article Modified T-F Function Method for Finding Global Minimizer on Unconstrained Optimization
Mathematical Problems in Engineering Volume 2010, Article ID 602831, 11 pages doi:10.1155/2010/602831 Research Article Modified T-F Function Method for Finding Global Minimizer on Unconstrained Optimization
More informationA Diagnostic Treatment of Unconstrained Optimization Problems via a Modified Armijo line search technique.
IOSR Journal of Mathematics IOSR-JM e-issn: 2278-5728, p-issn: 2319-765X. Volume 10, Issue 6 Ver. VI Nov - Dec. 2014, PP 47-53 www.iosrjournals.or A Dianostic reatment of Unconstraine Optimization Problems
More informationTHE solution of the absolute value equation (AVE) of
The nonlinear HSS-like iterative method for absolute value equations Mu-Zheng Zhu Member, IAENG, and Ya-E Qi arxiv:1403.7013v4 [math.na] 2 Jan 2018 Abstract Salkuyeh proposed the Picard-HSS iteration method
More informationFALL 2018 MATH 4211/6211 Optimization Homework 4
FALL 2018 MATH 4211/6211 Optimization Homework 4 This homework assignment is open to textbook, reference books, slides, and online resources, excluding any direct solution to the problem (such as solution
More informationResearch Article A Nonmonotone Weighting Self-Adaptive Trust Region Algorithm for Unconstrained Nonconvex Optimization
Discrete Dynamics in Nature and Society Volume 2015, Article ID 825839, 8 pages http://dx.doi.org/10.1155/2015/825839 Research Article A Nonmonotone Weighting Self-Adaptive Trust Region Algorithm for Unconstrained
More informationIntroduction to Nonlinear Optimization Paul J. Atzberger
Introduction to Nonlinear Optimization Paul J. Atzberger Comments should be sent to: atzberg@math.ucsb.edu Introduction We shall discuss in these notes a brief introduction to nonlinear optimization concepts,
More informationConjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization Yasushi Narushima and Hiroshi Yabe September 28, 2011 Abstract Conjugate gradient
More informationComputational Linear Algebra
Computational Linear Algebra PD Dr. rer. nat. habil. Ralf Peter Mundani Computation in Engineering / BGU Scientific Computing in Computer Science / INF Winter Term 2017/18 Part 3: Iterative Methods PD
More informationAn improved generalized Newton method for absolute value equations
DOI 10.1186/s40064-016-2720-5 RESEARCH Open Access An improved generalized Newton method for absolute value equations Jingmei Feng 1,2* and Sanyang Liu 1 *Correspondence: fengjingmeilq@hotmail.com 1 School
More informationStudy on the Cutter Suction Dredgers Productivity Model and Its Optimal Control
Modelin, Simulation and Optimization Technoloies and Applications (MSOTA 016) Study on the Cutter Suction reders Productivity Model and Its Optimal Control Minhon Yao, Yanlin Wan, Jin Shan and Jianyon
More informationNew Accelerated Conjugate Gradient Algorithms for Unconstrained Optimization
ew Accelerated Conjugate Gradient Algorithms for Unconstrained Optimization eculai Andrei Research Institute for Informatics, Center for Advanced Modeling and Optimization, 8-0, Averescu Avenue, Bucharest,
More informationWEAK AND STRONG CONVERGENCE OF AN ITERATIVE METHOD FOR NONEXPANSIVE MAPPINGS IN HILBERT SPACES
Applicable Analysis and Discrete Mathematics available online at http://pemath.et.b.ac.yu Appl. Anal. Discrete Math. 2 (2008), 197 204. doi:10.2298/aadm0802197m WEAK AND STRONG CONVERGENCE OF AN ITERATIVE
More informationA new Newton like method for solving nonlinear equations
DOI 10.1186/s40064-016-2909-7 RESEARCH Open Access A new Newton like method for solving nonlinear equations B. Saheya 1,2, Guo qing Chen 1, Yun kang Sui 3* and Cai ying Wu 1 *Correspondence: yksui@sina.com
More informationBulletin of the. Iranian Mathematical Society
ISSN: 1017-060X (Print) ISSN: 1735-8515 (Online) Bulletin of the Iranian Mathematical Society Vol. 41 (2015), No. 5, pp. 1259 1269. Title: A uniform approximation method to solve absolute value equation
More informationNonlinear Programming
Nonlinear Programming Kees Roos e-mail: C.Roos@ewi.tudelft.nl URL: http://www.isa.ewi.tudelft.nl/ roos LNMB Course De Uithof, Utrecht February 6 - May 8, A.D. 2006 Optimization Group 1 Outline for week
More informationProximal-Based Pre-correction Decomposition Methods for Structured Convex Minimization Problems
J. Oper. Res. Soc. China (2014) 2:223 235 DOI 10.1007/s40305-014-0042-2 Proximal-Based Pre-correction Decomposition Methods for Structured Convex Minimization Problems Yuan-Yuan Huang San-Yang Liu Received:
More informationOptimization II: Unconstrained Multivariable
Optimization II: Unconstrained Multivariable CS 205A: Mathematical Methods for Robotics, Vision, and Graphics Justin Solomon CS 205A: Mathematical Methods Optimization II: Unconstrained Multivariable 1
More informationUnconstrained optimization
Chapter 4 Unconstrained optimization An unconstrained optimization problem takes the form min x Rnf(x) (4.1) for a target functional (also called objective function) f : R n R. In this chapter and throughout
More informationModification of the Wolfe line search rules to satisfy the descent condition in the Polak-Ribière-Polyak conjugate gradient method
Laboratoire d Arithmétique, Calcul formel et d Optimisation UMR CNRS 6090 Modification of the Wolfe line search rules to satisfy the descent condition in the Polak-Ribière-Polyak conjugate gradient method
More informationResearch Article A Novel Differential Evolution Invasive Weed Optimization Algorithm for Solving Nonlinear Equations Systems
Journal of Applied Mathematics Volume 2013, Article ID 757391, 18 pages http://dx.doi.org/10.1155/2013/757391 Research Article A Novel Differential Evolution Invasive Weed Optimization for Solving Nonlinear
More informationSuppose that the approximate solutions of Eq. (1) satisfy the condition (3). Then (1) if η = 0 in the algorithm Trust Region, then lim inf.
Maria Cameron 1. Trust Region Methods At every iteration the trust region methods generate a model m k (p), choose a trust region, and solve the constraint optimization problem of finding the minimum of
More informationIntroduction. New Nonsmooth Trust Region Method for Unconstraint Locally Lipschitz Optimization Problems
New Nonsmooth Trust Region Method for Unconstraint Locally Lipschitz Optimization Problems Z. Akbari 1, R. Yousefpour 2, M. R. Peyghami 3 1 Department of Mathematics, K.N. Toosi University of Technology,
More informationDeterministic convergence of conjugate gradient method for feedforward neural networks
Deterministic convergence of conjugate gradient method for feedforard neural netorks Jian Wang a,b,c, Wei Wu a, Jacek M. Zurada b, a School of Mathematical Sciences, Dalian University of Technology, Dalian,
More informationImproved Newton s method with exact line searches to solve quadratic matrix equation
Journal of Computational and Applied Mathematics 222 (2008) 645 654 wwwelseviercom/locate/cam Improved Newton s method with exact line searches to solve quadratic matrix equation Jian-hui Long, Xi-yan
More information10log(1/MSE) log(1/MSE)
IROVED MATRI PENCIL METHODS Biao Lu, Don Wei 2, Brian L Evans, and Alan C Bovik Dept of Electrical and Computer Enineerin The University of Texas at Austin, Austin, T 7872-84 fblu,bevans,bovik@eceutexasedu
More informationA new computational method for threaded connection stiffness
Research Article A new computational method for threaded connection stiffness Advances in Mechanical Engineering 2016, Vol. 8(12) 1 9 Ó The Author(s) 2016 DOI: 10.1177/1687814016682653 aime.sagepub.com
More information