Research Article A New Conjugate Gradient Algorithm with Sufficient Descent Property for Unconstrained Optimization

Size: px
Start display at page:

Download "Research Article A New Conjugate Gradient Algorithm with Sufficient Descent Property for Unconstrained Optimization"

Transcription

1 Mathematical Problems in Engineering Volume 205, Article ID , 8 pages Research Article A New Conjugate Gradient Algorithm with Sufficient Descent Property for Unconstrained Optimization XiaoPing Wu, LiYing Liu, 2 FengJie Xie, and YongFei Li School of Economic Management, Xi an University of Posts and Telecommunications, Shaanxi, Xi an 7006, China 2 School of Mathematics Science, Liaocheng University, Shandong, Liaocheng , China Correspondence should be addressed to XiaoPing Wu; wuxiaoping978@26.com Received 6 May 205; Revised 24 September 205; Accepted 29 September 205 Academic Editor: Masoud Hajarian Copyright 205 XiaoPing Wu et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original wor is properly cited. A new nonlinear conjugate gradient formula, which satisfies the sufficient descent condition, for solving unconstrained optimization problem is proposed. The global convergence of the algorithm is established under wea Wolfe line search. Some numerical experiments show that this new WWPNPRP + algorithm is competitive to the SWPPRP + algorithm, the SWPHS + algorithm, and the WWPDYHS + algorithm.. Introduction In this paper, we consider the following unconstrained optimization problem: min {f (x) x R n }, () where f(x) : R n R is a twice continuously differentiable function whose gradient is denoted by g(x) : R n R n.its iterative formula is given by where x + =x +t d, (2) d = { g { g { +β d if =, if 2, and t is a step size which is computed by carrying out a line search, β is a scalar, and g denotes g(x ). There are at least sixfamousformulasforβ, which are given below: β HS = gt (g g ) (g g ) T d, β FR = gt g g T g, (3) = gt (g g ) g T g, β PRP β CD = gt g d T g, β LS = gt (g g ) d T g, β DY g T = g (g g ) T. d To establish the global convergence results of the above conjugate gradient (CG) methods, it is usually required that the step size t should satisfy some line search conditions, suchastheweawolfe-powell(wwp)linesearch (4) f(x +t d ) f(x ) δt g T d, (5) g(x +t d ) T d σg T d, (6) where δ (0, /2) and σ (δ, ), andstrongwolfe-powell (SWP) line search (5) and g(x +t d ) T d σgt d, (7)

2 2 Mathematical Problems in Engineering where δ (0, /2) and σ (δ, ). Wolfe-Powell is referred to as Wolfe. Considerable attentions have been made on the global convergence behaviors for the above methods. Zoutendij [] proved that the FR method with exact line search is globally convergent. Al-Baali [2] extended this result to the strongwolfelinesearchconditions.in[3],daiandyuan proposed the DY method which produces a descent search direction at every iteration and converges globally provided thatthelinesearchsatisfiestheweawolfeconditions.in [4], Wei et al. discussed the global convergence of the PRP conjugate gradient method (CGM) with inexact line search for nonconvex unconstrained optimization. Recently, based on [5 7], Jiang et al. [8] proposed a hybrid CGM with β JHJ = g 2 max {0, ( g / d )gt d,( g / g )gt g } d T (g. (8) g ) Under the Wolfe line search, the method possesses global convergence and efficient numerical performance. On some studies of the conjugate gradient methods, the sufficient descent condition g T d c g 2, c > 0 (9) is often used to analyze the global convergence of the nonlinear conjugate gradient method with the inexact line search techniques. For instance, Touati-Ahmed and Storey [9], Al-Baali [2], Gilbert and Nocedal [0], and Hu and Storey [] hinted that the sufficient descent condition may be crucial for conjugate gradient methods. Unfortunately, this condition is hard to hold. It has been showed that the PRP method with the strong Wolfe Powell line search does not ensure this condition at each iteration. So, Grippo and Lucidi [2] managed to find some line searches which ensure the sufficient descent condition, and they presented a new line search which ensures this condition. The convergence of the PRP method with this line search had been established. Yu et al. [3] analyzed the global convergence of modified PRP CGM with sufficient descent property. Gilbert and Nocedal [0] gave another way to discuss the global convergence of the PRP method with the wea Wolfe line search. By using a complicated line search, they were able to establish the global convergence result of the PRP and HS methods by restricting the parameter β in (3), not allowed to be negative; that is, β + = max {0, βprp }, (0) which yields a globally convergent CG method, being also computationally efficient [4]. In spite of the numerical efficiency of the PRP method, as an important defect, the method lacs the following descent property: g T d 0, 0, () even for uniformly convex objective functions [5]. This motivated the researchers to pay much attention to finding some extensions of the PRP method with descent property. In this context, Yu et al. [6] proposed a modified form of β PRP as follows: β DPRP =β PRP C y 2 g 4 gt + d, (2) with a constant C /4, leading to a CG method with the sufficient descent property. Dai and Kou [7] propose a family of conjugate gradient methods and an improved Wolfe line search; meanwhile, to accelerate the algorithm, an adaptive restart along negative gradients method is introduced. Jiang and Jian [8] proposed two modified CGMs with disturbance factors based on a variant of PRP method; the two proposed methods not only generate sufficient descent direction at each iteration but also converge globally for nonconvex minimization if the strong Wolfe line search is used. A new hybrid conjugate gradient method was presented for unconstrained optimization. The proposed method can generate decent directions at every iteration; moreover, this property is independent of the steplength line search. Under thewolfelinesearch,theproposedmethodpossessesglobal convergence [9]. The main purpose of this paper is to design an efficient algorithm which possesses the properties of global convergence, sufficient descent, and good numerical results. In next section, we present a new CG formula and give its properties. In Section 3, the new algorithm and its global convergence result will be established. To test and compare the numerical performance of the proposed method, in the last part of this wor, a large amount of medium-scale numerical experiments are reported by tables and performance profiles. 2. The Formula and Its Property Because sufficient descent condition (9) is a very nice and important property to analyze the global convergence of the CG methods, we hope to find β such that d satisfies (9). In the following, we propose a sequence {β } and prove that it has such property. Firstly, we give a definition of a descent sequence (or a sufficient descent sequence): a sequence {β } is called a descent sequence (or a sufficient descent sequence) forthecgmethodsifthereexistsaconstantτ (0, ) (or τ [0,))suchthat,forall 2, By using (3), we have, for all 2, β g T d τ g 2. (3) g T d = g 2 +β g T d. (4)

3 Mathematical Problems in Engineering 3 From the above discussion, we require that g 2 +β g T d 0. (5) The above inequality implies (3). In [20], the authors proposed a variation of the FR formula: β VFR (μ) = μ g 2 μ 2 gt d +μ 3 g 2, (6) where μ (0,+ ), μ 2 [μ +ε,+ ), μ 3 (0,+ ),andε is any given positive constant. It is easy to prove that {β VFR } is a descent sequence (with τ=μ /μ 2 )forcgmethdsifg T d 0. Formula (6) possesses the sufficient descent property and proved that there exist some nonlinear conjugate gradient formulae possessing the sufficient descent property without any line searches, where β WYL = gt (g g / g ) g T g. (7) By restricting the parameter σ /4 under the SWP line search condition, the WYL method possessed the sufficient descent condition [2]. In [22], the authors designed the following variation of the PRP formula which possesses the sufficient descent property without any line searches: β N (μ) { g 2 gt d = μ { gt d + g 2 if g 2 gt d, { 0 Otherwise, (8) in which μ. Motivated by the ideas in [20, 22] without any line search and sufficient descent, and taing into account the good convergence properties of [0] and the good numerical performance in [4], we propose a class new formula about β as follows: β NPRP = λμ g 2 + ( λ) μ ( g 2 gt g ) μ 2 gt d +μ g 2, (9) where the definitions of μ,μ 2 arethesameasthosein formula (6); λ (0, ). In order to ensure the nonnegative of the parameter β, we define β NPRP+ = max {0, β NPRP }. (20) Thus if a negative of β NPRP occurs,thisstrategywillrestartthe iteration along the steepest direction. The following two propositions show that the {β NPRP+ } is a descent sequence, so that d can mae sufficient descent condition (9) hold. Proposition. Suppose that β is defined by (9)-(20); then one has that β NPRP+ where 0< σ =μ /μ 2 <. σ g 2 gt d, (2) Proof. It is clear that inequality (2) holds when β NPRP+ =0. Nowweconsiderthecasewhereβ NPRP+ =β NPRP.Sowehave β NPRP+ = λμ g 2 + ( λ) μ ( g 2 gt g ) μ 2 gt d +μ g 2 λμ g2 + ( λ) μ g 2 μ 2 gt d μ g 2 (22) μ 2 gt d = σ g 2 gt d. Hence β NPRP+ can mae (2) hold. Furthermore {β NPRP+ } is a descent sequence without any line search. Proposition 2. Suppose that β is defined by (9)-(20); then d satisfies the sufficient descent condition (9) for all, where c = λ( μ /μ 2 ). Proof. For any >,supposethatg T d <0. If β NPRP+ =0,thend = g.sowehave g T d = g 2 c g 2, (23) where c = λ( μ /μ 2 ). Otherwise, from the definition of β NPRP+,wecanobtain g T d =g T [ [ g +( λμ g 2 + ( λ) μ ( g 2 gt g ) μ 2 gt d +μ g 2 )d ] ] g 2 +( λμ g 2 μ 2 gt d gt d + ( λ) μ g 2 μ 2 gt d gt d ) g 2 +λ μ μ 2 g 2 + g 2 λ g 2 λ g 2 +λ μ μ 2 g 2 λ( μ μ 2 ) g 2 = c g 2. (24) For g T d = g 2 < 0,wecandeducethatd can mae sufficient descent condition (9) hold for all. By the proof of Proposition 2, we can now that the formula μ /μ 2 < is necessary; otherwise, the sufficient descent condition can not be held.

4 4 Mathematical Problems in Engineering 3. Global Convergence In this section, we propose an algorithm related to β NPRP+ and then we study the global convergence property of this algorithm. Firstly, we mae the following two assumptions, which have been widely used in the literature to analyze the global convergence of the CG methods with the inexact line searches. Assumption A. The level set is bounded. Ω={x R n f(x) f(x )} (25) Assumption B. The gradient g(x) is Lipschitz continuous; that is, there exists a constant L>0such that, for any x, y Ω, g (x) g(y) L x y. (26) Now we give the algorithm. Algorithm 3. Step 0. Givenx R n,setd = g ; =.Ifg =0,then stop. Otherwise go to Step. Step. Findt >0satisfying wea Wolfe conditions (5) and (6). Step 2.Letx + =x +t d and g + =g(x + ).If g + =0, then stop. Otherwise go to Step 3. Step 3. Computeβ NPRP+ + by formula (9) and (20). Then generate d + by (3). Step 4.Set=:+;gotoStep0. Since {f(x )} is decreasing sequence, it is clear that the sequence {x } is contained in Ω, and there exists a constant f,suchthat lim f(x )=f. (27) By using Assumptions A and B, we can deduce that there exists M>0such that g M x Ω. (28) The following important result was obtained by Zoutendij [] and Wolfe [23, 24]. Lemma 4. Suppose f(x) is bounded below, and g(x) satisfies the Lipschitz condition. Consider any iteration method of formula (2), where d satisfies d T g < 0 and t is obtained bytheweawolflinesearch.then = (g T d ) 2 d 2 <+. (29) The following lemma was obtained by Dai and Yuan [25]. Lemma 5. Assume that a positive series {a i } satisfies the following inequality for all : i= a i l+c, (30) where l>0and c are constant. Then one has a 2 i i i a 2 =+, i= a =+. i (3) Theorem 6. Suppose that Assumptions A and B hold; {x } is a sequence generated by Algorithm 3. Then one has lim inf g =0. (32) Proof. Equation (3) indicates that, for all 2, Squaring both sides of (33), we obtain Suppose that β NPRP+ d +g =β d. (33) d 2 = g 2 2g T d +β 2 d 2. (34) =β NPRP in (20). Then, d 2 = g 2 2g T d +(β NPRP+ ) 2 d 2 We have = g 2 2g T d +( λμ g 2 + ( λ) μ ( g 2 gt g ) 2 μ 2 gt d +μ g 2 ) d 2 g 2 2g T d + g 4 d 2 h h g 4. (35) g 2 + 2γ g 2, (36) where h = d 2 / g 4 and γ = g T d / g 2. Note that h =/ g 2 and γ =. It follows from (36) that h i= g i 2 +2 γ i i= g i 2. (37) Suppose that conclusion (32) does not hold. Then, there exists apositivescalarε such that, for all, g ε. (38)

5 Mathematical Problems in Engineering 5 Thus, it follows from (28) and (38) that Further, we have h M ε 2 h 2 ε 2 i= i= γ i. (39) γ i. (40) On the other hand, using h 0, relation (39) implies that γ i ε2 i= 2M. (4) UsingLemma5and(40),itfollowsthat (g T d ) 2 γ 2 d 2 = =+, (42) h which contradicts to Zoutendij condition (29). This shows that (32) holds. The proof of the theorem is complete. From the proof of the above theorem, we can conclude that any conjugate gradient method with the formula β NPRP+ and some certain step size technique which ensures that Zoutendij condition (29) holds is globally convergent. In particular, the formula β NPRP+ with the wea Wolfe conditions can generate a globally convergent result. 4. Numerical Results All methods above are tested on 56 test problems, where the former test problems 48 (from arwhead to woods) in Table are taen from the CUTE library in Bongartz et al. [26] and the others are taen from Moréetal.[27]; DYHS: β = max {0, min {β HS,βDY }} (43) isgeneratedbygrippoandlucidi[2]. AllcodeswerewritteninMatlab7.5andrunonaHP with.87 GB RAM and Windows XP operating system. The parameters are σ = 0., δ = 0.0, u =, u 2 =2,andλ = 0.3. Stop the iteration if criterion g ε=0 6 is satisfied. In Table, Name denotes the abbreviation of the test problems, n denotes the dimension of the test problems, Itr/NF/NG denote the number of iteration, function evaluations, and gradient evaluations, respectively, and Tcpu denotes the computing time of CPU for computing a test problem (units: second). On the other hand, to show the performance difference clearly between the hjhj, han, hdy, and hhus method, we adopt the performance profiles given by Dolan and Moré [28] to compare the performance according to Itr, NF, NG, and Tcpu, respectively. Benchmar results are generated by running a solver on a set P of problems and recording information of interest such as NF and Tcpu. Let S be the set of solvers in comparison. Assume that S consists of n s ρ(τ) ρ(τ) τ SWPHS + WWPNPRP + SWPPRP + WWPDYHS Figure : Performance profiles on Tpu SWPHS + WWPNPRP + SWPPRP + WWPDYHS Figure 2: Performance profile on NF. solvers and P consists of n p problems. For each problem p P and solver s S, denote t p,s by the computing time (or the number of function evaluation, etc.) required to solve problem p P by solver s S, and the comparison between different solvers is based on the performance ratio defined by r p,s = τ t p,s min {t p,s :s S}. (44) Assume that a large enough parameter r M r p,s for all p, s is chosen, and r p,s =r M if and only if solvers s do not solve problem p.define ρ s (τ) = n p size {p P : log r p,s τ}, (45) where size A means the number of elements in set A; then ρ s (τ) is the probability for solver s S that a performance

6 6 Mathematical Problems in Engineering Table : Numerical test report. Name/n SWPHS + SWPPRP + WWPNPRP + WWPDYHS Itr/NF/NG/Tcpu Itr/NF/NG/Tcpu Itr/NF/NG/Tcpu Itr/NF/NG/Tcpu arwhead 000 0/293/98/ /5484/606/.462 6/92/6/ /275/32/0.073 arwhead /44/39/ /6824/825/ /97/6/ /325/43/0.35 arwhead /952/30/ /324/339/ /54/3/0.29 8/38/2/0.206 chainwoo 000 /329/9/ /3745/70/ /305/43/ /373/44/0.062 chnrosnb 20 3/72/3/ /72/3/ /72/7/ /72/3/0.004 chnrosnb 50 6/49/37/ /88/58/0.03 5/89/4/ /92/2/0.006 cosine 000 3/368/53/ /3/38/ /2/32/ /246/534/3.743 cragglvy 5000 //2/0.026 //2/0.026 //2/0.026 //2/0.026 cragglvy 0000 //2/0.027 //2/0.027 //2/0.026 //2/0.026 dixmaana /42287/953/ /3/4/ /9/3/0.28 6/4/6/0.45 dixmaanb /47/23/ /70/5/ /0/7/ /9/6/0.058 dixmaanc /49/3/ /7/8/ /2/8/ //9/0.086 dixmaand /422/686/7.85 5/72/2/0.289 /6//0.09 2/3/2/0.6 dixmaani 3000 F/F/F/F 4/98/33/0.403 F/F/F/F F/F/F/F dixmaanj 3000 F/F/F/F 6/69/23/ /602/676/ /420/2937/8.554 dixmaanl /355/34/.506 3/7/8/0.272 F/F/F/F F/F/F/F dqdrtic 000 F/F/F/F 804/5595/22587/ /470/205/ /484/7/0.064 dqdrtic 3000 F/F/F/F 805/56040/22893/ /30/82/ /648/96/0.65 dqdrtic 5000 F/F/F/F 805/56078/23060/ /029/42/ /530/79/0.204 dqrtic 6000 /63/9/0.237 /63/9/0.238 /27/2/0.085 /27/2/0.085 dqrtic 5000 /30/2/0.234 /30/2/0.235 /30/2/0.234 /30/2/0.235 dqrtic /3/2/0.325 /3/2/0.323 /3/2/0.325 /3/2/0.324 edensch /39852/832/ /3704/7643/ /8/39/0.7 58/263/77/0.265 engval /00/33/ /8698/7376/ /54/7/0.08 7/2/33/0.042 engval /00/28/ /7809/6994/ /58/0/ /46/45/0.55 engval /00/29/ /7366/687/ /58/0/0. 20/52/28/0.304 errinros 50 49/4029/445/ /2363/4708/0.986 F/F/F/F F/F/F/F genhumps /566/245/ /29/9/0.237 /02/30/0.07 2/97/25/0.085 genhumps /78/6/.83 4/282/2/ /7/37/0.65 3/87/56/.229 genhumps /78/23/0.69 5/02/36/ /88/73/.28 4/37/0/0.275 genrose /369/2775/ /63/4/ /7297/792/ /2365/265/2.333 genrose /6000/2284/ /64/7/0.9 53/4754/55/ /2676/298/5.283 nondia 0000 /4/0/0.028 /4/0/0.028 /2/2/0.05 /2/2/0.03 nondia 5000 /4/0/0.053 /4/0/0.044 /22/2/0.026 /22/2/0.02 nondia /42/9/0.063 /42/9/0.06 /22/2/0.029 /22/2/0.029 nondquar /63// /404/75/ /896/660/ /956/045/8.098 nondquar /65/8/ /49/507/ /630/30/ /496/395/ penalty 00 7/439/38/0.24 6/46/49/ /26/86/ /2/8/0.08 penalty 000 9/750/57/ /328/99/2.655 /49/4/ /65/29/0.72 power 0000 /28/2/0.0 /28/2/0.02 /28/2/0.02 /28/2/0.02 power 5000 /53/2/0.035 /53/2/0.036 /29/2/0.07 /29/2/0.08 power /30/2/0.025 /30/2/0.025 /30/2/0.025 /30/2/0.025 quartc 000 /54/7/0.038 /54/7/0.037 /22/2/0.03 /22/2/0.03 quartc 6000 /63/9/0.236 /63/9/0.237 /27/2/0.087 /27/2/0.085 quartc 0000 /29/2/0.52 /29/2/0.52 /29/2/0.52 /29/2/0.5 srosenbr /94/22/ /5870/2002/ /354/4/0.3 7/8/8/0.028 tridia 00 F/F/F/F F/F/F/F 553/5057/553/ /3098/348/0.223 tridia 000 F/F/F/F F/F/F/F 932/24283/932/ /7580/443/2.335 bv /358/6/ /3397/5234/ //4/ /220/707/ lin /90/6/ /90/6/ /65/2/58.80 /65/2/58.86 lin 3000 /84//9.26 /84//9.09 /60/2/6.84 /60/2/6.97 lin /93/8/ /93/8/ /68/2/82.87 /68/2/ pen 000 /3/2/0.780 /3/2/0.770 /3/2/0.768 /3/2/0.77 pen 5000 /38/2/7.46 /38/2/7.45 /38/2/7.425 /38/2/7.45 vardim 000 /68/2/0.76 /68/2/0.685 /68/2/0.689 /68/2/0.687 vardim 5000 /84/2/9.569 /84/2/ /84/2/9.628 /84/2/9.720

7 Mathematical Problems in Engineering 7 ρ(τ) ρ(τ) τ SWPHS + WWPNPRP + SWPPRP + WWPDYHS Figure 3: Performance profile on NG τ SWPHS + WWPNPRP + SWPPRP + WWPDYHS Figure 4: Performance profile on Itr. ratio r p,s is within a factor τ R n.theρ s is the (cumulative) distribution function for the performance ratio. The value of ρ s () is the probability that the solver will win over the rest of the solvers. Based on the theory of the performance profile above, four performance figures, that is, Figures 4, can be generated according to Table. From the four figures, we can see that the NPRP is superior to the other three CGMs on the testing problems. 5. Conclusion In this paper, we carefully studied the combination of the variations of the formulas β FR and β PRP.Wehavefound that the new formula possesses the following features: () β NPRP+ is a descent sequence without any line search; (2) the new method possesses the sufficient descent property and converge globally; (3) the strategy will restart the iteration automatically along the steepest descent direction if a negative value of β NPRP occurs; (4) the initial numerical results are promising. Conflict of Interests The authors declare that there is no conflict of interests regarding the publication of this paper. Acnowledgments The authors are very grateful to the anonymous referees for their useful suggestions and comments, which improved the quality of this paper. This wor is supported by the Natural Science Foundation of Shanxi Province (202JQ9004), New Star Team of Xi an University of Post and Telecommucications (XY20506), Science Foundation of Liaocheng University (380303), and General Project of the National Social Science Foundation (5BGL04). References [] G. Zoutendij, Nonlinear programming computational methods, in Integer and Non-Linear Programming,J.Abadie,Ed.,pp , North-Holland Publishing, Amsterdam, The Nertherlands, 970. [2] M. Al-Baali, Descent property and global convergence of the fletcher-reeves method with inexact line search, IMA Journal of Numerical Analysis,vol.5,no.,pp.2 24,985. [3] Y. H. Dai and Y. Yuan, An efficient hybrid conjugate gradient method for unconstrained optimization, Annals of Operations Research,vol.03,no. 4,pp.33 47,200. [4]Z.X.Wei,G.Y.Li,andL.Q.Qi, Globalconvergenceof the Pola-Ribière-Polya conjugate gradient method with an Armijo-type inexact line search for nonconvex unconstrained optimization problems, Mathematics of Computation, vol. 77, no.264,pp ,2008. [5] Y. H. Dai and Y. Yuan, A nonlinear conjugate gradient method with a strong global convergence property, SIAM Journal on Optimization,vol.0,no.,pp.77 82,2000. [6] S.W.Yao,Z.X.Wei,andH.Huang, AnoteaboutWYL sconjugate gradient method and its applications, Applied Mathematics and Computation, vol. 9, no. 2, pp , [7] X.Z.Jiang,G.D.Ma,andJ.B.Jian, Anewglobalconvergent conjugate gradient method with Wolfe line search, Chinese Engineering Mathematics,vol.28,no.6,pp , 20. [8]X.Z.Jiang,L.Han,andJ.B.Jian, Agloballyconvergent mixed conjugate gradient method with Wolfe line search, Mathematica Numerica Sinica,vol.34,no.,pp.03 2,202. [9] D. Touati-Ahmed and C. Storey, Efficient hybrid conjugate gradient techniques, JournalofOptimizationTheoryandApplications,vol.64,no.2,pp ,990. [0] J. C. Gilbert and J. Nocedal, Global convergence properties of conjugate gradient methods for optimization, SIAM Journal on Optimization,vol.2,no.,pp.2 42,992. [] Y. F. Hu and C. Storey, Global convergence result for conjugate gradient methods, JournalofOptimizationTheoryandApplications,vol.7,no.2,pp ,99.

8 8 Mathematical Problems in Engineering [2] L. Grippo and S. Lucidi, A globally convergent version of the pola-ribière conjugate gradient method, Mathematical Programming, Series B,vol.78,no.3,pp ,997. [3] G. H. Yu, L. T. Guan, and G. Y. Li, Global convergence of modified Pola-Ribière-Polya conjugate gradient methods with sufficient descent property, Industrial and Management Optimization,vol.4,no.3,pp ,2008. [4] N. Andrei, Numerical comparison of conjugate gradient algorithms for unconstrained optimization, Studies in Informatics & Control,vol.6,no.4,pp ,2007. [5] Y. H. Dai, Analyses of conjugate gradient methods [Ph.D. thesis], Mathematics and Scientific/Engineering Computing, Chinese Academy of Sciences, 997. [6] G. Yu, L. Guan, and G. Li, Global convergence of modified Pola-Ribière-Polya conjugate gradient methods with sufficient descent property, Industrial and Management Optimization,vol.4,no.3,pp ,2008. [7] Y.-H. Dai and C.-X. Kou, A nonlinear conjugate gradient algorithm with an optimal property and an improved wolfe line search, SIAM Journal on Optimization, vol.23,no.,pp , 203. [8] X.-Z. Jiang and J.-B. Jian, Two modified nonlinear conjugate gradient methods with disturbance factors for unconstrained optimization, Nonlinear Dynamics,vol.77, no.-2,pp , 204. [9] J. B. Jian, L. Han, and X. Z. Jiang, A hybrid conjugate gradient method with descent property for unconstrained optimization, Applied Mathematical Modelling,vol.39,pp ,205. [20] Z. Wei, G. Li, and L. Qi, New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems, Applied Mathematics and Computation,vol.79,no.2,pp , [2] H. Huang, Z. Wei, and Y. Shengwei, The proof of the sufficient descent condition of the Wei-Yao-Liu conjugate gradient method under the strong Wolfe-Powell line search, Applied Mathematics and Computation, vol.89,no.2,pp , [22] G. Yu, Y. Zhao, and Z. Wei, A descent nonlinear conjugate gradient method for large-scale unconstrained optimization, Applied Mathematics and Computation,vol.87,no.2,pp , [23] P. Wolfe, Convergence conditions for ascent methods, SIAM Review,vol.,no.2,pp ,969. [24] P. Wolfe, Convergence conditions for ascent methods. ii: some corrections, SIAM Review,vol.3,no.2,pp.85 88,97. [25] Y. Dai and Y. Yuan, Nonlinear Conjugate Methods, Science Press of Shanghai, Shanghai, China, [26]I.Bongartz,A.R.Conn,N.Gould,andP.L.Toint, CUTE: constrained and unconstrained testing environment, ACM Transactions on Mathematical Software, vol.2,no.,pp.23 60, 995. [27] J. J. Moré, B. S. Garbow, and K. E. Hillstrom, Testing unconstrained optimization software, ACM Transactions on Mathematical Software,vol.7,no.,pp.7 4,98. [28] E. D. Dolan and J. J. Moré, Benchmaring optimization software with performance profiles, Mathematical Programming, Series B,vol.9,no.2,pp.20 23,2002.

9 Advances in Operations Research Advances in Decision Sciences Applied Mathematics Algebra Probability and Statistics The Scientific World Journal International Differential Equations Submit your manuscripts at International Advances in Combinatorics Mathematical Physics Complex Analysis International Mathematics and Mathematical Sciences Mathematical Problems in Engineering Mathematics Discrete Mathematics Discrete Dynamics in Nature and Society Function Spaces Abstract and Applied Analysis International Stochastic Analysis Optimization

Research Article Nonlinear Conjugate Gradient Methods with Wolfe Type Line Search

Research Article Nonlinear Conjugate Gradient Methods with Wolfe Type Line Search Abstract and Applied Analysis Volume 013, Article ID 74815, 5 pages http://dx.doi.org/10.1155/013/74815 Research Article Nonlinear Conjugate Gradient Methods with Wolfe Type Line Search Yuan-Yuan Chen

More information

New hybrid conjugate gradient methods with the generalized Wolfe line search

New hybrid conjugate gradient methods with the generalized Wolfe line search Xu and Kong SpringerPlus (016)5:881 DOI 10.1186/s40064-016-5-9 METHODOLOGY New hybrid conjugate gradient methods with the generalized Wolfe line search Open Access Xiao Xu * and Fan yu Kong *Correspondence:

More information

A Modified Hestenes-Stiefel Conjugate Gradient Method and Its Convergence

A Modified Hestenes-Stiefel Conjugate Gradient Method and Its Convergence Journal of Mathematical Research & Exposition Mar., 2010, Vol. 30, No. 2, pp. 297 308 DOI:10.3770/j.issn:1000-341X.2010.02.013 Http://jmre.dlut.edu.cn A Modified Hestenes-Stiefel Conjugate Gradient Method

More information

An Efficient Modification of Nonlinear Conjugate Gradient Method

An Efficient Modification of Nonlinear Conjugate Gradient Method Malaysian Journal of Mathematical Sciences 10(S) March : 167-178 (2016) Special Issue: he 10th IM-G International Conference on Mathematics, Statistics and its Applications 2014 (ICMSA 2014) MALAYSIAN

More information

New Hybrid Conjugate Gradient Method as a Convex Combination of FR and PRP Methods

New Hybrid Conjugate Gradient Method as a Convex Combination of FR and PRP Methods Filomat 3:11 (216), 383 31 DOI 1.2298/FIL161183D Published by Faculty of Sciences and Mathematics, University of Niš, Serbia Available at: http://www.pmf.ni.ac.rs/filomat New Hybrid Conjugate Gradient

More information

A globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications

A globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications A globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications Weijun Zhou 28 October 20 Abstract A hybrid HS and PRP type conjugate gradient method for smooth

More information

Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization

Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization Yasushi Narushima and Hiroshi Yabe September 28, 2011 Abstract Conjugate gradient

More information

GLOBAL CONVERGENCE OF CONJUGATE GRADIENT METHODS WITHOUT LINE SEARCH

GLOBAL CONVERGENCE OF CONJUGATE GRADIENT METHODS WITHOUT LINE SEARCH GLOBAL CONVERGENCE OF CONJUGATE GRADIENT METHODS WITHOUT LINE SEARCH Jie Sun 1 Department of Decision Sciences National University of Singapore, Republic of Singapore Jiapu Zhang 2 Department of Mathematics

More information

Convergence of a Two-parameter Family of Conjugate Gradient Methods with a Fixed Formula of Stepsize

Convergence of a Two-parameter Family of Conjugate Gradient Methods with a Fixed Formula of Stepsize Bol. Soc. Paran. Mat. (3s.) v. 00 0 (0000):????. c SPM ISSN-2175-1188 on line ISSN-00378712 in press SPM: www.spm.uem.br/bspm doi:10.5269/bspm.v38i6.35641 Convergence of a Two-parameter Family of Conjugate

More information

Step-size Estimation for Unconstrained Optimization Methods

Step-size Estimation for Unconstrained Optimization Methods Volume 24, N. 3, pp. 399 416, 2005 Copyright 2005 SBMAC ISSN 0101-8205 www.scielo.br/cam Step-size Estimation for Unconstrained Optimization Methods ZHEN-JUN SHI 1,2 and JIE SHEN 3 1 College of Operations

More information

AN EIGENVALUE STUDY ON THE SUFFICIENT DESCENT PROPERTY OF A MODIFIED POLAK-RIBIÈRE-POLYAK CONJUGATE GRADIENT METHOD S.

AN EIGENVALUE STUDY ON THE SUFFICIENT DESCENT PROPERTY OF A MODIFIED POLAK-RIBIÈRE-POLYAK CONJUGATE GRADIENT METHOD S. Bull. Iranian Math. Soc. Vol. 40 (2014), No. 1, pp. 235 242 Online ISSN: 1735-8515 AN EIGENVALUE STUDY ON THE SUFFICIENT DESCENT PROPERTY OF A MODIFIED POLAK-RIBIÈRE-POLYAK CONJUGATE GRADIENT METHOD S.

More information

New hybrid conjugate gradient algorithms for unconstrained optimization

New hybrid conjugate gradient algorithms for unconstrained optimization ew hybrid conjugate gradient algorithms for unconstrained optimization eculai Andrei Research Institute for Informatics, Center for Advanced Modeling and Optimization, 8-0, Averescu Avenue, Bucharest,

More information

Modification of the Wolfe line search rules to satisfy the descent condition in the Polak-Ribière-Polyak conjugate gradient method

Modification of the Wolfe line search rules to satisfy the descent condition in the Polak-Ribière-Polyak conjugate gradient method Laboratoire d Arithmétique, Calcul formel et d Optimisation UMR CNRS 6090 Modification of the Wolfe line search rules to satisfy the descent condition in the Polak-Ribière-Polyak conjugate gradient method

More information

New Inexact Line Search Method for Unconstrained Optimization 1,2

New Inexact Line Search Method for Unconstrained Optimization 1,2 journal of optimization theory and applications: Vol. 127, No. 2, pp. 425 446, November 2005 ( 2005) DOI: 10.1007/s10957-005-6553-6 New Inexact Line Search Method for Unconstrained Optimization 1,2 Z.

More information

Modification of the Wolfe line search rules to satisfy the descent condition in the Polak-Ribière-Polyak conjugate gradient method

Modification of the Wolfe line search rules to satisfy the descent condition in the Polak-Ribière-Polyak conjugate gradient method Laboratoire d Arithmétique, Calcul formel et d Optimisation UMR CNRS 6090 Modification of the Wolfe line search rules to satisfy the descent condition in the Polak-Ribière-Polyak conjugate gradient method

More information

Research Article A Nonmonotone Weighting Self-Adaptive Trust Region Algorithm for Unconstrained Nonconvex Optimization

Research Article A Nonmonotone Weighting Self-Adaptive Trust Region Algorithm for Unconstrained Nonconvex Optimization Discrete Dynamics in Nature and Society Volume 2015, Article ID 825839, 8 pages http://dx.doi.org/10.1155/2015/825839 Research Article A Nonmonotone Weighting Self-Adaptive Trust Region Algorithm for Unconstrained

More information

A modified quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient method for unconstrained optimization problems

A modified quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient method for unconstrained optimization problems An International Journal of Optimization Control: Theories & Applications ISSN:246-0957 eissn:246-5703 Vol.7, No.2, pp.77-85 (207) http://doi.org/0.2/ijocta.0.207.00339 RESEARCH ARTICLE A modified quadratic

More information

On the convergence properties of the modified Polak Ribiére Polyak method with the standard Armijo line search

On the convergence properties of the modified Polak Ribiére Polyak method with the standard Armijo line search ANZIAM J. 55 (E) pp.e79 E89, 2014 E79 On the convergence properties of the modified Polak Ribiére Polyak method with the standard Armijo line search Lijun Li 1 Weijun Zhou 2 (Received 21 May 2013; revised

More information

Global Convergence Properties of the HS Conjugate Gradient Method

Global Convergence Properties of the HS Conjugate Gradient Method Applied Mathematical Sciences, Vol. 7, 2013, no. 142, 7077-7091 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/ams.2013.311638 Global Convergence Properties of the HS Conjugate Gradient Method

More information

Modification of the Armijo line search to satisfy the convergence properties of HS method

Modification of the Armijo line search to satisfy the convergence properties of HS method Université de Sfax Faculté des Sciences de Sfax Département de Mathématiques BP. 1171 Rte. Soukra 3000 Sfax Tunisia INTERNATIONAL CONFERENCE ON ADVANCES IN APPLIED MATHEMATICS 2014 Modification of the

More information

Journal of Computational and Applied Mathematics. Notes on the Dai Yuan Yuan modified spectral gradient method

Journal of Computational and Applied Mathematics. Notes on the Dai Yuan Yuan modified spectral gradient method Journal of Computational Applied Mathematics 234 (200) 2986 2992 Contents lists available at ScienceDirect Journal of Computational Applied Mathematics journal homepage: wwwelseviercom/locate/cam Notes

More information

Nonlinear conjugate gradient methods, Unconstrained optimization, Nonlinear

Nonlinear conjugate gradient methods, Unconstrained optimization, Nonlinear A SURVEY OF NONLINEAR CONJUGATE GRADIENT METHODS WILLIAM W. HAGER AND HONGCHAO ZHANG Abstract. This paper reviews the development of different versions of nonlinear conjugate gradient methods, with special

More information

Bulletin of the. Iranian Mathematical Society

Bulletin of the. Iranian Mathematical Society ISSN: 1017-060X (Print) ISSN: 1735-8515 (Online) Bulletin of the Iranian Mathematical Society Vol. 43 (2017), No. 7, pp. 2437 2448. Title: Extensions of the Hestenes Stiefel and Pola Ribière Polya conjugate

More information

Research Article Identifying a Global Optimizer with Filled Function for Nonlinear Integer Programming

Research Article Identifying a Global Optimizer with Filled Function for Nonlinear Integer Programming Discrete Dynamics in Nature and Society Volume 20, Article ID 7697, pages doi:0.55/20/7697 Research Article Identifying a Global Optimizer with Filled Function for Nonlinear Integer Programming Wei-Xiang

More information

and P RP k = gt k (g k? g k? ) kg k? k ; (.5) where kk is the Euclidean norm. This paper deals with another conjugate gradient method, the method of s

and P RP k = gt k (g k? g k? ) kg k? k ; (.5) where kk is the Euclidean norm. This paper deals with another conjugate gradient method, the method of s Global Convergence of the Method of Shortest Residuals Yu-hong Dai and Ya-xiang Yuan State Key Laboratory of Scientic and Engineering Computing, Institute of Computational Mathematics and Scientic/Engineering

More information

New class of limited-memory variationally-derived variable metric methods 1

New class of limited-memory variationally-derived variable metric methods 1 New class of limited-memory variationally-derived variable metric methods 1 Jan Vlček, Ladislav Lukšan Institute of Computer Science, Academy of Sciences of the Czech Republic, L. Lukšan is also from Technical

More information

Research Article A Descent Dai-Liao Conjugate Gradient Method Based on a Modified Secant Equation and Its Global Convergence

Research Article A Descent Dai-Liao Conjugate Gradient Method Based on a Modified Secant Equation and Its Global Convergence International Scholarly Research Networ ISRN Computational Mathematics Volume 2012, Article ID 435495, 8 pages doi:10.5402/2012/435495 Research Article A Descent Dai-Liao Conjugate Gradient Method Based

More information

Step lengths in BFGS method for monotone gradients

Step lengths in BFGS method for monotone gradients Noname manuscript No. (will be inserted by the editor) Step lengths in BFGS method for monotone gradients Yunda Dong Received: date / Accepted: date Abstract In this paper, we consider how to directly

More information

A Nonlinear Conjugate Gradient Algorithm with An Optimal Property and An Improved Wolfe Line Search

A Nonlinear Conjugate Gradient Algorithm with An Optimal Property and An Improved Wolfe Line Search A Nonlinear Conjugate Gradient Algorithm with An Optimal Property and An Improved Wolfe Line Search Yu-Hong Dai and Cai-Xia Kou State Key Laboratory of Scientific and Engineering Computing, Institute of

More information

MSS: MATLAB SOFTWARE FOR L-BFGS TRUST-REGION SUBPROBLEMS FOR LARGE-SCALE OPTIMIZATION

MSS: MATLAB SOFTWARE FOR L-BFGS TRUST-REGION SUBPROBLEMS FOR LARGE-SCALE OPTIMIZATION MSS: MATLAB SOFTWARE FOR L-BFGS TRUST-REGION SUBPROBLEMS FOR LARGE-SCALE OPTIMIZATION JENNIFER B. ERWAY AND ROUMMEL F. MARCIA Abstract. A MATLAB implementation of the Moré-Sorensen sequential (MSS) method

More information

Research Article Finding Global Minima with a Filled Function Approach for Non-Smooth Global Optimization

Research Article Finding Global Minima with a Filled Function Approach for Non-Smooth Global Optimization Hindawi Publishing Corporation Discrete Dynamics in Nature and Society Volume 00, Article ID 843609, 0 pages doi:0.55/00/843609 Research Article Finding Global Minima with a Filled Function Approach for

More information

A derivative-free nonmonotone line search and its application to the spectral residual method

A derivative-free nonmonotone line search and its application to the spectral residual method IMA Journal of Numerical Analysis (2009) 29, 814 825 doi:10.1093/imanum/drn019 Advance Access publication on November 14, 2008 A derivative-free nonmonotone line search and its application to the spectral

More information

Research Article Existence and Duality of Generalized ε-vector Equilibrium Problems

Research Article Existence and Duality of Generalized ε-vector Equilibrium Problems Applied Mathematics Volume 2012, Article ID 674512, 13 pages doi:10.1155/2012/674512 Research Article Existence and Duality of Generalized ε-vector Equilibrium Problems Hong-Yong Fu, Bin Dan, and Xiang-Yu

More information

Globally convergent three-term conjugate gradient projection methods for solving nonlinear monotone equations

Globally convergent three-term conjugate gradient projection methods for solving nonlinear monotone equations Arab. J. Math. (2018) 7:289 301 https://doi.org/10.1007/s40065-018-0206-8 Arabian Journal of Mathematics Mompati S. Koorapetse P. Kaelo Globally convergent three-term conjugate gradient projection methods

More information

First Published on: 11 October 2006 To link to this article: DOI: / URL:

First Published on: 11 October 2006 To link to this article: DOI: / URL: his article was downloaded by:[universitetsbiblioteet i Bergen] [Universitetsbiblioteet i Bergen] On: 12 March 2007 Access Details: [subscription number 768372013] Publisher: aylor & Francis Informa Ltd

More information

Two new spectral conjugate gradient algorithms based on Hestenes Stiefel

Two new spectral conjugate gradient algorithms based on Hestenes Stiefel Research Article Two new spectral conjuate radient alorithms based on Hestenes Stiefel Journal of Alorithms & Computational Technoloy 207, Vol. (4) 345 352! The Author(s) 207 Reprints and permissions:

More information

Research Article Modified T-F Function Method for Finding Global Minimizer on Unconstrained Optimization

Research Article Modified T-F Function Method for Finding Global Minimizer on Unconstrained Optimization Mathematical Problems in Engineering Volume 2010, Article ID 602831, 11 pages doi:10.1155/2010/602831 Research Article Modified T-F Function Method for Finding Global Minimizer on Unconstrained Optimization

More information

Research Article Strong Convergence of a Projected Gradient Method

Research Article Strong Convergence of a Projected Gradient Method Applied Mathematics Volume 2012, Article ID 410137, 10 pages doi:10.1155/2012/410137 Research Article Strong Convergence of a Projected Gradient Method Shunhou Fan and Yonghong Yao Department of Mathematics,

More information

Open Problems in Nonlinear Conjugate Gradient Algorithms for Unconstrained Optimization

Open Problems in Nonlinear Conjugate Gradient Algorithms for Unconstrained Optimization BULLETIN of the Malaysian Mathematical Sciences Society http://math.usm.my/bulletin Bull. Malays. Math. Sci. Soc. (2) 34(2) (2011), 319 330 Open Problems in Nonlinear Conjugate Gradient Algorithms for

More information

Research Article Existence of Periodic Positive Solutions for Abstract Difference Equations

Research Article Existence of Periodic Positive Solutions for Abstract Difference Equations Discrete Dynamics in Nature and Society Volume 2011, Article ID 870164, 7 pages doi:10.1155/2011/870164 Research Article Existence of Periodic Positive Solutions for Abstract Difference Equations Shugui

More information

Spectral gradient projection method for solving nonlinear monotone equations

Spectral gradient projection method for solving nonlinear monotone equations Journal of Computational and Applied Mathematics 196 (2006) 478 484 www.elsevier.com/locate/cam Spectral gradient projection method for solving nonlinear monotone equations Li Zhang, Weijun Zhou Department

More information

Adaptive two-point stepsize gradient algorithm

Adaptive two-point stepsize gradient algorithm Numerical Algorithms 27: 377 385, 2001. 2001 Kluwer Academic Publishers. Printed in the Netherlands. Adaptive two-point stepsize gradient algorithm Yu-Hong Dai and Hongchao Zhang State Key Laboratory of

More information

Research Article A Novel Filled Function Method for Nonlinear Equations

Research Article A Novel Filled Function Method for Nonlinear Equations Applied Mathematics, Article ID 24759, 7 pages http://dx.doi.org/0.55/204/24759 Research Article A Novel Filled Function Method for Nonlinear Equations Liuyang Yuan and Qiuhua Tang 2 College of Sciences,

More information

A family of derivative-free conjugate gradient methods for large-scale nonlinear systems of equations

A family of derivative-free conjugate gradient methods for large-scale nonlinear systems of equations Journal of Computational Applied Mathematics 224 (2009) 11 19 Contents lists available at ScienceDirect Journal of Computational Applied Mathematics journal homepage: www.elsevier.com/locate/cam A family

More information

Global Convergence of Perry-Shanno Memoryless Quasi-Newton-type Method. 1 Introduction

Global Convergence of Perry-Shanno Memoryless Quasi-Newton-type Method. 1 Introduction ISSN 1749-3889 (print), 1749-3897 (online) International Journal of Nonlinear Science Vol.11(2011) No.2,pp.153-158 Global Convergence of Perry-Shanno Memoryless Quasi-Newton-type Method Yigui Ou, Jun Zhang

More information

Research Article A New Roper-Suffridge Extension Operator on a Reinhardt Domain

Research Article A New Roper-Suffridge Extension Operator on a Reinhardt Domain Abstract and Applied Analysis Volume 2011, Article ID 865496, 14 pages doi:10.1155/2011/865496 Research Article A New Roper-Suffridge Extension Operator on a Reinhardt Domain Jianfei Wang and Cailing Gao

More information

Nonmonotonic back-tracking trust region interior point algorithm for linear constrained optimization

Nonmonotonic back-tracking trust region interior point algorithm for linear constrained optimization Journal of Computational and Applied Mathematics 155 (2003) 285 305 www.elsevier.com/locate/cam Nonmonotonic bac-tracing trust region interior point algorithm for linear constrained optimization Detong

More information

The Wolfe Epsilon Steepest Descent Algorithm

The Wolfe Epsilon Steepest Descent Algorithm Applied Mathematical Sciences, Vol. 8, 204, no. 55, 273-274 HIKARI Ltd, www.m-hiari.com http://dx.doi.org/0.2988/ams.204.4375 The Wolfe Epsilon Steepest Descent Algorithm Haima Degaichia Department of

More information

Research Article The Dirichlet Problem on the Upper Half-Space

Research Article The Dirichlet Problem on the Upper Half-Space Abstract and Applied Analysis Volume 2012, Article ID 203096, 5 pages doi:10.1155/2012/203096 Research Article The Dirichlet Problem on the Upper Half-Space Jinjin Huang 1 and Lei Qiao 2 1 Department of

More information

Chapter 4. Unconstrained optimization

Chapter 4. Unconstrained optimization Chapter 4. Unconstrained optimization Version: 28-10-2012 Material: (for details see) Chapter 11 in [FKS] (pp.251-276) A reference e.g. L.11.2 refers to the corresponding Lemma in the book [FKS] PDF-file

More information

A New Nonlinear Conjugate Gradient Coefficient. for Unconstrained Optimization

A New Nonlinear Conjugate Gradient Coefficient. for Unconstrained Optimization Applie Mathematical Sciences, Vol. 9, 05, no. 37, 83-8 HIKARI Lt, www.m-hiari.com http://x.oi.or/0.988/ams.05.4994 A New Nonlinear Conjuate Graient Coefficient for Unconstraine Optimization * Mohame Hamoa,

More information

Research Article The Solution Set Characterization and Error Bound for the Extended Mixed Linear Complementarity Problem

Research Article The Solution Set Characterization and Error Bound for the Extended Mixed Linear Complementarity Problem Journal of Applied Mathematics Volume 2012, Article ID 219478, 15 pages doi:10.1155/2012/219478 Research Article The Solution Set Characterization and Error Bound for the Extended Mixed Linear Complementarity

More information

Research Article Convex Polyhedron Method to Stability of Continuous Systems with Two Additive Time-Varying Delay Components

Research Article Convex Polyhedron Method to Stability of Continuous Systems with Two Additive Time-Varying Delay Components Applied Mathematics Volume 202, Article ID 689820, 3 pages doi:0.55/202/689820 Research Article Convex Polyhedron Method to Stability of Continuous Systems with Two Additive Time-Varying Delay Components

More information

on descent spectral cg algorithm for training recurrent neural networks

on descent spectral cg algorithm for training recurrent neural networks Technical R e p o r t on descent spectral cg algorithm for training recurrent neural networs I.E. Livieris, D.G. Sotiropoulos 2 P. Pintelas,3 No. TR9-4 University of Patras Department of Mathematics GR-265

More information

Research Article An Auxiliary Function Method for Global Minimization in Integer Programming

Research Article An Auxiliary Function Method for Global Minimization in Integer Programming Mathematical Problems in Engineering Volume 2011, Article ID 402437, 13 pages doi:10.1155/2011/402437 Research Article An Auxiliary Function Method for Global Minimization in Integer Programming Hongwei

More information

Research Article New Oscillation Criteria for Second-Order Neutral Delay Differential Equations with Positive and Negative Coefficients

Research Article New Oscillation Criteria for Second-Order Neutral Delay Differential Equations with Positive and Negative Coefficients Abstract and Applied Analysis Volume 2010, Article ID 564068, 11 pages doi:10.1155/2010/564068 Research Article New Oscillation Criteria for Second-Order Neutral Delay Differential Equations with Positive

More information

Gradient method based on epsilon algorithm for large-scale nonlinearoptimization

Gradient method based on epsilon algorithm for large-scale nonlinearoptimization ISSN 1746-7233, England, UK World Journal of Modelling and Simulation Vol. 4 (2008) No. 1, pp. 64-68 Gradient method based on epsilon algorithm for large-scale nonlinearoptimization Jianliang Li, Lian

More information

Institute of Computer Science. A conjugate directions approach to improve the limited-memory BFGS method

Institute of Computer Science. A conjugate directions approach to improve the limited-memory BFGS method Institute of Computer Science Academy of Sciences of the Czech Republic A conjugate directions approach to improve the limited-memory BFGS method J. Vlček a, L. Lukšan a, b a Institute of Computer Science,

More information

230 L. HEI if ρ k is satisfactory enough, and to reduce it by a constant fraction (say, ahalf): k+1 = fi 2 k (0 <fi 2 < 1); (1.7) in the case ρ k is n

230 L. HEI if ρ k is satisfactory enough, and to reduce it by a constant fraction (say, ahalf): k+1 = fi 2 k (0 <fi 2 < 1); (1.7) in the case ρ k is n Journal of Computational Mathematics, Vol.21, No.2, 2003, 229 236. A SELF-ADAPTIVE TRUST REGION ALGORITHM Λ1) Long Hei y (Institute of Computational Mathematics and Scientific/Engineering Computing, Academy

More information

Research Article Almost Sure Central Limit Theorem of Sample Quantiles

Research Article Almost Sure Central Limit Theorem of Sample Quantiles Advances in Decision Sciences Volume 202, Article ID 67942, 7 pages doi:0.55/202/67942 Research Article Almost Sure Central Limit Theorem of Sample Quantiles Yu Miao, Shoufang Xu, 2 and Ang Peng 3 College

More information

Research Article A Note about the General Meromorphic Solutions of the Fisher Equation

Research Article A Note about the General Meromorphic Solutions of the Fisher Equation Mathematical Problems in Engineering, Article ID 793834, 4 pages http://dx.doi.org/0.55/204/793834 Research Article A Note about the General Meromorphic Solutions of the Fisher Equation Jian-ming Qi, Qiu-hui

More information

A new class of Conjugate Gradient Methods with extended Nonmonotone Line Search

A new class of Conjugate Gradient Methods with extended Nonmonotone Line Search Appl. Math. Inf. Sci. 6, No. 1S, 147S-154S (2012) 147 Applied Mathematics & Information Sciences An International Journal A new class of Conjugate Gradient Methods with extended Nonmonotone Line Search

More information

Research Article A Two-Grid Method for Finite Element Solutions of Nonlinear Parabolic Equations

Research Article A Two-Grid Method for Finite Element Solutions of Nonlinear Parabolic Equations Abstract and Applied Analysis Volume 212, Article ID 391918, 11 pages doi:1.1155/212/391918 Research Article A Two-Grid Method for Finite Element Solutions of Nonlinear Parabolic Equations Chuanjun Chen

More information

Research Article A Two-Step Matrix-Free Secant Method for Solving Large-Scale Systems of Nonlinear Equations

Research Article A Two-Step Matrix-Free Secant Method for Solving Large-Scale Systems of Nonlinear Equations Applied Mathematics Volume 2012, Article ID 348654, 9 pages doi:10.1155/2012/348654 Research Article A Two-Step Matrix-Free Secant Method for Solving Large-Scale Systems of Nonlinear Equations M. Y. Waziri,

More information

Research Article A Characterization of E-Benson Proper Efficiency via Nonlinear Scalarization in Vector Optimization

Research Article A Characterization of E-Benson Proper Efficiency via Nonlinear Scalarization in Vector Optimization Applied Mathematics, Article ID 649756, 5 pages http://dx.doi.org/10.1155/2014/649756 Research Article A Characterization of E-Benson Proper Efficiency via Nonlinear Scalarization in Vector Optimization

More information

Research Article Solving the Matrix Nearness Problem in the Maximum Norm by Applying a Projection and Contraction Method

Research Article Solving the Matrix Nearness Problem in the Maximum Norm by Applying a Projection and Contraction Method Advances in Operations Research Volume 01, Article ID 357954, 15 pages doi:10.1155/01/357954 Research Article Solving the Matrix Nearness Problem in the Maximum Norm by Applying a Projection and Contraction

More information

January 29, Non-linear conjugate gradient method(s): Fletcher Reeves Polak Ribière January 29, 2014 Hestenes Stiefel 1 / 13

January 29, Non-linear conjugate gradient method(s): Fletcher Reeves Polak Ribière January 29, 2014 Hestenes Stiefel 1 / 13 Non-linear conjugate gradient method(s): Fletcher Reeves Polak Ribière Hestenes Stiefel January 29, 2014 Non-linear conjugate gradient method(s): Fletcher Reeves Polak Ribière January 29, 2014 Hestenes

More information

NUMERICAL COMPARISON OF LINE SEARCH CRITERIA IN NONLINEAR CONJUGATE GRADIENT ALGORITHMS

NUMERICAL COMPARISON OF LINE SEARCH CRITERIA IN NONLINEAR CONJUGATE GRADIENT ALGORITHMS NUMERICAL COMPARISON OF LINE SEARCH CRITERIA IN NONLINEAR CONJUGATE GRADIENT ALGORITHMS Adeleke O. J. Department of Computer and Information Science/Mathematics Covenat University, Ota. Nigeria. Aderemi

More information

GENERALIZED second-order cone complementarity

GENERALIZED second-order cone complementarity Stochastic Generalized Complementarity Problems in Second-Order Cone: Box-Constrained Minimization Reformulation and Solving Methods Mei-Ju Luo and Yan Zhang Abstract In this paper, we reformulate the

More information

Preconditioned conjugate gradient algorithms with column scaling

Preconditioned conjugate gradient algorithms with column scaling Proceedings of the 47th IEEE Conference on Decision and Control Cancun, Mexico, Dec. 9-11, 28 Preconditioned conjugate gradient algorithms with column scaling R. Pytla Institute of Automatic Control and

More information

Research Article Modified Halfspace-Relaxation Projection Methods for Solving the Split Feasibility Problem

Research Article Modified Halfspace-Relaxation Projection Methods for Solving the Split Feasibility Problem Advances in Operations Research Volume 01, Article ID 483479, 17 pages doi:10.1155/01/483479 Research Article Modified Halfspace-Relaxation Projection Methods for Solving the Split Feasibility Problem

More information

An Alternative Three-Term Conjugate Gradient Algorithm for Systems of Nonlinear Equations

An Alternative Three-Term Conjugate Gradient Algorithm for Systems of Nonlinear Equations International Journal of Mathematical Modelling & Computations Vol. 07, No. 02, Spring 2017, 145-157 An Alternative Three-Term Conjugate Gradient Algorithm for Systems of Nonlinear Equations L. Muhammad

More information

On Descent Spectral CG algorithms for Training Recurrent Neural Networks

On Descent Spectral CG algorithms for Training Recurrent Neural Networks 29 3th Panhellenic Conference on Informatics On Descent Spectral CG algorithms for Training Recurrent Neural Networs I.E. Livieris, D.G. Sotiropoulos and P. Pintelas Abstract In this paper, we evaluate

More information

New Accelerated Conjugate Gradient Algorithms for Unconstrained Optimization

New Accelerated Conjugate Gradient Algorithms for Unconstrained Optimization ew Accelerated Conjugate Gradient Algorithms for Unconstrained Optimization eculai Andrei Research Institute for Informatics, Center for Advanced Modeling and Optimization, 8-0, Averescu Avenue, Bucharest,

More information

Research Article Strong Convergence of Parallel Iterative Algorithm with Mean Errors for Two Finite Families of Ćirić Quasi-Contractive Operators

Research Article Strong Convergence of Parallel Iterative Algorithm with Mean Errors for Two Finite Families of Ćirić Quasi-Contractive Operators Abstract and Applied Analysis Volume 01, Article ID 66547, 10 pages doi:10.1155/01/66547 Research Article Strong Convergence of Parallel Iterative Algorithm with Mean Errors for Two Finite Families of

More information

Research Article Algorithms for a System of General Variational Inequalities in Banach Spaces

Research Article Algorithms for a System of General Variational Inequalities in Banach Spaces Journal of Applied Mathematics Volume 2012, Article ID 580158, 18 pages doi:10.1155/2012/580158 Research Article Algorithms for a System of General Variational Inequalities in Banach Spaces Jin-Hua Zhu,

More information

Research Article Sharp Bounds by the Generalized Logarithmic Mean for the Geometric Weighted Mean of the Geometric and Harmonic Means

Research Article Sharp Bounds by the Generalized Logarithmic Mean for the Geometric Weighted Mean of the Geometric and Harmonic Means Applied Mathematics Volume 2012, Article ID 480689, 8 pages doi:10.1155/2012/480689 Research Article Sharp Bounds by the Generalized Logarithmic Mean for the Geometric Weighted Mean of the Geometric and

More information

Research Article A New Fractional Integral Inequality with Singularity and Its Application

Research Article A New Fractional Integral Inequality with Singularity and Its Application Abstract and Applied Analysis Volume 212, Article ID 93798, 12 pages doi:1.1155/212/93798 Research Article A New Fractional Integral Inequality with Singularity and Its Application Qiong-Xiang Kong 1 and

More information

Global convergence of a regularized factorized quasi-newton method for nonlinear least squares problems

Global convergence of a regularized factorized quasi-newton method for nonlinear least squares problems Volume 29, N. 2, pp. 195 214, 2010 Copyright 2010 SBMAC ISSN 0101-8205 www.scielo.br/cam Global convergence of a regularized factorized quasi-newton method for nonlinear least squares problems WEIJUN ZHOU

More information

A new ane scaling interior point algorithm for nonlinear optimization subject to linear equality and inequality constraints

A new ane scaling interior point algorithm for nonlinear optimization subject to linear equality and inequality constraints Journal of Computational and Applied Mathematics 161 (003) 1 5 www.elsevier.com/locate/cam A new ane scaling interior point algorithm for nonlinear optimization subject to linear equality and inequality

More information

Research Article Some Generalizations of Fixed Point Results for Multivalued Contraction Mappings

Research Article Some Generalizations of Fixed Point Results for Multivalued Contraction Mappings International Scholarly Research Network ISRN Mathematical Analysis Volume 2011, Article ID 924396, 13 pages doi:10.5402/2011/924396 Research Article Some Generalizations of Fixed Point Results for Multivalued

More information

A COMBINED CLASS OF SELF-SCALING AND MODIFIED QUASI-NEWTON METHODS

A COMBINED CLASS OF SELF-SCALING AND MODIFIED QUASI-NEWTON METHODS A COMBINED CLASS OF SELF-SCALING AND MODIFIED QUASI-NEWTON METHODS MEHIDDIN AL-BAALI AND HUMAID KHALFAN Abstract. Techniques for obtaining safely positive definite Hessian approximations with selfscaling

More information

Research Article Extended Precise Large Deviations of Random Sums in the Presence of END Structure and Consistent Variation

Research Article Extended Precise Large Deviations of Random Sums in the Presence of END Structure and Consistent Variation Applied Mathematics Volume 2012, Article ID 436531, 12 pages doi:10.1155/2012/436531 Research Article Extended Precise Large Deviations of Random Sums in the Presence of END Structure and Consistent Variation

More information

Research Article Cyclic Iterative Method for Strictly Pseudononspreading in Hilbert Space

Research Article Cyclic Iterative Method for Strictly Pseudononspreading in Hilbert Space Journal of Applied Mathematics Volume 2012, Article ID 435676, 15 pages doi:10.1155/2012/435676 Research Article Cyclic Iterative Method for Strictly Pseudononspreading in Hilbert Space Bin-Chao Deng,

More information

Research Article A Note on Kantorovich Inequality for Hermite Matrices

Research Article A Note on Kantorovich Inequality for Hermite Matrices Hindawi Publishing Corporation Journal of Inequalities and Applications Volume 0, Article ID 5767, 6 pages doi:0.55/0/5767 Research Article A Note on Kantorovich Inequality for Hermite Matrices Zhibing

More information

Research Article A Note on Optimality Conditions for DC Programs Involving Composite Functions

Research Article A Note on Optimality Conditions for DC Programs Involving Composite Functions Abstract and Applied Analysis, Article ID 203467, 6 pages http://dx.doi.org/10.1155/2014/203467 Research Article A Note on Optimality Conditions for DC Programs Involving Composite Functions Xiang-Kai

More information

Research Article Convergence Theorems for Infinite Family of Multivalued Quasi-Nonexpansive Mappings in Uniformly Convex Banach Spaces

Research Article Convergence Theorems for Infinite Family of Multivalued Quasi-Nonexpansive Mappings in Uniformly Convex Banach Spaces Abstract and Applied Analysis Volume 2012, Article ID 435790, 6 pages doi:10.1155/2012/435790 Research Article Convergence Theorems for Infinite Family of Multivalued Quasi-Nonexpansive Mappings in Uniformly

More information

Research Article Existence for Elliptic Equation Involving Decaying Cylindrical Potentials with Subcritical and Critical Exponent

Research Article Existence for Elliptic Equation Involving Decaying Cylindrical Potentials with Subcritical and Critical Exponent International Differential Equations Volume 2015, Article ID 494907, 4 pages http://dx.doi.org/10.1155/2015/494907 Research Article Existence for Elliptic Equation Involving Decaying Cylindrical Potentials

More information

Research Article A Nice Separation of Some Seiffert-Type Means by Power Means

Research Article A Nice Separation of Some Seiffert-Type Means by Power Means International Mathematics and Mathematical Sciences Volume 2012, Article ID 40692, 6 pages doi:10.1155/2012/40692 Research Article A Nice Separation of Some Seiffert-Type Means by Power Means Iulia Costin

More information

Research Article Multiple-Decision Procedures for Testing the Homogeneity of Mean for k Exponential Distributions

Research Article Multiple-Decision Procedures for Testing the Homogeneity of Mean for k Exponential Distributions Discrete Dynamics in Nature and Society, Article ID 70074, 5 pages http://dx.doi.org/0.55/204/70074 Research Article Multiple-Decision Procedures for Testing the Homogeneity of Mean for Exponential Distributions

More information

A SUBSPACE MINIMIZATION METHOD FOR THE TRUST-REGION STEP

A SUBSPACE MINIMIZATION METHOD FOR THE TRUST-REGION STEP A SUBSPACE MINIMIZATION METHOD FOR THE TRUST-REGION STEP JENNIFER B. ERWAY AND PHILIP E. GILL Abstract. We consider methods for large-scale unconstrained minimization based on finding an approximate minimizer

More information

GRADIENT METHODS FOR LARGE-SCALE NONLINEAR OPTIMIZATION

GRADIENT METHODS FOR LARGE-SCALE NONLINEAR OPTIMIZATION GRADIENT METHODS FOR LARGE-SCALE NONLINEAR OPTIMIZATION By HONGCHAO ZHANG A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE

More information

Acceleration Method for Convex Optimization over the Fixed Point Set of a Nonexpansive Mapping

Acceleration Method for Convex Optimization over the Fixed Point Set of a Nonexpansive Mapping Noname manuscript No. will be inserted by the editor) Acceleration Method for Convex Optimization over the Fixed Point Set of a Nonexpansive Mapping Hideaki Iiduka Received: date / Accepted: date Abstract

More information

Research Article Applying GG-Convex Function to Hermite-Hadamard Inequalities Involving Hadamard Fractional Integrals

Research Article Applying GG-Convex Function to Hermite-Hadamard Inequalities Involving Hadamard Fractional Integrals International Journal of Mathematics and Mathematical Sciences Volume 4, Article ID 3635, pages http://dx.doi.org/.55/4/3635 Research Article Applying GG-Convex Function to Hermite-Hadamard Inequalities

More information

KingSaudBinAbdulazizUniversityforHealthScience,Riyadh11481,SaudiArabia. Correspondence should be addressed to Raghib Abu-Saris;

KingSaudBinAbdulazizUniversityforHealthScience,Riyadh11481,SaudiArabia. Correspondence should be addressed to Raghib Abu-Saris; Chaos Volume 26, Article ID 49252, 7 pages http://dx.doi.org/.55/26/49252 Research Article On Matrix Projective Synchronization and Inverse Matrix Projective Synchronization for Different and Identical

More information

Numerical Optimization of Partial Differential Equations

Numerical Optimization of Partial Differential Equations Numerical Optimization of Partial Differential Equations Part I: basic optimization concepts in R n Bartosz Protas Department of Mathematics & Statistics McMaster University, Hamilton, Ontario, Canada

More information

Research Article An Unbiased Two-Parameter Estimation with Prior Information in Linear Regression Model

Research Article An Unbiased Two-Parameter Estimation with Prior Information in Linear Regression Model e Scientific World Journal, Article ID 206943, 8 pages http://dx.doi.org/10.1155/2014/206943 Research Article An Unbiased Two-Parameter Estimation with Prior Information in Linear Regression Model Jibo

More information

Research Article Degenerate-Generalized Likelihood Ratio Test for One-Sided Composite Hypotheses

Research Article Degenerate-Generalized Likelihood Ratio Test for One-Sided Composite Hypotheses Mathematical Problems in Engineering Volume 2012, Article ID 538342, 11 pages doi:10.1155/2012/538342 Research Article Degenerate-Generalized Likelihood Ratio Test for One-Sided Composite Hypotheses Dongdong

More information

Research Article Constrained Solutions of a System of Matrix Equations

Research Article Constrained Solutions of a System of Matrix Equations Journal of Applied Mathematics Volume 2012, Article ID 471573, 19 pages doi:10.1155/2012/471573 Research Article Constrained Solutions of a System of Matrix Equations Qing-Wen Wang 1 and Juan Yu 1, 2 1

More information

Math 164: Optimization Barzilai-Borwein Method

Math 164: Optimization Barzilai-Borwein Method Math 164: Optimization Barzilai-Borwein Method Instructor: Wotao Yin Department of Mathematics, UCLA Spring 2015 online discussions on piazza.com Main features of the Barzilai-Borwein (BB) method The BB

More information