On the Weak Convergence of the Extragradient Method for Solving Pseudo-Monotone Variational Inequalities

Similar documents
Algorithms for Bilevel Pseudomonotone Variational Inequality Problems

A double projection method for solving variational inequalities without monotonicity

Solution existence of variational inequalities with pseudomonotone operators in the sense of Brézis

Research Article Strong Convergence of a Projected Gradient Method

Variational inequalities for fixed point problems of quasi-nonexpansive operators 1. Rafał Zalas 2

Strong Convergence Theorem by a Hybrid Extragradient-like Approximation Method for Variational Inequalities and Fixed Point Problems

On pseudomonotone variational inequalities

Extensions of Korpelevich s Extragradient Method for the Variational Inequality Problem in Euclidean Space

STRONG CONVERGENCE OF AN ITERATIVE METHOD FOR VARIATIONAL INEQUALITY PROBLEMS AND FIXED POINT PROBLEMS

Iterative algorithms based on the hybrid steepest descent method for the split feasibility problem

arxiv: v1 [math.oc] 28 Jan 2015

Thai Journal of Mathematics Volume 14 (2016) Number 1 : ISSN

An inexact subgradient algorithm for Equilibrium Problems

CONVERGENCE PROPERTIES OF COMBINED RELAXATION METHODS

The Journal of Nonlinear Science and Applications

Research Article Algorithms for a System of General Variational Inequalities in Banach Spaces

SOME STABILITY RESULTS FOR THE SEMI-AFFINE VARIATIONAL INEQUALITY PROBLEM. 1. Introduction

The Generalized Viscosity Implicit Rules of Asymptotically Nonexpansive Mappings in Hilbert Spaces

A projection-type method for generalized variational inequalities with dual solutions

New Iterative Algorithm for Variational Inequality Problem and Fixed Point Problem in Hilbert Spaces

ON A HYBRID PROXIMAL POINT ALGORITHM IN BANACH SPACES

Monotone variational inequalities, generalized equilibrium problems and fixed point methods

ON GAP FUNCTIONS OF VARIATIONAL INEQUALITY IN A BANACH SPACE. Sangho Kum and Gue Myung Lee. 1. Introduction

An iterative method for fixed point problems and variational inequality problems

On an iterative algorithm for variational inequalities in. Banach space

WEAK CONVERGENCE THEOREMS FOR EQUILIBRIUM PROBLEMS WITH NONLINEAR OPERATORS IN HILBERT SPACES

On non-expansivity of topical functions by a new pseudo-metric

An Algorithm for Solving Triple Hierarchical Pseudomonotone Variational Inequalities

A convergence result for an Outer Approximation Scheme

GENERAL NONCONVEX SPLIT VARIATIONAL INEQUALITY PROBLEMS. Jong Kyu Kim, Salahuddin, and Won Hee Lim

A Relaxed Explicit Extragradient-Like Method for Solving Generalized Mixed Equilibria, Variational Inequalities and Constrained Convex Minimization

The Strong Convergence of Subgradients of Convex Functions Along Directions: Perspectives and Open Problems

On Gap Functions for Equilibrium Problems via Fenchel Duality

Some Properties of the Augmented Lagrangian in Cone Constrained Optimization

Finite Convergence for Feasible Solution Sequence of Variational Inequality Problems

A derivative-free nonmonotone line search and its application to the spectral residual method

On the Iteration Complexity of Some Projection Methods for Monotone Linear Variational Inequalities

On the Convergence and O(1/N) Complexity of a Class of Nonlinear Proximal Point Algorithms for Monotonic Variational Inequalities

Downloaded 12/13/16 to Redistribution subject to SIAM license or copyright; see

Some unified algorithms for finding minimum norm fixed point of nonexpansive semigroups in Hilbert spaces

1 Introduction and preliminaries

Generalized Monotonicities and Its Applications to the System of General Variational Inequalities

On Penalty and Gap Function Methods for Bilevel Equilibrium Problems

Research Article Generalized Mann Iterations for Approximating Fixed Points of a Family of Hemicontractions

Fixed point theory for nonlinear mappings in Banach spaces and applications

Relaxed Quasimonotone Operators and Relaxed Quasiconvex Functions

Split equality problem with equilibrium problem, variational inequality problem, and fixed point problem of nonexpansive semigroups

A general iterative algorithm for equilibrium problems and strict pseudo-contractions in Hilbert spaces

THE CYCLIC DOUGLAS RACHFORD METHOD FOR INCONSISTENT FEASIBILITY PROBLEMS

Journal of Inequalities in Pure and Applied Mathematics

PROXIMAL POINT ALGORITHMS INVOLVING FIXED POINT OF NONSPREADING-TYPE MULTIVALUED MAPPINGS IN HILBERT SPACES

CONVERGENCE THEOREMS FOR STRICTLY ASYMPTOTICALLY PSEUDOCONTRACTIVE MAPPINGS IN HILBERT SPACES. Gurucharan Singh Saluja

On nonexpansive and accretive operators in Banach spaces

The Split Common Fixed Point Problem for Asymptotically Quasi-Nonexpansive Mappings in the Intermediate Sense

Iterative common solutions of fixed point and variational inequality problems

On a result of Pazy concerning the asymptotic behaviour of nonexpansive mappings

Shih-sen Chang, Yeol Je Cho, and Haiyun Zhou

Convergence theorems for a finite family. of nonspreading and nonexpansive. multivalued mappings and equilibrium. problems with application

PROPERTIES OF A CLASS OF APPROXIMATELY SHRINKING OPERATORS AND THEIR APPLICATIONS

A GENERALIZATION OF THE REGULARIZATION PROXIMAL POINT METHOD

AN ASYMPTOTIC MINTY S TYPE VARIATIONAL INEQUALITY WITH PSEUDOMONOTONE OPERATORS. G. Isac

A New Modified Gradient-Projection Algorithm for Solution of Constrained Convex Minimization Problem in Hilbert Spaces

Research Article The Solution Set Characterization and Error Bound for the Extended Mixed Linear Complementarity Problem

Multi-step hybrid steepest-descent methods for split feasibility problems with hierarchical variational inequality problem constraints

Resolvent dynamical systems and mixed variational inequalities

A NEW ITERATIVE METHOD FOR THE SPLIT COMMON FIXED POINT PROBLEM IN HILBERT SPACES. Fenghui Wang

Spectral gradient projection method for solving nonlinear monotone equations

On the O(1/t) convergence rate of the projection and contraction methods for variational inequalities with Lipschitz continuous monotone operators

Strong convergence theorems for fixed point problems, variational inequality problems, and equilibrium problems

Convergence to Common Fixed Point for Two Asymptotically Quasi-nonexpansive Mappings in the Intermediate Sense in Banach Spaces

Convergence Rates in Regularization for Nonlinear Ill-Posed Equations Involving m-accretive Mappings in Banach Spaces

Existence of Solutions to Split Variational Inequality Problems and Split Minimization Problems in Banach Spaces

Extensions of the CQ Algorithm for the Split Feasibility and Split Equality Problems

STRONG CONVERGENCE THEOREMS BY A HYBRID STEEPEST DESCENT METHOD FOR COUNTABLE NONEXPANSIVE MAPPINGS IN HILBERT SPACES

Regularization proximal point algorithm for finding a common fixed point of a finite family of nonexpansive mappings in Banach spaces

Strong convergence theorems for asymptotically nonexpansive nonself-mappings with applications

Research Article Modified Halfspace-Relaxation Projection Methods for Solving the Split Feasibility Problem

Convergence Theorems of Approximate Proximal Point Algorithm for Zeroes of Maximal Monotone Operators in Hilbert Spaces 1

1. Introduction. We consider the classical variational inequality problem [1, 3, 7] VI(F, C), which is to find a point x such that

The Strong Convergence Theorems for the Split Equality Fixed Point Problems in Hilbert Spaces

Research Article An Iterative Algorithm for the Split Equality and Multiple-Sets Split Equality Problem

SOME EXTRAGRADIENT METHODS FOR NONCONVEX QUASI VARIATIONAL INEQUALITIES

ITERATIVE SCHEMES FOR APPROXIMATING SOLUTIONS OF ACCRETIVE OPERATORS IN BANACH SPACES SHOJI KAMIMURA AND WATARU TAKAHASHI. Received December 14, 1999

HAIYUN ZHOU, RAVI P. AGARWAL, YEOL JE CHO, AND YONG SOO KIM

Research Article Hybrid Algorithm of Fixed Point for Weak Relatively Nonexpansive Multivalued Mappings and Applications

Strong convergence theorems for total quasi-ϕasymptotically

Available online at J. Nonlinear Sci. Appl., 10 (2017), Research Article

CONVERGENCE OF APPROXIMATING FIXED POINTS FOR MULTIVALUED NONSELF-MAPPINGS IN BANACH SPACES. Jong Soo Jung. 1. Introduction

Krasnoselskii type algorithm for zeros of strongly monotone Lipschitz maps in classical banach spaces

The Split Hierarchical Monotone Variational Inclusions Problems and Fixed Point Problems for Nonexpansive Semigroup

ITERATIVE ALGORITHMS WITH ERRORS FOR ZEROS OF ACCRETIVE OPERATORS IN BANACH SPACES. Jong Soo Jung. 1. Introduction

Gauss-Seidel Type Algorithms for a Class of Variational Inequalities

Two-Step Iteration Scheme for Nonexpansive Mappings in Banach Space

On Total Convexity, Bregman Projections and Stability in Banach Spaces

KKM-Type Theorems for Best Proximal Points in Normed Linear Space

Research Article Cyclic Iterative Method for Strictly Pseudononspreading in Hilbert Space

Viscosity Iterative Approximating the Common Fixed Points of Non-expansive Semigroups in Banach Spaces

WEAK CONVERGENCE OF RESOLVENTS OF MAXIMAL MONOTONE OPERATORS AND MOSCO CONVERGENCE

PARALLEL SUBGRADIENT METHOD FOR NONSMOOTH CONVEX OPTIMIZATION WITH A SIMPLE CONSTRAINT

Transcription:

J Optim Theory Appl 208) 76:399 409 https://doi.org/0.007/s0957-07-24-0 On the Weak Convergence of the Extragradient Method for Solving Pseudo-Monotone Variational Inequalities Phan Tu Vuong Received: 5 July 207 / Accepted: 26 December 207 / Published online: 8 January 208 The Authors) 208. This article is an open access publication Abstract In infinite-dimensional Hilbert spaces, we prove that the iterative sequence generated by the extragradient method for solving pseudo-monotone variational inequalities converges weakly to a solution. A class of pseudo-monotone variational inequalities is considered to illustrate the convergent behavior. The result obtained in this note extends some recent results in the literature; especially, it gives a positive answer to a question raised in Khanh Acta Math Vietnam 4:25 263, 206). Keywords Variational inequality Extragradient method Pseudo-monotonicity Weak convergence Mathematics Subject Classification 47J20 49J40 49M30 Introduction Variational inequalities serve as a powerful mathematical model, which unifies important concepts in applied mathematics like systems of nonlinear equations, necessary optimality conditions for optimization problems, complementarity problems, obstacle problems, or network equilibrium problems []. Therefore, this model has numerous applications in the fields of engineering, mathematical programming, network economics, transportation research, game theory, and regional sciences [2]. Several techniques for the solution of a variational inequality VI) in finitedimensional spaces have been suggested such as projection method, extragradient B Phan Tu Vuong vuong.phan@tuwien.ac.at Institute of Statistics and Mathematical Methods in Economics, Vienna University of Technology, Wiedner Hauptstraße 8, 040 Vienna, Austria

400 J Optim Theory Appl 208) 76:399 409 method, Tikhonov regularization method and proximal point method; see, e.g., []. Typically, for guaranteeing the convergence to a solution of the VI, some kinds of monotonicity of the assigned mapping is required. In case of gradient maps, generalized monotonicity characterizes generalized convexity of the underlying function [3]. The well-known gradient projection method can be successfully applied for solving strongly monotone VIs and inverse strongly monotone VIs [,4]. In practice, these assumptions are rather strong. The Tikhonov regularization and proximal point methods can serve as an efficient solution method for solving monotone VIs. For pseudo-monotone VIs, however, it may happen that every regularized problem generated by the Tikhonov regularization resp. every problem generated by the proximal point method) is not pseudo-monotone [5]. This implies that the regularization procedures performed in Tikhonov regularization and proximal point methods may destroy completely the given pseudo-monotone structure of the original problem and can make auxiliary problems more difficult to solve than the original one. To overcome this drawback, Korpelevich introduced the extragradient method [6]. In the original paper, this method was applied for solving monotone VIs in finitedimensional spaces. It is a known fact [, Theorem 2.2.] that the extragradient method can be successfully applied for solving pseudo-monotone VIs. Because of its importance, extragradient-type methods have been widely studied and generalized []. Recently, the extragradient method has been considered for solving VIs in infinitedimensional Hilbert spaces [7 9]. Providing that the VI has solutions and the assigned mapping is monotone and Lipschitz continuous, it is proved that the iterative sequence generated by the extragradient method converges weakly to a solution. However, as stated in [9, Section 6, Q2], it is not clear if the weak convergence is still available when monotonicity is replaced by pseudo-monotonicity. The aim of this paper is to give a positive answer to this question. As a consequence, the scope of the related optimization problems can be enlarged from convex optimization problems to pseudoconvex optimization problems. This guarantees the advantage of extragradient method in comparing with the other solution methods. The paper is organized as follows: We first recall some basic definitions and results in Sect. 2. The weak convergence of the extragradient method for solving pseudomonotone, Lipschitz continuous VIs is discussed in Sect. 3. An example is presented in Sect. 4 to illustrate the behavior of the extragradient method. We conclude the note with some final remarks in Sect. 5. 2 Preliminaries Let H be real Hilbert space with inner product, and induced norm. and K be a nonempty, closed and convex subset of H. For each u H, there exists a unique point in K see [2, p. 8]), denoted by P K u), such that u P K u) u v v K. It is well known [2,0] that the projection operator can be characterized by

J Optim Theory Appl 208) 76:399 409 40 u P K u), v P K u) 0 v K. ) Let F : H H be a mapping. The variational inequality VIK, F) defined by K and F consists in finding a point u K such that Fu ), u u 0 u K. 2) The solution set of VIK, F) is abbreviated to SolK, F). Remark 2. u SolK, F) if and only if u = P K u λfu )) for all λ>0. We recall some concepts which are useful in the sequel. Definition 2. The mapping F : H H is said to be a) pseudo-monotone if b) monotone if Fu), v u 0 Fv), v u 0 u,v H; Fu) Fv), u v 0 u,v H; c) Lipschitz continuous if there exists L > 0 such that Fu) Fv) L u v u,v H; d) sequentially weakly continuous if for each sequence {u n } we have: {u n } converges weakly to u implies {Fu n )} converges weakly to Fu). Remark 2.2 It is clear that monotonicity implies pseudo-monotonicity. However, the converse does not hold. For example, the mapping F : ]0, + [ ]0, + [, defined by Fu) = with a > 0 is pseudo-monotone but not monotone. a a+u We recall a result which is called Minty lemma [, Lemma 2.]. Proposition 2. Consider the problem VIK, F) with K being a nonempty, closed, convex subset of a real Hilbert space H and F : K H being pseudo-monotone and continuous. Then, u is a solution of VIK, F) if and only if Fu), u u 0 u K. 3 Weak Convergence of the Extragradient Method In this section, we consider the problem VIK, F) with K being nonempty, closed, convex and F being pseudo-monotone on H and Lipschitz continuous with modulus L > 0onK. We also assume that the solution set SolK, F) is nonempty.

402 J Optim Theory Appl 208) 76:399 409 Extragradient Algorithm Data: u 0 K and {λ k } [a, b], where 0 < a b < /L. Step 0: Set k = 0. Step : If u k = P K u k λ k Fu k )) then stop. Step 2: Otherwise, set Replace k by k + ; go to Step. ū k = P K u k λ k Fu k )), u k+ = P K u k λ k Fū k )). Remark 3. If at some iteration we have Fu k ) = 0, then u k = P K u k λ k Fu k )) and the Extragradient Algorithm terminates at step k with a solution u k.fromnow on, we assume that Fu k ) = 0 for all k and the Extragradient Algorithm generates an infinite sequence. We recall an important property of the iterative sequence {u k } generated by the Extragradient Algorithm; see, e.g., [6,9]. Proposition 3. Assume that F is pseudo-monotone and L-Lipschitz continuous on K and SolK, F) is nonempty. Let u be a solution of VIK, F). Then, for every k N, we have u k+ u 2 u k u 2 λ 2 k L2) u k ū k 2. 3) We are now in the position to establish the main result of this note. The following theorem states that the sequence {u k } converges weakly to a solution of VIK, F). This result extends the Extragradient Algorithm for solving monotone VIs [7,9] to pseudo-monotone VIs. Theorem 3. Assume that F is pseudo-monotone on H, sequentially weakly continuous and L-Lipschitz continuous on K. Assume also that SolK, F) is nonempty. Then, the sequence {u k } generated by the Extragradient Algorithm converges weakly to a solution of VIK, F). Proof Since 0 < a λ k b < /L, it holds that 0 < b 2 L 2 λ 2 k L2 a 2 L 2 <. Therefore, from Proposition 3., the sequence {u k } is bounded and lim k uk ū k =0. Since F is Lipschitz continuous on K we have Fu k ) Fū k ) L u k ū k.

J Optim Theory Appl 208) 76:399 409 403 Hence lim k Fuk ) Fū k ) =0. As {u k } is a bounded sequence in a Hilbert space, there exists a subsequence {u k i } of {u k } converging weakly to an element û K. Since lim k u k ū k =0, {ū k i } also converges weakly to û. We will prove that û SolK, F). Indeed, since ū k = P K u k λ k Fu k )), by the projection characterization ), it holds u k i λ ki Fu k i ) ū k i,v ū k i 0 v K, or equivalently, u k i ū k i,v ū k i Fu k i ), v ū k i λ ki v K. This implies that u k i ū k i,v ū k i + Fu k i ), ū k i u k i Fu k i ), v u k i λ ki v K. 4) Fixing v K and letting i + in the last inequality, remembering that lim k u k ū k =0and λ k [a, b] ]0, /L[ for all k, wehave lim inf Fu k i ), v u k i 0. 5) i Now we choose a sequence {ɛ i } i of positive numbers decreasing and tending to 0. For each ɛ i, we denote by n i the smallest positive integer such that Fu k j ), v u k j + ɛ i 0 j n i, 6) where the existence of n i follows from 5). Since {ɛ i } is decreasing, it is easy to see that the sequence {n i } is increasing. Furthermore, for each i, Fu k n i ) = 0 and, setting v k n i = Fuk n i ) Fu k n i ) 2, we have Fu k n i ), v k n i = for each i. Now we can deduce from 6) that for each i Fu k n i ), v + ɛ i v k n i u k n i 0,

404 J Optim Theory Appl 208) 76:399 409 and, since F is pseudo-monotone, that Fv + ɛ i v k n i ), v + ɛ i v k n i u k n i 0. 7) On the other hand, we have that { u k i } converges weakly to û when i. Since F is sequentially weakly continuous on K, { Fu k i ) } converges weakly to Fû). We can suppose that Fû) = 0 otherwise, û is a solution). Since the norm mapping is sequentially weakly lower semicontinuous, we have Fû) lim inf i Fuk i ). } Since {u k n i { u k } i and ɛ i 0asi 0, we obtain 0 lim ɛ iv k ɛ n i i = lim i i Fu k n i ) 0 Fû) = 0. Hence, taking the limit as i in 7), we obtain Fv), v û 0. It follows from Proposition 2. that û SolK, F). Finally, we prove that the sequence {u k } converges weakly to û. To do this, it is sufficient to show that {u k } cannot have two distinct weak sequential cluster points in SolK, F). Let {u k j } be another subsequence of {u k } converging weakly to ū.wehave to prove that û =ū. As it has been proven above, ū SolK, F). From Proposition 3., the sequences { u k û } and { u k ū } are monotonically decreasing and therefore converge. Since for all k N, 2 u k, ū û = u k û 2 u k ū 2 + ū 2 û 2, we deduce that the sequence { u k, ū û } also converges. Setting l = lim u k, ū û, k and passing to the limit along {u k i } and {u k j } yields, respectively, l = û, ū û = ū, ū û. This implies that û ū 2 = 0 and therefore û =ū. Remark 3.2 The author in [3] studied the extragradient method for solving strongly pseudo-monotone variational inequalities with the following choice of stepsizes:

J Optim Theory Appl 208) 76:399 409 405 λ k =, k=0 lim λ k = 0. k It was proved that the iterative sequence generated by the extragradient method converges strongly to a solution. By considering an example [3, Example 4.2], the author stated that the condition lim k λ k = 0 cannot be omitted. We have shown that if this condition is violated then the strong convergence reduces to the weak convergence. It is also worth stressing that, the basic extragradient method can serve as an adequate solution method for solving pseudo-monotone VIs, which was not guaranteed by the method studied in [3]. Remark 3.3 If we replace the Lipschitz continuity of F on K by its Lipschitz continuity on the whole space H, then the conclusion in Theorem 3. still holds for the subgradient extragradient method [7]. Indeed, a careful reviewing shows that Lemma 5.2 in [7]is also guaranteed for pseudo-monotone mappings instead of monotone ones see also [2]). The conclusion can be obtained by using a similar technique as in Theorem 3.. Remark 3.4 When the function F is monotone, it is not necessary to impose the sequential weak continuity on F. Indeed, in that case, it follows from 4) and the monotonicity of F that u k i ū k i,v ū k i + Fu k i ), ū k i u k i Fu k i ), v u k i λ ki Fv), v u k i v K. Letting i + in the last inequality, remembering that lim k u k ū k =0and λ k [a, b] ]0, /L[ for all k,wehave Fv), v û 0 v K. 4 An Illustrative Example In this section, we present an example to illustrate the main results obtained in Sect. 3. Another example can be found in [9, Example 5.2], where the mapping F is monotone and Lipchitz continuous. The following example is considered in [3], where the mapping F is pseudo-monotone but not monotone. Let H = l 2, the real Hilbert space, whose elements are the square-summable sequences of real numbers, i.e., H ={u = u, u 2,...,u i,...): i= u i 2 < + }. The inner product and the norm on H are given by setting u,v= u i v i and u = u, u i= for any u = u, u 2,...,u i,...),v = v,v 2,...,v i,...) H.

406 J Optim Theory Appl 208) 76:399 409 Let α, β R be such that β>α> β 2 > 0. Put K α ={u H : u α}, F β u) = β u ) u, where α and β are parameters. It is easy to verify that F β is sequentially weakly continuous on H and SolK α, F β ) ={0}. Note that F β is Lipschitz continuous and pseudo-monotone on K α. Indeed, for any u,v K α, F β u) F β v) = β u )u β v )v = βu v) u u v) u v )v β u v + u u v + u v v β u v +α u v + u v α = β + 2α) u v. Hence, F β is Lipschitz continuous on K α with the Lipschitz constant L := β + 2α. Let u,v K α be such that F β u), v u 0. Then β u )u,v u 0. Since u α<β,wehaveu,v u 0. Hence, F β v), v u =β v )v, v u β v )v, v u u,v u) β α) u v 2 0. We have thus shown that F β is pseudo-monotone on K α. It is worthy to stress that F β is not monotone on K α. To see this, it suffices to choose u = β 2, 0,...,0,...),v = α, 0,...,0,...) K α and note that F β u) F β v), u v = ) β 3 2 α < 0. We now apply the Extragradient Algorithm for solving the variational inequality VIK α, F β ). We choose λ k = λ ] 0, L [ ] [ = 0, β+2α and we take starting point as any u 0 K α. The projection onto K α is explicitly calculated as Since for all k, { u, if u α, P Kα u = αu u, otherwise. 8) 0 <λ< β + 2α < β u k,

J Optim Theory Appl 208) 76:399 409 407 then we have u k λf β u k ) = )) λ β u k u k u k α. Therefore, ū k = P Kα u k λ k Fu k )) = )) λ β u k u k. Similarly, we can deduce that u k λ k F β ū k ) α. Indeed, we have ) )) u k λ k F β ū k ) = u k λ β ū k λ β u k u k. Since ) )) ) λ β ū k λ β u k = λ β ū k ) ) + λ 2 β ū k β u k ) λ β ū k > 0, 9) we can write u k λ k F β ū k ) = [ ) ))] λ β ū k λ β u k u k u k α. This and 9) imply that u k+ = P Kα u k λ k F β ū k )) ) = u k λ β ū k ū k [ ) ))] = λ β ū k λ β u k u k. 0) We have ) )) λ β ū k λ β u k = λ λ = λ ) ) β ū k λβ + λ u k ) β ū k λβ) ) β λβ) u k λ u k 2) λβ).

408 J Optim Theory Appl 208) 76:399 409 Considering the function f x) := β λβ) x λx 2 with x [0,α], it is easy to see that f is decreasing on [0,α]. Therefore, the minimal value of f is β λβ) α λα 2, which is attained at x = α. Combining this with ) and 0) yields u k+ λ β λβ) α λα 2) ) λβ) u k = λβ λα + λ 2 αβ λ 2 α 2) ) λβ) u k We claim that = [ β α) λ + αλ) λβ)] u k. 2) q := β α) λ + αλ) λβ) ]0, [. Indeed, since α<βand 0 <λ< β+2α,wehaveq > 0. To verify that q <, it is sufficient to show that β α) λ + αλ) <. Since β/2 <α<βand 0 <λ< we have β+2α β α) λ + αλ) < β α) β + 2α < β 2 β + β + β β + β + α This implies that q ]0, [ and we can deduce from 2) that u k q) k u 0, β + 2α ) = 3 8. for all k N. This means that the sequence {u k } converges strongly to 0, the unique solution of VIK α, F β ). ) 5 Conclusions We have considered the extragradient method for solving infinite-dimensional variational inequalities with a pseudo-monotone and Lipschitz continuous mapping. We have shown that the iterative sequence generated by the extragradient method converges weakly to a solution of the considered variational inequality, provided that such a solution exists. The strong convergence of the iteration sequence is still an open question that could be an interesting topic for a future research. Acknowledgements Open access funding provided by Austrian Science Fund FWF). The author wishes to express his gratitude to the Editor-in-Chief, the two anonymous referees and Prof. Jean Jacques Strodiot for their detailed comments and useful suggestions that allowed to improve significantly the presentation of this paper. This research is supported by the Austrian Science Foundation FWF) under Grant No. P26640-N25.

J Optim Theory Appl 208) 76:399 409 409 Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original authors) and the source, provide a link to the Creative Commons license, and indicate if changes were made. References. Facchinei, F., Pang, J.-S.: Finite-Dimensional Variational Inequalities and Complementarity Problems, Vols. I and II. Springer, New York 2003) 2. Kinderlehrer, D., Stampacchia, G.: An Introduction to Variational Inequalities and Their Applications. Academic Press, New York 980) 3. Karamardian, S., Schaible, S.: Seven kinds of monotone maps. J. Optim. Theory Appl. 66, 37 46 990) 4. Zhu, D.L., Marcotte, P.: Co-coercivity and its role in the convergence of iterative schemes for solving variational inequalities. SIAM J. Control Optim. 6, 74 726 996) 5. Tam, N.N., Yao, J.-C., Yen, N.D.: Solution methods for pseudomonotone variational inequalities. J. Optim. Theory Appl. 38, 253 273 2008) 6. Korpelevich, G.M.: The extragradient method for finding saddle points and other problems. Metody 2, 747 756 976) 7. Censor, Y., Gibali, A., Reich, S.: The subgradient extragradient method for solving variational inequalities in Hilbert space. J. Optim. Theory Appl. 48, 38 335 20) 8. Censor, Y., Gibali, A., Reich, S.: Strong convergence of subgradient extragradient methods for the variational inequality problem in Hilbert space. Optim. Methods Softw. 26, 827 845 20) 9. Khanh, P.D.: A modified extragradient method for infinite-dimensional variational inequalities. Acta. Math. Vietnam. 4, 25 263 206) 0. Goebel, K., Reich, S.: Uniform Convexity, Hyperbolic Geometry, and Nonexpansive Mappings. Marcel Dekker, New York 984). Cottle, R.W., Yao, J.C.: Pseudo-monotone complementarity problems in Hilbert space. J. Optim. Theory Appl. 75, 28 295 992) 2. Censor, Y., Gibali, A., Reich, S.: Extensions of Korpelevich s extragradient method for the variational inequality problem in Euclidean space. Optimization 6, 9 32 202) 3. Khanh, P.D.: A new extragradient method for strongly pseudomonotone variational inequalities. Numer. Funct. Anal. Optim. 37, 3 43 206)