A New Modified Gradient-Projection Algorithm for Solution of Constrained Convex Minimization Problem in Hilbert Spaces

Similar documents
A General Iterative Method for Constrained Convex Minimization Problems in Hilbert Spaces

Iterative algorithms based on the hybrid steepest descent method for the split feasibility problem

A general iterative algorithm for equilibrium problems and strict pseudo-contractions in Hilbert spaces

A NEW ITERATIVE METHOD FOR THE SPLIT COMMON FIXED POINT PROBLEM IN HILBERT SPACES. Fenghui Wang

Iterative Convex Optimization Algorithms; Part One: Using the Baillon Haddad Theorem

The Split Hierarchical Monotone Variational Inclusions Problems and Fixed Point Problems for Nonexpansive Semigroup

Research Article Strong Convergence of a Projected Gradient Method

Research Article Some Krasnonsel skiĭ-mann Algorithms and the Multiple-Set Split Feasibility Problem

The Split Common Fixed Point Problem for Asymptotically Quasi-Nonexpansive Mappings in the Intermediate Sense

New Iterative Algorithm for Variational Inequality Problem and Fixed Point Problem in Hilbert Spaces

On the split equality common fixed point problem for quasi-nonexpansive multi-valued mappings in Banach spaces

Viscosity approximation methods for the implicit midpoint rule of asymptotically nonexpansive mappings in Hilbert spaces

Strong Convergence of Two Iterative Algorithms for a Countable Family of Nonexpansive Mappings in Hilbert Spaces

CONVERGENCE THEOREMS FOR STRICTLY ASYMPTOTICALLY PSEUDOCONTRACTIVE MAPPINGS IN HILBERT SPACES. Gurucharan Singh Saluja

STRONG CONVERGENCE THEOREMS BY A HYBRID STEEPEST DESCENT METHOD FOR COUNTABLE NONEXPANSIVE MAPPINGS IN HILBERT SPACES

WEAK AND STRONG CONVERGENCE OF AN ITERATIVE METHOD FOR NONEXPANSIVE MAPPINGS IN HILBERT SPACES

Split equality problem with equilibrium problem, variational inequality problem, and fixed point problem of nonexpansive semigroups

INERTIAL ACCELERATED ALGORITHMS FOR SOLVING SPLIT FEASIBILITY PROBLEMS. Yazheng Dang. Jie Sun. Honglei Xu

The Generalized Viscosity Implicit Rules of Asymptotically Nonexpansive Mappings in Hilbert Spaces

On an iterative algorithm for variational inequalities in. Banach space

On nonexpansive and accretive operators in Banach spaces

The Journal of Nonlinear Science and Applications

Convergence Theorems for Bregman Strongly Nonexpansive Mappings in Reflexive Banach Spaces

CONVERGENCE OF HYBRID FIXED POINT FOR A PAIR OF NONLINEAR MAPPINGS IN BANACH SPACES

Some unified algorithms for finding minimum norm fixed point of nonexpansive semigroups in Hilbert spaces

GENERAL NONCONVEX SPLIT VARIATIONAL INEQUALITY PROBLEMS. Jong Kyu Kim, Salahuddin, and Won Hee Lim

A Relaxed Explicit Extragradient-Like Method for Solving Generalized Mixed Equilibria, Variational Inequalities and Constrained Convex Minimization

Synchronal Algorithm For a Countable Family of Strict Pseudocontractions in q-uniformly Smooth Banach Spaces

Iterative common solutions of fixed point and variational inequality problems

Multi-step hybrid steepest-descent methods for split feasibility problems with hierarchical variational inequality problem constraints

Strong convergence to a common fixed point. of nonexpansive mappings semigroups

Extensions of the CQ Algorithm for the Split Feasibility and Split Equality Problems

Thai Journal of Mathematics Volume 14 (2016) Number 1 : ISSN

Viscosity Iterative Approximating the Common Fixed Points of Non-expansive Semigroups in Banach Spaces

Research Article An Iterative Algorithm for the Split Equality and Multiple-Sets Split Equality Problem

Viscosity Approximative Methods for Nonexpansive Nonself-Mappings without Boundary Conditions in Banach Spaces

STRONG CONVERGENCE OF AN ITERATIVE METHOD FOR VARIATIONAL INEQUALITY PROBLEMS AND FIXED POINT PROBLEMS

Strong Convergence Theorem of Split Equality Fixed Point for Nonexpansive Mappings in Banach Spaces

Two-Step Iteration Scheme for Nonexpansive Mappings in Banach Space

ITERATIVE SCHEMES FOR APPROXIMATING SOLUTIONS OF ACCRETIVE OPERATORS IN BANACH SPACES SHOJI KAMIMURA AND WATARU TAKAHASHI. Received December 14, 1999

A GENERALIZATION OF THE REGULARIZATION PROXIMAL POINT METHOD

Split equality monotone variational inclusions and fixed point problem of set-valued operator

Research Article Self-Adaptive and Relaxed Self-Adaptive Projection Methods for Solving the Multiple-Set Split Feasibility Problem

The viscosity technique for the implicit midpoint rule of nonexpansive mappings in

ON A HYBRID PROXIMAL POINT ALGORITHM IN BANACH SPACES

THE CYCLIC DOUGLAS RACHFORD METHOD FOR INCONSISTENT FEASIBILITY PROBLEMS

PARALLEL SUBGRADIENT METHOD FOR NONSMOOTH CONVEX OPTIMIZATION WITH A SIMPLE CONSTRAINT

Steepest descent approximations in Banach space 1

Existence and convergence theorems for the split quasi variational inequality problems on proximally smooth sets

Convergence to Common Fixed Point for Two Asymptotically Quasi-nonexpansive Mappings in the Intermediate Sense in Banach Spaces

A generalized forward-backward method for solving split equality quasi inclusion problems in Banach spaces

STRONG CONVERGENCE OF A MODIFIED ISHIKAWA ITERATIVE ALGORITHM FOR LIPSCHITZ PSEUDOCONTRACTIVE MAPPINGS

Algorithms for finding minimum norm solution of equilibrium and fixed point problems for nonexpansive semigroups in Hilbert spaces

ON GAP FUNCTIONS OF VARIATIONAL INEQUALITY IN A BANACH SPACE. Sangho Kum and Gue Myung Lee. 1. Introduction

Convergence Theorems of Approximate Proximal Point Algorithm for Zeroes of Maximal Monotone Operators in Hilbert Spaces 1

HAIYUN ZHOU, RAVI P. AGARWAL, YEOL JE CHO, AND YONG SOO KIM

The Strong Convergence Theorems for the Split Equality Fixed Point Problems in Hilbert Spaces

Existence of Solutions to Split Variational Inequality Problems and Split Minimization Problems in Banach Spaces

Weak and strong convergence theorems of modified SP-iterations for generalized asymptotically quasi-nonexpansive mappings

Research Article Cyclic Iterative Method for Strictly Pseudononspreading in Hilbert Space

Research Article Algorithms for a System of General Variational Inequalities in Banach Spaces

Strong Convergence Theorem by a Hybrid Extragradient-like Approximation Method for Variational Inequalities and Fixed Point Problems

Hybrid Steepest Descent Method for Variational Inequality Problem over Fixed Point Sets of Certain Quasi-Nonexpansive Mappings

STRONG CONVERGENCE OF AN IMPLICIT ITERATION PROCESS FOR ASYMPTOTICALLY NONEXPANSIVE IN THE INTERMEDIATE SENSE MAPPINGS IN BANACH SPACES

CONVERGENCE OF THE STEEPEST DESCENT METHOD FOR ACCRETIVE OPERATORS

Shih-sen Chang, Yeol Je Cho, and Haiyun Zhou

A novel approach to Banach contraction principle in extended quasi-metric spaces

Viscosity approximation method for m-accretive mapping and variational inequality in Banach space

A FIXED POINT THEOREM FOR GENERALIZED NONEXPANSIVE MULTIVALUED MAPPINGS

Common fixed points of two generalized asymptotically quasi-nonexpansive mappings

WEAK CONVERGENCE OF RESOLVENTS OF MAXIMAL MONOTONE OPERATORS AND MOSCO CONVERGENCE

ON WEAK AND STRONG CONVERGENCE THEOREMS FOR TWO NONEXPANSIVE MAPPINGS IN BANACH SPACES. Pankaj Kumar Jhade and A. S. Saluja

WEAK CONVERGENCE THEOREMS FOR EQUILIBRIUM PROBLEMS WITH NONLINEAR OPERATORS IN HILBERT SPACES

An Algorithm for Solving Triple Hierarchical Pseudomonotone Variational Inequalities

FIXED POINT ITERATION FOR PSEUDOCONTRACTIVE MAPS

Common fixed point of multivalued mappings in ordered generalized metric spaces

International Journal of Scientific & Engineering Research, Volume 7, Issue 12, December-2016 ISSN

Developments on Variational Inclusions

1 Introduction and preliminaries

Split hierarchical variational inequality problems and fixed point problems for nonexpansive mappings

Scalar Asymptotic Contractivity and Fixed Points for Nonexpansive Mappings on Unbounded Sets

Variational inequalities for fixed point problems of quasi-nonexpansive operators 1. Rafał Zalas 2

Modified semi-implicit midpoint rule for nonexpansive mappings

Fixed points of monotone mappings and application to integral equations

THROUGHOUT this paper, we let C be a nonempty

Monotone variational inequalities, generalized equilibrium problems and fixed point methods

Lecture Notes on Iterative Optimization Algorithms

Generalized Monotonicities and Its Applications to the System of General Variational Inequalities

Goebel and Kirk fixed point theorem for multivalued asymptotically nonexpansive mappings

arxiv: v1 [math.oc] 20 Sep 2016

Academic Editor: Hari M. Srivastava Received: 29 September 2016; Accepted: 6 February 2017; Published: 11 February 2017

Existence and Approximation of Fixed Points of. Bregman Nonexpansive Operators. Banach Spaces

Strong Convergence of the Mann Iteration for Demicontractive Mappings

PROPERTIES OF FIXED POINT SET OF A MULTIVALUED MAP

On The Convergence Of Modified Noor Iteration For Nearly Lipschitzian Maps In Real Banach Spaces

CONVERGENCE THEOREMS FOR MULTI-VALUED MAPPINGS. 1. Introduction

Krasnoselskii type algorithm for zeros of strongly monotone Lipschitz maps in classical banach spaces

On the Weak Convergence of the Extragradient Method for Solving Pseudo-Monotone Variational Inequalities

SOME EXTRAGRADIENT METHODS FOR NONCONVEX QUASI VARIATIONAL INEQUALITIES

On Almost Contraction Principle and Property (XU)

Transcription:

A New Modified Gradient-Projection Algorithm for Solution of Constrained Convex Minimization Problem in Hilbert Spaces Cyril Dennis Enyi and Mukiawa Edwin Soh Abstract In this paper, we present a new iterative scheme for finding a minimizer for a constrained convex minimization problem. We will also prove that the sequence generated by our new scheme converges strongly to a solution of the constrained convex minimization problem in real Hilbert spaces. Index Terms Average Mappings, Constrained Convex Minimization problem, Fixed point, Gradient Projection Algorithm, Nonexpansive mappings. I. INTRODUCTION Let H be a real Hilbert space with inner product, and norm. Definition 1.1 Let C be a nonempty closed and convex subset of a real Hilbert space H. A map T:C H is said to be nonexpansive if for all x,z C we have. We denote the fixed point set of T by Fix(T). Definition 1.2 For any x H, we define the map satisfying x- x-z. is called the metric projection of H onto C. it is well known that is nonexpansive and (1) Clearly, (1) is equivalent to Furthermore, is characterized by the property that and (2) Definition 1.3 A mapping of into is called monotone if Manuscript received January 01, 2014; revised March 28, 2014. This work was supported in part by King Fahd University of Petroleum and Minerals, Dhahran, Saudi Arabia. Cyril Dennis Enyi is with King Fahd University of Petroleum and Minerals, Dhahran, Saudi Arabia. (corresponding author: +966550782390; e-mail: cenyi@ kfupm.edu.sa). Mukiawa Edwin Soh is with King Fahd University of Petroleum and Minerals, Dhahran, Saudi Arabia. (e-mail: sohedwin2013@gmail.com). is called monotone if there exists such that is called monotone if there exists such that also is if there exists such that for all. Definition 1.4 A mapping is said to be firmly nonexpansive if is nonexpansive, or equivalently Alternatively, is firmly nonexpansive if can be expressed as, for some nonexpansive. As an example, the projection mappings are firmly nonexpansive. Definition 1.5 A mapping is said to be an averaged mapping if it can be written as the average of the identity mapping and a nonexpansive mappings; i.e. (3) where α (0,1) and S:H H is non expansive. More precisely, when (3) holds, we say that T is α-averaged. Therefore, firmly nonexpansive mappings (e.g. projections) are -averaged mappings. The proposition below gives some properties of averaged mappings. Proposition 1.1 (Bryne [3], Combettes [5]) For given operators S,T,V:H H: (a) If T=(1 α)s+αv for some α (0,1) and if S is averaged and V is nonexpansive, then T is averaged. (b) T is firmly nonexpansive if and only if the complement I T is nonexpansive. (c) If T=(1 α)s+αv for some α (0,1) and if S is firmly nonexpansive and V is nonexpansive, then T is averaged.

(d) The composition of finitely many averaged mappings is averaged. That is, if each of the mappings { } is averaged, so is the composition. In particular, if is averaged and is averaged, where, then the composition is α- averaged, where. The following are immediately noticeable: (i) If T is nonexpansive, then is monotone; (ii) the projection mapping is a 1-ism. The inverse strongly monotone (also known as co-coercive) operators have being widely used in solving practical problems in diverse fields, e.g., the traffic assignment problems; see, [1, 6] and the references therein. The proposition that follows gives some important relationships between average mappings and inverse strongly monotone operators. Proposition 1.2 (Bryne [3], Combettes [5]) (a) T is nonexpansive if and only if the complement I T is -ism. (b) If T is ν-ism, then for γ>0, γt is ν γ -ism. (c) T is averaged if and only if the complement I T is ism for some. Indeed, for α (0,1), T is α-averaged if and only if I T is -ism. Given the following constrained convex minimization problem: minimize { }, (4) where f:c R is a real-valued convex function. The minimization problem (4) is said to be consistent if it has a solution. In what follows, we shall denote the solution set of problem (4) by S. It is well known that if f is (Fréchet) differentiable, then the gradient-projection method (i.e., GPM) generates a sequence { } by using the following recursive formula: (5) or in a more general form; (6) where is an arbitrary initial guess in both (5) and (6), or are positive real numbers. It is known that if f is α-strongly monotone and L-Lipschitzian with constants α, L>0, then the operator (7) is a contraction; thus the sequence { } in (5) converges in norm to the unique minimizer of (4). More generally if the sequence { } is such that (8) then the sequence { } generated by (6) converges in norm to the unique minimizer of (4). However in a situation where the f fails to be strongly monotone, the operator T defined by (7) would fail to be a contraction, consequently, the sequence { } generated by (6) may fail to converge in norm. (see [12], Sect. 4). If f is Lipchitzian, then the schemes (5) and (6) can still converge weakly under certain assumptions. The GPM for finding the approximate solutions of the constrained convex minimization problem has been studied by several authors; see, for example [9] and the references therein. The convergence of the sequence generated by this method depends largely on the behavior of the gradient of the objective function. If the gradient fails to be strongly monotone, then the sequence generated by the GPM may fail to converge strongly. Recently alternative operatororiented approach to algorithm (6); namely an average mapping approach; see, for example Xu [12]. Xu [12] also presented two modifications of the gradient-projection algorithms which are shown to be strongly convergent. Very recently, based on Yamada hybrid steepest descent method, Tian and Huang [10] proposed respectively the following implicit and explicit iterative scheme: and They proved that the sequence generated by their implicit and explicit schemes converge strongly to a solution of the constrained convex minimization problem, which also solves a certain variational inequality problem. Also, motivated by the work of Xu [12], Shehu et al [8] proposed the following iterative scheme : { (9) they proved that the sequence generated by their scheme converges strongly to a solution of the constrained convex minimization problem, which also solves a certain variational inequality problem (see [8] for details).

Motivated by the works of Xu [12], Tian and Huang [10] and Shehu et al [8], we propose a new iterative scheme for finding the approximate solution of a constrained convex minimization problem and we proved that the sequence generated by our scheme converges strongly to a solution of the constrained minimization problem. II PRELIMINARIES In the sequel, we shall also make use of the following lemmas. Lemma 2.1 Let H be a real Hilbert space. Then the following inequality holds; for all x,y H. Lemma 2.2 Let H be a real Hilbert space. Then the following inequality holds; for all x,y H, λ [0,1]. Lemma 2.3 (Browder [2]) Let H be a real Hilbert space,c a closed convex subset of H and T:C C a nonexpansive mapping with a fixed point. Assume that a sequence { } in C is such that weakly and. Then x Tx=y Lemma 2.4 (Xu [11]) Let { } be a sequence of nonnegative real numbers satisfying the following where (i) { } [ ] ; (ii) (iii). Then a 0 as n. n We adopt the following notions : x x means that x x strongly; n n x x means that x x weakly; n n { } is the weak w-limit set of the sequence { } Furthermore, using the technique in [7, 12], we obtain the following theorem. Theorem 3.1 Let C be a non empty, closed and convex subset of a real Hilbert space H. Suppose that the minimization problem (4) is consistent and let S denote its solution set. Assume that the gradient f is L-Lipschitzian with constant L>0. For any given u C, let the sequence { } be generated iteratively by,, (10) where { } [ ] and { } in (0,L/2) satisfying the following conditions: (C 1 ), (C 2 ), (C 2 ) Then the sequence { } converges strongly to a minimizer of (4), which is closest to u from the solution set S. In other words Proof. Inspired by the method of proof of [8], it is well known that: (a) solves the minimization problem (4) if and only if solves the fixed point equation where is any fixed positive number. For the sake of simplicity, we assume that (due to condition C 3 ) where a and b are constants. (b) The gradient is. (c) is average for Hence we have that, for each n, is average. Therefore we can write, (11) where is nonexpansive and [ ] where and III MAIN RESULTS In this section we present a modify of the gradient projection method and prove its strong convergence. Our result in this section complements the result of Xu [12]. Then we can write (10) as [ ] Let, observing that, we have (12)

{ } Therefore, by induction, we have { } Hence the sequence { } is bounded and so { } Now we have the following: [ ]. (13) We can observe that; which implies that (14) Using Lemma (2.1), (13) and (14) we obtain that. Hence,. (15) Since { } is bounded, then there exists a constant such that. Therefore (15) gives (16) The rest of the proof will be done considering 2 cases. Case 1: Assume that the sequence { } is a monotonically decreasing sequence. Then { } converges. Clearly, we have that Now by (16) we obtain that Using the condition that [ ], we have (17) Now we show that. Let and { } be a subsequence of { } such that We may assume that ; then. We set hence T is nonexpansive. From (17), we have = Therefore, by Lemma (2.3) we have that. We next prove that the sequence { } converges strongly to, where is the solution of (4), which is the closest to u from the solution set S. Firstly, we show that Observe that there exists a subsequence { } of { } satisfying Since { } is bounded, there exists a subsequence { } of { } such that without loss of generality, we assume that. Then we obtain (18) Now, we have

[ ]. (19) It is clear that. Therefore, by Lemma (2.4) applied to (19), we obtain that. Case 2: Assume that { } is not a monotonocally decreasing sequence. Set { } and let τ : N N be a mapping for all (for some large enough ) By τ (n) = { } }. Clearly, is a non-decreasing sequence such that and for. From (17) we can see that as. Further more,we have that as. Using similar argument as in case 1, we can show that converges weakly to as and. We know that for all [ ] which implies that Therefore, we conclude that Hence, Furthermore, for, one could observe easily that if, that is, because for. As a consequence, we obtain for all { }. Thus that is { } converges strongly to. This completes the proof. Corollary 3.2 Let be a nonempty, closed and convex subset of a real Hilbert space problem (5) is consistent, and let. Suppose that the minimization denote its solution set. Assume that the gradient is Lipschitzian with constant. For any given, let the sequence { } be generated iteratively by (20) where and { } [ ] satisfies the following conditions: (C 1 ) (C 2 ). Then the sequence { } converges strongly to a minimizer of (4). IV APPLICATION We give an application of Theorem 3.1 this section. We apply Theorem 3.1 to the split feasibility problem (denoted as SPF), which was introduced by Censor and Elfving [4]. SFP has received considerable attention of many authors because of its applications in image reconstruction, signal processing and intensity-modulation therapy (see [3, 8, 10] and the references therein). SFP can be mathematically formulated as a problem of finding a point with the property that (21) Where and are nonempty, closed and convex subset of Hilbert spaces and respectively and is a bounded linear operator. Clearly, is a solution of SFP (21) if and only if and The proximity function f is defined by (22) and consider the constrained convex minimization problem (23) Then solves the minimization problem (24). In [3] CQ algorithm was introduced to solve SFP. ( ( ) ) (24) where. It was proved that the sequence generated by (25) converges weekly to a solution of the SFP. Now we introduce the following algorithm to obtain a

strong convergence iterative sequence to solve the SFP. Let be given and the sequence { } be generated by (25) where { } [ ] and { } satisfy the following conditions: (C1) (C2). Applying Theorem 3.1, we obtain the following convergence result for solving the SFP (21). Theorem 4.2 Assume that the split feasibility problem (SFP) (21) is consistent. Let the sequence { } be generated by (26), where the sequence { } [ ] and { } in satisfy conditions Then the sequence { } converges strongly to a solution of the split feasibility problem (21). Proof From the definition of the proximity function we have (26) and is Lipchitz continuous, i.e., (27) where Set REFERENCES [1] D. P. Bertsekas and Gafni, E.M. Projection methods for variational variational inequalities with applications to the traffic assignment problem, Math. Progr. Stud. 1982;17:139-159. [2] F. E. Browder, Nonexpansive nonlinear mappings in a Banach Space, Proc. Nat. Acad. Sci. USA. 1965;54: 1041-1044. [3] C. Byrne, Unified treatment of some algorithms in signal processing and image construction, Inverse Probl. 2004;20:103-120. [4] Y. Censor and T. A. Elfving, Multiprojection algorithm using Bregman projection in a product space, Numer. Algorithms 1994;8:221-239. [5] P. L. Combettes, Solving monotone inclusion via compositions of nonexpansive averaged operators, Optimization. 2004;53:475 504. [6] D. Han and H. K. Lo, Solving non-additive traffic assignment problems: a descent method for co-coercive variational inequalities, Eur. J. Oper. Res. 2004;159:529-544. [7] P. E. Mainge, Strong convergence of projected subgradient methods for nonsmooth and nonstrictly convex minimization, Set-Valued Anal. 2008;16:899 912. [8] Y. Shehu, O. S. Iyiola and C. D. Enyi, Iterative approximation of solutions for constrained convex minimization problem, Arab J. Math. 2013;2:393 402. [9] M. Su and H. K. Xu, Remarks on the gradient-projection algorithm, J. Nonlinear Anal.Optim. 2010;1(1):35-43. [10] M. Tian and L. Huang, Iterative methods for constrained convex minimization prob- lem in Hilbert spaces, Fixed Point Theory Appl. 2013; doi:10.1186/1687-1812-2013-105. [11] H. K. Xu, Iterative algorithms for nonlinear operators, J. London Math. Soc. 2002;66(2):240-256. [12] H. K. Xu, Averaged mappings and the gradient-projection algorithm, J. of Opt. Theory Appl. 2011;150(2):360-378. Consequently, and is Lipchitzian with Lipchitz constant Therefore the iterative scheme (25) is equivalent to (28) where { } [ ] and { } in satisfy the following conditions: (C1) (C2) Where The conclusion follows from Theorem 3.1.