A General Iterative Method for Constrained Convex Minimization Problems in Hilbert Spaces
|
|
- Bridget Andrews
- 5 years ago
- Views:
Transcription
1 A General Iterative Method for Constrained Convex Minimization Problems in Hilbert Spaces MING TIAN Civil Aviation University of China College of Science Tianjin CHINA MINMIN LI Civil Aviation University of China College of Science Tianjin CHINA Abstract: It is well known that the gradient-projection algorithm plays an important role in solving constrained convex minimization problems. In this paper, based on Xu s method [Xu, H. K.: Averaged mappings and the gradient-projection algorithm, J. Optim. Theory Appl. 50, (0)], we use the idea of regularization to establish implicit and explicit iterative methods for finding the approximate minimizer of a constrained convex minimization problem and prove that the sequences generated by our methods converge strongly to a solution of the constrained convex minimization problem. Such a solution is also a solution of a variational inequality defined over the set of the solutions of the constrained convex minimization problem. As application, we apply our main result to solve the split feasibility problem in Hilbert spaces. Key Words: Variational inequality; Regularization algorithm; Constrained convex minimization; Fixed point; Averaged mapping; Nonexpansive mappings. Introduction Strong convergence is convergent in norm. In finite dimensional space, strong convergence is equivalent to the weak convergence. In infinite dimension space, strong convergence must be weak convergence, but weak convergence is not necessarily strong convergence. Therefore, we hope to obtain a iterative algorithm which converges in norm to a best solution of constrained convex minimization problems. Now we give an example that is weakly but not strongly convergent. Assume that l = {x = {x k } k= x k < }, {e i } l, e i = (0, 0,,, 0, ). For v l, we can obtain e i, v = v i 0, v, which implies that e i 0. However, e i = (i =,, 3, ) implies that e i is not strongly convergent to the 0. Consider the following constrained convex minimization problem: min f(x), () x C where C is a nonempty closed and convex subset of a real Hilbert space H and f : C IR is a realvalued convex and continuously Fréchet differentiable function. Assume that the minimization problem () is consistent and let S denote its solution set. It is well known that the gradient-projection algorithm is very useful in dealing with constrained convex minimization problems and has extensively been studied (see [-8]and the references therein). It has recently been applied to solve split feasibility problems(see [9-4]) which find applications in image reconstructions and the intensity modulated radiation therapy (see [, 3, 5, 6, 7]). The gradient-projection algorithm (GPA) generates a sequence {x n } using the following recursive formula: x n+ := Proj C (x n γ f(x n )), n 0, () or more generally, x n+ := Proj C (x n γ n f(x n )), n 0, (3) where, in both () and (3), the initial guess x 0 is taken from C arbitrarily, the parameters γ or γ n are positive real numbers satisfying appropriate conditions. The convergence of the algorithms () and (3) depends on the behavior of the gradient f; see Levitin and Polyak []. As a matter of fact, it is known [] that if f has a Lipschitz continuous and strongly monotone gradient, then {x n } generated by () and (3) can be strongly convergent to a minimizer of f in C, respectively. If the gradient of f fails to be strongly monotone, then {x n } can only be weakly convergent if H is infinite-dimensional. Regularization, in particular, the traditional Tikhonov regularization, is usually used to solve ill-posed optimization problems. E-ISSN: Volume 3, 04
2 Next consider the regularized minimization problem: min f α(x) := f(x) + α x C x, (4) here α > 0 is the regularization parameter. f is convex and f is inverse strongly monotone. Since now the gradient f α is α-strongly monotone and (L+α)- Lipschitzian, (4) has a unique solution which is denoted as x α C. Assume that the minimization problem () is consistent. If we appropriately select the regularized parameter α and the parameter γ in the projectiongradient algorithm, we can get a single iterative algorithm that generates a sequence {x n } in the following manner: x n+ = Proj C (I γ f αn )x n. (5) It is proved that if the sequence {α n } and the parameter γ satisfy appropriate conditions, the sequence {x n } generated by (5) converges weakly to a minimizer of () [8]. One question arises in the literature naturally: Is it possible to get strong convergence of (5) when we make some changes? In 00, Yamada [9] introduced the following hybrid iterative algorithm for solving the variational inequality x n+ = T x n µα n F T x n, n 0, (6) where F is a κ-lipschitzian and η-strongly monotone operator with κ > 0, η > 0. Then he proved that if {α n } satisfies appropriate conditions, the sequence {x n } generated by (6) converges strongly to the unique solution of variational inequality F x, x x 0, x F ix(t ). (7) Recently, Xu [0] provided a modification of GPA so that strong convergence is guaranteed. He considered the following hybrid gradient-projection algorithm x n+ = θ n h(x n )+( θ n )Proj C (x n λ n f(x n )). (8) Assume that the minimization problem () is consistent, it is proved that if the sequences {θ n } and {λ n } satisfy appropriate conditions, the sequence {x n } generated by (8) converges in norm to a minimizer of () which solves the variational inequality x S, (I h)x, x x 0, x S. (9) In this article, motivated and inspired by the research work of [0], we will combine the iterative method (6) with the iterative method (5) and consider the following hybrid algorithm for the idea of regularization approach: { yn = (I µθ n F )Proj C (I γ f αn )x n, (0) x n+ = Proj C y n, n 0. Assume that the minimization problem () is consistent, we will prove that if the sequence {θ n } of parameters and the sequence {α n } of parameters satisfy appropriate conditions, then the sequence {x n } generated by (0) converges in norm to a minimizer of () which solves the variational inequality (VI) x S, F x, x x 0, x S, where S is the solution set of the minimization problem (). Finally, in Sect.4, we apply this algorithm to the split feasibility problem, and give the numerical result in Sect. 5 Preliminaries Throughout this paper, we assume that H is a Hilbert space whose inner product and norm are denoted by, and, respectively, and C is a nonempty closed convex subset of H. We write x n x to indicate that the sequence {x n } converges weakly to x, x n x implies that {x n } converges strongly to x. ω w (x n ) := {x : x nj x} is the weak ω-limit set of the sequence {x n } n=. Definition A mapping T : H H is said to be (a) nonexpansive, if and only if T x T y x y, x, y H. (b) firmly nonexpansive, if and only if T I is nonexpansive, or equivalently x y, T x T y T x T y, x, y H. Alternatively, T is firmly nonexpansive, if and only if T can be expressed as T = (I + W ), where W : H H is nonexpansive. Projections are firmly nonexpansive. (c) an averaged mapping, if and only if it can be written as the average of the identity I and a nonexpansive mapping; that is, T = ( ε)i + εw, E-ISSN: Volume 3, 04
3 where ε is a number in (0, ) and W : H H is nonexpansive. More precisely, when the above expression holds, we say that T is ε-averaged. Thus firmly nonexpansive mappings (in particular, projections) are (/)-averaged maps. Proposition ([5, ]) Let T : H H be given. We have: (i) if T is υ-ism, then for γ > 0, γt is (υ/γ)-ism; (ii) T is averaged, iff the complement I T is υ- ism for some υ > /; indeed, for ε (0, ), T is ε-averaged, iff I T is (/ε)-ism; (iii) the composite of finitely many averaged mappings is averaged. That is, if each of the mappings {T i } N i= is averaged, then so is the composite T T N (see []). In particular, an averaged mapping is a nonexpansive mapping. Definition 3 (See[3]) for comprehensive theory of monotone operators.) (i) A is monotone if and only if, x y, Ax Ay 0, x, y H. (ii) Given is a number υ > 0. A : H H is said to be υ-inverse strongly monotone, if and only if x y, Ax Ay υ Ax Ay, x, y H. (iii) Given is a number ζ > 0. A is said to be ζ-strongly monotone, if and only if x y, Ax Ay ζ x y, x, y H. It is easily seen that, if T is nonexpansive, then I T is monotone. It is also easily seen that a projection is a one-ism. Inverse strongly monotone operators have widely been applied to solve practical problems in various fields; for instance, in traffic assignment problems(see[4, 5]). Definition 4 Let the operators S, T, V : H H be given. (i) If T = ( α)s + αv for some α (0, ) and if S is averaged and V is nonexpansive, then T is averaged. (ii) T is firmly nonexpansive, if and only if the complement I T is firmly nonexpansive. (iii) If T = ( α)s + αv for some α (0, ), S is firmly nonexpansive and V is nonexpansive, then T is averaged. (iv) T is nonexopansive, if and only if the complement I T is (/)-ism. Lemma 5 [6] Assume that {a n } is a sequence of non-negative real numbers such that a n+ ( γ n )a n + γ n δ n + β n, n 0, where {γ n } and {β n} are sequences in (0,) and {δ n } is a sequence in IR such that (i) γ n = ; (ii) either lim sup n δ n 0 or (iii) β n <. Then lim n a n = 0. γ n δ n < ; Lemma 6 [7] Let C be a closed and convex subset of a Hilbert space H and let T : C C be a nonexpansive mapping with F ixt. If {x n } n= is a sequence in C weakly converging to x and if {(I T )x n } n= converges strongly to y, then (I T )x = y. Lemma 7 Let C be a closed subset of a real Hilbert space H, given x H and y C. Then y = P C x if and only if there holds the inequality x y, y z 0, z C. 3 Main results Assume that the minimization problem () is consistent and let S denote its solution set. Assume that the gradient f is L -inverse strongly monotone ( L -ism) (see [3]) with a constant L > 0. Throughout the rest of this paper, we always let H be a real Hilbert space and let C be a nonempty closed and convex subset of H. Since S is a closed convex subset, the nearest point projection from H onto S is well defined. Let F : C H be a κ-lipschitzian and η-strongly monotone operator with κ, η > 0. Let 0 < µ < η/κ, τ = µ(η µκ ). Assume that α t is continuous with respect to t and in addition, α lim t t 0 t = 0; that is, there is a constant B > 0 so as to α t t < B. For t (0, ), we consider a mapping X t on C defined by X t (x) = Proj C (I tµf )V αt (x), x C, () where V αt := Proj C (I γ f αt ). It is obvious that V αt is a nonexpansive mapping. It is also easy to see that X t is a contraction. E-ISSN: Volume 3, 04
4 For t (0, ), we can get (I tµf )V αt (x) (I tµf )V αt (y) = V αt (x) V αt (y) tµ(f V αt (x) F V αt (y)) = V αt (x) V αt (y) tµ V αt (x) V αt (y), F V αt (x) F V αt (y) +t µ F V αt (x) F V αt (y) V αt (x) V αt (y) tµη V αt (x) V αt (y) +t µ κ V αt (x) V αt (y) = ( tµ(η tµκ )) V αt (x) V αt (y) ( sµ(η sµκ ) ) V αt (x) V αt (y) ( tµ(η µκ )) V αt (x) V αt (y) = ( tτ) x y. Indeed, we have X t (x) X t (y) = Proj C (I tµf )V αt (x) Proj C (I tµf )V αt (y) (I tµf )V αt (x) (I tµf )V αt (y) ( tτ) x y. Hence X t has a unique fixed point, denoted x t, which uniquely solves the fixed point equation x t = Proj C (I tµf )V αt (x t ). () The next proposition summarizes the properties of {x t }. Proposition 8 Let x t be defined by () (i) {x t } is bounded for t (0, τ ). (ii) lim t 0 x t Proj C (I γ f αt )(x t ) = 0. (iii) x t defines a continuous curve from (0, /τ) into H. Proof: (i) For a p S, then we have x t p = Proj C (I tµf )V αt (x t ) p (I tµf )V αt (x t ) p = (I tµf )Proj C (I γ f αt )(x t ) (I tµf )Proj C (I γ f)p tµf Proj C (I γ f)p ( tτ) Proj C (I γ f αt )(x t ) Proj C (I γ f)p + t µf p ( tτ)( x t p + γα t p ) + t µf p ( tτ) x t p + t µf p + γα t p = ( tτ) x t p + t( µf p + γ α t t p ) ( tτ) x t p + t( µf (p) + γb p ). It follows that x t p µf (p) + γb p. τ Hence, {x t } is bounded. (ii) By the definition of {x t }, we have x t Proj C (I γ f αt )(x t ) = Proj C (I tµf )V αt x t Proj C Proj C (I γ f αt )(x t ) (I tµf )V αt x t Proj C (I γ f αt )(x t ) = tµ F V αt x t 0. {x t } is bounded, so is {F V αt x t }. (iii) For t, t 0 (0, /τ), we have x t x t0 = Proj C (I tµf )V αt (x t ) Proj C (I t 0 µf )V αt0 (x t0 ) (I tµf )Proj C (I γ f αt )(x t ) (I t 0 µf )Proj C (I γ f αt0 )(x t0 ) = (I µtf )V αt x t (I µt 0 F )V αt0 x t +(I µt 0 F )V αt0 x t (I µt 0 F )V αt0 x t0 (I t 0 µf )Proj C (I γ f αt0 )(x t ) (I t 0 µf )Proj C (I γ f αt0 )(x t0 ) + (I tµf )Proj C (I γ f αt )(x t ) (I t 0 µf )Proj C (I γ f αt0 )(x t ) ( t 0 τ) x t x t0 + (I tµf )Proj C (I γ f αt )(x t ) (I tµf )Proj C (I γ f αt0 )(x t ) + (I tµf )Proj C (I γ f αt0 )(x t ) (I t 0 µf )Proj C (I γ f αt0 )(x t ) ( t 0 τ) x t x t0 + Proj C (I γ f αt )x t Proj C (I γ f αt0 )x t + t 0 µf Proj C (I γ f αt0 )(x t ) tµf Proj C (I γ f αt0 )(x t ) ( t 0 τ) x t x t0 + (I γ f αt )x t (I γ f αt0 )x t +µ t t 0 F Proj C (I γ f αt0 )(x t ) E-ISSN: Volume 3, 04
5 = ( t 0 τ) x t x t0 +γ α t α t0 x t +µ t t 0 F Proj C (I γ f αt0 )(x t ). Therefore, x t x t0 µ F Proj C (I γ f α t0 )x t t 0 τ t t 0 + γ xt t 0 τ α t α t0. Therefore x t x t0 as t t 0. This means x t is continuous. Our main result below shows that {x t } converges in norm to a minimizer of () which solves a variational inequality. Theorem 9 Assume that the minimization problem () is consistent and let S denote its solution set. Assume that the gradient f is L-ism. Let F : C H is η-strongly monotone and κ-lipschitzian. Fix a constant µ satisfying 0 < µ < η/κ and a constant γ satisfying 0 < γ < /L. Assume also that t (0, ) satisfies the condition α t = o(t). Let {x t } be defined by (). Then x t converges in norm, as t 0, to a minimizer of () which solves the variational inequality F x, x x 0, x S. (3) Equivalently, we have Proj S (I F )x = x. Proof: It is easy to see that the uniqueness of a solution of the variational inequality (3). Let x S denote the unique solution of (3). Let us prove that x t x (t 0). Set and y t := (I tµf )V αt (x t ) V := Proj C (I γ f). Then we have x t = Proj C y t. For a given x S, we write x t x = Proj C y t x = Proj C y t y t + y t x = Proj C y t y t + (I tµf )V αt (x t ) (I tµf ) x tµf ( x). Since Proj C is the metric projection from H onto C, we have y t x t, x x t 0. It follows that x t x = Proj C y t y t, Proj C y t x + (I tµf )V αt (x t ) (I tµf ) x, x t x tµ F ( x), x t x (I tµf )V αt (x t ) (I tµf ) x, x t x tµ F ( x), x t x (I tµf )V αt (x t ) (I tµf )V x x t x tµ F ( x), x t x ( tτ) V αt (x t ) V x x t x tµ F ( x), x t x = ( tτ) V αt (x t ) V αt x +V αt x V x x t x tµ F ( x), x t x ( tτ) x t x + γα t x x t x tµ F ( x), x t x. To derive that x t x τ µf ( x), x t x + α t t γ τ x x t x. (4) Since {x t } is bounded as t 0, we see that if {t n } is a sequence in (0,) such that t n 0 and x tn x, then by (4), x tn x. We may further assume that α tn 0. Notice that Proj C (I γ f) is nonexpansive. It turns out that x tn Proj C (I γ f)x tn x tn Proj C (I γ fα tn )x tn + Proj C (I γ fα tn )x tn Proj C (I γ f)x tn x tn Proj C (I γ fα tn )x tn + γα tn x tn. From the boundedness of {x t } and lim Proj C(I γ f αt )x t x t = 0, t 0 we conclude that lim x t n n Proj C (I γ f)x tn = 0. Since x tn x, by lemma 6, we obtain x = Proj C (I γ f) x. This shows that x S. We next prove that x is a solution of the variational inequality (3). Since x t = Proj C y t y t + (I tµf )V αt (x t ), E-ISSN: Volume 3, 04
6 we can derive that F (x t ) = tµ (Proj Cy t y t ) + tµ ((I tµf )V α t (x t ) (I tµf )(x t )). Note that, for x S, Proj C y t y t, Proj C y t x 0. Therefore, for x S, F (x t ), x t x = tµ Proj Cy t y t, x t x + (I tµf ) tµ V αt (x t ) (I tµf )(x t ), x t x tµ (I tµf )V α t (x t ) (I tµf )(x t ), x t x = tµ (I tµf )(x t) (I tµf )V αt (x t ), x t x = tµ (I V α t )(x t ), x t x + F (x t ) F V αt (x t ), x t x = tµ (I V α t )(x t ) (I V αt ) x, x t x tµ (I V α t ) x, x t x + F (x t ) F V αt (x t ), x t x tµ (I V α t ) x, x t x + F (x t ) F V αt (x t ), x t x tµ V α t x V x x t x + F (x t ) F V αt (x t ), x t x tµ γα t x x t x + F (x t ) F V αt (x t ), x t x. Since Proj C (I γ f αt ) is nonexpansive, we obtain that I Proj C (I γ f αt ) is monotone, i.e. (I Proj C (I γ f αt ))(x t ) (I Proj C (I γ f αt ))( x), x t x 0. Taking the limit through t = t n 0 ensures that x is a solution to (3). That is to say F ( x), x x 0. Hence x = x by uniqueness. Therefore x t x as t 0. The variational inequality (3) can be written as (I F )x x, x x 0, x S. So, by lemma 7, it is equivalent to the fixed point equation P S (I F )x = x. Finally, we consider the following hybrid algorithm for the idea of regularization approach: { yn = (I µθ n F )Proj C (I γ f αn )x n, (5) x n+ = Proj C y n, n 0, where the initial guess x 0 is selected in C arbitrarily. Assume that the parameter γ satisfies the condition 0 < γ < /L and in addition, that the following conditions are satisfied for {α n } and {θ n} (0, ): (i) θ n 0; (ii) (iii) (iv) (v) α lim n n θ n = 0; θ n = ; θ n+ θ n < ; α n+ α n <. Theorem 0 Assume that the minimization problem () is consistent and the gradient f is L -ism. Let F : C H is η-strongly monotone and κ- Lipschitzian. Fix a constant µ satisfying 0 < µ < η/κ and a constant γ satisfying 0 < γ < /L. Let {x n } be generated by algorithm (5) with the sequences {θ n } and {α n } satisfying the above conditions. Then the sequence {x n } converges in norm to x that is obtained in Theorem 9. Proof: () The sequence {x n } is bounded. Set First we see that V αn := Proj C (I γ f αn ). (I µθ n F )Proj C (I γ f αn )x n (I µθ n F )Proj C (I γ f αn )x n = (I µθ n F )V αn x n (I µθ n F )V αn x n = V αn x n V αn x n µθ n (F V αn x n F V αn x n ) = V αn x n V αn x n +µ θ n F V αn x n F V αn x n µθ n V αn x n V αn x n, F V αn x n F V αn x n E-ISSN: Volume 3, 04
7 V αn x n V αn x n +µ θnk V αn x n V αn x n µθ n η V αn x n V αn x n = ( µθ n η + µ θnk ) V αn x n V αn x n ( µθ n(η µθ n )k ) V αn x n V αn x n ( θ n τ) x n x n. Indeed, we have, for x S, x n+ x = Proj C y n Proj C x y n x = θ n µf ( x) +(I µθ n F )Proj C (I γ f αn )x n (I µθ n F )Proj C (I γ f) x µθ n F ( x) +( θ n τ) Proj C (I γ f αn )x n Proj C (I γ f) x = µθ n F ( x) +( θ n τ) Proj C (I γ f αn )x n Proj C (I γ f αn ) x +Proj C (I γ f αn ) x Proj C (I γ f) x θ n µf ( x) + ( θ n τ) x n x + γα n x = ( θ n τ) x n x By induction +θ n ( µf ( x) + γ α n x ) θ n ( θ n τ) x n x +θ n ( µf ( x) + γb x ) max { x n x, } τ ( µf ( x) + γb x ). x n x max { x 0 x, In particular, {x n } is bounded. µf ( x) + γb x }. τ () We prove that x n+ x n 0 as n. Let M be a constant such that { } M > max sup µ F V αk x n, sup γ x n. κ,n 0 n 0 We compute x n+ x n = Proj C y n Proj C y n y n y n (I µθ n F )Proj C (I γ f αn )x n and (I µθ n F )Proj C (I γ f αn )x n + (I µθ n F )Proj C (I γ f αn )x n (I µθ n F )Proj C (I γ f αn )x n ( θ n τ) x n x n + (I µθ n F )Proj C (I γ f αn )x n (I µθ n F )Proj C (I γ f αn )x n + (I µθ n F )Proj C (I γ f αn )x n (I µθ n F )Proj C (I γ f αn )x n ( θ n τ) x n x n + V αn x n V αn x n + (I µθ n F )V αn x n (I µθ n F )V αn x n = ( θ n τ) x n x n + V αn x n V αn x n +µ θ n θ n F V αn x n ( θ n τ) x n x n + M θ n θ n + V αn x n V αn x n V αn x n V αn x n = Proj C (I γ f αn )x n Proj C (I γ f αn )x n (I γ f αn )x n (I γ f αn )x n = γ f αn (x n ) + γ f αn (x n ) = γ α n α n x n. (6) Combining (6) and (6), we can obtain x n+ x n ( τθ n ) x n x n +M( θ n θ n + α n α n ). (7) Apply lemma 5 to (7) to conclude that x n+ x n 0 as n. (3) We prove that ω w (x n ) S. Let ˆx ω w (x n ) and assume that x nk ˆx for some subsequence {x nk } k= of {x n}. We may further assume that α nk 0. Set V := Proj C (I γ f). Notice that V is nonexpansive and F ixv turns out that = S. It x nk V x nk x nk V αnk x nk + V αnk x nk V x nk x nk x nk + + x nk + V αnk x nk + V αnk x nk V x nk E-ISSN: Volume 3, 04
8 = x nk x nk + + Proj C y nk Proj C (I γ f αnk )x nk + Proj C (I γ f αnk )x nk Proj C (I γ f)x nk x nk x nk + + θ nk µf V αnk x nk +γα nk x nk x nk x nk + + M(θ nk + α nk ) 0 as k. So lemma 6 guarantees that ω w (x n ) F ixv = S. (4) We prove that x n x as n, where x is the unique solution of the V I (3). First we observe that there is some ˆx ω w (x n ) S. Such that lim sup F x, x n x = F x, ˆx x 0. (8) n We now compute x n+ x = Proj C y n Proj C x y n x = (I µθ n F )V αn (x n ) (I µθ n F )V x θ n µf x (I µθ n F )V αn (x n ) (I µθ n F )V x θ n µf x, x n+ x ( θ n τ) V αn (x n ) V αn x +V αn x V x µθ n F x, x n+ x = ( θ n τ) ( V αn (x n ) V αn x + V αn x V x + V αn (x n ) V αn x, V αn x V x ) µθ n F x, x n+ x ( θ n τ) ( x n x + γ αn x +γα n x x n x ) µθ n F x, x n+ x ( θ n τ) x n x +γ αn x + γα n x x n x µθ n F x, x n+ x = ( θ n τ) x n x +θ n [γ α n x x n x + γ α n α n x θ n θ n µf x, x n+ x + θ n τ x n x ] = ( θ n ) x n x + θ n χ n. (9) where θ n = θ n τ. Therefore, x n+ x ( θ n ) x n x + θ n χ n. χ n = τ [γ α n θ n x x n x + γ α α n n θ n x µf x, x n+ x + θ n τ x n x ]. Applying lemma 5 to the inequality (9), together with (8), we get x n x 0 as n. 4 Application In this section, we give an application of Theorem 0 to the split feasibility problem (say SFP, for short), which was introduced by Censor and Elfving []. Since its inception in 994, the split feasibility problem (SFP) has received much attention due to its applications in signal processing and image reconstruction, with particular progress in intensity-modulated radiation therapy. The SFP can mathematically be formulated as the problem of finding a point x with the property x C, and Ax Q, (0) where C and Q are nonempty, closed and convex subset of Hilbert space H and H, respectively. A : H H is a bounded linear operator. It is clear that x is a solution to the split feasibility problem () if and only if x C and Ax Proj Q Ax = 0. We define the proximity function f by f(x) = Ax Proj QAx, and consider the constrained convex minimization problem min f(x) = min xϵc xϵc Ax Proj QAx. () Then x solves the split feasibility problem () if and only if x solves the minimization problem () with the minimize equal to 0. Byrne [5] introduced the so-called CQ algorithm to solve the (SFP). x n+ = Proj C (I γa (I Proj Q )A)x n, n 0, () where 0 < γ < / A. He obtained that the sequence {x n } generated by (3) converges weekly to a solution of the (SFP). In order to obtain strong convergence iterative sequence to solve the (SFP). We propose the following algorithm: y n = (I µθ n F )Proj C (I γ(a (I Proj Q )A +α n I))x n, x n+ = Proj C y n, n 0, (3) E-ISSN: Volume 3, 04
9 where the initial guess is x 0 C and F : C H is η-strongly monotone and κ-lipschitzian with constants κ > 0, η > 0 such that 0 < µ < η/κ. We can show that the sequence {x n } generated by (4) converges strongly to a solution of the (SFP) () if the sequence {θ n } (0, ) and the sequence {α n } of parameters satisfy appropriate conditions. Applying Theorem 0, we obtain the following result. Theorem Assume that the split feasibility problem () is consistent. Let the sequence {x n } be generated by (4), where the sequence {θ n } (0, ) and the sequence {α n } satisfy the conditions (i)-(v). Then the sequence {x n } converges strongly to a solution of the split feasibility problem (). Proof: By the definition of the proximity function f, we have Hence, we obtain and f(x) = A (I Proj Q )Ax. f(x) f(y) = A (I Proj Q )Ax A (I Proj Q )Ay = A (I Proj Q )Ax A (I Proj Q )Ay, A (I Proj Q )Ax A (I Proj Q )Ay = (I Proj Q )Ax (I Proj Q )Ay, AA (I Proj Q )Ax AA (I Proj Q )Ay AA (I Proj Q )Ax (I Proj Q )Ay = AA (I Proj Q )Ax (I Proj Q )Ay, (I Proj Q )Ax (I Proj Q )Ay = AA Ax Ay (Proj Q Ax Proj Q Ay), (I Proj Q )Ax (I Proj Q )Ay = AA ( x y, A (I Proj Q )Ax A (I Proj Q )Ay Proj Q Ax Proj Q Ay, (I Proj Q )Ax (I Proj Q )Ay ) Proj Q Ax Proj Q Ay, (I Proj Q )Ax (I Proj Q )Ay = Proj Q Ax Proj Q Ay, Ax Ay (Proj Q Ax Proj Q Ay) = Ax Ay, Proj Q Ax Proj Q Ay Proj Q Ax Proj Q Ay Proj Q Ax Proj Q Ay Proj Q Ax Proj Q Ay = 0. Therefore, f(x) f(y) = A (I Proj Q )Ax A (I Proj Q )Ay AA x y, A (I Proj Q )Ax A (I Proj Q )Ay. That is, x y, f(x) f(y) = x y, A (I Proj Q )Ax A (I Proj Q )Ay AA A (I Proj Q )Ax = A (I Proj Q )Ay A f(x) f(y). Hence, f is A -ism. Set f αn (x) = f(x) + α n x. Consequently, f αn (x) = f(x) + α n I(x) = A (I Proj Q )Ax + α n x. Then the iterative scheme (4) is equivalent to { yn = (I µθ n F )Proj C (I γ f αn )x n, x n+ = Proj C y n, n 0, (4) where the initial guess is x 0 C and the parameters {θ n } (0, ) and {α n} satisfy the above conditions (i)-(v). Due to Theorem 0, we have the conclusion immediately. 5 Numerical Result In this section, we consider the following simple example to illustrate the effectiveness, realization, and convergence of the algorithm in Theorem. Example In part 4, we assume that H = H = IR 3. Take F = I with Lipschitz constant κ = and strongly monotone constant η =. Give the parameters θ n = n+, α n = for every n 0. Fix (n+) µ = γ =. In the split feasibility problem (SFP), we take A = E-ISSN: Volume 3, 04
10 b = 5 7 7, x 0 = 0. The SFP can be formulated as the problem of finding a point x with the property x C and Ax Q where C = H, Q = {b} H. This implies that x is the solution of system of linear equations Ax = b, and x = 5. 3 Then by Theorem 4., the sequence {x n } is generated by x n+ = (I (n + ) I) P C (x n A Ax n + A b ) (n + ) x n. As n, we have {x n } x = (, 5, 3) T. Table : x 0 = (0,, ) T n(iterative number) Error(n) Table : x 0 = (0,, ) T n(iterative number) x n (iterative point) 55 (.9905, 4.990,.9905) T 550 (.9989, ,.9985) T 00 (.9999, ,.9998) T From the computer programming point of view, the algorithms are easier to implement in this paper. 6 Conclusion Methods for solving constrained convex minimization problem have been extensively studied in Hilbert space. But to the best of our knowledge, in this paper, it would probably be the first time in the literature that we use the idea of regularization to establish a different hybrid method for finding a minimizer of constrained convex minimization problems and also prove some strong convergence theorems. Finally, we apply this algorithm to the split feasibility problem. Acknowledgements: The authors thank the referees for their helping comments, which notably improved the presentation of this manuscript. This work was supported by the Fundamental Research Funds for the Central Universities (No. 303K004). References: [] E.-S. Levitin and B.-T. Polyak, Constrained minimization methods Zh. Vychisl. Mat. Mat. Fiz., 6, 966, pp [] P. Kumam, A new hybrid iterative method for solution of equilibrium problems and fixed point problems for an inverse strongly monotone operator and a nonexpansive mapping, J. Appl. Math. Comput., 9, 009, pp [3] Y. Yao and H.-K. Xu, Iterative methods for finding minimum-norm fixed points of nonexpansive mappings with applications, Optimization, 60, 0, pp [4] S. He, W. Sun, New hybrid steepest descent algorithms for variational inequalities over the common fixed points set of infinite nonexpansive mappings, WSEAS Transactions on Mathematics, (), 0, pp [5] M. Su and H.-K. Xu, Remarks on the gradientprojection algorithm, J. Nonliner Anal. Optim.,, 00, pp [6] R.-X Ni, Strong Convergence of a Hybrid Projection Algorithm for Approximation of a Common Element of Three Sets in Banach Spaces, WSEAS Transactions on Mathematics, 03, pp [7] P. Kumam, A Hybrid approximation method for equilibrium and fixed point problems for a monotone mapping and a nonexpansive mapping, Nonlinear Anal.: Hybrid Syst., (4), 008, pp [8] P.-H. Calamai and J.-J. Moré, Projected gradient methods for linearly constrained problems, Math. Program, 39, 987, pp [9] F. Wang and H.-K. Xu, Approximating curve and strong convergence of the CQ algorithm for the split feasibility problem, J. Inequal. Appl., 00, [0] J.-S. Jung, Strong convergence of composite iterative methods for equilibrium problems and fixed point problems, Appl. Math. Comput., 3, 009, pp E-ISSN: Volume 3, 04
11 [] J.-S. Jung, Strong convergence of iterative methods for k-strictly pseudo-contractive mappings in Hilbert spaces, Appl. Math. Comput., 5, 00, pp [] Y. Censor and T. Elfving, A multiprojection algorithm using Bregman projections in a product space, Numer. Algorithms, 8, 994, -39. [3] H.-K. Xu, A variable Krasnosel skii-mann algorithm and the multiple-set split feasibility problem, Inverse Probl.,, 006, [4] H.-K. Xu, Iterative methods for the split feasibility problem in infinite-dimensional Hilbert spaces, Inverse Probl., 6, 00, [5] C. Byrne, A unified treatment of some iterative algorithms in signal processing and image reconstruction, Inverse Probl. 0, 004, [6] S.-N He, J. Guo, Algorithms for Finding the Minimum Norm Solution of Hierarchical Fixed Point Problems, WSEAS Transactions on Mathematics,, 03, [7] G. Lopez, V. Martin and H.-K. Xu, Perturbation techniques for nonexpansive mappings with applications, Nonliner Anal., Real World Appl., 0, 009, pp [8] H.-K. Xu, Iterative methods for the split feasibility problem in infinite-dimensional Hilbert spaces, Inverse Problems, 6, 00, [9] I. Yamada, The hybrid steepest descent for the variational inequality problems over the intersection of fixed points sets of nonexpansive mapping, in: D. Butnariu, Y. Censor, S. Reich(Eds), Inherently Parallel Algorithms in Feasibility and Optimization and Their Application, Elservier, New York. 00, pp [0] H.-K. Xu, Averaged Mappings and the Gradient- Projection Algorithm, J. Optim. Theory Appl., 50, 0, pp [] C. Martinez-Yanes and H.-K. Xu, Strong convergence of the CQ method for fixed-point iteration processes, Nonlinear Anal., 64, 006, pp [] P.-L. Combettes, Solving monotone inclusions via compositions of nonexpansive averaged operators, Optimization, 53, 004, pp [3] H. Brezis, Operateur Maximaux Monotones et Semi-Groups de Contractions dans les Espaces de Hilbert, North-Holland, Amsterdam [4] D.-P. Bretarkas and E.-M. Gafin, Projection methods for variational inequalities with applications to the traffic assignment problem, Math. Program. Stud.,7, 98, pp [5] D. Han and H.-K. Lo, Solving non-additive traffic assignment problemsa descent method for cocoercive variational inequalities, Eur. J. Oper. Res., 59, 004, pp [6] H.-K. Xu, Iterative algorithms for nonlinear operators, J. Lond. Math. Soc., 66, 00, pp [7] K. Geobel and W.-A. Kirk, Topics in Metric Fixed Point Theory, in: Cambridge Stud. Adv. Math., vol. Cambridge Univ. Press, 8, 990, pp E-ISSN: Volume 3, 04
Iterative algorithms based on the hybrid steepest descent method for the split feasibility problem
Available online at www.tjnsa.com J. Nonlinear Sci. Appl. 9 (206), 424 4225 Research Article Iterative algorithms based on the hybrid steepest descent method for the split feasibility problem Jong Soo
More informationA New Modified Gradient-Projection Algorithm for Solution of Constrained Convex Minimization Problem in Hilbert Spaces
A New Modified Gradient-Projection Algorithm for Solution of Constrained Convex Minimization Problem in Hilbert Spaces Cyril Dennis Enyi and Mukiawa Edwin Soh Abstract In this paper, we present a new iterative
More informationA general iterative algorithm for equilibrium problems and strict pseudo-contractions in Hilbert spaces
A general iterative algorithm for equilibrium problems and strict pseudo-contractions in Hilbert spaces MING TIAN College of Science Civil Aviation University of China Tianjin 300300, China P. R. CHINA
More informationNew Iterative Algorithm for Variational Inequality Problem and Fixed Point Problem in Hilbert Spaces
Int. Journal of Math. Analysis, Vol. 8, 2014, no. 20, 995-1003 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/ijma.2014.4392 New Iterative Algorithm for Variational Inequality Problem and Fixed
More informationIterative Convex Optimization Algorithms; Part One: Using the Baillon Haddad Theorem
Iterative Convex Optimization Algorithms; Part One: Using the Baillon Haddad Theorem Charles Byrne (Charles Byrne@uml.edu) http://faculty.uml.edu/cbyrne/cbyrne.html Department of Mathematical Sciences
More informationViscosity approximation methods for the implicit midpoint rule of asymptotically nonexpansive mappings in Hilbert spaces
Available online at www.tjnsa.com J. Nonlinear Sci. Appl. 9 016, 4478 4488 Research Article Viscosity approximation methods for the implicit midpoint rule of asymptotically nonexpansive mappings in Hilbert
More informationResearch Article Strong Convergence of a Projected Gradient Method
Applied Mathematics Volume 2012, Article ID 410137, 10 pages doi:10.1155/2012/410137 Research Article Strong Convergence of a Projected Gradient Method Shunhou Fan and Yonghong Yao Department of Mathematics,
More informationA NEW ITERATIVE METHOD FOR THE SPLIT COMMON FIXED POINT PROBLEM IN HILBERT SPACES. Fenghui Wang
A NEW ITERATIVE METHOD FOR THE SPLIT COMMON FIXED POINT PROBLEM IN HILBERT SPACES Fenghui Wang Department of Mathematics, Luoyang Normal University, Luoyang 470, P.R. China E-mail: wfenghui@63.com ABSTRACT.
More informationA Relaxed Explicit Extragradient-Like Method for Solving Generalized Mixed Equilibria, Variational Inequalities and Constrained Convex Minimization
, March 16-18, 2016, Hong Kong A Relaxed Explicit Extragradient-Like Method for Solving Generalized Mixed Equilibria, Variational Inequalities and Constrained Convex Minimization Yung-Yih Lur, Lu-Chuan
More informationThe Split Hierarchical Monotone Variational Inclusions Problems and Fixed Point Problems for Nonexpansive Semigroup
International Mathematical Forum, Vol. 11, 2016, no. 8, 395-408 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/imf.2016.6220 The Split Hierarchical Monotone Variational Inclusions Problems and
More informationResearch Article Some Krasnonsel skiĭ-mann Algorithms and the Multiple-Set Split Feasibility Problem
Hindawi Publishing Corporation Fixed Point Theory and Applications Volume 2010, Article ID 513956, 12 pages doi:10.1155/2010/513956 Research Article Some Krasnonsel skiĭ-mann Algorithms and the Multiple-Set
More informationSynchronal Algorithm For a Countable Family of Strict Pseudocontractions in q-uniformly Smooth Banach Spaces
Int. Journal of Math. Analysis, Vol. 8, 2014, no. 15, 727-745 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/ijma.2014.212287 Synchronal Algorithm For a Countable Family of Strict Pseudocontractions
More informationAlgorithms for finding minimum norm solution of equilibrium and fixed point problems for nonexpansive semigroups in Hilbert spaces
Available online at www.tjnsa.com J. Nonlinear Sci. Appl. 9 (6), 37 378 Research Article Algorithms for finding minimum norm solution of equilibrium and fixed point problems for nonexpansive semigroups
More informationIterative common solutions of fixed point and variational inequality problems
Available online at www.tjnsa.com J. Nonlinear Sci. Appl. 9 (2016), 1882 1890 Research Article Iterative common solutions of fixed point and variational inequality problems Yunpeng Zhang a, Qing Yuan b,
More informationSome unified algorithms for finding minimum norm fixed point of nonexpansive semigroups in Hilbert spaces
An. Şt. Univ. Ovidius Constanţa Vol. 19(1), 211, 331 346 Some unified algorithms for finding minimum norm fixed point of nonexpansive semigroups in Hilbert spaces Yonghong Yao, Yeong-Cheng Liou Abstract
More informationGENERAL NONCONVEX SPLIT VARIATIONAL INEQUALITY PROBLEMS. Jong Kyu Kim, Salahuddin, and Won Hee Lim
Korean J. Math. 25 (2017), No. 4, pp. 469 481 https://doi.org/10.11568/kjm.2017.25.4.469 GENERAL NONCONVEX SPLIT VARIATIONAL INEQUALITY PROBLEMS Jong Kyu Kim, Salahuddin, and Won Hee Lim Abstract. In this
More informationThe Split Common Fixed Point Problem for Asymptotically Quasi-Nonexpansive Mappings in the Intermediate Sense
International Mathematical Forum, Vol. 8, 2013, no. 25, 1233-1241 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/imf.2013.3599 The Split Common Fixed Point Problem for Asymptotically Quasi-Nonexpansive
More informationExtensions of the CQ Algorithm for the Split Feasibility and Split Equality Problems
Extensions of the CQ Algorithm for the Split Feasibility Split Equality Problems Charles L. Byrne Abdellatif Moudafi September 2, 2013 Abstract The convex feasibility problem (CFP) is to find a member
More informationOn the split equality common fixed point problem for quasi-nonexpansive multi-valued mappings in Banach spaces
Available online at www.tjnsa.com J. Nonlinear Sci. Appl. 9 (06), 5536 5543 Research Article On the split equality common fixed point problem for quasi-nonexpansive multi-valued mappings in Banach spaces
More informationSTRONG CONVERGENCE THEOREMS BY A HYBRID STEEPEST DESCENT METHOD FOR COUNTABLE NONEXPANSIVE MAPPINGS IN HILBERT SPACES
Scientiae Mathematicae Japonicae Online, e-2008, 557 570 557 STRONG CONVERGENCE THEOREMS BY A HYBRID STEEPEST DESCENT METHOD FOR COUNTABLE NONEXPANSIVE MAPPINGS IN HILBERT SPACES SHIGERU IEMOTO AND WATARU
More informationThe Generalized Viscosity Implicit Rules of Asymptotically Nonexpansive Mappings in Hilbert Spaces
Applied Mathematical Sciences, Vol. 11, 2017, no. 12, 549-560 HIKARI Ltd, www.m-hikari.com https://doi.org/10.12988/ams.2017.718 The Generalized Viscosity Implicit Rules of Asymptotically Nonexpansive
More informationMulti-step hybrid steepest-descent methods for split feasibility problems with hierarchical variational inequality problem constraints
Available online at www.tjnsa.com J. Nonlinear Sci. Appl. 9 (2016), 4148 4166 Research Article Multi-step hybrid steepest-descent methods for split feasibility problems with hierarchical variational inequality
More informationThai Journal of Mathematics Volume 14 (2016) Number 1 : ISSN
Thai Journal of Mathematics Volume 14 (2016) Number 1 : 53 67 http://thaijmath.in.cmu.ac.th ISSN 1686-0209 A New General Iterative Methods for Solving the Equilibrium Problems, Variational Inequality Problems
More informationOn an iterative algorithm for variational inequalities in. Banach space
MATHEMATICAL COMMUNICATIONS 95 Math. Commun. 16(2011), 95 104. On an iterative algorithm for variational inequalities in Banach spaces Yonghong Yao 1, Muhammad Aslam Noor 2,, Khalida Inayat Noor 3 and
More informationWEAK AND STRONG CONVERGENCE OF AN ITERATIVE METHOD FOR NONEXPANSIVE MAPPINGS IN HILBERT SPACES
Applicable Analysis and Discrete Mathematics available online at http://pemath.et.b.ac.yu Appl. Anal. Discrete Math. 2 (2008), 197 204. doi:10.2298/aadm0802197m WEAK AND STRONG CONVERGENCE OF AN ITERATIVE
More informationCONVERGENCE THEOREMS FOR STRICTLY ASYMPTOTICALLY PSEUDOCONTRACTIVE MAPPINGS IN HILBERT SPACES. Gurucharan Singh Saluja
Opuscula Mathematica Vol 30 No 4 2010 http://dxdoiorg/107494/opmath2010304485 CONVERGENCE THEOREMS FOR STRICTLY ASYMPTOTICALLY PSEUDOCONTRACTIVE MAPPINGS IN HILBERT SPACES Gurucharan Singh Saluja Abstract
More informationStrong Convergence Theorem by a Hybrid Extragradient-like Approximation Method for Variational Inequalities and Fixed Point Problems
Strong Convergence Theorem by a Hybrid Extragradient-like Approximation Method for Variational Inequalities and Fixed Point Problems Lu-Chuan Ceng 1, Nicolas Hadjisavvas 2 and Ngai-Ching Wong 3 Abstract.
More informationStrong Convergence of Two Iterative Algorithms for a Countable Family of Nonexpansive Mappings in Hilbert Spaces
International Mathematical Forum, 5, 2010, no. 44, 2165-2172 Strong Convergence of Two Iterative Algorithms for a Countable Family of Nonexpansive Mappings in Hilbert Spaces Jintana Joomwong Division of
More informationSTRONG CONVERGENCE OF AN ITERATIVE METHOD FOR VARIATIONAL INEQUALITY PROBLEMS AND FIXED POINT PROBLEMS
ARCHIVUM MATHEMATICUM (BRNO) Tomus 45 (2009), 147 158 STRONG CONVERGENCE OF AN ITERATIVE METHOD FOR VARIATIONAL INEQUALITY PROBLEMS AND FIXED POINT PROBLEMS Xiaolong Qin 1, Shin Min Kang 1, Yongfu Su 2,
More informationOn nonexpansive and accretive operators in Banach spaces
Available online at www.isr-publications.com/jnsa J. Nonlinear Sci. Appl., 10 (2017), 3437 3446 Research Article Journal Homepage: www.tjnsa.com - www.isr-publications.com/jnsa On nonexpansive and accretive
More informationResearch Article An Iterative Algorithm for the Split Equality and Multiple-Sets Split Equality Problem
Abstract and Applied Analysis, Article ID 60813, 5 pages http://dx.doi.org/10.1155/014/60813 Research Article An Iterative Algorithm for the Split Equality and Multiple-Sets Split Equality Problem Luoyi
More informationViscosity Iterative Approximating the Common Fixed Points of Non-expansive Semigroups in Banach Spaces
Viscosity Iterative Approximating the Common Fixed Points of Non-expansive Semigroups in Banach Spaces YUAN-HENG WANG Zhejiang Normal University Department of Mathematics Yingbing Road 688, 321004 Jinhua
More informationResearch Article Algorithms for a System of General Variational Inequalities in Banach Spaces
Journal of Applied Mathematics Volume 2012, Article ID 580158, 18 pages doi:10.1155/2012/580158 Research Article Algorithms for a System of General Variational Inequalities in Banach Spaces Jin-Hua Zhu,
More informationINERTIAL ACCELERATED ALGORITHMS FOR SOLVING SPLIT FEASIBILITY PROBLEMS. Yazheng Dang. Jie Sun. Honglei Xu
Manuscript submitted to AIMS Journals Volume X, Number 0X, XX 200X doi:10.3934/xx.xx.xx.xx pp. X XX INERTIAL ACCELERATED ALGORITHMS FOR SOLVING SPLIT FEASIBILITY PROBLEMS Yazheng Dang School of Management
More informationThe Journal of Nonlinear Science and Applications
J. Nonlinear Sci. Appl. 2 (2009), no. 2, 78 91 The Journal of Nonlinear Science and Applications http://www.tjnsa.com STRONG CONVERGENCE THEOREMS FOR EQUILIBRIUM PROBLEMS AND FIXED POINT PROBLEMS OF STRICT
More informationShih-sen Chang, Yeol Je Cho, and Haiyun Zhou
J. Korean Math. Soc. 38 (2001), No. 6, pp. 1245 1260 DEMI-CLOSED PRINCIPLE AND WEAK CONVERGENCE PROBLEMS FOR ASYMPTOTICALLY NONEXPANSIVE MAPPINGS Shih-sen Chang, Yeol Je Cho, and Haiyun Zhou Abstract.
More informationHybrid Steepest Descent Method for Variational Inequality Problem over Fixed Point Sets of Certain Quasi-Nonexpansive Mappings
Hybrid Steepest Descent Method for Variational Inequality Problem over Fixed Point Sets of Certain Quasi-Nonexpansive Mappings Isao Yamada Tokyo Institute of Technology VIC2004 @ Wellington, Feb. 13, 2004
More informationSTRONG CONVERGENCE OF AN IMPLICIT ITERATION PROCESS FOR ASYMPTOTICALLY NONEXPANSIVE IN THE INTERMEDIATE SENSE MAPPINGS IN BANACH SPACES
Kragujevac Journal of Mathematics Volume 36 Number 2 (2012), Pages 237 249. STRONG CONVERGENCE OF AN IMPLICIT ITERATION PROCESS FOR ASYMPTOTICALLY NONEXPANSIVE IN THE INTERMEDIATE SENSE MAPPINGS IN BANACH
More informationResearch Article Self-Adaptive and Relaxed Self-Adaptive Projection Methods for Solving the Multiple-Set Split Feasibility Problem
Abstract and Applied Analysis Volume 01, Article ID 958040, 11 pages doi:10.1155/01/958040 Research Article Self-Adaptive and Relaxed Self-Adaptive Proection Methods for Solving the Multiple-Set Split
More informationResearch Article Cyclic Iterative Method for Strictly Pseudononspreading in Hilbert Space
Journal of Applied Mathematics Volume 2012, Article ID 435676, 15 pages doi:10.1155/2012/435676 Research Article Cyclic Iterative Method for Strictly Pseudononspreading in Hilbert Space Bin-Chao Deng,
More informationCONVERGENCE OF HYBRID FIXED POINT FOR A PAIR OF NONLINEAR MAPPINGS IN BANACH SPACES
International Journal of Analysis and Applications ISSN 2291-8639 Volume 8, Number 1 2015), 69-78 http://www.etamaths.com CONVERGENCE OF HYBRID FIXED POINT FOR A PAIR OF NONLINEAR MAPPINGS IN BANACH SPACES
More informationON A HYBRID PROXIMAL POINT ALGORITHM IN BANACH SPACES
U.P.B. Sci. Bull., Series A, Vol. 80, Iss. 3, 2018 ISSN 1223-7027 ON A HYBRID PROXIMAL POINT ALGORITHM IN BANACH SPACES Vahid Dadashi 1 In this paper, we introduce a hybrid projection algorithm for a countable
More informationWEAK CONVERGENCE THEOREMS FOR EQUILIBRIUM PROBLEMS WITH NONLINEAR OPERATORS IN HILBERT SPACES
Fixed Point Theory, 12(2011), No. 2, 309-320 http://www.math.ubbcluj.ro/ nodeacj/sfptcj.html WEAK CONVERGENCE THEOREMS FOR EQUILIBRIUM PROBLEMS WITH NONLINEAR OPERATORS IN HILBERT SPACES S. DHOMPONGSA,
More informationThe viscosity technique for the implicit midpoint rule of nonexpansive mappings in
Xu et al. Fixed Point Theory and Applications 05) 05:4 DOI 0.86/s3663-05-08-9 R E S E A R C H Open Access The viscosity technique for the implicit midpoint rule of nonexpansive mappings in Hilbert spaces
More informationExistence and convergence theorems for the split quasi variational inequality problems on proximally smooth sets
Available online at www.tjnsa.com J. Nonlinear Sci. Appl. 9 (206), 2364 2375 Research Article Existence and convergence theorems for the split quasi variational inequality problems on proximally smooth
More informationThe Strong Convergence Theorems for the Split Equality Fixed Point Problems in Hilbert Spaces
Applied Mathematical Sciences, Vol. 2, 208, no. 6, 255-270 HIKARI Ltd, www.m-hikari.com https://doi.org/0.2988/ams.208.86 The Strong Convergence Theorems for the Split Equality Fixed Point Problems in
More informationMonotone variational inequalities, generalized equilibrium problems and fixed point methods
Wang Fixed Point Theory and Applications 2014, 2014:236 R E S E A R C H Open Access Monotone variational inequalities, generalized equilibrium problems and fixed point methods Shenghua Wang * * Correspondence:
More informationAn Algorithm for Solving Triple Hierarchical Pseudomonotone Variational Inequalities
, March 15-17, 2017, Hong Kong An Algorithm for Solving Triple Hierarchical Pseudomonotone Variational Inequalities Yung-Yih Lur, Lu-Chuan Ceng and Ching-Feng Wen Abstract In this paper, we introduce and
More informationWEAK CONVERGENCE OF RESOLVENTS OF MAXIMAL MONOTONE OPERATORS AND MOSCO CONVERGENCE
Fixed Point Theory, Volume 6, No. 1, 2005, 59-69 http://www.math.ubbcluj.ro/ nodeacj/sfptcj.htm WEAK CONVERGENCE OF RESOLVENTS OF MAXIMAL MONOTONE OPERATORS AND MOSCO CONVERGENCE YASUNORI KIMURA Department
More informationConvergence Theorems of Approximate Proximal Point Algorithm for Zeroes of Maximal Monotone Operators in Hilbert Spaces 1
Int. Journal of Math. Analysis, Vol. 1, 2007, no. 4, 175-186 Convergence Theorems of Approximate Proximal Point Algorithm for Zeroes of Maximal Monotone Operators in Hilbert Spaces 1 Haiyun Zhou Institute
More informationStrong Convergence Theorem of Split Equality Fixed Point for Nonexpansive Mappings in Banach Spaces
Applied Mathematical Sciences, Vol. 12, 2018, no. 16, 739-758 HIKARI Ltd www.m-hikari.com https://doi.org/10.12988/ams.2018.8580 Strong Convergence Theorem of Split Euality Fixed Point for Nonexpansive
More informationStrong convergence theorem for amenable semigroups of nonexpansive mappings and variational inequalities
RESEARCH Open Access Strong convergence theorem for amenable semigroups of nonexpansive mappings and variational inequalities Hossein Piri * and Ali Haji Badali * Correspondence: h. piri@bonabetu.ac.ir
More informationVariational inequalities for fixed point problems of quasi-nonexpansive operators 1. Rafał Zalas 2
University of Zielona Góra Faculty of Mathematics, Computer Science and Econometrics Summary of the Ph.D. thesis Variational inequalities for fixed point problems of quasi-nonexpansive operators 1 by Rafał
More informationConvergence of Fixed-Point Iterations
Convergence of Fixed-Point Iterations Instructor: Wotao Yin (UCLA Math) July 2016 1 / 30 Why study fixed-point iterations? Abstract many existing algorithms in optimization, numerical linear algebra, and
More informationCONVERGENCE OF THE STEEPEST DESCENT METHOD FOR ACCRETIVE OPERATORS
PROCEEDINGS OF THE AMERICAN MATHEMATICAL SOCIETY Volume 127, Number 12, Pages 3677 3683 S 0002-9939(99)04975-8 Article electronically published on May 11, 1999 CONVERGENCE OF THE STEEPEST DESCENT METHOD
More informationPROXIMAL POINT ALGORITHMS INVOLVING FIXED POINT OF NONSPREADING-TYPE MULTIVALUED MAPPINGS IN HILBERT SPACES
PROXIMAL POINT ALGORITHMS INVOLVING FIXED POINT OF NONSPREADING-TYPE MULTIVALUED MAPPINGS IN HILBERT SPACES Shih-sen Chang 1, Ding Ping Wu 2, Lin Wang 3,, Gang Wang 3 1 Center for General Educatin, China
More informationA double projection method for solving variational inequalities without monotonicity
A double projection method for solving variational inequalities without monotonicity Minglu Ye Yiran He Accepted by Computational Optimization and Applications, DOI: 10.1007/s10589-014-9659-7,Apr 05, 2014
More informationCommon fixed points of two generalized asymptotically quasi-nonexpansive mappings
An. Ştiinţ. Univ. Al. I. Cuza Iaşi. Mat. (N.S.) Tomul LXIII, 2017, f. 2 Common fixed points of two generalized asymptotically quasi-nonexpansive mappings Safeer Hussain Khan Isa Yildirim Received: 5.VIII.2013
More informationHAIYUN ZHOU, RAVI P. AGARWAL, YEOL JE CHO, AND YONG SOO KIM
Georgian Mathematical Journal Volume 9 (2002), Number 3, 591 600 NONEXPANSIVE MAPPINGS AND ITERATIVE METHODS IN UNIFORMLY CONVEX BANACH SPACES HAIYUN ZHOU, RAVI P. AGARWAL, YEOL JE CHO, AND YONG SOO KIM
More informationA generalized forward-backward method for solving split equality quasi inclusion problems in Banach spaces
Available online at www.isr-publications.com/jnsa J. Nonlinear Sci. Appl., 10 (2017), 4890 4900 Research Article Journal Homepage: www.tjnsa.com - www.isr-publications.com/jnsa A generalized forward-backward
More informationA GENERALIZATION OF THE REGULARIZATION PROXIMAL POINT METHOD
A GENERALIZATION OF THE REGULARIZATION PROXIMAL POINT METHOD OGANEDITSE A. BOIKANYO AND GHEORGHE MOROŞANU Abstract. This paper deals with the generalized regularization proximal point method which was
More informationConvergence to Common Fixed Point for Two Asymptotically Quasi-nonexpansive Mappings in the Intermediate Sense in Banach Spaces
Mathematica Moravica Vol. 19-1 2015, 33 48 Convergence to Common Fixed Point for Two Asymptotically Quasi-nonexpansive Mappings in the Intermediate Sense in Banach Spaces Gurucharan Singh Saluja Abstract.
More informationTwo-Step Iteration Scheme for Nonexpansive Mappings in Banach Space
Mathematica Moravica Vol. 19-1 (2015), 95 105 Two-Step Iteration Scheme for Nonexpansive Mappings in Banach Space M.R. Yadav Abstract. In this paper, we introduce a new two-step iteration process to approximate
More informationAcceleration of the Halpern algorithm to search for a fixed point of a nonexpansive mapping
Sakurai and Iiduka Fixed Point Theory and Applications 2014, 2014:202 R E S E A R C H Open Access Acceleration of the Halpern algorithm to search for a fixed point of a nonexpansive mapping Kaito Sakurai
More informationConvergence theorems for a finite family. of nonspreading and nonexpansive. multivalued mappings and equilibrium. problems with application
Theoretical Mathematics & Applications, vol.3, no.3, 2013, 49-61 ISSN: 1792-9687 (print), 1792-9709 (online) Scienpress Ltd, 2013 Convergence theorems for a finite family of nonspreading and nonexpansive
More informationAcceleration Method for Convex Optimization over the Fixed Point Set of a Nonexpansive Mapping
Noname manuscript No. will be inserted by the editor) Acceleration Method for Convex Optimization over the Fixed Point Set of a Nonexpansive Mapping Hideaki Iiduka Received: date / Accepted: date Abstract
More informationIterative Method for a Finite Family of Nonexpansive Mappings in Hilbert Spaces with Applications
Applied Mathematical Sciences, Vol. 7, 2013, no. 3, 103-126 Iterative Method for a Finite Family of Nonexpansive Mappings in Hilbert Spaces with Applications S. Imnang Department of Mathematics, Faculty
More informationSplitting Techniques in the Face of Huge Problem Sizes: Block-Coordinate and Block-Iterative Approaches
Splitting Techniques in the Face of Huge Problem Sizes: Block-Coordinate and Block-Iterative Approaches Patrick L. Combettes joint work with J.-C. Pesquet) Laboratoire Jacques-Louis Lions Faculté de Mathématiques
More informationSplit hierarchical variational inequality problems and fixed point problems for nonexpansive mappings
Ansari et al. Journal of Inequalities and Applications (015) 015:74 DOI 10.1186/s13660-015-0793- R E S E A R C H Open Access Split hierarchical variational inequality problems and fixed point problems
More informationTHROUGHOUT this paper, we let C be a nonempty
Strong Convergence Theorems of Multivalued Nonexpansive Mappings and Maximal Monotone Operators in Banach Spaces Kriengsak Wattanawitoon, Uamporn Witthayarat and Poom Kumam Abstract In this paper, we prove
More informationConvergence Theorems for Bregman Strongly Nonexpansive Mappings in Reflexive Banach Spaces
Filomat 28:7 (2014), 1525 1536 DOI 10.2298/FIL1407525Z Published by Faculty of Sciences and Mathematics, University of Niš, Serbia Available at: http://www.pmf.ni.ac.rs/filomat Convergence Theorems for
More informationSplit equality problem with equilibrium problem, variational inequality problem, and fixed point problem of nonexpansive semigroups
Available online at www.isr-publications.com/jnsa J. Nonlinear Sci. Appl., 10 (2017), 3217 3230 Research Article Journal Homepage: www.tjnsa.com - www.isr-publications.com/jnsa Split equality problem with
More informationExistence of Solutions to Split Variational Inequality Problems and Split Minimization Problems in Banach Spaces
Existence of Solutions to Split Variational Inequality Problems and Split Minimization Problems in Banach Spaces Jinlu Li Department of Mathematical Sciences Shawnee State University Portsmouth, Ohio 45662
More informationExistence and Approximation of Fixed Points of. Bregman Nonexpansive Operators. Banach Spaces
Existence and Approximation of Fixed Points of in Reflexive Banach Spaces Department of Mathematics The Technion Israel Institute of Technology Haifa 22.07.2010 Joint work with Prof. Simeon Reich General
More informationResearch Article Hybrid Algorithm of Fixed Point for Weak Relatively Nonexpansive Multivalued Mappings and Applications
Abstract and Applied Analysis Volume 2012, Article ID 479438, 13 pages doi:10.1155/2012/479438 Research Article Hybrid Algorithm of Fixed Point for Weak Relatively Nonexpansive Multivalued Mappings and
More informationWeak and strong convergence theorems of modified SP-iterations for generalized asymptotically quasi-nonexpansive mappings
Mathematica Moravica Vol. 20:1 (2016), 125 144 Weak and strong convergence theorems of modified SP-iterations for generalized asymptotically quasi-nonexpansive mappings G.S. Saluja Abstract. The aim of
More informationSteepest descent approximations in Banach space 1
General Mathematics Vol. 16, No. 3 (2008), 133 143 Steepest descent approximations in Banach space 1 Arif Rafiq, Ana Maria Acu, Mugur Acu Abstract Let E be a real Banach space and let A : E E be a Lipschitzian
More informationA Viscosity Method for Solving a General System of Finite Variational Inequalities for Finite Accretive Operators
A Viscosity Method for Solving a General System of Finite Variational Inequalities for Finite Accretive Operators Phayap Katchang, Somyot Plubtieng and Poom Kumam Member, IAENG Abstract In this paper,
More informationarxiv: v1 [math.oc] 20 Sep 2016
General Viscosity Implicit Midpoint Rule For Nonexpansive Mapping Shuja Haider Rizvi Department of Mathematics, Babu Banarasi Das University, Lucknow 68, India arxiv:169.616v1 [math.oc] Sep 16 Abstract:
More informationResearch Article Strong Convergence Theorems for Zeros of Bounded Maximal Monotone Nonlinear Operators
Abstract and Applied Analysis Volume 2012, Article ID 681348, 19 pages doi:10.1155/2012/681348 Research Article Strong Convergence Theorems for Zeros of Bounded Maximal Monotone Nonlinear Operators C.
More informationSTRONG CONVERGENCE RESULTS FOR NEARLY WEAK UNIFORMLY L-LIPSCHITZIAN MAPPINGS
BULLETIN OF THE INTERNATIONAL MATHEMATICAL VIRTUAL INSTITUTE ISSN (p) 2303-4874, ISSN (o) 2303-4955 www.imvibl.org / JOURNALS / BULLETIN Vol. 6(2016), 199-208 Former BULLETIN OF THE SOCIETY OF MATHEMATICIANS
More informationITERATIVE SCHEMES FOR APPROXIMATING SOLUTIONS OF ACCRETIVE OPERATORS IN BANACH SPACES SHOJI KAMIMURA AND WATARU TAKAHASHI. Received December 14, 1999
Scientiae Mathematicae Vol. 3, No. 1(2000), 107 115 107 ITERATIVE SCHEMES FOR APPROXIMATING SOLUTIONS OF ACCRETIVE OPERATORS IN BANACH SPACES SHOJI KAMIMURA AND WATARU TAKAHASHI Received December 14, 1999
More informationThe fuzzy over-relaxed proximal point iterative scheme for generalized variational inclusion with fuzzy mappings
Available at http://pvamu.edu/aam Appl. Appl. Math. ISSN: 1932-9466 Vol. 10, Issue 2 (December 2015), pp. 805 816 Applications and Applied Mathematics: An International Journal (AAM) The fuzzy over-relaxed
More informationStrong Convergence Theorems for Nonself I-Asymptotically Quasi-Nonexpansive Mappings 1
Applied Mathematical Sciences, Vol. 2, 2008, no. 19, 919-928 Strong Convergence Theorems for Nonself I-Asymptotically Quasi-Nonexpansive Mappings 1 Si-Sheng Yao Department of Mathematics, Kunming Teachers
More informationConvergence Rates in Regularization for Nonlinear Ill-Posed Equations Involving m-accretive Mappings in Banach Spaces
Applied Mathematical Sciences, Vol. 6, 212, no. 63, 319-3117 Convergence Rates in Regularization for Nonlinear Ill-Posed Equations Involving m-accretive Mappings in Banach Spaces Nguyen Buong Vietnamese
More informationAn iterative method for fixed point problems and variational inequality problems
Mathematical Communications 12(2007), 121-132 121 An iterative method for fixed point problems and variational inequality problems Muhammad Aslam Noor, Yonghong Yao, Rudong Chen and Yeong-Cheng Liou Abstract.
More informationOn The Convergence Of Modified Noor Iteration For Nearly Lipschitzian Maps In Real Banach Spaces
CJMS. 2(2)(2013), 95-104 Caspian Journal of Mathematical Sciences (CJMS) University of Mazandaran, Iran http://cjms.journals.umz.ac.ir ISSN: 1735-0611 On The Convergence Of Modified Noor Iteration For
More informationAPPROXIMATE SOLUTIONS TO VARIATIONAL INEQUALITY OVER THE FIXED POINT SET OF A STRONGLY NONEXPANSIVE MAPPING
APPROXIMATE SOLUTIONS TO VARIATIONAL INEQUALITY OVER THE FIXED POINT SET OF A STRONGLY NONEXPANSIVE MAPPING SHIGERU IEMOTO, KAZUHIRO HISHINUMA, AND HIDEAKI IIDUKA Abstract. Variational inequality problems
More informationResearch Article Generalized Mann Iterations for Approximating Fixed Points of a Family of Hemicontractions
Hindawi Publishing Corporation Fixed Point Theory and Applications Volume 008, Article ID 84607, 9 pages doi:10.1155/008/84607 Research Article Generalized Mann Iterations for Approximating Fixed Points
More informationSplit equality monotone variational inclusions and fixed point problem of set-valued operator
Acta Univ. Sapientiae, Mathematica, 9, 1 (2017) 94 121 DOI: 10.1515/ausm-2017-0007 Split equality monotone variational inclusions and fixed point problem of set-valued operator Mohammad Eslamian Department
More informationViscosity approximation method for m-accretive mapping and variational inequality in Banach space
An. Şt. Univ. Ovidius Constanţa Vol. 17(1), 2009, 91 104 Viscosity approximation method for m-accretive mapping and variational inequality in Banach space Zhenhua He 1, Deifei Zhang 1, Feng Gu 2 Abstract
More informationBREGMAN DISTANCES, TOTALLY
BREGMAN DISTANCES, TOTALLY CONVEX FUNCTIONS AND A METHOD FOR SOLVING OPERATOR EQUATIONS IN BANACH SPACES DAN BUTNARIU AND ELENA RESMERITA January 18, 2005 Abstract The aim of this paper is twofold. First,
More informationA convergence result for an Outer Approximation Scheme
A convergence result for an Outer Approximation Scheme R. S. Burachik Engenharia de Sistemas e Computação, COPPE-UFRJ, CP 68511, Rio de Janeiro, RJ, CEP 21941-972, Brazil regi@cos.ufrj.br J. O. Lopes Departamento
More informationCONVERGENCE THEOREMS FOR TOTAL ASYMPTOTICALLY NONEXPANSIVE MAPPINGS
An. Şt. Univ. Ovidius Constanţa Vol. 18(1), 2010, 163 180 CONVERGENCE THEOREMS FOR TOTAL ASYMPTOTICALLY NONEXPANSIVE MAPPINGS Yan Hao Abstract In this paper, a demiclosed principle for total asymptotically
More informationCONVERGENCE OF APPROXIMATING FIXED POINTS FOR MULTIVALUED NONSELF-MAPPINGS IN BANACH SPACES. Jong Soo Jung. 1. Introduction
Korean J. Math. 16 (2008), No. 2, pp. 215 231 CONVERGENCE OF APPROXIMATING FIXED POINTS FOR MULTIVALUED NONSELF-MAPPINGS IN BANACH SPACES Jong Soo Jung Abstract. Let E be a uniformly convex Banach space
More informationHybrid extragradient viscosity method for general system of variational inequalities
Ceng et al. Journal of Inequalities and Applications (05) 05:50 DOI 0.86/s3660-05-0646-z R E S E A R C H Open Access Hybrid extragradient viscosity method for general system of variational inequalities
More informationRegularization Inertial Proximal Point Algorithm for Convex Feasibility Problems in Banach Spaces
Int. Journal of Math. Analysis, Vol. 3, 2009, no. 12, 549-561 Regularization Inertial Proximal Point Algorithm for Convex Feasibility Problems in Banach Spaces Nguyen Buong Vietnamse Academy of Science
More informationA Solution Method for Semidefinite Variational Inequality with Coupled Constraints
Communications in Mathematics and Applications Volume 4 (2013), Number 1, pp. 39 48 RGN Publications http://www.rgnpublications.com A Solution Method for Semidefinite Variational Inequality with Coupled
More informationPARALLEL SUBGRADIENT METHOD FOR NONSMOOTH CONVEX OPTIMIZATION WITH A SIMPLE CONSTRAINT
Linear and Nonlinear Analysis Volume 1, Number 1, 2015, 1 PARALLEL SUBGRADIENT METHOD FOR NONSMOOTH CONVEX OPTIMIZATION WITH A SIMPLE CONSTRAINT KAZUHIRO HISHINUMA AND HIDEAKI IIDUKA Abstract. In this
More informationResearch Article On an Iterative Method for Finding a Zero to the Sum of Two Maximal Monotone Operators
Applied Mathematics, Article ID 414031, 5 pages http://dx.doi.org/10.1155/2014/414031 Research Article On an Iterative Method for Finding a Zero to the Sum of Two Maximal Monotone Operators Hongwei Jiao
More information