Algorithmes de minimisation alternée avec frottement: H. Attouch

Size: px
Start display at page:

Download "Algorithmes de minimisation alternée avec frottement: H. Attouch"

Transcription

1 Algorithmes de minimisation alternée avec frottement: Applications aux E.D.P., problèmes inverses et jeux dynamiques. H. Attouch Institut de Mathématiques et de Modélisation de Montpellier UMR CNRS 5149 programme ANR décisionprox (UMII-Paris 6) Séminaire du laboratoire de Mathématiques de l Université de Saint-Etienne (LaMUSE) 10 Mai 2007, Université Saint-Etienne.

2 Contents: 1. Classical alternating minimization algorithms 1.1 Von Neumann theorem. 1.2 Acker-Prestel theorem. 1.3 Application to the inverse Cauchy problem. 1.4 Passty Theorem. 2. Link with decision sciences and dynamical games. 2.1 Dynamical decision with costs to change. 2.2 Proximal algorithms in decision sciences. 2.3 Alternating proximal algorithms and Nash potential games. 3. Convergence results for proximal dynamics 3.1 The convex case. 3.2 The analytic and subanalytic cases. 3.3 Subdifferential calculus for nonsmooth subanalytic functions. 4. Alternating proximal minimization: the convex case. 4.1 Strong coupling. Attouch-Redont-Soubeyran (SIAM J. Opt.) 4.2 Weak coupling. 4.3 Splitting algorithms for P.D.E. 5. Alternating proximal minimization: the subanalytic case. Attouch-Bolte (Math. Programming 2007)

3 1 Alternating minimization algorithms. Some classical results. 1.1 Von Neumann theorem (1933), Annals of Math. (1950) H Hilbert C, D closed affine subspaces of H, C D x 0 H, x 1 = proj C x 0, x 2 = proj D x 1,... Then x k s H proj C D x 0 as k +. x 2 x 4 x 6 x 7 x 5 x 3 D x 0 x 1 C lim x 2k = lim x 2k+1 = proj C D x 0 Extension: Halperin (1962), N subspaces: C 1,..., C N x 0 H, x 1 = proj C1 x 0, x 2 = proj C2 x 1,..., x N = proj CN x N 1, x N+1 = proj C1 x N Solving numerically linear systems (Kaczmarz, 1937).

4 1.2 From linear convex analysis Theorem (Acker and Prestel, Ann. Toulouse, 190) H Hilbert f, g : H R {+ } convex, lower semicontinuous, + >< >: (x k, y k ) (x k+1, y k ) (x k+1, y k+1 ): x k+1 = argmin{f(ξ) + 1 2λ ξ y k 2 : ξ H} then y k+1 = argmin{g(η) + 1 2λ x k+1 η 2 : η H} Then (x k, y k ) w H H (x, y ) as k +, with (x, y ) argmin{f(ξ) + g(η) + 1 2λ ξ η 2 : ξ H, η H} Algorithm = alternating minimization of the bivariate function Application to inverse problems L(ξ, η) = f(ξ) + g(η) + 1 2λ ξ η 2 (Cheney and Goldstein 59; Gubin, Polyak and Raik 67) f = δ C, g = δ D, C, D closed convex sets in H x k w H x C, y k w H y D x y = min{ x y : x C, y D} C x 1 x x 3 x 2 y 3 y y 2 D y 1 y 0

5 1.3 Application to the inverse Cauchy problem Cimetière, Delvare, Jaoua, Pons, Inverse problems (2001) u = 0 Γ i Ω Γ d < : u = u 0 u n = u 1 Lacking data on Γ i superabundant measured data on Γ d. >< u = 0 on Ω u u = u 0, n >: = u 1 on Γ d u unknown on Γ i Ill-posed inverse problem: data may be incompatible (measurements, noise). Hadamard example of instability Ω = (0, 1) (0, 1), u 0 = 0, u 1 = 0, u n (x, y) = 1 n sin(nx)ch(ny). Alternating projection method in H = H 1 (Ω) with D, N closed affine subspaces of H = H 1 (Ω): D = {v H 1 (Ω) : v = 0 on Ω, v = u 0 on Γ d } N = {v H 1 (Ω) : v = 0 on Ω, v n = u 1 on Γ d }

6 1.4 Passty theorem Theorem (Passty, J. Math Analysis Appl., 1979) H Hilbert f, g : H R {+ } convex, lower semicontinuous, + (f + g) = f + g. (x 0, y 0 ) H H given. (x k, y k ) (x k+1, y k ) (x k+1, y k+1 ): >< >: x k+1 = argmin{f(ξ) + 1 2λ k ξ y k 2 : ξ H} then y k+1 = argmin{g(η) + 1 2λ k x k+1 η 2 : η H} λ k 0 with P λ 2 k and P λ k =. Define s k = P k 1 λ p, X k = 1 s k P k 1 λ px p, Y k = 1 s k P k 1 λ py p Then, X k w H z, Y k w H z as k +, with z argmin{f(ξ) + g(ξ) : ξ H} Remarks on Passty theorem: 1. Valid for A, B maximal monotone operators, A+B max. monotone. Extension of Brezis-Lions thm. (Israel J. Math., 197). 2. Link with semigroups of contractions (Trotter-Lie-Kato formula). Questions: 1. Still valid without (f + g) = f + g? relaxation phenomena: Lehdili and Lemaire (1999), weak sum (Lapidus), variational sum (Attouch-Baillon-Thera, JCA., 1994), extended sum (Revalski-Thera). 2. Weak ergodic weak convergence? (Bruck, JFA., 1975). 3. P λ n = is enough? (Güler, SIAM J. Control Opt., 1991).

7 2 Link with decision sciences and dynamical games 2.1 Dynamical decision with costs to change Goal oriented human cognition (motivated agent): Vroom (1964), Locke and Lathan (1990). Agent = problem solver. Solving problem = reducing (gradually) unsatisfied needs. x 0 x k x k+1 goal Satisficing, bounded rationality: (Carnegie school), H. Simon, Nobel prize of economics (197), artificial intelligence. Landscape theory: The environment (landscape) is largely unknown, exploration aspects: Levinthal and Warglien (1999), learning aspects: Sobel (2000), Berthoz (2003). Worthwhile to move principle: Satisficing with not too much sacrificing. Attouch and Soubeyran (2005, J. Math. Psy.) >< >: X state, performance, strategy space g : X R gain function c : X X R + cost to change g(x k+1 ) g(x k ) θ k c(x k+1, x k ) Marginal gain Cost to change Cost to change Local features of the dynamics: c(x k, x k+1 ) 1 θ k (sup X g g(x 0 ))

8 2.2 Proximal algorithms in decision sciences. Worthwhile to move principle + Optimization Proximal Dynamics. The vision the agent has of his environment and of his gain function depend on his current position (local aspects). Difficulty to change, inertia Anchoring effect: at stage k, his gain function is given by ξ g(ξ) c(ξ, x k ). Proximal dynamics: x k x k+1 x k+1 argmax{g(ξ) c(ξ, x k ) : ξ X} Take ξ = x k, sum w.r. to k, Finite ressource assumption sup X g + g(x k+1 ) g(x k ) c(x k+1, x k ) P k c(x k+1, x k ) sup X g g(x 0 ) + c(x k+1, x k ) 0 as k + Classical prox. dynamics: H Hilbert, f = g, c(ξ, x) = ξ x 2, λ k 0 x k+1 argmin{f(ξ) + 1 2λ k ξ x k 2 : ξ H} General proximal dynamics: x k+1 ǫ k argmax{g(ξ) θ k c(ξ, x k ) : ξ E(x k, r k )} ǫ k : psychological features (motivation, degree of resolution) θ k : cognitive features (speed, learning, reactivity). E(x k, r k ) : exploration set.

9 2.3 Alternating algorithms and Nash potential games 2 agents (players) f: X R payoff of player 1, individual behaviour g: Y R payoff of player 2, individual behaviour c: X Y R joint payoff of the two players, coupling term with either attractive features (team game) or repulsive features (congestion games) Static normal form of the potential game (Monderer-Shapley, 1996) < : F(ξ, η) = f(ξ) + βc(ξ, η) G(ξ, η) = g(ξ) + µc(ξ, η) Dynamical game: Inertial non autonomous Nash equilibration process h: X X R + cost to move in X for player 1 g: Y Y R + cost to move in Y for player 2 non autonomous aspects: α k 0, ν k 0 >< >: x k+1 argmin{f(ξ) + βc(ξ, y k ) + α k h(x k, ξ) : ξ X } then y k+1 argmin{g(η) + µc(x k+1, η) + ν k k(y k, η) : η Y} Nash equilibrium (x, y) is a Nash equilibrium of the normal form game if < : x argmin{f(ξ) + βc(ξ, y) : ξ X } y argmin{g(η) + µc(x, η) : η Y}

10 3 Convergence of proximal dynamics 3.1 The convex case Theorem (Martinet 70, Rockafellar 76, Güler 91) H Hilbert space f : H R {+ } convex, lower semicontinuous, inf X f P λ k 0, λk = + x k+1 = argmin{f(ξ) + 1 2λ k ξ x k 2 : ξ H} Then, a) (x k ) k N is a minimizing sequence for f. b) If argminf x k w H x as k +, with x argminf. Remarks 1. Proximal algorithm = implicit discretization of the generalized steepest descent differential inclusion (x(t k ) = x k ) 0 ẋ(t) + f(x(t)) 0 1 λ k (x(t k+1 ) x(t k )) + f (x(t k+1 )) λ k = t k+1 t k, X λ k = + t k + k Hence, asymptotic behaviour of proximal algorithms asymptotic behaviour of the steepest descent (Bruck theorem, Opial lemma). 2. Weak convergence of (x k ) k N and not strong convergence: see Hundal counterexample (Nonlinear Analysis, 2004).

11 3.2 The analytic and subanalytic case (Attouch and Bolte, Math. Programming, 2007) f : R n R {+ } lower semicontinuous, inf X f. The restriction of f to its domain is a continuous function. f satisfies the Lojasiewicz property : (L) For any limiting-critical point ˆx, that is f(ˆx) 0, there exist C, ǫ > 0 and θ [0, 1) s.t. f(x) f(ˆx) θ C x, x B(ˆx, ǫ), x f(x). Theorem (convergence) Let (x k ) k N be a bounded sequence generated by the proximal algorithm, then + X k=0 x k+1 x k < +, and the whole sequence (x k ) converges to some critical point of f. The rate of convergence is intimately related to the Lojasiewicz exponent which measures some flatness (curvature) property of f. Theorem (rate of convergence) Let x k x be a convergent sequence generated by the proximal algorithm. Let us denote by θ a Lojasiewicz exponent of x crit f. Then, (i) If θ = 0, the sequence (x k ) converges in a finite number of steps, (ii) If θ (0, 1 ] then there exist c > 0 and Q [0, 1) such that 2 x k x c Q k, (iii) If θ ( 1, 1) then there exists c > 0 such that 2 x k x c k 1 θ 2θ 1.

12 3.3 Subdifferential calculus for nonsmooth subanalytic functions Definition: Let f : R n R {+ } be a proper lower semicontinuous function. For each x domf, the Fréchet subdifferential of f at x, written b f(x), is the set of vectors x R n which satisfy lim inf y x y x 1 x y [f(y) f(x) x, y x ] 0. If x / domf, then b f(x)) =. The limiting-subdifferential of f at x R n, written f, is defined as follows f(x) := {x R n : x n x, f(x n ) f(x), b f(xn ) x }. Remarks: The above definition implies that b f(x) f(x), where the first set is convex while the second one is closed. Clearly a necessary condition for x R n to be a minimizer of f is f(x) 0. Unless f is convex the above is not a sufficient condition. A point x R n that satisfies f(x) 0 is called limiting-critical or critical. The set of critical points of f is denoted by critf.

13 On the class of nonsmooth real valued functions involving analytic features 1. Real-analytic functions have the Lojasiewicz property, see Lojasiewicz (Editions du CNRS, Paris, 1963). 2. If f is subanalytic and is continuous on its domain with dom f closed in R n, in particular if f is continuous and subanalytic, it has the Lojasiewicz property, see Bolte-Daniilidis-Lewis (JMAA., 2005), Kurdyka-Parusinski (CRAS., 1994). 3. An interesting class of functions satisfying the Lojasiewicz property is given by semialgebraic functions. These are functions whose graphs can be expressed as p i=1 q j=1 {x Rn : P ij (x) = 0, Q ij (x) 0} where for all 1 i p, 1 j q the P ij, Q ij : R n R are polynomial functions. Due to the Tarski-Seidenberg principle, which asserts that the linear projection of a set of the above type remains of this type, semialgebraic objects enjoy remarkable stability properties. 4. Convex functions satisfying the following growth condition near the infimal set do satisfy the Lojasiewicz property: bx argminf, C 0, r 1, ǫ 0, x B(bx, ǫ), f(x) f(bx) + Cd(x,argminf) r 5. Infinite-dimensional versions of the Lojasiewicz property have been developed in view of the asymptotic analysis of dissipative evolution equations, see Simon (Ann. Math., 193) and Haraux. 6. Kurdyka has recently established a Lojasiewicz-like inequality for functions definable in an arbitrary o-minimal structure (Ann. Inst. Fourier, 199).

14 4 Alternating proximal minimization: the convex case 4.1 Strong coupling Setting: X = Y = H Hilbert spaces. f : H R {+ } convex, lower semicontinuous, + g : H R {+ } convex, lower semicontinuous, + P α k 0, ν k 0, + k=0 α P k+1 α k, + k=0 ν k+1 ν k Algorithm: >< >: x k+1 = argmin{f(ξ) + β 2 ξ y k 2 + α k 2 ξ x k 2 : ξ H} then y k+1 = argmin{g(η) + µ 2 η x k ν k 2 η y k 2 : η H} Alternating minimization with anchoring (inertia) of L β,µ (ξ, η) = µf(ξ) + βg(η) + βµ 2 ξ η 2 Equilibrium: solutions of the minimization problem min{µf(ξ) + βg(η) + βµ 2 ξ η 2 : ξ H, η H} Theorem (Attouch-Redont-Soubeyran, 2007) Assume S = argmin L β,µ. Then, a) (x k, y k ) k N is a minimizing sequence for L β,µ. b) x k w H x, y k w H y with (x, y ) S = argminl β,µ c) x k+1 x k s H 0, y k+1 y k s H 0 x k y k s H x y as k +

15 Alternating projection with anchoring versus classical version x 0 x 1 C x 3 x 2 u 1 u 2 v1 u 3 v2 v3 y 3 y 2 y 1 y 0 D C x 1 x x 3 x 2 y y 3 y 2 D y 1 y 0

16 4.2 Weak coupling Setting: X, Y, Z Hilbert spaces. The coupling between X and Y is realized via Z (trace, interface,...) f : X R {+ } convex, lower semicontinuous, + g : Y R {+ } convex, lower semicontinuous, + A: X Z linear, continuous operator. B: Y Z linear, continuous operator. Algorithm: Alternating minimization with anchoring (inertia) of L λ (ξ, η) = f(ξ) + g(η) + 1 2λ A(ξ) B(η) 2 >< >: x k+1 = argmin{f(ξ) + 1 2λ A(ξ) B(y k) 2 + α 2 ξ x k 2 : ξ X} then y k+1 = argmin{g(η) + 1 2λ B(η) A(x k+1) 2 + ν 2 η y k 2 : η Y } Equilibrium: solutions of the minimization problem min{f(ξ) + g(η) + 1 2λ A(ξ) B(η) 2 : ξ X, η Y } Theorem (Attouch-Redont-Soubeyran, 2007) Assume S = argmin L λ. Then, a) (x k, y k ) k N is a minimizing sequence for L λ. b) x k w X x, y k w Y y with (x, y ) S = argminl λ c) x k+1 x k s X 0, y k+1 y k s Y 0 A(x k ) B(y k ) s Z A(x ) B(y ) as k +.

17 4.3 Decomposition and splitting for P.D.E Variational problem min{f(ξ) + g(η) + 1 2λ A(ξ) B(η) 2 : ξ X, η Y } Transmission through a thin weakly conducting layer, contact problems. Ω 1 Σ Ω 2 X = H 1 (Ω 1 ), A : H 1 (Ω 1 ) Z = L 2 (Σ) Y = H 1 (Ω 2 ), B : H 1 (Ω 2 ) Z = L 2 (Σ) trace operator trace operator min{ 1 2 Z Ω 1 u Z Ω 2 u λ Z Σ Z [u] 2 Ω fu : u 1 X, u 2 Y } < : u = u 1 on Ω 1 u = u 2 on Ω 2 [u] = A(u 1 ) B(u 2 ) : jump of u through Σ. Alternating minimization decomposition of domain

18 5 Alternating proximal minimization: the analytic and subanalytic cases. f : R n R {+ }, g : R m R {+ } lower semicontinuous, A: R n R p, B: R m R p linear (continuous) operators. Equilibrium: critical points of the bivariate function L λ : R n R m R {+ } L λ (ξ, η) = f(ξ) + g(η) + 1 A(ξ) B(η) 2 2λ Algorithm: Alternating minimization with anchoring (inertia) of L λ >< >: x k+1 argmin{f(ξ) + 1 2λ Aξ By k θ k ξ x k 2 : ξ R n } then y k+1 argmin{g(η) + 1 2λ Bη Ax k µ k η y k 2 : η R m } where 0 θ θ k θ + +, 0 µ µ k µ + +. Let us assume that inf R nf and inf R mg and that f and g are respectively continuous on their domains dom f and dom g. Theorem (Attouch-Bolte, 2007) Assume that the bivariate function L λ : R n R m R {+ } satisfy the Lojasiewicz property. Let (x k, y k ) k N be a bounded sequence generated by the alternating proximal algorithm with anchoring, then + X k=0 ( x k+1 x k + y k+1 y k ) < +, and the whole sequence (x k, y k ) k N converges to some critical point of L λ.

A user s guide to Lojasiewicz/KL inequalities

A user s guide to Lojasiewicz/KL inequalities Other A user s guide to Lojasiewicz/KL inequalities Toulouse School of Economics, Université Toulouse I SLRA, Grenoble, 2015 Motivations behind KL f : R n R smooth ẋ(t) = f (x(t)) or x k+1 = x k λ k f

More information

I P IANO : I NERTIAL P ROXIMAL A LGORITHM FOR N ON -C ONVEX O PTIMIZATION

I P IANO : I NERTIAL P ROXIMAL A LGORITHM FOR N ON -C ONVEX O PTIMIZATION I P IANO : I NERTIAL P ROXIMAL A LGORITHM FOR N ON -C ONVEX O PTIMIZATION Peter Ochs University of Freiburg Germany 17.01.2017 joint work with: Thomas Brox and Thomas Pock c 2017 Peter Ochs ipiano c 1

More information

A proximal-newton method for monotone inclusions in Hilbert spaces with complexity O(1/k 2 ).

A proximal-newton method for monotone inclusions in Hilbert spaces with complexity O(1/k 2 ). H. ATTOUCH (Univ. Montpellier 2) Fast proximal-newton method Sept. 8-12, 2014 1 / 40 A proximal-newton method for monotone inclusions in Hilbert spaces with complexity O(1/k 2 ). Hedy ATTOUCH Université

More information

Convergence rate estimates for the gradient differential inclusion

Convergence rate estimates for the gradient differential inclusion Convergence rate estimates for the gradient differential inclusion Osman Güler November 23 Abstract Let f : H R { } be a proper, lower semi continuous, convex function in a Hilbert space H. The gradient

More information

arxiv: v1 [math.oc] 11 Jan 2008

arxiv: v1 [math.oc] 11 Jan 2008 Alternating minimization and projection methods for nonconvex problems 1 Hedy ATTOUCH 2, Jérôme BOLTE 3, Patrick REDONT 2, Antoine SOUBEYRAN 4. arxiv:0801.1780v1 [math.oc] 11 Jan 2008 Abstract We study

More information

PROX-PENALIZATION AND SPLITTING METHODS FOR CONSTRAINED VARIATIONAL PROBLEMS

PROX-PENALIZATION AND SPLITTING METHODS FOR CONSTRAINED VARIATIONAL PROBLEMS PROX-PENALIZATION AND SPLITTING METHODS FOR CONSTRAINED VARIATIONAL PROBLEMS HÉDY ATTOUCH, MARC-OLIVIER CZARNECKI & JUAN PEYPOUQUET Abstract. This paper is concerned with the study of a class of prox-penalization

More information

Une méthode proximale pour les inclusions monotones dans les espaces de Hilbert, avec complexité O(1/k 2 ).

Une méthode proximale pour les inclusions monotones dans les espaces de Hilbert, avec complexité O(1/k 2 ). Une méthode proximale pour les inclusions monotones dans les espaces de Hilbert, avec complexité O(1/k 2 ). Hedy ATTOUCH Université Montpellier 2 ACSIOM, I3M UMR CNRS 5149 Travail en collaboration avec

More information

From error bounds to the complexity of first-order descent methods for convex functions

From error bounds to the complexity of first-order descent methods for convex functions From error bounds to the complexity of first-order descent methods for convex functions Nguyen Trong Phong-TSE Joint work with Jérôme Bolte, Juan Peypouquet, Bruce Suter. Toulouse, 23-25, March, 2016 Journées

More information

A semi-algebraic look at first-order methods

A semi-algebraic look at first-order methods splitting A semi-algebraic look at first-order Université de Toulouse / TSE Nesterov s 60th birthday, Les Houches, 2016 in large-scale first-order optimization splitting Start with a reasonable FOM (some

More information

A Dykstra-like algorithm for two monotone operators

A Dykstra-like algorithm for two monotone operators A Dykstra-like algorithm for two monotone operators Heinz H. Bauschke and Patrick L. Combettes Abstract Dykstra s algorithm employs the projectors onto two closed convex sets in a Hilbert space to construct

More information

Visco-penalization of the sum of two monotone operators

Visco-penalization of the sum of two monotone operators Visco-penalization of the sum of two monotone operators Patrick L. Combettes a and Sever A. Hirstoaga b a Laboratoire Jacques-Louis Lions, Faculté de Mathématiques, Université Pierre et Marie Curie Paris

More information

A splitting minimization method on geodesic spaces

A splitting minimization method on geodesic spaces A splitting minimization method on geodesic spaces J.X. Cruz Neto DM, Universidade Federal do Piauí, Teresina, PI 64049-500, BR B.P. Lima DM, Universidade Federal do Piauí, Teresina, PI 64049-500, BR P.A.

More information

1 Introduction and preliminaries

1 Introduction and preliminaries Proximal Methods for a Class of Relaxed Nonlinear Variational Inclusions Abdellatif Moudafi Université des Antilles et de la Guyane, Grimaag B.P. 7209, 97275 Schoelcher, Martinique abdellatif.moudafi@martinique.univ-ag.fr

More information

On the iterate convergence of descent methods for convex optimization

On the iterate convergence of descent methods for convex optimization On the iterate convergence of descent methods for convex optimization Clovis C. Gonzaga March 1, 2014 Abstract We study the iterate convergence of strong descent algorithms applied to convex functions.

More information

Variable Metric Forward-Backward Algorithm

Variable Metric Forward-Backward Algorithm Variable Metric Forward-Backward Algorithm 1/37 Variable Metric Forward-Backward Algorithm for minimizing the sum of a differentiable function and a convex function E. Chouzenoux in collaboration with

More information

Approaching monotone inclusion problems via second order dynamical systems with linear and anisotropic damping

Approaching monotone inclusion problems via second order dynamical systems with linear and anisotropic damping March 0, 206 3:4 WSPC Proceedings - 9in x 6in secondorderanisotropicdamping206030 page Approaching monotone inclusion problems via second order dynamical systems with linear and anisotropic damping Radu

More information

An inertial forward-backward algorithm for the minimization of the sum of two nonconvex functions

An inertial forward-backward algorithm for the minimization of the sum of two nonconvex functions An inertial forward-backward algorithm for the minimization of the sum of two nonconvex functions Radu Ioan Boţ Ernö Robert Csetnek Szilárd Csaba László October, 1 Abstract. We propose a forward-backward

More information

Tame variational analysis

Tame variational analysis Tame variational analysis Dmitriy Drusvyatskiy Mathematics, University of Washington Joint work with Daniilidis (Chile), Ioffe (Technion), and Lewis (Cornell) May 19, 2015 Theme: Semi-algebraic geometry

More information

Spring School on Variational Analysis Paseky nad Jizerou, Czech Republic (April 20 24, 2009)

Spring School on Variational Analysis Paseky nad Jizerou, Czech Republic (April 20 24, 2009) Spring School on Variational Analysis Paseky nad Jizerou, Czech Republic (April 20 24, 2009) GRADIENT DYNAMICAL SYSTEMS, TAME OPTIMIZATION AND APPLICATIONS Aris DANIILIDIS Departament de Matemàtiques Universitat

More information

On nonexpansive and accretive operators in Banach spaces

On nonexpansive and accretive operators in Banach spaces Available online at www.isr-publications.com/jnsa J. Nonlinear Sci. Appl., 10 (2017), 3437 3446 Research Article Journal Homepage: www.tjnsa.com - www.isr-publications.com/jnsa On nonexpansive and accretive

More information

Hedy Attouch, Jérôme Bolte, Benar Svaiter. To cite this version: HAL Id: hal

Hedy Attouch, Jérôme Bolte, Benar Svaiter. To cite this version: HAL Id: hal Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods Hedy Attouch, Jérôme Bolte, Benar Svaiter To cite

More information

Second order forward-backward dynamical systems for monotone inclusion problems

Second order forward-backward dynamical systems for monotone inclusion problems Second order forward-backward dynamical systems for monotone inclusion problems Radu Ioan Boţ Ernö Robert Csetnek March 6, 25 Abstract. We begin by considering second order dynamical systems of the from

More information

Convergence rates for an inertial algorithm of gradient type associated to a smooth nonconvex minimization

Convergence rates for an inertial algorithm of gradient type associated to a smooth nonconvex minimization Convergence rates for an inertial algorithm of gradient type associated to a smooth nonconvex minimization Szilárd Csaba László November, 08 Abstract. We investigate an inertial algorithm of gradient type

More information

Characterizations of Lojasiewicz inequalities: subgradient flows, talweg, convexity.

Characterizations of Lojasiewicz inequalities: subgradient flows, talweg, convexity. Characterizations of Lojasiewicz inequalities: subgradient flows, talweg, convexity. Jérôme BOLTE, Aris DANIILIDIS, Olivier LEY, Laurent MAZET. Abstract The classical Lojasiewicz inequality and its extensions

More information

ITERATIVE SCHEMES FOR APPROXIMATING SOLUTIONS OF ACCRETIVE OPERATORS IN BANACH SPACES SHOJI KAMIMURA AND WATARU TAKAHASHI. Received December 14, 1999

ITERATIVE SCHEMES FOR APPROXIMATING SOLUTIONS OF ACCRETIVE OPERATORS IN BANACH SPACES SHOJI KAMIMURA AND WATARU TAKAHASHI. Received December 14, 1999 Scientiae Mathematicae Vol. 3, No. 1(2000), 107 115 107 ITERATIVE SCHEMES FOR APPROXIMATING SOLUTIONS OF ACCRETIVE OPERATORS IN BANACH SPACES SHOJI KAMIMURA AND WATARU TAKAHASHI Received December 14, 1999

More information

arxiv: v2 [math.oc] 21 Nov 2017

arxiv: v2 [math.oc] 21 Nov 2017 Unifying abstract inexact convergence theorems and block coordinate variable metric ipiano arxiv:1602.07283v2 [math.oc] 21 Nov 2017 Peter Ochs Mathematical Optimization Group Saarland University Germany

More information

Splitting Techniques in the Face of Huge Problem Sizes: Block-Coordinate and Block-Iterative Approaches

Splitting Techniques in the Face of Huge Problem Sizes: Block-Coordinate and Block-Iterative Approaches Splitting Techniques in the Face of Huge Problem Sizes: Block-Coordinate and Block-Iterative Approaches Patrick L. Combettes joint work with J.-C. Pesquet) Laboratoire Jacques-Louis Lions Faculté de Mathématiques

More information

Prox-Diagonal Method: Caracterization of the Limit

Prox-Diagonal Method: Caracterization of the Limit International Journal of Mathematical Analysis Vol. 12, 2018, no. 9, 403-412 HIKARI Ltd, www.m-hikari.com https://doi.org/10.12988/ijma.2018.8639 Prox-Diagonal Method: Caracterization of the Limit M. Amin

More information

Active sets, steepest descent, and smooth approximation of functions

Active sets, steepest descent, and smooth approximation of functions Active sets, steepest descent, and smooth approximation of functions Dmitriy Drusvyatskiy School of ORIE, Cornell University Joint work with Alex D. Ioffe (Technion), Martin Larsson (EPFL), and Adrian

More information

Convergence Theorems of Approximate Proximal Point Algorithm for Zeroes of Maximal Monotone Operators in Hilbert Spaces 1

Convergence Theorems of Approximate Proximal Point Algorithm for Zeroes of Maximal Monotone Operators in Hilbert Spaces 1 Int. Journal of Math. Analysis, Vol. 1, 2007, no. 4, 175-186 Convergence Theorems of Approximate Proximal Point Algorithm for Zeroes of Maximal Monotone Operators in Hilbert Spaces 1 Haiyun Zhou Institute

More information

A gradient type algorithm with backward inertial steps for a nonconvex minimization

A gradient type algorithm with backward inertial steps for a nonconvex minimization A gradient type algorithm with backward inertial steps for a nonconvex minimization Cristian Daniel Alecsa Szilárd Csaba László Adrian Viorel November 22, 208 Abstract. We investigate an algorithm of gradient

More information

Variational Analysis and Tame Optimization

Variational Analysis and Tame Optimization Variational Analysis and Tame Optimization Aris Daniilidis http://mat.uab.cat/~arisd Universitat Autònoma de Barcelona April 15 17, 2010 PLAN OF THE TALK Nonsmooth analysis Genericity of pathological situations

More information

Optimization and Optimal Control in Banach Spaces

Optimization and Optimal Control in Banach Spaces Optimization and Optimal Control in Banach Spaces Bernhard Schmitzer October 19, 2017 1 Convex non-smooth optimization with proximal operators Remark 1.1 (Motivation). Convex optimization: easier to solve,

More information

CLARKE CRITICAL VALUES OF SUBANALYTIC LIPSCHITZ CONTINUOUS FUNCTIONS

CLARKE CRITICAL VALUES OF SUBANALYTIC LIPSCHITZ CONTINUOUS FUNCTIONS CLARKE CRITICAL VALUES OF SUBANALYTIC LIPSCHITZ CONTINUOUS FUNCTIONS Abstract. We prove that any subanalytic locally Lipschitz function has the Sard property. Such functions are typically nonsmooth and

More information

The Lojasiewicz inequality for nonsmooth subanalytic functions with applications to subgradient dynamical systems

The Lojasiewicz inequality for nonsmooth subanalytic functions with applications to subgradient dynamical systems The Lojasiewicz inequality for nonsmooth subanalytic functions with applications to subgradient dynamical systems Jérôme BOLTE 1, Aris DANIILIDIS 1 & Adrian LEWIS Abstract Given a real-analytic function

More information

Partial Differential Equations

Partial Differential Equations Part II Partial Differential Equations Year 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2015 Paper 4, Section II 29E Partial Differential Equations 72 (a) Show that the Cauchy problem for u(x,

More information

INERTIAL ACCELERATED ALGORITHMS FOR SOLVING SPLIT FEASIBILITY PROBLEMS. Yazheng Dang. Jie Sun. Honglei Xu

INERTIAL ACCELERATED ALGORITHMS FOR SOLVING SPLIT FEASIBILITY PROBLEMS. Yazheng Dang. Jie Sun. Honglei Xu Manuscript submitted to AIMS Journals Volume X, Number 0X, XX 200X doi:10.3934/xx.xx.xx.xx pp. X XX INERTIAL ACCELERATED ALGORITHMS FOR SOLVING SPLIT FEASIBILITY PROBLEMS Yazheng Dang School of Management

More information

Iterative Convex Optimization Algorithms; Part One: Using the Baillon Haddad Theorem

Iterative Convex Optimization Algorithms; Part One: Using the Baillon Haddad Theorem Iterative Convex Optimization Algorithms; Part One: Using the Baillon Haddad Theorem Charles Byrne (Charles Byrne@uml.edu) http://faculty.uml.edu/cbyrne/cbyrne.html Department of Mathematical Sciences

More information

Asymptotics for some vibro-impact problems. with a linear dissipation term

Asymptotics for some vibro-impact problems. with a linear dissipation term XLIM UMR CNRS 617 Département Mathématiques-Informatique Asymptotics for some vibro-impact problems with a linear dissipation term Alexandre Cabot & Laetitia Paoli Rapport de recherche n 6-8 Déposé le

More information

Keywords: monotone operators, sums of operators, enlargements, Yosida regularizations, subdifferentials

Keywords: monotone operators, sums of operators, enlargements, Yosida regularizations, subdifferentials Ill-posed Variational Problems and Regularization Techniques (M. Théra and R. Tichatschke (eds.)), Lect. Notes in Economics and Mathematical Systems, Vol. 477, 1999, Springer-Verlag, pp. 229 246 VARIATIONAL

More information

A memory gradient algorithm for l 2 -l 0 regularization with applications to image restoration

A memory gradient algorithm for l 2 -l 0 regularization with applications to image restoration A memory gradient algorithm for l 2 -l 0 regularization with applications to image restoration E. Chouzenoux, A. Jezierska, J.-C. Pesquet and H. Talbot Université Paris-Est Lab. d Informatique Gaspard

More information

Convergence of descent methods for semi-algebraic and tame problems.

Convergence of descent methods for semi-algebraic and tame problems. OPTIMIZATION, GAMES, AND DYNAMICS Institut Henri Poincaré November 28-29, 2011 Convergence of descent methods for semi-algebraic and tame problems. Hedy ATTOUCH Institut de Mathématiques et Modélisation

More information

Continuous Sets and Non-Attaining Functionals in Reflexive Banach Spaces

Continuous Sets and Non-Attaining Functionals in Reflexive Banach Spaces Laboratoire d Arithmétique, Calcul formel et d Optimisation UMR CNRS 6090 Continuous Sets and Non-Attaining Functionals in Reflexive Banach Spaces Emil Ernst Michel Théra Rapport de recherche n 2004-04

More information

A Unified Approach to Proximal Algorithms using Bregman Distance

A Unified Approach to Proximal Algorithms using Bregman Distance A Unified Approach to Proximal Algorithms using Bregman Distance Yi Zhou a,, Yingbin Liang a, Lixin Shen b a Department of Electrical Engineering and Computer Science, Syracuse University b Department

More information

WEAK CONVERGENCE OF RESOLVENTS OF MAXIMAL MONOTONE OPERATORS AND MOSCO CONVERGENCE

WEAK CONVERGENCE OF RESOLVENTS OF MAXIMAL MONOTONE OPERATORS AND MOSCO CONVERGENCE Fixed Point Theory, Volume 6, No. 1, 2005, 59-69 http://www.math.ubbcluj.ro/ nodeacj/sfptcj.htm WEAK CONVERGENCE OF RESOLVENTS OF MAXIMAL MONOTONE OPERATORS AND MOSCO CONVERGENCE YASUNORI KIMURA Department

More information

Asymptotic Convergence of the Steepest Descent Method for the Exponential Penalty in Linear Programming

Asymptotic Convergence of the Steepest Descent Method for the Exponential Penalty in Linear Programming Journal of Convex Analysis Volume 2 (1995), No.1/2, 145 152 Asymptotic Convergence of the Steepest Descent Method for the Exponential Penalty in Linear Programming R. Cominetti 1 Universidad de Chile,

More information

Journal of Inequalities in Pure and Applied Mathematics

Journal of Inequalities in Pure and Applied Mathematics Journal of Inequalities in Pure and Applied Mathematics http://jipam.vu.edu.au/ Volume 4, Issue 4, Article 67, 2003 ON GENERALIZED MONOTONE MULTIFUNCTIONS WITH APPLICATIONS TO OPTIMALITY CONDITIONS IN

More information

A VISCOSITY APPROXIMATIVE METHOD TO CESÀRO MEANS FOR SOLVING A COMMON ELEMENT OF MIXED EQUILIBRIUM, VARIATIONAL INEQUALITIES AND FIXED POINT PROBLEMS

A VISCOSITY APPROXIMATIVE METHOD TO CESÀRO MEANS FOR SOLVING A COMMON ELEMENT OF MIXED EQUILIBRIUM, VARIATIONAL INEQUALITIES AND FIXED POINT PROBLEMS J. Appl. Math. & Informatics Vol. 29(2011), No. 1-2, pp. 227-245 Website: http://www.kcam.biz A VISCOSITY APPROXIMATIVE METHOD TO CESÀRO MEANS FOR SOLVING A COMMON ELEMENT OF MIXED EQUILIBRIUM, VARIATIONAL

More information

BORIS MORDUKHOVICH Wayne State University Detroit, MI 48202, USA. Talk given at the SPCOM Adelaide, Australia, February 2015

BORIS MORDUKHOVICH Wayne State University Detroit, MI 48202, USA. Talk given at the SPCOM Adelaide, Australia, February 2015 CODERIVATIVE CHARACTERIZATIONS OF MAXIMAL MONOTONICITY BORIS MORDUKHOVICH Wayne State University Detroit, MI 48202, USA Talk given at the SPCOM 2015 Adelaide, Australia, February 2015 Based on joint papers

More information

c 2007 Society for Industrial and Applied Mathematics

c 2007 Society for Industrial and Applied Mathematics SIAM J. OPTIM. Vol. 17, No. 4, pp. 1205 1223 c 2007 Society for Industrial and Applied Mathematics THE LOJASIEWICZ INEQUALITY FOR NONSMOOTH SUBANALYTIC FUNCTIONS WITH APPLICATIONS TO SUBGRADIENT DYNAMICAL

More information

arxiv: v2 [math.oc] 22 Jun 2017

arxiv: v2 [math.oc] 22 Jun 2017 A Proximal Difference-of-convex Algorithm with Extrapolation Bo Wen Xiaojun Chen Ting Kei Pong arxiv:1612.06265v2 [math.oc] 22 Jun 2017 June 17, 2017 Abstract We consider a class of difference-of-convex

More information

Split equality monotone variational inclusions and fixed point problem of set-valued operator

Split equality monotone variational inclusions and fixed point problem of set-valued operator Acta Univ. Sapientiae, Mathematica, 9, 1 (2017) 94 121 DOI: 10.1515/ausm-2017-0007 Split equality monotone variational inclusions and fixed point problem of set-valued operator Mohammad Eslamian Department

More information

Brøndsted-Rockafellar property of subdifferentials of prox-bounded functions. Marc Lassonde Université des Antilles et de la Guyane

Brøndsted-Rockafellar property of subdifferentials of prox-bounded functions. Marc Lassonde Université des Antilles et de la Guyane Conference ADGO 2013 October 16, 2013 Brøndsted-Rockafellar property of subdifferentials of prox-bounded functions Marc Lassonde Université des Antilles et de la Guyane Playa Blanca, Tongoy, Chile SUBDIFFERENTIAL

More information

Subdifferential representation of convex functions: refinements and applications

Subdifferential representation of convex functions: refinements and applications Subdifferential representation of convex functions: refinements and applications Joël Benoist & Aris Daniilidis Abstract Every lower semicontinuous convex function can be represented through its subdifferential

More information

Sequential convex programming,: value function and convergence

Sequential convex programming,: value function and convergence Sequential convex programming,: value function and convergence Edouard Pauwels joint work with Jérôme Bolte Journées MODE Toulouse March 23 2016 1 / 16 Introduction Local search methods for finite dimensional

More information

ITERATIVE ALGORITHMS WITH ERRORS FOR ZEROS OF ACCRETIVE OPERATORS IN BANACH SPACES. Jong Soo Jung. 1. Introduction

ITERATIVE ALGORITHMS WITH ERRORS FOR ZEROS OF ACCRETIVE OPERATORS IN BANACH SPACES. Jong Soo Jung. 1. Introduction J. Appl. Math. & Computing Vol. 20(2006), No. 1-2, pp. 369-389 Website: http://jamc.net ITERATIVE ALGORITHMS WITH ERRORS FOR ZEROS OF ACCRETIVE OPERATORS IN BANACH SPACES Jong Soo Jung Abstract. The iterative

More information

A Majorize-Minimize subspace approach for l 2 -l 0 regularization with applications to image processing

A Majorize-Minimize subspace approach for l 2 -l 0 regularization with applications to image processing A Majorize-Minimize subspace approach for l 2 -l 0 regularization with applications to image processing Emilie Chouzenoux emilie.chouzenoux@univ-mlv.fr Université Paris-Est Lab. d Informatique Gaspard

More information

Vector Variational Principles; ε-efficiency and Scalar Stationarity

Vector Variational Principles; ε-efficiency and Scalar Stationarity Journal of Convex Analysis Volume 8 (2001), No. 1, 71 85 Vector Variational Principles; ε-efficiency and Scalar Stationarity S. Bolintinéanu (H. Bonnel) Université de La Réunion, Fac. des Sciences, IREMIA,

More information

Second order forward-backward dynamical systems for monotone inclusion problems

Second order forward-backward dynamical systems for monotone inclusion problems Second order forward-backward dynamical systems for monotone inclusion problems Radu Ioan Boţ Ernö Robert Csetnek March 2, 26 Abstract. We begin by considering second order dynamical systems of the from

More information

A GENERALIZATION OF THE REGULARIZATION PROXIMAL POINT METHOD

A GENERALIZATION OF THE REGULARIZATION PROXIMAL POINT METHOD A GENERALIZATION OF THE REGULARIZATION PROXIMAL POINT METHOD OGANEDITSE A. BOIKANYO AND GHEORGHE MOROŞANU Abstract. This paper deals with the generalized regularization proximal point method which was

More information

Nonconvex notions of regularity and convergence of fundamental algorithms for feasibility problems

Nonconvex notions of regularity and convergence of fundamental algorithms for feasibility problems Nonconvex notions of regularity and convergence of fundamental algorithms for feasibility problems Robert Hesse and D. Russell Luke December 12, 2012 Abstract We consider projection algorithms for solving

More information

Received: 15 December 2010 / Accepted: 30 July 2011 / Published online: 20 August 2011 Springer and Mathematical Optimization Society 2011

Received: 15 December 2010 / Accepted: 30 July 2011 / Published online: 20 August 2011 Springer and Mathematical Optimization Society 2011 Math. Program., Ser. A (2013) 137:91 129 DOI 10.1007/s10107-011-0484-9 FULL LENGTH PAPER Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward backward splitting,

More information

Steepest descent method on a Riemannian manifold: the convex case

Steepest descent method on a Riemannian manifold: the convex case Steepest descent method on a Riemannian manifold: the convex case Julien Munier Abstract. In this paper we are interested in the asymptotic behavior of the trajectories of the famous steepest descent evolution

More information

Monotone variational inequalities, generalized equilibrium problems and fixed point methods

Monotone variational inequalities, generalized equilibrium problems and fixed point methods Wang Fixed Point Theory and Applications 2014, 2014:236 R E S E A R C H Open Access Monotone variational inequalities, generalized equilibrium problems and fixed point methods Shenghua Wang * * Correspondence:

More information

ADMM for monotone operators: convergence analysis and rates

ADMM for monotone operators: convergence analysis and rates ADMM for monotone operators: convergence analysis and rates Radu Ioan Boţ Ernö Robert Csetne May 4, 07 Abstract. We propose in this paper a unifying scheme for several algorithms from the literature dedicated

More information

Douglas-Rachford splitting for nonconvex feasibility problems

Douglas-Rachford splitting for nonconvex feasibility problems Douglas-Rachford splitting for nonconvex feasibility problems Guoyin Li Ting Kei Pong Jan 3, 015 Abstract We adapt the Douglas-Rachford DR) splitting method to solve nonconvex feasibility problems by studying

More information

Alberto Bressan. Department of Mathematics, Penn State University

Alberto Bressan. Department of Mathematics, Penn State University Non-cooperative Differential Games A Homotopy Approach Alberto Bressan Department of Mathematics, Penn State University 1 Differential Games d dt x(t) = G(x(t), u 1(t), u 2 (t)), x(0) = y, u i (t) U i

More information

arxiv: v3 [math.oc] 22 Jan 2013

arxiv: v3 [math.oc] 22 Jan 2013 Proximal alternating minimization and projection methods for nonconvex problems. An approach based on the Kurdyka- Lojasiewicz inequality 1 Hedy ATTOUCH 2, Jérôme BOLTE 3, Patrick REDONT 2, Antoine SOUBEYRAN

More information

SHARPNESS, RESTART AND ACCELERATION

SHARPNESS, RESTART AND ACCELERATION SHARPNESS, RESTART AND ACCELERATION VINCENT ROULET AND ALEXANDRE D ASPREMONT ABSTRACT. The Łojasievicz inequality shows that sharpness bounds on the minimum of convex optimization problems hold almost

More information

On the Brézis - Haraux - type approximation in nonreflexive Banach spaces

On the Brézis - Haraux - type approximation in nonreflexive Banach spaces On the Brézis - Haraux - type approximation in nonreflexive Banach spaces Radu Ioan Boţ Sorin - Mihai Grad Gert Wanka Abstract. We give Brézis - Haraux - type approximation results for the range of the

More information

An Accelerated Hybrid Proximal Extragradient Method for Convex Optimization and its Implications to Second-Order Methods

An Accelerated Hybrid Proximal Extragradient Method for Convex Optimization and its Implications to Second-Order Methods An Accelerated Hybrid Proximal Extragradient Method for Convex Optimization and its Implications to Second-Order Methods Renato D.C. Monteiro B. F. Svaiter May 10, 011 Revised: May 4, 01) Abstract This

More information

Brézis - Haraux - type approximation of the range of a monotone operator composed with a linear mapping

Brézis - Haraux - type approximation of the range of a monotone operator composed with a linear mapping Brézis - Haraux - type approximation of the range of a monotone operator composed with a linear mapping Radu Ioan Boţ, Sorin-Mihai Grad and Gert Wanka Faculty of Mathematics Chemnitz University of Technology

More information

Zeqing Liu, Jeong Sheok Ume and Shin Min Kang

Zeqing Liu, Jeong Sheok Ume and Shin Min Kang Bull. Korean Math. Soc. 41 (2004), No. 2, pp. 241 256 GENERAL VARIATIONAL INCLUSIONS AND GENERAL RESOLVENT EQUATIONS Zeqing Liu, Jeong Sheok Ume and Shin Min Kang Abstract. In this paper, we introduce

More information

Existence and Approximation of Fixed Points of. Bregman Nonexpansive Operators. Banach Spaces

Existence and Approximation of Fixed Points of. Bregman Nonexpansive Operators. Banach Spaces Existence and Approximation of Fixed Points of in Reflexive Banach Spaces Department of Mathematics The Technion Israel Institute of Technology Haifa 22.07.2010 Joint work with Prof. Simeon Reich General

More information

Convex Feasibility Problems

Convex Feasibility Problems Laureate Prof. Jonathan Borwein with Matthew Tam http://carma.newcastle.edu.au/drmethods/paseky.html Spring School on Variational Analysis VI Paseky nad Jizerou, April 19 25, 2015 Last Revised: May 6,

More information

GENERAL NONCONVEX SPLIT VARIATIONAL INEQUALITY PROBLEMS. Jong Kyu Kim, Salahuddin, and Won Hee Lim

GENERAL NONCONVEX SPLIT VARIATIONAL INEQUALITY PROBLEMS. Jong Kyu Kim, Salahuddin, and Won Hee Lim Korean J. Math. 25 (2017), No. 4, pp. 469 481 https://doi.org/10.11568/kjm.2017.25.4.469 GENERAL NONCONVEX SPLIT VARIATIONAL INEQUALITY PROBLEMS Jong Kyu Kim, Salahuddin, and Won Hee Lim Abstract. In this

More information

Lower Semicontinuity of the Solution Set Mapping in Some Optimization Problems

Lower Semicontinuity of the Solution Set Mapping in Some Optimization Problems Lower Semicontinuity of the Solution Set Mapping in Some Optimization Problems Roberto Lucchetti coauthors: A. Daniilidis, M.A. Goberna, M.A. López, Y. Viossat Dipartimento di Matematica, Politecnico di

More information

Morse-Sard theorem for Clarke critical values

Morse-Sard theorem for Clarke critical values Morse-Sard theorem for Clarke critical values Luc Barbet, Marc Dambrine, Aris Daniilidis Abstract The Morse-Sard theorem states that the set of critical values of a C k smooth function defined on a Euclidean

More information

Variational approach to mean field games with density constraints

Variational approach to mean field games with density constraints 1 / 18 Variational approach to mean field games with density constraints Alpár Richárd Mészáros LMO, Université Paris-Sud (based on ongoing joint works with F. Santambrogio, P. Cardaliaguet and F. J. Silva)

More information

A Relaxed Explicit Extragradient-Like Method for Solving Generalized Mixed Equilibria, Variational Inequalities and Constrained Convex Minimization

A Relaxed Explicit Extragradient-Like Method for Solving Generalized Mixed Equilibria, Variational Inequalities and Constrained Convex Minimization , March 16-18, 2016, Hong Kong A Relaxed Explicit Extragradient-Like Method for Solving Generalized Mixed Equilibria, Variational Inequalities and Constrained Convex Minimization Yung-Yih Lur, Lu-Chuan

More information

An inexact subgradient algorithm for Equilibrium Problems

An inexact subgradient algorithm for Equilibrium Problems Volume 30, N. 1, pp. 91 107, 2011 Copyright 2011 SBMAC ISSN 0101-8205 www.scielo.br/cam An inexact subgradient algorithm for Equilibrium Problems PAULO SANTOS 1 and SUSANA SCHEIMBERG 2 1 DM, UFPI, Teresina,

More information

Thai Journal of Mathematics Volume 14 (2016) Number 1 : ISSN

Thai Journal of Mathematics Volume 14 (2016) Number 1 : ISSN Thai Journal of Mathematics Volume 14 (2016) Number 1 : 53 67 http://thaijmath.in.cmu.ac.th ISSN 1686-0209 A New General Iterative Methods for Solving the Equilibrium Problems, Variational Inequality Problems

More information

On the convergence of a regularized Jacobi algorithm for convex optimization

On the convergence of a regularized Jacobi algorithm for convex optimization On the convergence of a regularized Jacobi algorithm for convex optimization Goran Banjac, Kostas Margellos, and Paul J. Goulart Abstract In this paper we consider the regularized version of the Jacobi

More information

ON THE RANGE OF THE SUM OF MONOTONE OPERATORS IN GENERAL BANACH SPACES

ON THE RANGE OF THE SUM OF MONOTONE OPERATORS IN GENERAL BANACH SPACES PROCEEDINGS OF THE AMERICAN MATHEMATICAL SOCIETY Volume 124, Number 11, November 1996 ON THE RANGE OF THE SUM OF MONOTONE OPERATORS IN GENERAL BANACH SPACES HASSAN RIAHI (Communicated by Palle E. T. Jorgensen)

More information

PROXIMAL POINT ALGORITHMS INVOLVING FIXED POINT OF NONSPREADING-TYPE MULTIVALUED MAPPINGS IN HILBERT SPACES

PROXIMAL POINT ALGORITHMS INVOLVING FIXED POINT OF NONSPREADING-TYPE MULTIVALUED MAPPINGS IN HILBERT SPACES PROXIMAL POINT ALGORITHMS INVOLVING FIXED POINT OF NONSPREADING-TYPE MULTIVALUED MAPPINGS IN HILBERT SPACES Shih-sen Chang 1, Ding Ping Wu 2, Lin Wang 3,, Gang Wang 3 1 Center for General Educatin, China

More information

A SECOND-ORDER GRADIENT-LIKE DISSIPATIVE DYNAMICAL SYSTEM with HESSIAN DRIVEN DAMPING. Application to Optimization and Mechanics.

A SECOND-ORDER GRADIENT-LIKE DISSIPATIVE DYNAMICAL SYSTEM with HESSIAN DRIVEN DAMPING. Application to Optimization and Mechanics. A SECOND-ORDER GRADIENT-LIKE DISSIPATIVE DYNAMICAL SYSTEM with HESSIAN DRIVEN DAMPING. Application to Optimization and Mechanics. F. ALVAREZ H. ATTOUCH J. BOLTE P. REDONT Abstract Given H a real Hilbert

More information

Strong Convergence Theorem by a Hybrid Extragradient-like Approximation Method for Variational Inequalities and Fixed Point Problems

Strong Convergence Theorem by a Hybrid Extragradient-like Approximation Method for Variational Inequalities and Fixed Point Problems Strong Convergence Theorem by a Hybrid Extragradient-like Approximation Method for Variational Inequalities and Fixed Point Problems Lu-Chuan Ceng 1, Nicolas Hadjisavvas 2 and Ngai-Ching Wong 3 Abstract.

More information

Evolution equations for maximal monotone operators: asymptotic analysis in continuous and discrete time

Evolution equations for maximal monotone operators: asymptotic analysis in continuous and discrete time Evolution equations for maximal monotone operators: asymptotic analysis in continuous and discrete time Juan Peypouquet Departamento de Matemática, Universidad Técnica Federico Santa María Av. España 168,

More information

Convergence rate of inexact proximal point methods with relative error criteria for convex optimization

Convergence rate of inexact proximal point methods with relative error criteria for convex optimization Convergence rate of inexact proximal point methods with relative error criteria for convex optimization Renato D. C. Monteiro B. F. Svaiter August, 010 Revised: December 1, 011) Abstract In this paper,

More information

LMI Methods in Optimal and Robust Control

LMI Methods in Optimal and Robust Control LMI Methods in Optimal and Robust Control Matthew M. Peet Arizona State University Lecture 15: Nonlinear Systems and Lyapunov Functions Overview Our next goal is to extend LMI s and optimization to nonlinear

More information

Probability and Measure

Probability and Measure Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability

More information

Perron-Frobenius theorem for nonnegative multilinear forms and extensions

Perron-Frobenius theorem for nonnegative multilinear forms and extensions Perron-Frobenius theorem for nonnegative multilinear forms and extensions Shmuel Friedland Univ. Illinois at Chicago ensors 18 December, 2010, Hong-Kong Overview 1 Perron-Frobenius theorem for irreducible

More information

arxiv: v1 [math.oc] 7 Jan 2019

arxiv: v1 [math.oc] 7 Jan 2019 LOCAL MINIMIZERS OF SEMI-ALGEBRAIC FUNCTIONS TIÊ N-SO. N PHẠM arxiv:1901.01698v1 [math.oc] 7 Jan 2019 Abstract. Consider a semi-algebraic function f: R n R, which is continuous around a point x R n. Using

More information

Extensions of the CQ Algorithm for the Split Feasibility and Split Equality Problems

Extensions of the CQ Algorithm for the Split Feasibility and Split Equality Problems Extensions of the CQ Algorithm for the Split Feasibility Split Equality Problems Charles L. Byrne Abdellatif Moudafi September 2, 2013 Abstract The convex feasibility problem (CFP) is to find a member

More information

ON PROXIMAL POINT-TYPE ALGORITHMS FOR WEAKLY CONVEX FUNCTIONS AND THEIR CONNECTION TO THE BACKWARD EULER METHOD

ON PROXIMAL POINT-TYPE ALGORITHMS FOR WEAKLY CONVEX FUNCTIONS AND THEIR CONNECTION TO THE BACKWARD EULER METHOD ON PROXIMAL POINT-TYPE ALGORITHMS FOR WEAKLY CONVEX FUNCTIONS AND THEIR CONNECTION TO THE BACKWARD EULER METHOD TIM HOHEISEL, MAXIME LABORDE, AND ADAM OBERMAN Abstract. In this article we study the connection

More information

MAXIMALITY OF SUMS OF TWO MAXIMAL MONOTONE OPERATORS

MAXIMALITY OF SUMS OF TWO MAXIMAL MONOTONE OPERATORS MAXIMALITY OF SUMS OF TWO MAXIMAL MONOTONE OPERATORS JONATHAN M. BORWEIN, FRSC Abstract. We use methods from convex analysis convex, relying on an ingenious function of Simon Fitzpatrick, to prove maximality

More information

Math 273a: Optimization Subgradient Methods

Math 273a: Optimization Subgradient Methods Math 273a: Optimization Subgradient Methods Instructor: Wotao Yin Department of Mathematics, UCLA Fall 2015 online discussions on piazza.com Nonsmooth convex function Recall: For ˉx R n, f(ˉx) := {g R

More information

PARALLEL SUBGRADIENT METHOD FOR NONSMOOTH CONVEX OPTIMIZATION WITH A SIMPLE CONSTRAINT

PARALLEL SUBGRADIENT METHOD FOR NONSMOOTH CONVEX OPTIMIZATION WITH A SIMPLE CONSTRAINT Linear and Nonlinear Analysis Volume 1, Number 1, 2015, 1 PARALLEL SUBGRADIENT METHOD FOR NONSMOOTH CONVEX OPTIMIZATION WITH A SIMPLE CONSTRAINT KAZUHIRO HISHINUMA AND HIDEAKI IIDUKA Abstract. In this

More information

ANSWER TO A QUESTION BY BURR AND ERDŐS ON RESTRICTED ADDITION, AND RELATED RESULTS Mathematics Subject Classification: 11B05, 11B13, 11P99

ANSWER TO A QUESTION BY BURR AND ERDŐS ON RESTRICTED ADDITION, AND RELATED RESULTS Mathematics Subject Classification: 11B05, 11B13, 11P99 ANSWER TO A QUESTION BY BURR AND ERDŐS ON RESTRICTED ADDITION, AND RELATED RESULTS N. HEGYVÁRI, F. HENNECART AND A. PLAGNE Abstract. We study the gaps in the sequence of sums of h pairwise distinct elements

More information