Convergence of Fixed-Point Iterations
|
|
- Coral Washington
- 6 years ago
- Views:
Transcription
1 Convergence of Fixed-Point Iterations Instructor: Wotao Yin (UCLA Math) July / 30
2 Why study fixed-point iterations? Abstract many existing algorithms in optimization, numerical linear algebra, and differential equations Often require only minimal conditions Simplify complicated convergence proofs 2 / 30
3 3 / 30
4 Notation space: Hilbert space H equipped with, and Fine to think in R 2 (though not always) An operator T : H H (or C C where C is closed subset of H) our focus: when FixT := {x H : x = T (x)} is nonempty the convergence of x k+1 T (x k ) simplification: T (x) is often written as T x 4 / 30
5 Examples unconstrained C 1 minimization: minimize f(x) x is a stationary point if f(x ) = 0 gradient descent operator: for γ > 0 T := I γ f the gradient descent iteration x k+1 T x k lemma: x is a stationary point if, and only if, x FixT 5 / 30
6 Examples constrained C 1 minimization: minimize f(x) subject to x C assume: f is proper closed convex, C is nonempty closed convex projected-gradient operator: for γ > 0 T := proj C (I γ f) x k+1 T x k is the projected-gradient iteration ( x k+1 proj C x k γ f(x k ) ) x is optimal if f(x ), x x 0 x C lemma: x is optimal if, and only if, x FixT 6 / 30
7 Lipschitz operator definition: an operator T is L-Lipschitz, L [0, ), if T x T y L x y, x, y H definition: an operator T is L-quasi-Lipschitz, L [0, ), if for any x FixT (assumed to exist), T x x L x x, x H L 7 / 30
8 Contractive operator definition: T is contractive if it is L-Lipschitz for L [0, 1) definition: T is quasi-contractive if it is L-quasi-Lipschitz for L [0, 1) L 1 8 / 30
9 Banach fixed-point theorem Theorem: If T is contractive, then T admits a unique fixed-point x (existence and uniqueness) x k x (convergence) x k x L k x 0 x (speed) Holds in a Banach space Also known as the Picard-Lindelöf Theorem 9 / 30
10 Examples minimize a Lipschitz-differentiable strongly-convex function: minimize f(x) definition: a convex f is L-Lipschitz-differentiable if f(x) f(y) 2 L x y, f(x) f(y) x, y domf definition: a convex f is µ-strongly convex if, element wise, f(x) f(y), x y µ x y 2 x, y domf lemma: Gradient descent operator T := I γ f is C-contractive for all γ in a certain interval. exercise: find the interval of γ and the formula of C in γ, L, µ Also true for a projected-gradient operator if C is closed convex and C domf 10 / 30
11 Nonexpansive operator definition: an operator is nonexpansive if it is 1-Lipschitz, i.e., 1 T x T y x y, x, y H properties: T may not have a fixed point x if x exists, x k+1 = T x k is bounded may diverge examples: rotation, alt. reflection T 2 x T T x 3 x θ x 11 / 30
12 Between L = 1 and L < 1 L < 1: linear (or geometric) convergence L = 1: bounded, may diverge A vast set of algorithms (often with sublinear convergence) cannot be characterized by L Alternative projection (von Neumann) Gradient descent without strong convexity Proximal-point algorithm without strong convexity Operator splitting algorithms 12 / 30
13 Averaged operator fixed-point residual operator: R := I T Rx = 0 x = T x averaged operator: from some η > 0, T x T y 2 x y 2 η Rx Ry 2, x, y H. quasi-averaged operator: from some η > 0, T x x 2 x x 2 η Rx 2, x H. interpretation: improve by the amount of fixed-point violation speed: may become slower as x k gets closer to the minimizer 13 / 30
14 convention: use α instead of η following η > 0 α (0, 1) η := 1 α α α-averaged operator: from some η > 0, special case: T x T y 2 x y 2 1 α α Rx Ry 2, x, y H α = 1 : T is called firmly nonexpansive 2 α = 1 (violating α (0, 1)): T is called nonexpansive 14 / 30
15 Why called averaged? Lemma T is α-averaged if, and only if, there exists a nonexpansive map T so that T = (1 α)i + αt. or equivalently, T := ( (1 1 α )I + 1 α T ) is nonexpansive. Proof. From T := (I 1 )I + 1 T = I 1 R, basic algebraic manipulation α α α gives us: for any x and y, α( x y 2 T x T y 2 ) = x y 2 T x T y 2 1 α α Rx Ry 2. Therefore, T is nonexpansive T is α-averaged. 15 / 30
16 Properties assume: T is α-averaged T has a fixed point x iteration: x k+1 T x k claims about the iteration: step-by-step, (a) x k+1 x 2 x k x 2 1 α α Rxk }{{} Rx =0 (b) by telescopic sum on (a), x k+1 x 2 x 0 x 2 1 α k α j=0 Rxj 2. (c) { Rx k 2 } is summable and Rx k / 30
17 claims (cont.): (d) by (a), { x k x 2 } is monotonically decreasing until x k FixT (e) by (d), lim k x k x 2 exists (but not necessarily zero) (f) by (d), {x k } is bounded and thus has a weak cluster point x (note: H is weakly sequentially closed) Next: we will show that x FixT and then x k x. 17 / 30
18 claims (cont.): (h) demiclosedness principle: Let T be nonexpansive and R := I T. If x j x and lim Rx j = 0, then Rx = 0. Proof. Goal is to expand Rx 2 into convergent terms as j. Rx 2 = Rx j Rx j, T x j T x + T x j T x 2 x j x 2 2 Rx, x j x Rx j Rx j, T x j T x 2 Rx, x j x. Each term on the RHS 0 as j. Therefore, Rx 2 = 0. (i) by applying (h) to any converging subsequence, each cluster point x of {x k } is a fixed point. 18 / 30
19 claims (cont.): (j) By (e) and (i), x is the unique cluster point. Proof. Let ȳ also be a cluster point. ȳ FixT, just like x. by (e), both lim k x k x 2 and lim k x k ȳ 2 exist. algebraically, 2 x k, x ȳ = x k x 2 x k ȳ 2 + x 2 ȳ 2, whose RHS converges to a constant, say C. passing the limits of the two subsequence, to x and to ȳ, 2 x, x ȳ = 2 ȳ, x ȳ = c. hence, x ȳ 2 = / 30
20 Theorem (Krasnosel skĭi) Let T be an averaged operator with a fixed point. Then, the iteration x k+1 T x k converges weakly to a fixed point of T. 20 / 30
21 Mann s version Let T be a nonexpansive operator with a fixed point. Then, the iteration x k+1 (1 λ k )x k + λ k T x k (known as the KM iteration) converges weakly to a fixed point of T as long as λ k > 0, The λ k condition is ensured if λ k (1 λ k ) =. λ k [ɛ, 1 ɛ] (bounded away from 0 and 1) k 21 / 30
22 Remarks Can be relaxed to quasi-averagedness Summable errors can be added to the iteration In finite dimension, demiclosedness principle is not needed This fundamental result is largely ignore, yet often reproved in R n Browder-Göhde-Kirk fixed-point theorem: If T has no fixed point and λ k is bounded away from 0 and 1, the sequence {x k } is unbounded. Speed: Rx k 2 = o(1/k), no rate for x k x Much more applications than Banach s fixed-point theorem 22 / 30
23 Special cases proximal-point algorithm problem: minimize f(x) proximal operator: let λ > 0, T := prox λf Since T is firmly-nonexpansive, x k+1 prox λf (x k ) converges weakly to a minimizer of f, if it exists 23 / 30
24 Special cases gradient descent: Define the gradient-descent operator: T := I λ f iteration: x k+1 T x k = x k γ f(x k ) Baillion-Haddad theorem: if f is convex and f is L-Lipschitz, then f(x) f(y) 2 L x y, f(x) f(y) If f has a minimizer x, then 2 Lγ γ f(xk ) 2 2 x k x, γ f(x k ) 24 / 30
25 Directly expand x k+1 x 2 : x k+1 x 2 = x k γ f(x k ) x 2 Therefore, T is quasi-averaged if = x k x 2 2 x k x, γ f(x k ) + γ f(x k ) 2 x k x 2 ( 2 Lγ 1) γ f(xk ) 2. λ ( 0, 2 L). In fact, it is easy to show that T is averaged. The convergence result applies to gradient descent. 25 / 30
26 Composition of operators If T 1,..., T m : H H are nonexpansive, then T 1 T m is nonexpansive. If T 1,..., T m : H H are averaged, then T 1 T m is averaged. The averagedness constants get worse: let T i be α i-averaged (allowing α i = 1), then T = T 1 T m is α-averaged where α = m m max i α i In addition, if any T i is contractive, T 1 T m is contractive. 26 / 30
27 Special cases projected-gradient method: convex problem: minimize f(x) subject to x C. x assume sufficient intersection between domf and C define: T := proj C (I λ f) assume f is L-Lipschitz, let λ (0, 2/L) since both proj C and (I λ f) are averaged, T is averaged therefore, the following sequence weakly converges to a minimizer, if exists: x k+1 T x k = proj C ( x k λ f(x k ) ) 27 / 30
28 Special cases prox-gradient method: convex problem: minimize x f(x) + h(x) assume sufficient intersection between domf and domh define: T := prox λh (I λ f) assume f is L-Lipschitz, let λ (0, 2/L) since both prox λh and (I λ f) are averaged, T is averaged therefore, the following sequence weakly converges to a minimizer, if exists: x k+1 T x k = proj λh ( x k λ f(x k ) ) 28 / 30
29 Special cases Later this course, we will see more special cases forward-backward iteration Douglas-Rachford and Peaceman-Rachford iteration ADMM Tseng s forward-backward-forward iteration Davis-Yin iteration primal-dual iteration 29 / 30
30 Summary Fixed-point iteration and analysis are powerful tools Contractive T : fixed-point exists, is unique, iteration strongly converges Nonexpansive T : bounded, if fixed-point exists Averaged T : weakly converges, if fixed-point exists More power: closedness under composition 30 / 30
Tight Rates and Equivalence Results of Operator Splitting Schemes
Tight Rates and Equivalence Results of Operator Splitting Schemes Wotao Yin (UCLA Math) Workshop on Optimization for Modern Computing Joint w Damek Davis and Ming Yan UCLA CAM 14-51, 14-58, and 14-59 1
More informationCoordinate Update Algorithm Short Course Proximal Operators and Algorithms
Coordinate Update Algorithm Short Course Proximal Operators and Algorithms Instructor: Wotao Yin (UCLA Math) Summer 2016 1 / 36 Why proximal? Newton s method: for C 2 -smooth, unconstrained problems allow
More informationMath 273a: Optimization Subgradient Methods
Math 273a: Optimization Subgradient Methods Instructor: Wotao Yin Department of Mathematics, UCLA Fall 2015 online discussions on piazza.com Nonsmooth convex function Recall: For ˉx R n, f(ˉx) := {g R
More informationOperator Splitting for Parallel and Distributed Optimization
Operator Splitting for Parallel and Distributed Optimization Wotao Yin (UCLA Math) Shanghai Tech, SSDS 15 June 23, 2015 URL: alturl.com/2z7tv 1 / 60 What is splitting? Sun-Tzu: (400 BC) Caesar: divide-n-conquer
More informationIterative Convex Optimization Algorithms; Part One: Using the Baillon Haddad Theorem
Iterative Convex Optimization Algorithms; Part One: Using the Baillon Haddad Theorem Charles Byrne (Charles Byrne@uml.edu) http://faculty.uml.edu/cbyrne/cbyrne.html Department of Mathematical Sciences
More informationCoordinate Update Algorithm Short Course Operator Splitting
Coordinate Update Algorithm Short Course Operator Splitting Instructor: Wotao Yin (UCLA Math) Summer 2016 1 / 25 Operator splitting pipeline 1. Formulate a problem as 0 A(x) + B(x) with monotone operators
More informationCoordinate Update Algorithm Short Course Subgradients and Subgradient Methods
Coordinate Update Algorithm Short Course Subgradients and Subgradient Methods Instructor: Wotao Yin (UCLA Math) Summer 2016 1 / 30 Notation f : H R { } is a closed proper convex function domf := {x R n
More informationMath 273a: Optimization Convex Conjugacy
Math 273a: Optimization Convex Conjugacy Instructor: Wotao Yin Department of Mathematics, UCLA Fall 2015 online discussions on piazza.com Convex conjugate (the Legendre transform) Let f be a closed proper
More informationSparse Optimization Lecture: Dual Methods, Part I
Sparse Optimization Lecture: Dual Methods, Part I Instructor: Wotao Yin July 2013 online discussions on piazza.com Those who complete this lecture will know dual (sub)gradient iteration augmented l 1 iteration
More informationARock: an algorithmic framework for asynchronous parallel coordinate updates
ARock: an algorithmic framework for asynchronous parallel coordinate updates Zhimin Peng, Yangyang Xu, Ming Yan, Wotao Yin ( UCLA Math, U.Waterloo DCO) UCLA CAM Report 15-37 ShanghaiTech SSDS 15 June 25,
More informationConvex Optimization Notes
Convex Optimization Notes Jonathan Siegel January 2017 1 Convex Analysis This section is devoted to the study of convex functions f : B R {+ } and convex sets U B, for B a Banach space. The case of B =
More informationA NEW ITERATIVE METHOD FOR THE SPLIT COMMON FIXED POINT PROBLEM IN HILBERT SPACES. Fenghui Wang
A NEW ITERATIVE METHOD FOR THE SPLIT COMMON FIXED POINT PROBLEM IN HILBERT SPACES Fenghui Wang Department of Mathematics, Luoyang Normal University, Luoyang 470, P.R. China E-mail: wfenghui@63.com ABSTRACT.
More informationDual and primal-dual methods
ELE 538B: Large-Scale Optimization for Data Science Dual and primal-dual methods Yuxin Chen Princeton University, Spring 2018 Outline Dual proximal gradient method Primal-dual proximal gradient method
More informationSplitting Techniques in the Face of Huge Problem Sizes: Block-Coordinate and Block-Iterative Approaches
Splitting Techniques in the Face of Huge Problem Sizes: Block-Coordinate and Block-Iterative Approaches Patrick L. Combettes joint work with J.-C. Pesquet) Laboratoire Jacques-Louis Lions Faculté de Mathématiques
More informationARock: an Algorithmic Framework for Asynchronous Parallel Coordinate Updates
ARock: an Algorithmic Framework for Asynchronous Parallel Coordinate Updates Zhimin Peng Yangyang Xu Ming Yan Wotao Yin May 3, 216 Abstract Finding a fixed point to a nonexpansive operator, i.e., x = T
More informationProximal splitting methods on convex problems with a quadratic term: Relax!
Proximal splitting methods on convex problems with a quadratic term: Relax! The slides I presented with added comments Laurent Condat GIPSA-lab, Univ. Grenoble Alpes, France Workshop BASP Frontiers, Jan.
More informationMath 273a: Optimization Overview of First-Order Optimization Algorithms
Math 273a: Optimization Overview of First-Order Optimization Algorithms Wotao Yin Department of Mathematics, UCLA online discussions on piazza.com 1 / 9 Typical flow of numerical optimization Optimization
More informationAsynchronous Algorithms for Conic Programs, including Optimal, Infeasible, and Unbounded Ones
Asynchronous Algorithms for Conic Programs, including Optimal, Infeasible, and Unbounded Ones Wotao Yin joint: Fei Feng, Robert Hannah, Yanli Liu, Ernest Ryu (UCLA, Math) DIMACS: Distributed Optimization,
More informationConvex Optimization. (EE227A: UC Berkeley) Lecture 15. Suvrit Sra. (Gradient methods III) 12 March, 2013
Convex Optimization (EE227A: UC Berkeley) Lecture 15 (Gradient methods III) 12 March, 2013 Suvrit Sra Optimal gradient methods 2 / 27 Optimal gradient methods We saw following efficiency estimates for
More informationMath 273a: Optimization Subgradients of convex functions
Math 273a: Optimization Subgradients of convex functions Made by: Damek Davis Edited by Wotao Yin Department of Mathematics, UCLA Fall 2015 online discussions on piazza.com 1 / 42 Subgradients Assumptions
More informationA General Iterative Method for Constrained Convex Minimization Problems in Hilbert Spaces
A General Iterative Method for Constrained Convex Minimization Problems in Hilbert Spaces MING TIAN Civil Aviation University of China College of Science Tianjin 300300 CHINA tianming963@6.com MINMIN LI
More informationARock: an Algorithmic Framework for Async-Parallel Coordinate Updates
ARock: an Algorithmic Framework for Async-Parallel Coordinate Updates Zhimin Peng Yangyang Xu Ming Yan Wotao Yin July 7, 215 The problem of finding a fixed point to a nonexpansive operator is an abstraction
More informationNew Iterative Algorithm for Variational Inequality Problem and Fixed Point Problem in Hilbert Spaces
Int. Journal of Math. Analysis, Vol. 8, 2014, no. 20, 995-1003 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/ijma.2014.4392 New Iterative Algorithm for Variational Inequality Problem and Fixed
More informationResearch Article On an Iterative Method for Finding a Zero to the Sum of Two Maximal Monotone Operators
Applied Mathematics, Article ID 414031, 5 pages http://dx.doi.org/10.1155/2014/414031 Research Article On an Iterative Method for Finding a Zero to the Sum of Two Maximal Monotone Operators Hongwei Jiao
More informationSecond order forward-backward dynamical systems for monotone inclusion problems
Second order forward-backward dynamical systems for monotone inclusion problems Radu Ioan Boţ Ernö Robert Csetnek March 6, 25 Abstract. We begin by considering second order dynamical systems of the from
More informationExtensions of the CQ Algorithm for the Split Feasibility and Split Equality Problems
Extensions of the CQ Algorithm for the Split Feasibility Split Equality Problems Charles L. Byrne Abdellatif Moudafi September 2, 2013 Abstract The convex feasibility problem (CFP) is to find a member
More informationRecent Developments of Alternating Direction Method of Multipliers with Multi-Block Variables
Recent Developments of Alternating Direction Method of Multipliers with Multi-Block Variables Department of Systems Engineering and Engineering Management The Chinese University of Hong Kong 2014 Workshop
More informationWE consider an undirected, connected network of n
On Nonconvex Decentralized Gradient Descent Jinshan Zeng and Wotao Yin Abstract Consensus optimization has received considerable attention in recent years. A number of decentralized algorithms have been
More informationApproaching monotone inclusion problems via second order dynamical systems with linear and anisotropic damping
March 0, 206 3:4 WSPC Proceedings - 9in x 6in secondorderanisotropicdamping206030 page Approaching monotone inclusion problems via second order dynamical systems with linear and anisotropic damping Radu
More informationSequential Unconstrained Minimization: A Survey
Sequential Unconstrained Minimization: A Survey Charles L. Byrne February 21, 2013 Abstract The problem is to minimize a function f : X (, ], over a non-empty subset C of X, where X is an arbitrary set.
More informationConditional Gradient (Frank-Wolfe) Method
Conditional Gradient (Frank-Wolfe) Method Lecturer: Aarti Singh Co-instructor: Pradeep Ravikumar Convex Optimization 10-725/36-725 1 Outline Today: Conditional gradient method Convergence analysis Properties
More informationConvergence rate analysis for averaged fixed point iterations in the presence of Hölder regularity
Convergence rate analysis for averaged fixed point iterations in the presence of Hölder regularity Jonathan M. Borwein Guoyin Li Matthew K. Tam October 23, 205 Abstract In this paper, we establish sublinear
More informationLecture Notes on Iterative Optimization Algorithms
Charles L. Byrne Department of Mathematical Sciences University of Massachusetts Lowell December 8, 2014 Lecture Notes on Iterative Optimization Algorithms Contents Preface vii 1 Overview and Examples
More informationM. Marques Alves Marina Geremia. November 30, 2017
Iteration complexity of an inexact Douglas-Rachford method and of a Douglas-Rachford-Tseng s F-B four-operator splitting method for solving monotone inclusions M. Marques Alves Marina Geremia November
More informationConvex Analysis and Monotone Operator Theory in Hilbert Spaces
Heinz H. Bauschke Patrick L. Combettes Convex Analysis and Monotone Operator Theory in Hilbert Spaces Springer Foreword This self-contained book offers a modern unifying presentation of three basic areas
More informationHAIYUN ZHOU, RAVI P. AGARWAL, YEOL JE CHO, AND YONG SOO KIM
Georgian Mathematical Journal Volume 9 (2002), Number 3, 591 600 NONEXPANSIVE MAPPINGS AND ITERATIVE METHODS IN UNIFORMLY CONVEX BANACH SPACES HAIYUN ZHOU, RAVI P. AGARWAL, YEOL JE CHO, AND YONG SOO KIM
More informationFINDING BEST APPROXIMATION PAIRS RELATIVE TO A CONVEX AND A PROX-REGULAR SET IN A HILBERT SPACE
FINDING BEST APPROXIMATION PAIRS RELATIVE TO A CONVEX AND A PROX-REGULAR SET IN A HILBERT SPACE D. RUSSELL LUKE Abstract. We study the convergence of an iterative projection/reflection algorithm originally
More informationGradient Descent. Ryan Tibshirani Convex Optimization /36-725
Gradient Descent Ryan Tibshirani Convex Optimization 10-725/36-725 Last time: canonical convex programs Linear program (LP): takes the form min x subject to c T x Gx h Ax = b Quadratic program (QP): like
More informationA Primal-dual Three-operator Splitting Scheme
Noname manuscript No. (will be inserted by the editor) A Primal-dual Three-operator Splitting Scheme Ming Yan Received: date / Accepted: date Abstract In this paper, we propose a new primal-dual algorithm
More informationSplitting methods for decomposing separable convex programs
Splitting methods for decomposing separable convex programs Philippe Mahey LIMOS - ISIMA - Université Blaise Pascal PGMO, ENSTA 2013 October 4, 2013 1 / 30 Plan 1 Max Monotone Operators Proximal techniques
More informationLecture 12 Unconstrained Optimization (contd.) Constrained Optimization. October 15, 2008
Lecture 12 Unconstrained Optimization (contd.) Constrained Optimization October 15, 2008 Outline Lecture 11 Gradient descent algorithm Improvement to result in Lec 11 At what rate will it converge? Constrained
More informationSubgradient Method. Ryan Tibshirani Convex Optimization
Subgradient Method Ryan Tibshirani Convex Optimization 10-725 Consider the problem Last last time: gradient descent min x f(x) for f convex and differentiable, dom(f) = R n. Gradient descent: choose initial
More information1 Introduction and preliminaries
Proximal Methods for a Class of Relaxed Nonlinear Variational Inclusions Abdellatif Moudafi Université des Antilles et de la Guyane, Grimaag B.P. 7209, 97275 Schoelcher, Martinique abdellatif.moudafi@martinique.univ-ag.fr
More informationMath 273a: Optimization Lagrange Duality
Math 273a: Optimization Lagrange Duality Instructor: Wotao Yin Department of Mathematics, UCLA Winter 2015 online discussions on piazza.com Gradient descent / forward Euler assume function f is proper
More informationPARALLEL SUBGRADIENT METHOD FOR NONSMOOTH CONVEX OPTIMIZATION WITH A SIMPLE CONSTRAINT
Linear and Nonlinear Analysis Volume 1, Number 1, 2015, 1 PARALLEL SUBGRADIENT METHOD FOR NONSMOOTH CONVEX OPTIMIZATION WITH A SIMPLE CONSTRAINT KAZUHIRO HISHINUMA AND HIDEAKI IIDUKA Abstract. In this
More informationOn convergence rate of the Douglas-Rachford operator splitting method
On convergence rate of the Douglas-Rachford operator splitting method Bingsheng He and Xiaoming Yuan 2 Abstract. This note provides a simple proof on a O(/k) convergence rate for the Douglas- Rachford
More informationSEMI-SMOOTH SECOND-ORDER TYPE METHODS FOR COMPOSITE CONVEX PROGRAMS
SEMI-SMOOTH SECOND-ORDER TYPE METHODS FOR COMPOSITE CONVEX PROGRAMS XIANTAO XIAO, YONGFENG LI, ZAIWEN WEN, AND LIWEI ZHANG Abstract. The goal of this paper is to study approaches to bridge the gap between
More informationarxiv: v3 [math.oc] 1 May 2015
Noname manuscript No. will be inserted by the editor) arxiv:1407.5210v3 [math.oc] 1 May 2015 Faster convergence rates of relaxed Peaceman-Rachford and ADMM under regularity assumptions Damek Davis Wotao
More informationCONVERGENCE THEOREMS FOR STRICTLY ASYMPTOTICALLY PSEUDOCONTRACTIVE MAPPINGS IN HILBERT SPACES. Gurucharan Singh Saluja
Opuscula Mathematica Vol 30 No 4 2010 http://dxdoiorg/107494/opmath2010304485 CONVERGENCE THEOREMS FOR STRICTLY ASYMPTOTICALLY PSEUDOCONTRACTIVE MAPPINGS IN HILBERT SPACES Gurucharan Singh Saluja Abstract
More informationAbout Split Proximal Algorithms for the Q-Lasso
Thai Journal of Mathematics Volume 5 (207) Number : 7 http://thaijmath.in.cmu.ac.th ISSN 686-0209 About Split Proximal Algorithms for the Q-Lasso Abdellatif Moudafi Aix Marseille Université, CNRS-L.S.I.S
More informationarxiv: v4 [math.oc] 29 Jan 2018
Noname manuscript No. (will be inserted by the editor A new primal-dual algorithm for minimizing the sum of three functions with a linear operator Ming Yan arxiv:1611.09805v4 [math.oc] 29 Jan 2018 Received:
More informationHybrid Steepest Descent Method for Variational Inequality Problem over Fixed Point Sets of Certain Quasi-Nonexpansive Mappings
Hybrid Steepest Descent Method for Variational Inequality Problem over Fixed Point Sets of Certain Quasi-Nonexpansive Mappings Isao Yamada Tokyo Institute of Technology VIC2004 @ Wellington, Feb. 13, 2004
More informationA New Use of Douglas-Rachford Splitting and ADMM for Classifying Infeasible, Unbounded, and Pathological Conic Programs
A New Use of Douglas-Rachford Splitting and ADMM for Classifying Infeasible, Unbounded, and Pathological Conic Programs Yanli Liu, Ernest K. Ryu, and Wotao Yin May 23, 2017 Abstract In this paper, we present
More informationg 2 (x) (1/3)M 1 = (1/3)(2/3)M.
COMPACTNESS If C R n is closed and bounded, then by B-W it is sequentially compact: any sequence of points in C has a subsequence converging to a point in C Conversely, any sequentially compact C R n is
More informationTHE CYCLIC DOUGLAS RACHFORD METHOD FOR INCONSISTENT FEASIBILITY PROBLEMS
THE CYCLIC DOUGLAS RACHFORD METHOD FOR INCONSISTENT FEASIBILITY PROBLEMS JONATHAN M. BORWEIN AND MATTHEW K. TAM Abstract. We analyse the behaviour of the newly introduced cyclic Douglas Rachford algorithm
More informationProximal methods. S. Villa. October 7, 2014
Proximal methods S. Villa October 7, 2014 1 Review of the basics Often machine learning problems require the solution of minimization problems. For instance, the ERM algorithm requires to solve a problem
More informationA Unified Analysis of Nonconvex Optimization Duality and Penalty Methods with General Augmenting Functions
A Unified Analysis of Nonconvex Optimization Duality and Penalty Methods with General Augmenting Functions Angelia Nedić and Asuman Ozdaglar April 16, 2006 Abstract In this paper, we study a unifying framework
More informationGENERAL NONCONVEX SPLIT VARIATIONAL INEQUALITY PROBLEMS. Jong Kyu Kim, Salahuddin, and Won Hee Lim
Korean J. Math. 25 (2017), No. 4, pp. 469 481 https://doi.org/10.11568/kjm.2017.25.4.469 GENERAL NONCONVEX SPLIT VARIATIONAL INEQUALITY PROBLEMS Jong Kyu Kim, Salahuddin, and Won Hee Lim Abstract. In this
More informationGradient descent. Barnabas Poczos & Ryan Tibshirani Convex Optimization /36-725
Gradient descent Barnabas Poczos & Ryan Tibshirani Convex Optimization 10-725/36-725 1 Gradient descent First consider unconstrained minimization of f : R n R, convex and differentiable. We want to solve
More informationContraction Methods for Convex Optimization and Monotone Variational Inequalities No.16
XVI - 1 Contraction Methods for Convex Optimization and Monotone Variational Inequalities No.16 A slightly changed ADMM for convex optimization with three separable operators Bingsheng He Department of
More informationDouglas-Rachford splitting for nonconvex feasibility problems
Douglas-Rachford splitting for nonconvex feasibility problems Guoyin Li Ting Kei Pong Jan 3, 015 Abstract We adapt the Douglas-Rachford DR) splitting method to solve nonconvex feasibility problems by studying
More informationMath 273a: Optimization Subgradients of convex functions
Math 273a: Optimization Subgradients of convex functions Made by: Damek Davis Edited by Wotao Yin Department of Mathematics, UCLA Fall 2015 online discussions on piazza.com 1 / 20 Subgradients Assumptions
More informationAdaptive Restarting for First Order Optimization Methods
Adaptive Restarting for First Order Optimization Methods Nesterov method for smooth convex optimization adpative restarting schemes step-size insensitivity extension to non-smooth optimization continuation
More informationShiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 9. Alternating Direction Method of Multipliers
Shiqian Ma, MAT-258A: Numerical Optimization 1 Chapter 9 Alternating Direction Method of Multipliers Shiqian Ma, MAT-258A: Numerical Optimization 2 Separable convex optimization a special case is min f(x)
More informationYou should be able to...
Lecture Outline Gradient Projection Algorithm Constant Step Length, Varying Step Length, Diminishing Step Length Complexity Issues Gradient Projection With Exploration Projection Solving QPs: active set
More informationIterative algorithms based on the hybrid steepest descent method for the split feasibility problem
Available online at www.tjnsa.com J. Nonlinear Sci. Appl. 9 (206), 424 4225 Research Article Iterative algorithms based on the hybrid steepest descent method for the split feasibility problem Jong Soo
More informationExistence and Approximation of Fixed Points of. Bregman Nonexpansive Operators. Banach Spaces
Existence and Approximation of Fixed Points of in Reflexive Banach Spaces Department of Mathematics The Technion Israel Institute of Technology Haifa 22.07.2010 Joint work with Prof. Simeon Reich General
More informationAccelerated Proximal Gradient Methods for Convex Optimization
Accelerated Proximal Gradient Methods for Convex Optimization Paul Tseng Mathematics, University of Washington Seattle MOPTA, University of Guelph August 18, 2008 ACCELERATED PROXIMAL GRADIENT METHODS
More informationConvergence Theorems of Approximate Proximal Point Algorithm for Zeroes of Maximal Monotone Operators in Hilbert Spaces 1
Int. Journal of Math. Analysis, Vol. 1, 2007, no. 4, 175-186 Convergence Theorems of Approximate Proximal Point Algorithm for Zeroes of Maximal Monotone Operators in Hilbert Spaces 1 Haiyun Zhou Institute
More informationADMM for monotone operators: convergence analysis and rates
ADMM for monotone operators: convergence analysis and rates Radu Ioan Boţ Ernö Robert Csetne May 4, 07 Abstract. We propose in this paper a unifying scheme for several algorithms from the literature dedicated
More informationNewton s Method. Javier Peña Convex Optimization /36-725
Newton s Method Javier Peña Convex Optimization 10-725/36-725 1 Last time: dual correspondences Given a function f : R n R, we define its conjugate f : R n R, f ( (y) = max y T x f(x) ) x Properties and
More informationA Relaxed Explicit Extragradient-Like Method for Solving Generalized Mixed Equilibria, Variational Inequalities and Constrained Convex Minimization
, March 16-18, 2016, Hong Kong A Relaxed Explicit Extragradient-Like Method for Solving Generalized Mixed Equilibria, Variational Inequalities and Constrained Convex Minimization Yung-Yih Lur, Lu-Chuan
More informationFixed Point Theory in Reflexive Metric Spaces
Babeş-Bolyai University Cluj-Napoca Department of Mathematics and University of Seville Department of Mathematical Analysis Fixed Point Theory in Reflexive Metric Spaces Ph.D. Thesis Summary Adriana-Maria
More informationUnconstrained minimization of smooth functions
Unconstrained minimization of smooth functions We want to solve min x R N f(x), where f is convex. In this section, we will assume that f is differentiable (so its gradient exists at every point), and
More informationA Unified Approach to Proximal Algorithms using Bregman Distance
A Unified Approach to Proximal Algorithms using Bregman Distance Yi Zhou a,, Yingbin Liang a, Lixin Shen b a Department of Electrical Engineering and Computer Science, Syracuse University b Department
More informationmin f(x). (2.1) Objectives consisting of a smooth convex term plus a nonconvex regularization term;
Chapter 2 Gradient Methods The gradient method forms the foundation of all of the schemes studied in this book. We will provide several complementary perspectives on this algorithm that highlight the many
More informationPrimal-dual coordinate descent
Primal-dual coordinate descent Olivier Fercoq Joint work with P. Bianchi & W. Hachem 15 July 2015 1/28 Minimize the convex function f, g, h convex f is differentiable Problem min f (x) + g(x) + h(mx) x
More informationOptimization and Optimal Control in Banach Spaces
Optimization and Optimal Control in Banach Spaces Bernhard Schmitzer October 19, 2017 1 Convex non-smooth optimization with proximal operators Remark 1.1 (Motivation). Convex optimization: easier to solve,
More informationA generalized forward-backward method for solving split equality quasi inclusion problems in Banach spaces
Available online at www.isr-publications.com/jnsa J. Nonlinear Sci. Appl., 10 (2017), 4890 4900 Research Article Journal Homepage: www.tjnsa.com - www.isr-publications.com/jnsa A generalized forward-backward
More informationSplit equality monotone variational inclusions and fixed point problem of set-valued operator
Acta Univ. Sapientiae, Mathematica, 9, 1 (2017) 94 121 DOI: 10.1515/ausm-2017-0007 Split equality monotone variational inclusions and fixed point problem of set-valued operator Mohammad Eslamian Department
More informationMonotone Operator Splitting Methods in Signal and Image Recovery
Monotone Operator Splitting Methods in Signal and Image Recovery P.L. Combettes 1, J.-C. Pesquet 2, and N. Pustelnik 3 2 Univ. Pierre et Marie Curie, Paris 6 LJLL CNRS UMR 7598 2 Univ. Paris-Est LIGM CNRS
More informationNonconvex notions of regularity and convergence of fundamental algorithms for feasibility problems
Nonconvex notions of regularity and convergence of fundamental algorithms for feasibility problems Robert Hesse and D. Russell Luke December 12, 2012 Abstract We consider projection algorithms for solving
More informationIterative Convex Optimization Algorithms; Part Two: Without the Baillon Haddad Theorem
Iterative Convex Optimization Algorithms; Part Two: Without the Baillon Haddad Theorem Charles L. Byrne February 24, 2015 Abstract Let C X be a nonempty subset of an arbitrary set X and f : X R. The problem
More informationLecture 5: Gradient Descent. 5.1 Unconstrained minimization problems and Gradient descent
10-725/36-725: Convex Optimization Spring 2015 Lecturer: Ryan Tibshirani Lecture 5: Gradient Descent Scribes: Loc Do,2,3 Disclaimer: These notes have not been subjected to the usual scrutiny reserved for
More informationHedy Attouch, Jérôme Bolte, Benar Svaiter. To cite this version: HAL Id: hal
Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods Hedy Attouch, Jérôme Bolte, Benar Svaiter To cite
More informationMATH 680 Fall November 27, Homework 3
MATH 680 Fall 208 November 27, 208 Homework 3 This homework is due on December 9 at :59pm. Provide both pdf, R files. Make an individual R file with proper comments for each sub-problem. Subgradients and
More informationViscosity approximation methods for the implicit midpoint rule of asymptotically nonexpansive mappings in Hilbert spaces
Available online at www.tjnsa.com J. Nonlinear Sci. Appl. 9 016, 4478 4488 Research Article Viscosity approximation methods for the implicit midpoint rule of asymptotically nonexpansive mappings in Hilbert
More informationAPPLICATIONS IN FIXED POINT THEORY. Matthew Ray Farmer. Thesis Prepared for the Degree of MASTER OF ARTS UNIVERSITY OF NORTH TEXAS.
APPLICATIONS IN FIXED POINT THEORY Matthew Ray Farmer Thesis Prepared for the Degree of MASTER OF ARTS UNIVERSITY OF NORTH TEXAS December 2005 APPROVED: Elizabeth M. Bator, Major Professor Paul Lewis,
More information6. Proximal gradient method
L. Vandenberghe EE236C (Spring 2013-14) 6. Proximal gradient method motivation proximal mapping proximal gradient method with fixed step size proximal gradient method with line search 6-1 Proximal mapping
More informationA Dykstra-like algorithm for two monotone operators
A Dykstra-like algorithm for two monotone operators Heinz H. Bauschke and Patrick L. Combettes Abstract Dykstra s algorithm employs the projectors onto two closed convex sets in a Hilbert space to construct
More informationOn an iterative algorithm for variational inequalities in. Banach space
MATHEMATICAL COMMUNICATIONS 95 Math. Commun. 16(2011), 95 104. On an iterative algorithm for variational inequalities in Banach spaces Yonghong Yao 1, Muhammad Aslam Noor 2,, Khalida Inayat Noor 3 and
More informationThai Journal of Mathematics Volume 14 (2016) Number 1 : ISSN
Thai Journal of Mathematics Volume 14 (2016) Number 1 : 53 67 http://thaijmath.in.cmu.ac.th ISSN 1686-0209 A New General Iterative Methods for Solving the Equilibrium Problems, Variational Inequality Problems
More information1991 Mathematics Subject Classification: 47H09, 47H10.
æ THE LERAY-SCHAUDER ALTERNATIVE FOR NONEXPANSIVE MAPS FROM THE BALL CHARACTERIZE HILBERT SPACE Michael Ireland Department of Mathematics The University of Newcastle Newcastle 2308, NSW, Australia William
More informationLinear Convergence under the Polyak-Łojasiewicz Inequality
Linear Convergence under the Polyak-Łojasiewicz Inequality Hamed Karimi, Julie Nutini and Mark Schmidt The University of British Columbia LCI Forum February 28 th, 2017 1 / 17 Linear Convergence of Gradient-Based
More informationA New Modified Gradient-Projection Algorithm for Solution of Constrained Convex Minimization Problem in Hilbert Spaces
A New Modified Gradient-Projection Algorithm for Solution of Constrained Convex Minimization Problem in Hilbert Spaces Cyril Dennis Enyi and Mukiawa Edwin Soh Abstract In this paper, we present a new iterative
More informationResearch Article Strong Convergence of a Projected Gradient Method
Applied Mathematics Volume 2012, Article ID 410137, 10 pages doi:10.1155/2012/410137 Research Article Strong Convergence of a Projected Gradient Method Shunhou Fan and Yonghong Yao Department of Mathematics,
More informationLecture 5: September 12
10-725/36-725: Convex Optimization Fall 2015 Lecture 5: September 12 Lecturer: Lecturer: Ryan Tibshirani Scribes: Scribes: Barun Patra and Tyler Vuong Note: LaTeX template courtesy of UC Berkeley EECS
More informationCYCLIC COORDINATE-UPDATE ALGORITHMS FOR FIXED-POINT PROBLEMS: ANALYSIS AND APPLICATIONS
SIAM J. SCI. COMPUT. Vol. 39, No. 4, pp. A80 A300 CYCLIC COORDINATE-UPDATE ALGORITHMS FOR FIXED-POINT PROBLEMS: ANALYSIS AND APPLICATIONS YAT TIN CHOW, TIANYU WU, AND WOTAO YIN Abstract. Many problems
More informationExistence of Global Minima for Constrained Optimization 1
Existence of Global Minima for Constrained Optimization 1 A. E. Ozdaglar 2 and P. Tseng 3 Communicated by A. Miele 1 We thank Professor Dimitri Bertsekas for his comments and support in the writing of
More information