Splitting Techniques in the Face of Huge Problem Sizes: Block-Coordinate and Block-Iterative Approaches
|
|
- Kerrie McKinney
- 5 years ago
- Views:
Transcription
1 Splitting Techniques in the Face of Huge Problem Sizes: Block-Coordinate and Block-Iterative Approaches Patrick L. Combettes joint work with J.-C. Pesquet) Laboratoire Jacques-Louis Lions Faculté de Mathématiques Université Pierre et Marie Curie Paris Paris, France Edinburgh, May 7, 2015 Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 1/25
2 2/25 Framework A wide range of problems in applied nonlinear analysis can be reduced to finding a point in a closed convex subset F of a Hilbert space H. The solution set F is often constructed from prior information and the observation of data. Mathematical model: find x F n N Fix T n, T n : H H quasi-nonexpansive Algorithmic model: x n+1 = T n x n Asymptotic analysis: under suitable assumptions, x n ) n N converges weakly/strongly/linearly) to a point in F Application areas: variational inequalities, game theory, optimization, statistics, partial differential equations, inverse problems, mechanics, signal and image processing, machine learning, computer vision, transport theory, optics,... Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 2/25
3 2/25 Basic convergence principle T n is quasi-nonexpansive, i.e., with F Fix T n. Then: x H) y Fix T n ) T n x y x y, Fejér monotonicity: n N) y F) x n+1 y x n y Suppose that the set Wx n ) n N of weak cluster points of x n ) n N is in F. Then x n x F Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 3/25
4 2/25 Basic convergence principle T n is quasi-nonexpansive, i.e., with F Fix T n. Then: x H) y Fix T n ) T n x y x y, Fejér monotonicity: n N) y F) x n+1 y x n y Suppose that the set Wx n ) n N of weak cluster points of x n ) n N is in F. Then x n x F Elementary example: alternating projection method for finding a point in the intersection of two closed convex sets C 1 and C 2. Set T 2n = P 2 and T 2n+1 = P 1. Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 3/25
5 2/25 Example: Special cases Fixed point methods: Krasnosel skĭı Mann, string averaging, extrapolated barycentric methods, Martinet s cyclic firmly nonexpansive iteration, etc. Projections methods for convex feasibility problems Projections methods for split feasibility problems Subgradient projections methods for systems of convex inequalities Splitting methods for monotone inclusions: forward-backward, Douglas-Rachford, forward-backward-forward, Spingarn, etc. Convex optimization methods: projected gradient method, augmented Lagrangian method, ADMM, various proximal splitting methods, etc. Iterative methods for variational inequalites in mechanics, traffic theory, and finance etc. Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 4/25
6 2/25 Very large scale problems I: Block-coordinate approach The basic iteration x n+1 = T n x n in the Hilbert space H may be too involved computations, memory) to be operational We assume that H can be decomposed in m factors H = H 1 H m in which each T n has an explicit decomposition T n : x T 1,n x,...,t m,n x) The strategy is to update only arbitrarily chosen coordinates of x n+1 = x 1,n+1,...,x m,n+1 ) up to some tolerance: x i,n+1 = x i,n +ε i,n Ti,n x n + a i,n x i,n ), where ε i,n {0, 1} activation variable) Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 5/25
7 2/25 Very large scale problems I: Block-coordinate approach The basic iteration x n+1 = T n x n in the Hilbert space H may be too involved computations, memory) to be operational We assume that H can be decomposed in m factors H = H 1 H m in which each T n has an explicit decomposition T n : x T 1,n x,...,t m,n x) The strategy is to update only arbitrarily chosen coordinates of x n+1 = x 1,n+1,...,x m,n+1 ) up to some tolerance: x i,n+1 = x i,n +ε i,n Ti,n x n + a i,n x i,n ), where ε i,n {0, 1} activation variable) Our goal is to extend available fixed points methods to this block-coordinate setting while preserving their convergence properties Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 5/25
8 2/25 A roadblock The nice properties of an operator T: H H are destroyed by coordinate sampling For instance, consider the nonexpansive 1-Lipschitz) operator T: x 1, x 2 ) x 2, x 1 ) Then Q 1 : x 1, x 2 ) x 1, x 1 ) Q 2 : x 1, x 2 ) x 2, x 2 ) Q 1 Q 2 Q 2 Q 1 are no longer nonexpansive Fejér monotonicity is destroyed Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 6/25
9 2/25 A roadblock The nice properties of an operator T: H H are destroyed by coordinate sampling For instance, consider the nonexpansive 1-Lipschitz) operator T: x 1, x 2 ) x 2, x 1 ) Then Q 1 : x 1, x 2 ) x 1, x 1 ) Q 2 : x 1, x 2 ) x 2, x 2 ) Q 1 Q 2 Q 2 Q 1 are no longer nonexpansive Fejér monotonicity is destroyed.. and even for jointly convex functions with a unique minimizer, alternating minimizations fail, etc. Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 6/25
10 2/25 A roadblock The nice properties of an operator T: H H are destroyed by coordinate sampling For instance, consider the nonexpansive 1-Lipschitz) operator T: x 1, x 2 ) x 2, x 1 ) Then Q 1 : x 1, x 2 ) x 1, x 1 ) Q 2 : x 1, x 2 ) x 2, x 2 ) Q 1 Q 2 Q 2 Q 1 are no longer nonexpansive Fejér monotonicity is destroyed.. and even for jointly convex functions with a unique minimizer, alternating minimizations fail, etc. introduce stochasticity and renorming Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 6/25
11 Notation H: separable real Hilbert space; : weak convergence Wx n ) n N : set of weak cluster points of x n ) n N H N Γ 0 H): proper lower semicontinuous convex functions from H to ],+ ] Ω, F, P): underlying probability space Given a sequence x n ) n N of H-valued random variables, X = X n ) n N, where n N) X n = σx 0,...,x n ) l + X ): set of sequences of [0,+ [-valued random variables ξ n ) n N such that, for every n N, ξ n is X n -measurable { l p } +X )= ξ n ) n N l + X ) ξn p < + P-a.s., p ]0,+ [ l + {ξ X ) = n ) n N l + X ) n N sup n N } ξ n < + P-a.s. Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 7/25
12 Stochastic quasi-fejér sequences Ø F H, φ: [0,+ [ [0,+ [, φt) + as t + Deterministic definition: A sequence x n ) n N in H is Fejér monotone with respect to F if for every z F, n N) φ x n+1 z ) φ x n z ) Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 8/25
13 Stochastic quasi-fejér sequences Ø F H, φ: [0,+ [ [0,+ [, φt) + as t + Stochastic definition 1: A sequence x n ) n N of H-valued random variables is stochastically Fejér monotone with respect to F if, for every z F, n N) Eφ x n+1 z X n ) φ x n z ) Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 8/25
14 Stochastic quasi-fejér sequences Ø F H, φ: [0,+ [ [0,+ [, φt) + as t + Stochastic definition 2: A sequence x n ) n N of H-valued random variables is stochastically quasi-fejér monotone with respect to F if, for every z F, there exist χ n z)) n N l 1 + X ), ϑ n z)) n N l + X ), and η n z)) n N l 1 +X ) such that n N) Eφ x n+1 z ) X n )+ϑ n z) 1+χ n z))φ x n z )+η n z) Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 8/25
15 Stochastic quasi-fejér sequences Ø F H, φ: [0,+ [ [0,+ [, φt) + as t + Stochastic definition 2: A sequence x n ) n N of H-valued random variables is stochastically quasi-fejér monotone with respect to F if, for every z F, there exist χ n z)) n N l 1 + X ), ϑ n z)) n N l + X ), and η n z)) n N l 1 +X ) such that Theorem n N) Eφ x n+1 z ) X n )+ϑ n z) 1+χ n z))φ x n z )+η n z) Suppose x n ) n N is stochastically quasi-fejér monotone. Then z F) [ n N ϑ nz) < + P-a.s. ] [Wx n ) n N F P-a.s.] [x n ) n N converges weakly P-a.s. to an F-valued random variable] Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 8/25
16 An abstract stochastic iterative scheme Theorem Let Ø F H. Suppose that x n ) n N, t n ) n N, and e n ) n N are sequences of H-valued random variables such that: n N) x n+1 = x n +λ n t n + e n x n ), λ n ]0, 1] n N λ n E en 2 X n ) < + P-a.s. For every z F, there exist θ n z)) n N l + X ), µ n z)) n N l + X ), and ν n z)) n N l + X ) such that λ n µ n z)) n N l 1 +X ), λ n ν n z)) n N l 1/2 + X ), and n N) E t n z 2 X n )+θ n z) 1+µ n z)) x n z 2 +ν n z) Then z F) [ n N λ nθ n z) < + P-a.s. ] and [Wx n ) n N F P-a.s.] x n ) n N converges weakly P-a.s. to an F-valued random variable. Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 9/25
17 Single-layer algorithm H = H 1 H m, H i ) 1 i m separable real Hilbert spaces T n : H H: x T i,n x) 1 i m quasinonexpansive F = n N Fix T n Ø x 0 and the errors a n ) n N are H-valued random variables ε n ) n N identically distributed D-valued random variables with D = {0, 1} m {0} Algorithm: for n = 0, 1,... for i = 1,...,m xi,n+1 = x i,n +ε i,n λ n Ti,n x 1,n,...,x m,n )+a i,n x i,n ) Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 10/25
18 Single-layer algorithm Theorem Set n N) X n = σx 0,...,x n ) and E n = σε n ). Assume that inf n N λ n > 0 and sup n N λ n < 1. n N E an 2 X n ) < +. Wx n ) n N F P-a.s. For every n N, E n and X n are independent. For every i {1,...,m}, p i = P[ε i,0 = 1] > 0. Then x n ) n N converges weakly P-a.s. to an F-valued r.v. Proof. We achieve stochastic quasi-fejér monotonicity w.r.t. the norm x 2 = m i=1 x i 2 /p i Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 11/25
19 Example: Krasnosel skĭı Mann iteration T: H H: x T i x) 1 i m nonexpansive operator F = Fix T Ø x 0 and the errors a n ) n N are H-valued random variables ε n ) n N identically distributed D-valued random variables with D = {0, 1} m {0} Algorithm: for n = 0, 1,... for i = 1,...,m xi,n+1 = x i,n +ε i,n λ n Ti x 1,n,...,x m,n )+a i,n x i,n ) Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 12/25
20 Example: Krasnosel skĭı Mann iteration Theorem Set n N) X n = σx 0,...,x n ) and E n = σε n ). Assume that inf n N λ n > 0 and sup n N λ n < 1 n N E an 2 X n ) < + For every n N, E n and X n are independent For every i {1,...,m}, P[ε i,0 = 1] > 0 Then x n ) n N converges weakly P-a.s. to an F-valued r.v. Proof. Apply the single-layer theorem with T n = T. Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 13/25
21 Example: Block-coordinate, primal-dual splitting of coupled composite monotone inclusions Let F be the set of solutions to the problem find x 1 H 1,...,x m H m such that i {1,...,m}) 0 A i x i + p L ki B m k L kj x j ), where each A i : H i 2 H i, Bk : G k 2 G k are maximally monotone, L ki : H i G k linear&bounded k=1 Let F be the set of solutions to the dual problem find v 1 G 1,...,v p G p such that m p ) k {1,...,p}) 0 L ki A 1 i L li v l i=1 l=1 j=1 + B 1 k v k Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 14/25
22 Example: Block-coordinate, primal-dual splitting of coupled composite monotone inclusions Let Q j 1 j m+p) be the jth component of the projector P V onto the subspace { m } V = x, y) H G k {1,...,p}) y k = L ki x i Algorithm: γ ]0,+ [ and for n = 0, 1,... µ n ]0, 2[ for i = 1,...,m zi,n+1 = z i,n +ε i,n Qi x 1,n,...,x m,n, y 1,n,...,y p,n )+c i,n z i,n ) x i,n+1 = x i,n +ε i,n µ n JγAi 2z i,n+1 x i,n )+a i,n z i,n+1 ) for k = 1,...,p wk,n+1 = w k,n +ε m+k,n Qm+k x 1,n,...,x m,n, y 1,n,...,y p,n )+d k,n w k,n y k,n+1 = y k,n +ε m+k,n µ n JγBk 2w k,n+1 y k,n )+b k,n w k,n+1 ). i=1 Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 15/25
23 Example: Block-coordinate, primal-dual splitting of coupled composite monotone inclusions Under the same conditions as before: z n ) n N converges weakly P-a.s. to an F-valued random variable γ 1 w n y n )) n N converges weakly P-a.s. to an F - valued random variable Proof: This relies on the single-layer theorem and a nonstandard implementation of the Douglas-Rachford algorithm in the product space H G = H 1 H m G 1 G p : solve 0, 0) Ax By+N V x, y), where A: H 2 H : x m i=1 A ix i B: G 2 G : x p { k=1b k y k V = x, y) H G k {1,...,p}) y k = } m i=1 L kix i Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 16/25
24 Example: Nonsmooth, block-coordinate, primal-dual multivariate minimization Let F be the set of solutions to the problem minimize x 1 H 1,...,x m H m m f i x i )+ i=1 p m ) g k L ki x i where f i Γ 0 H i ), g k Γ 0 G k ), L ki : H i G k linear&bounded k=1 Let F be the set of solutions to the dual problem minimize v 1 G 1,...,v p G p m i=1 f i k=1 i=1 p ) L ki v k + p g k v k) k=1 Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 17/25
25 Example: Nonsmooth, block-coordinate, primal-dual multivariate minimization Algorithm: γ ]0,+ [ and for n = 0, 1,... µ n ]0, 2[ for i = 1,...,m zi,n+1 = z i,n +ε i,n Qi x 1,n,...,x m,n, y 1,n,...,y p,n )+c i,n z i,n ) x i,n+1 = x i,n +ε i,n µ n proxγfi 2z i,n+1 x i,n )+a i,n z i,n+1 ) for k = 1,...,p wk,n+1 = w k,n +ε m+k,n Qm+k x 1,n,...,x m,n, y 1,n,...,y p,n )+d k,n w k,n y k,n+1 = y k,n +ε m+k,n µ n proxγgk 2w k,n+1 y k,n )+b k,n w k,n+1 ). Under the same conditions as before: z n ) n N converges weakly P-a.s. to an F-valued random variable γ 1 w n y n )) n N converges weakly P-a.s. to an F -valued random variable Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 18/25
26 Double-layer random block-coordinate algorithms T n : H H: x T i,n x) 1 i m is α n -averaged Id+α 1 n T n Id) is nonexpansive) R n : H H is β n -averaged F = n N Fix T n R n Ø x 0, a n ) n N, and b n ) n N are H-valued random variables ε n ) n N identically distributed D-valued random variables with D = {0, 1} m {0} Algorithm: for n = 0, 1,... y n = R n x n + b n for i = 1,...,m ) xi,n+1 = x i,n +ε i,n Ti,n y n + a i,n x i,n Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 19/25
27 Double-layer random block-coordinate algorithms Theorem Set n N) X n = σx 0,...,x n ) and E n = σε n ). Assume that n N E an 2 X n ) < +, n N E bn 2 X n ) < + Wx n ) n N F P-a.s. For every n N, E n and X n are independent For every i {1,...,m}, p i = P[ε i,0 = 1] > 0 Then x n ) n N converges weakly P-a.s. to an F-valued r.v. Proof. We achieve stochastic quasi-fejér monotonicity w.r.t. the norm x 2 = m i=1 x i 2 /p i Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 20/25
28 Block-coordinate forward-backward splitting The forward-backward splitting algorithm is important because: It models many problems of interest and tolerates errors: PLC and Wajs, Signal recovery by proximal forward-backward splitting, Multiscale Model. Simul., vol. 4, Applied to the dual problem of a strongly monotone/convex composite problem, it provides a primal-dual algorithm: PLC, Dung, and Vũ, Dualization of signal recovery problems, Set-Valued Var. Anal., vol. 18, Applied in a renormed product space, it covers/extends various methods e.g., Chambolle-Pock): Vũ, A splitting algorithm for dual monotone inclusions involving cocoercive operators, Adv. Comput. Math., vol. 38, It can be implemented with variable metrics: PLC and Vũ, Variable metric forward-backward splitting with applications to monotone inclusions in duality, Optimization, vol. 63, In minimization problems it provides Fejér-monotonicity, convergent sequences, and monotone minimizing sequences. Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 21/25
29 Block-coordinate forward-backward splitting Let F be the set of solutions to the problem find x 1 H 1,...,x m H m such that i {1,...,m}) 0 A i x i + B i x 1,...,x m ) where A i : H i 2 H i is maximally monotone and, for some ϑ > 0, m m x H) y H) x i y i B i x B i y ϑ B i x B i y 2 Algorithm: i=1 for n = 0, 1,... ε γ n 2 ε)ϑ for i = 1,...,m xi,n+1 = x i,n +ε i,n JγnA i xi,n γ n B i x 1,n,...,x m,n )+c i,n ) ) ) +a i,n x i,n i=1 Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 22/25
30 Block-coordinate forward-backward splitting Under the same conditions as before almost sure weak convergence of x n ) n N to a point in F is achieved. Proof: In double-layer theorem, set A: H 2 H : x m i=1 A ix i B: H H: x B i x) 1 i m T n = J γna R n = Id γ n B, F = zera+b) b n = γ n c n α n = 1/2 β n = γ n /2ϑ) Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 23/25
31 Block-coordinate forward-backward splitting: convex minimization Let F be the set of solutions to the problem minimize x 1 H 1,...,x m H m m p m ) f i x i )+ g k L ki x i i=1 where f i Γ 0 H i ), g k : G k R differentiable, convex, g k τ k -Lipschitz, L ki : H i G k linear&bounded Let Algorithm: k=1 i=1 k=1 p m ϑ = τ k L ki L ki ) 1 for n = 0, 1,... for i = 1,...,m p ri,n = ε i,n xi,n γ n k=1 L ki g m k j=1 L ) )) kjx j,n + ci,n ) x i,n+1 = x i,n +ε i,n λ n proxγnf i r i,n + a i,n x i,n. i=1 Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 24/25
32 References PLC and J.-C. Pesquet, Stochastic quasi-fejér block-coordinate fixed point iterations with random sweeping, SIAM J. Optim., 2015 arxiv, April 2014) H. H. Bauschke and PLC, Convex Analysis and Monotone Operator Theory in Hilbert Spaces. Springer, New York, nd ed., Fall 2015) Patrick L. Combettes joint work with J.-C. Pesquet) Block-coordinate splitting algorithms 25/25
In collaboration with J.-C. Pesquet A. Repetti EC (UPE) IFPEN 16 Dec / 29
A Random block-coordinate primal-dual proximal algorithm with application to 3D mesh denoising Emilie CHOUZENOUX Laboratoire d Informatique Gaspard Monge - CNRS Univ. Paris-Est, France Horizon Maths 2014
More informationA Parallel Block-Coordinate Approach for Primal-Dual Splitting with Arbitrary Random Block Selection
EUSIPCO 2015 1/19 A Parallel Block-Coordinate Approach for Primal-Dual Splitting with Arbitrary Random Block Selection Jean-Christophe Pesquet Laboratoire d Informatique Gaspard Monge - CNRS Univ. Paris-Est
More informationA Dykstra-like algorithm for two monotone operators
A Dykstra-like algorithm for two monotone operators Heinz H. Bauschke and Patrick L. Combettes Abstract Dykstra s algorithm employs the projectors onto two closed convex sets in a Hilbert space to construct
More informationAbout Split Proximal Algorithms for the Q-Lasso
Thai Journal of Mathematics Volume 5 (207) Number : 7 http://thaijmath.in.cmu.ac.th ISSN 686-0209 About Split Proximal Algorithms for the Q-Lasso Abdellatif Moudafi Aix Marseille Université, CNRS-L.S.I.S
More informationA NEW ITERATIVE METHOD FOR THE SPLIT COMMON FIXED POINT PROBLEM IN HILBERT SPACES. Fenghui Wang
A NEW ITERATIVE METHOD FOR THE SPLIT COMMON FIXED POINT PROBLEM IN HILBERT SPACES Fenghui Wang Department of Mathematics, Luoyang Normal University, Luoyang 470, P.R. China E-mail: wfenghui@63.com ABSTRACT.
More informationSplitting methods for decomposing separable convex programs
Splitting methods for decomposing separable convex programs Philippe Mahey LIMOS - ISIMA - Université Blaise Pascal PGMO, ENSTA 2013 October 4, 2013 1 / 30 Plan 1 Max Monotone Operators Proximal techniques
More informationFINDING BEST APPROXIMATION PAIRS RELATIVE TO TWO CLOSED CONVEX SETS IN HILBERT SPACES
FINDING BEST APPROXIMATION PAIRS RELATIVE TO TWO CLOSED CONVEX SETS IN HILBERT SPACES Heinz H. Bauschke, Patrick L. Combettes, and D. Russell Luke Abstract We consider the problem of finding a best approximation
More informationarxiv: v1 [math.oc] 20 Jun 2014
A forward-backward view of some primal-dual optimization methods in image recovery arxiv:1406.5439v1 [math.oc] 20 Jun 2014 P. L. Combettes, 1 L. Condat, 2 J.-C. Pesquet, 3 and B. C. Vũ 4 1 Sorbonne Universités
More informationProximal splitting methods on convex problems with a quadratic term: Relax!
Proximal splitting methods on convex problems with a quadratic term: Relax! The slides I presented with added comments Laurent Condat GIPSA-lab, Univ. Grenoble Alpes, France Workshop BASP Frontiers, Jan.
More informationOperator Splitting for Parallel and Distributed Optimization
Operator Splitting for Parallel and Distributed Optimization Wotao Yin (UCLA Math) Shanghai Tech, SSDS 15 June 23, 2015 URL: alturl.com/2z7tv 1 / 60 What is splitting? Sun-Tzu: (400 BC) Caesar: divide-n-conquer
More informationConvergence of Fixed-Point Iterations
Convergence of Fixed-Point Iterations Instructor: Wotao Yin (UCLA Math) July 2016 1 / 30 Why study fixed-point iterations? Abstract many existing algorithms in optimization, numerical linear algebra, and
More informationLearning with stochastic proximal gradient
Learning with stochastic proximal gradient Lorenzo Rosasco DIBRIS, Università di Genova Via Dodecaneso, 35 16146 Genova, Italy lrosasco@mit.edu Silvia Villa, Băng Công Vũ Laboratory for Computational and
More informationOn the order of the operators in the Douglas Rachford algorithm
On the order of the operators in the Douglas Rachford algorithm Heinz H. Bauschke and Walaa M. Moursi June 11, 2015 Abstract The Douglas Rachford algorithm is a popular method for finding zeros of sums
More informationIterative Convex Optimization Algorithms; Part One: Using the Baillon Haddad Theorem
Iterative Convex Optimization Algorithms; Part One: Using the Baillon Haddad Theorem Charles Byrne (Charles Byrne@uml.edu) http://faculty.uml.edu/cbyrne/cbyrne.html Department of Mathematical Sciences
More informationA generalized forward-backward method for solving split equality quasi inclusion problems in Banach spaces
Available online at www.isr-publications.com/jnsa J. Nonlinear Sci. Appl., 10 (2017), 4890 4900 Research Article Journal Homepage: www.tjnsa.com - www.isr-publications.com/jnsa A generalized forward-backward
More informationADMM for monotone operators: convergence analysis and rates
ADMM for monotone operators: convergence analysis and rates Radu Ioan Boţ Ernö Robert Csetne May 4, 07 Abstract. We propose in this paper a unifying scheme for several algorithms from the literature dedicated
More informationDual and primal-dual methods
ELE 538B: Large-Scale Optimization for Data Science Dual and primal-dual methods Yuxin Chen Princeton University, Spring 2018 Outline Dual proximal gradient method Primal-dual proximal gradient method
More informationCoordinate Update Algorithm Short Course Operator Splitting
Coordinate Update Algorithm Short Course Operator Splitting Instructor: Wotao Yin (UCLA Math) Summer 2016 1 / 25 Operator splitting pipeline 1. Formulate a problem as 0 A(x) + B(x) with monotone operators
More informationCanadian Mathematical Society Société mathématique du Canada
Canadian Mathematical Society Société mathématique du Canada Editors-in-Chief Rédacteurs-en-chef K. Dilcher K. Taylor Department of Mathematics and Statistics Dalhousie University Halifax, Nova Scotia
More informationExtensions of the CQ Algorithm for the Split Feasibility and Split Equality Problems
Extensions of the CQ Algorithm for the Split Feasibility Split Equality Problems Charles L. Byrne Abdellatif Moudafi September 2, 2013 Abstract The convex feasibility problem (CFP) is to find a member
More informationVisco-penalization of the sum of two monotone operators
Visco-penalization of the sum of two monotone operators Patrick L. Combettes a and Sever A. Hirstoaga b a Laboratoire Jacques-Louis Lions, Faculté de Mathématiques, Université Pierre et Marie Curie Paris
More informationPrimal-dual coordinate descent
Primal-dual coordinate descent Olivier Fercoq Joint work with P. Bianchi & W. Hachem 15 July 2015 1/28 Minimize the convex function f, g, h convex f is differentiable Problem min f (x) + g(x) + h(mx) x
More information1 Introduction and preliminaries
Proximal Methods for a Class of Relaxed Nonlinear Variational Inclusions Abdellatif Moudafi Université des Antilles et de la Guyane, Grimaag B.P. 7209, 97275 Schoelcher, Martinique abdellatif.moudafi@martinique.univ-ag.fr
More informationTight Rates and Equivalence Results of Operator Splitting Schemes
Tight Rates and Equivalence Results of Operator Splitting Schemes Wotao Yin (UCLA Math) Workshop on Optimization for Modern Computing Joint w Damek Davis and Ming Yan UCLA CAM 14-51, 14-58, and 14-59 1
More informationInertial Douglas-Rachford splitting for monotone inclusion problems
Inertial Douglas-Rachford splitting for monotone inclusion problems Radu Ioan Boţ Ernö Robert Csetnek Christopher Hendrich January 5, 2015 Abstract. We propose an inertial Douglas-Rachford splitting algorithm
More informationPARALLEL SUBGRADIENT METHOD FOR NONSMOOTH CONVEX OPTIMIZATION WITH A SIMPLE CONSTRAINT
Linear and Nonlinear Analysis Volume 1, Number 1, 2015, 1 PARALLEL SUBGRADIENT METHOD FOR NONSMOOTH CONVEX OPTIMIZATION WITH A SIMPLE CONSTRAINT KAZUHIRO HISHINUMA AND HIDEAKI IIDUKA Abstract. In this
More informationConvergence Rates for Projective Splitting
Convergence Rates for Projective Splitting Patric R. Johnstone Jonathan Ecstein August 8, 018 Abstract Projective splitting is a family of methods for solving inclusions involving sums of maximal monotone
More informationA Primal-dual Three-operator Splitting Scheme
Noname manuscript No. (will be inserted by the editor) A Primal-dual Three-operator Splitting Scheme Ming Yan Received: date / Accepted: date Abstract In this paper, we propose a new primal-dual algorithm
More informationFonctions Perspectives et Statistique en Grande Dimension
Fonctions Perspectives et Statistique en Grande Dimension Patrick L. Combettes Department of Mathematics North Carolina State University Raleigh, NC 27695, USA Basé sur un travail conjoint avec C. L. Müller
More informationPrimal-dual coordinate descent A Coordinate Descent Primal-Dual Algorithm with Large Step Size and Possibly Non-Separable Functions
Primal-dual coordinate descent A Coordinate Descent Primal-Dual Algorithm with Large Step Size and Possibly Non-Separable Functions Olivier Fercoq and Pascal Bianchi Problem Minimize the convex function
More informationA Relaxed Explicit Extragradient-Like Method for Solving Generalized Mixed Equilibria, Variational Inequalities and Constrained Convex Minimization
, March 16-18, 2016, Hong Kong A Relaxed Explicit Extragradient-Like Method for Solving Generalized Mixed Equilibria, Variational Inequalities and Constrained Convex Minimization Yung-Yih Lur, Lu-Chuan
More informationA General Iterative Method for Constrained Convex Minimization Problems in Hilbert Spaces
A General Iterative Method for Constrained Convex Minimization Problems in Hilbert Spaces MING TIAN Civil Aviation University of China College of Science Tianjin 300300 CHINA tianming963@6.com MINMIN LI
More informationVariational inequalities for fixed point problems of quasi-nonexpansive operators 1. Rafał Zalas 2
University of Zielona Góra Faculty of Mathematics, Computer Science and Econometrics Summary of the Ph.D. thesis Variational inequalities for fixed point problems of quasi-nonexpansive operators 1 by Rafał
More informationApproaching monotone inclusion problems via second order dynamical systems with linear and anisotropic damping
March 0, 206 3:4 WSPC Proceedings - 9in x 6in secondorderanisotropicdamping206030 page Approaching monotone inclusion problems via second order dynamical systems with linear and anisotropic damping Radu
More informationPrimal-dual algorithms for the sum of two and three functions 1
Primal-dual algorithms for the sum of two and three functions 1 Ming Yan Michigan State University, CMSE/Mathematics 1 This works is partially supported by NSF. optimization problems for primal-dual algorithms
More informationDouglas-Rachford; a very versatile algorithm
Douglas-Rachford; a very versatile algorithm Brailey Sims CARMA, University of Newcastle, AUSTRALIA ACFPTO2018, Chiang Mai, 16 18 July 2018 https://carma.newcastle.edu.au/brailey/ Outline I Feasibility
More informationCompressed Sensing: a Subgradient Descent Method for Missing Data Problems
Compressed Sensing: a Subgradient Descent Method for Missing Data Problems ANZIAM, Jan 30 Feb 3, 2011 Jonathan M. Borwein Jointly with D. Russell Luke, University of Goettingen FRSC FAAAS FBAS FAA Director,
More informationarxiv: v1 [math.oc] 12 Mar 2013
On the convergence rate improvement of a primal-dual splitting algorithm for solving monotone inclusion problems arxiv:303.875v [math.oc] Mar 03 Radu Ioan Boţ Ernö Robert Csetnek André Heinrich February
More informationConvex Feasibility Problems
Laureate Prof. Jonathan Borwein with Matthew Tam http://carma.newcastle.edu.au/drmethods/paseky.html Spring School on Variational Analysis VI Paseky nad Jizerou, April 19 25, 2015 Last Revised: May 6,
More informationConvergence analysis for a primal-dual monotone + skew splitting algorithm with applications to total variation minimization
Convergence analysis for a primal-dual monotone + skew splitting algorithm with applications to total variation minimization Radu Ioan Boţ Christopher Hendrich November 7, 202 Abstract. In this paper we
More informationDual Proximal Gradient Method
Dual Proximal Gradient Method http://bicmr.pku.edu.cn/~wenzw/opt-2016-fall.html Acknowledgement: this slides is based on Prof. Lieven Vandenberghes lecture notes Outline 2/19 1 proximal gradient method
More informationSolving monotone inclusions involving parallel sums of linearly composed maximally monotone operators
Solving monotone inclusions involving parallel sums of linearly composed maximally monotone operators Radu Ioan Boţ Christopher Hendrich 2 April 28, 206 Abstract. The aim of this article is to present
More informationI P IANO : I NERTIAL P ROXIMAL A LGORITHM FOR N ON -C ONVEX O PTIMIZATION
I P IANO : I NERTIAL P ROXIMAL A LGORITHM FOR N ON -C ONVEX O PTIMIZATION Peter Ochs University of Freiburg Germany 17.01.2017 joint work with: Thomas Brox and Thomas Pock c 2017 Peter Ochs ipiano c 1
More informationOn the convergence rate of a forward-backward type primal-dual splitting algorithm for convex optimization problems
On the convergence rate of a forward-backward type primal-dual splitting algorithm for convex optimization problems Radu Ioan Boţ Ernö Robert Csetnek August 5, 014 Abstract. In this paper we analyze the
More informationIterative common solutions of fixed point and variational inequality problems
Available online at www.tjnsa.com J. Nonlinear Sci. Appl. 9 (2016), 1882 1890 Research Article Iterative common solutions of fixed point and variational inequality problems Yunpeng Zhang a, Qing Yuan b,
More informationarxiv: v3 [math.oc] 18 Apr 2012
A class of Fejér convergent algorithms, approximate resolvents and the Hybrid Proximal-Extragradient method B. F. Svaiter arxiv:1204.1353v3 [math.oc] 18 Apr 2012 Abstract A new framework for analyzing
More informationA splitting algorithm for coupled system of primal dual monotone inclusions
A splitting algorithm for coupled system of primal dual monotone inclusions B`ăng Công Vũ UPMC Université Paris 06 Laboratoire Jacques-Louis Lions UMR CNRS 7598 75005 Paris, France vu@ljll.math.upmc.fr
More informationNonconvex notions of regularity and convergence of fundamental algorithms for feasibility problems
Nonconvex notions of regularity and convergence of fundamental algorithms for feasibility problems Robert Hesse and D. Russell Luke December 12, 2012 Abstract We consider projection algorithms for solving
More informationEE 546, Univ of Washington, Spring Proximal mapping. introduction. review of conjugate functions. proximal mapping. Proximal mapping 6 1
EE 546, Univ of Washington, Spring 2012 6. Proximal mapping introduction review of conjugate functions proximal mapping Proximal mapping 6 1 Proximal mapping the proximal mapping (prox-operator) of a convex
More informationMath 273a: Optimization Overview of First-Order Optimization Algorithms
Math 273a: Optimization Overview of First-Order Optimization Algorithms Wotao Yin Department of Mathematics, UCLA online discussions on piazza.com 1 / 9 Typical flow of numerical optimization Optimization
More informationSelf-dual Smooth Approximations of Convex Functions via the Proximal Average
Chapter Self-dual Smooth Approximations of Convex Functions via the Proximal Average Heinz H. Bauschke, Sarah M. Moffat, and Xianfu Wang Abstract The proximal average of two convex functions has proven
More informationA New Primal Dual Algorithm for Minimizing the Sum of Three Functions with a Linear Operator
https://doi.org/10.1007/s10915-018-0680-3 A New Primal Dual Algorithm for Minimizing the Sum of Three Functions with a Linear Operator Ming Yan 1,2 Received: 22 January 2018 / Accepted: 22 February 2018
More informationarxiv: v4 [math.oc] 29 Jan 2018
Noname manuscript No. (will be inserted by the editor A new primal-dual algorithm for minimizing the sum of three functions with a linear operator Ming Yan arxiv:1611.09805v4 [math.oc] 29 Jan 2018 Received:
More informationAnalysis of the convergence rate for the cyclic projection algorithm applied to semi-algebraic convex sets
Analysis of the convergence rate for the cyclic projection algorithm applied to semi-algebraic convex sets Liangjin Yao University of Newcastle June 3rd, 2013 The University of Newcastle liangjin.yao@newcastle.edu.au
More informationON A HYBRID PROXIMAL POINT ALGORITHM IN BANACH SPACES
U.P.B. Sci. Bull., Series A, Vol. 80, Iss. 3, 2018 ISSN 1223-7027 ON A HYBRID PROXIMAL POINT ALGORITHM IN BANACH SPACES Vahid Dadashi 1 In this paper, we introduce a hybrid projection algorithm for a countable
More informationConvergence rate estimates for the gradient differential inclusion
Convergence rate estimates for the gradient differential inclusion Osman Güler November 23 Abstract Let f : H R { } be a proper, lower semi continuous, convex function in a Hilbert space H. The gradient
More informationResearch Article On an Iterative Method for Finding a Zero to the Sum of Two Maximal Monotone Operators
Applied Mathematics, Article ID 414031, 5 pages http://dx.doi.org/10.1155/2014/414031 Research Article On an Iterative Method for Finding a Zero to the Sum of Two Maximal Monotone Operators Hongwei Jiao
More informationOn the split equality common fixed point problem for quasi-nonexpansive multi-valued mappings in Banach spaces
Available online at www.tjnsa.com J. Nonlinear Sci. Appl. 9 (06), 5536 5543 Research Article On the split equality common fixed point problem for quasi-nonexpansive multi-valued mappings in Banach spaces
More information1. Standing assumptions, problem statement, and motivation. We assume throughout this paper that
SIAM J. Optim., to appear ITERATING BREGMAN RETRACTIONS HEINZ H. BAUSCHKE AND PATRICK L. COMBETTES Abstract. The notion of a Bregman retraction of a closed convex set in Euclidean space is introduced.
More informationProximal methods. S. Villa. October 7, 2014
Proximal methods S. Villa October 7, 2014 1 Review of the basics Often machine learning problems require the solution of minimization problems. For instance, the ERM algorithm requires to solve a problem
More informationConvergence Theorems for Bregman Strongly Nonexpansive Mappings in Reflexive Banach Spaces
Filomat 28:7 (2014), 1525 1536 DOI 10.2298/FIL1407525Z Published by Faculty of Sciences and Mathematics, University of Niš, Serbia Available at: http://www.pmf.ni.ac.rs/filomat Convergence Theorems for
More informationGENERAL NONCONVEX SPLIT VARIATIONAL INEQUALITY PROBLEMS. Jong Kyu Kim, Salahuddin, and Won Hee Lim
Korean J. Math. 25 (2017), No. 4, pp. 469 481 https://doi.org/10.11568/kjm.2017.25.4.469 GENERAL NONCONVEX SPLIT VARIATIONAL INEQUALITY PROBLEMS Jong Kyu Kim, Salahuddin, and Won Hee Lim Abstract. In this
More informationConvergence rate analysis for averaged fixed point iterations in the presence of Hölder regularity
Convergence rate analysis for averaged fixed point iterations in the presence of Hölder regularity Jonathan M. Borwein Guoyin Li Matthew K. Tam October 23, 205 Abstract In this paper, we establish sublinear
More informationA New Modified Gradient-Projection Algorithm for Solution of Constrained Convex Minimization Problem in Hilbert Spaces
A New Modified Gradient-Projection Algorithm for Solution of Constrained Convex Minimization Problem in Hilbert Spaces Cyril Dennis Enyi and Mukiawa Edwin Soh Abstract In this paper, we present a new iterative
More informationA General Framework for a Class of Primal-Dual Algorithms for TV Minimization
A General Framework for a Class of Primal-Dual Algorithms for TV Minimization Ernie Esser UCLA 1 Outline A Model Convex Minimization Problem Main Idea Behind the Primal Dual Hybrid Gradient (PDHG) Method
More informationTHE CYCLIC DOUGLAS RACHFORD METHOD FOR INCONSISTENT FEASIBILITY PROBLEMS
THE CYCLIC DOUGLAS RACHFORD METHOD FOR INCONSISTENT FEASIBILITY PROBLEMS JONATHAN M. BORWEIN AND MATTHEW K. TAM Abstract. We analyse the behaviour of the newly introduced cyclic Douglas Rachford algorithm
More informationSignal Processing and Networks Optimization Part VI: Duality
Signal Processing and Networks Optimization Part VI: Duality Pierre Borgnat 1, Jean-Christophe Pesquet 2, Nelly Pustelnik 1 1 ENS Lyon Laboratoire de Physique CNRS UMR 5672 pierre.borgnat@ens-lyon.fr,
More informationOn a result of Pazy concerning the asymptotic behaviour of nonexpansive mappings
On a result of Pazy concerning the asymptotic behaviour of nonexpansive mappings arxiv:1505.04129v1 [math.oc] 15 May 2015 Heinz H. Bauschke, Graeme R. Douglas, and Walaa M. Moursi May 15, 2015 Abstract
More informationHybrid Steepest Descent Method for Variational Inequality Problem over Fixed Point Sets of Certain Quasi-Nonexpansive Mappings
Hybrid Steepest Descent Method for Variational Inequality Problem over Fixed Point Sets of Certain Quasi-Nonexpansive Mappings Isao Yamada Tokyo Institute of Technology VIC2004 @ Wellington, Feb. 13, 2004
More informationARock: an algorithmic framework for asynchronous parallel coordinate updates
ARock: an algorithmic framework for asynchronous parallel coordinate updates Zhimin Peng, Yangyang Xu, Ming Yan, Wotao Yin ( UCLA Math, U.Waterloo DCO) UCLA CAM Report 15-37 ShanghaiTech SSDS 15 June 25,
More informationVictoria Martín-Márquez
A NEW APPROACH FOR THE CONVEX FEASIBILITY PROBLEM VIA MONOTROPIC PROGRAMMING Victoria Martín-Márquez Dep. of Mathematical Analysis University of Seville Spain XIII Encuentro Red de Análisis Funcional y
More informationA GENERALIZATION OF THE REGULARIZATION PROXIMAL POINT METHOD
A GENERALIZATION OF THE REGULARIZATION PROXIMAL POINT METHOD OGANEDITSE A. BOIKANYO AND GHEORGHE MOROŞANU Abstract. This paper deals with the generalized regularization proximal point method which was
More informationM. Marques Alves Marina Geremia. November 30, 2017
Iteration complexity of an inexact Douglas-Rachford method and of a Douglas-Rachford-Tseng s F-B four-operator splitting method for solving monotone inclusions M. Marques Alves Marina Geremia November
More informationThe Split Hierarchical Monotone Variational Inclusions Problems and Fixed Point Problems for Nonexpansive Semigroup
International Mathematical Forum, Vol. 11, 2016, no. 8, 395-408 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/imf.2016.6220 The Split Hierarchical Monotone Variational Inclusions Problems and
More informationCONSTRUCTION OF BEST BREGMAN APPROXIMATIONS IN REFLEXIVE BANACH SPACES
PROCEEINGS OF THE AMERICAN MATHEMATICAL SOCIETY Volume 131, Number 12, Pages 3757 3766 S 0002-9939(03)07050-3 Article electronically published on April 24, 2003 CONSTRUCTION OF BEST BREGMAN APPROXIMATIONS
More informationConvex Optimization. (EE227A: UC Berkeley) Lecture 15. Suvrit Sra. (Gradient methods III) 12 March, 2013
Convex Optimization (EE227A: UC Berkeley) Lecture 15 (Gradient methods III) 12 March, 2013 Suvrit Sra Optimal gradient methods 2 / 27 Optimal gradient methods We saw following efficiency estimates for
More informationOn the equivalence of the primal-dual hybrid gradient method and Douglas Rachford splitting
Mathematical Programming manuscript No. (will be inserted by the editor) On the equivalence of the primal-dual hybrid gradient method and Douglas Rachford splitting Daniel O Connor Lieven Vandenberghe
More informationOn Slater s condition and finite convergence of the Douglas Rachford algorithm for solving convex feasibility problems in Euclidean spaces
On Slater s condition and finite convergence of the Douglas Rachford algorithm for solving convex feasibility problems in Euclidean spaces Heinz H. Bauschke, Minh N. Dao, Dominikus Noll and Hung M. Phan
More informationProximal Methods for Optimization with Spasity-inducing Norms
Proximal Methods for Optimization with Spasity-inducing Norms Group Learning Presentation Xiaowei Zhou Department of Electronic and Computer Engineering The Hong Kong University of Science and Technology
More informationINERTIAL ACCELERATED ALGORITHMS FOR SOLVING SPLIT FEASIBILITY PROBLEMS. Yazheng Dang. Jie Sun. Honglei Xu
Manuscript submitted to AIMS Journals Volume X, Number 0X, XX 200X doi:10.3934/xx.xx.xx.xx pp. X XX INERTIAL ACCELERATED ALGORITHMS FOR SOLVING SPLIT FEASIBILITY PROBLEMS Yazheng Dang School of Management
More informationAn Overview of Recent and Brand New Primal-Dual Methods for Solving Convex Optimization Problems
PGMO 1/32 An Overview of Recent and Brand New Primal-Dual Methods for Solving Convex Optimization Problems Emilie Chouzenoux Laboratoire d Informatique Gaspard Monge - CNRS Univ. Paris-Est Marne-la-Vallée,
More informationWEAK CONVERGENCE THEOREMS FOR EQUILIBRIUM PROBLEMS WITH NONLINEAR OPERATORS IN HILBERT SPACES
Fixed Point Theory, 12(2011), No. 2, 309-320 http://www.math.ubbcluj.ro/ nodeacj/sfptcj.html WEAK CONVERGENCE THEOREMS FOR EQUILIBRIUM PROBLEMS WITH NONLINEAR OPERATORS IN HILBERT SPACES S. DHOMPONGSA,
More informationA primal dual Splitting Method for Convex. Optimization Involving Lipschitzian, Proximable and Linear Composite Terms,
A primal dual Splitting Method for Convex Optimization Involving Lipschitzian, Proximable and Linear Composite Terms Laurent Condat Final author s version. Cite as: L. Condat, A primal dual Splitting Method
More informationARock: an Algorithmic Framework for Asynchronous Parallel Coordinate Updates
ARock: an Algorithmic Framework for Asynchronous Parallel Coordinate Updates Zhimin Peng Yangyang Xu Ming Yan Wotao Yin May 3, 216 Abstract Finding a fixed point to a nonexpansive operator, i.e., x = T
More informationA splitting minimization method on geodesic spaces
A splitting minimization method on geodesic spaces J.X. Cruz Neto DM, Universidade Federal do Piauí, Teresina, PI 64049-500, BR B.P. Lima DM, Universidade Federal do Piauí, Teresina, PI 64049-500, BR P.A.
More informationMonotone variational inequalities, generalized equilibrium problems and fixed point methods
Wang Fixed Point Theory and Applications 2014, 2014:236 R E S E A R C H Open Access Monotone variational inequalities, generalized equilibrium problems and fixed point methods Shenghua Wang * * Correspondence:
More informationADMM and Fast Gradient Methods for Distributed Optimization
ADMM and Fast Gradient Methods for Distributed Optimization João Xavier Instituto Sistemas e Robótica (ISR), Instituto Superior Técnico (IST) European Control Conference, ECC 13 July 16, 013 Joint work
More informationVariable Metric Forward-Backward Algorithm
Variable Metric Forward-Backward Algorithm 1/37 Variable Metric Forward-Backward Algorithm for minimizing the sum of a differentiable function and a convex function E. Chouzenoux in collaboration with
More informationConvex Optimization Notes
Convex Optimization Notes Jonathan Siegel January 2017 1 Convex Analysis This section is devoted to the study of convex functions f : B R {+ } and convex sets U B, for B a Banach space. The case of B =
More informationYou should be able to...
Lecture Outline Gradient Projection Algorithm Constant Step Length, Varying Step Length, Diminishing Step Length Complexity Issues Gradient Projection With Exploration Projection Solving QPs: active set
More informationSecond order forward-backward dynamical systems for monotone inclusion problems
Second order forward-backward dynamical systems for monotone inclusion problems Radu Ioan Boţ Ernö Robert Csetnek March 6, 25 Abstract. We begin by considering second order dynamical systems of the from
More informationHeinz H. Bauschke and Walaa M. Moursi. December 1, Abstract
The magnitude of the minimal displacement vector for compositions and convex combinations of firmly nonexpansive mappings arxiv:1712.00487v1 [math.oc] 1 Dec 2017 Heinz H. Bauschke and Walaa M. Moursi December
More informationExtrapolation algorithm for affine-convex feasibility problems
Extrapolation algorithm for affine-convex feasibility problems Heinz H. Bauschke, Patrick L. Combettes, and Serge G. Kruk October 5, 2005 version 1.25 Abstract The convex feasibility problem under consideration
More informationAbstract In this article, we consider monotone inclusions in real Hilbert spaces
Noname manuscript No. (will be inserted by the editor) A new splitting method for monotone inclusions of three operators Yunda Dong Xiaohuan Yu Received: date / Accepted: date Abstract In this article,
More informationMaster 2 MathBigData. 3 novembre CMAP - Ecole Polytechnique
Master 2 MathBigData S. Gaïffas 1 3 novembre 2014 1 CMAP - Ecole Polytechnique 1 Supervised learning recap Introduction Loss functions, linearity 2 Penalization Introduction Ridge Sparsity Lasso 3 Some
More informationDistributed Optimization via Alternating Direction Method of Multipliers
Distributed Optimization via Alternating Direction Method of Multipliers Stephen Boyd, Neal Parikh, Eric Chu, Borja Peleato Stanford University ITMANET, Stanford, January 2011 Outline precursors dual decomposition
More informationNeighboring feasible trajectories in infinite dimension
Neighboring feasible trajectories in infinite dimension Marco Mazzola Université Pierre et Marie Curie (Paris 6) H. Frankowska and E. M. Marchini Control of State Constrained Dynamical Systems Padova,
More informationConductivity imaging from one interior measurement
Conductivity imaging from one interior measurement Amir Moradifam (University of Toronto) Fields Institute, July 24, 2012 Amir Moradifam (University of Toronto) A convergent algorithm for CDII 1 / 16 A
More informationWeak and strong convergence theorems of modified SP-iterations for generalized asymptotically quasi-nonexpansive mappings
Mathematica Moravica Vol. 20:1 (2016), 125 144 Weak and strong convergence theorems of modified SP-iterations for generalized asymptotically quasi-nonexpansive mappings G.S. Saluja Abstract. The aim of
More information