The Direct Extension of ADMM for Multi-block Convex Minimization Problems is Not Necessarily Convergent

Size: px
Start display at page:

Download "The Direct Extension of ADMM for Multi-block Convex Minimization Problems is Not Necessarily Convergent"

Transcription

1 The Direct Extension of ADMM for Multi-block Convex Minimization Problems is Not Necessarily Convergent Yinyu Ye K. T. Li Professor of Engineering Department of Management Science and Engineering Stanford University, and The International Center of Management Science and Engineering Nanjing University, Nanjing, China http: // yyye Joint work with Caihua Chen, Bingsheng He, Xiaoming Yuan April 25, 2014 Yinyu Ye Stanford 1/ 22

2 Outline 1 Background and Motivation 2 Divergent Examples for the Extended ADMM 3 The Small-Stepsize Variant of ADMM 4 Conclusions Yinyu Ye Stanford 2/ 22

3 1. Background and Motivation Yinyu Ye Stanford 3/ 22

4 Alternating Direction Method of Multipliers I min {θ 1 (x 1 )+θ 2 (x 2 ) A 1 x 1 + A 2 x 2 = b, x 1 X 1, x 2 X 2 } θ 1 (x 1 )andθ 2 (x 2 ) are convex closed proper functions; X 1 and X 2 are convex sets. Yinyu Ye Stanford 4/ 22

5 Alternating Direction Method of Multipliers I min {θ 1 (x 1 )+θ 2 (x 2 ) A 1 x 1 + A 2 x 2 = b, x 1 X 1, x 2 X 2 } θ 1 (x 1 )andθ 2 (x 2 ) are convex closed proper functions; X 1 and X 2 are convex sets. Alternating direction method of multipliers(glowinski & Marrocco 75, Gabay & Mercier 76): x1 k+1 = arg min{l A (x 1, x2 k,λk ) x 1 X 1 }, x2 k+1 = arg min{l A (x1 k+1, x 2,λ k ) x 2 X 2 }, λ k+1 = λ k β(a 1 x1 k+1 + A 2 x2 k+1 b), where the augmented Lagrangian function L A is defined as L A (x 1, x 2,λ)= 2 θ i (x i ) λ T ( 2 A i x i b ) + β 2 A i x i b 2. 2 i=1 i=1 i=1 Yinyu Ye Stanford 4/ 22

6 Alternating direction method of multipliers II Theoretical results of ADMM: Douglas-Rachford Splitting method to its dual (Gabay 76) A special implementation of the Proximal Point Algorithm (Eckstein & Bertsekas 92). Thus the convergence of ADMM can be easily established by using the classical operator theory. O(1/k) convergence speed (He & Yuan 12, Monteiro & Svaiter 13) Linear convergence under certain conditions (Lions & Mercier 79, Eckstein 89, ) Yinyu Ye Stanford 5/ 22

7 Alternating direction method of multipliers II Theoretical results of ADMM: Douglas-Rachford Splitting method to its dual (Gabay 76) A special implementation of the Proximal Point Algorithm (Eckstein & Bertsekas 92). Thus the convergence of ADMM can be easily established by using the classical operator theory. O(1/k) convergence speed (He & Yuan 12, Monteiro & Svaiter 13) Linear convergence under certain conditions (Lions & Mercier 79, Eckstein 89, ) Applications of ADMM: Partial differential equations, mechanics, image processing, compressed processing, statistical learning, compute version, semidefinite programming, Yinyu Ye Stanford 5/ 22

8 ADMM for Multi-block Convex Minimization Problems Convex minimization problems with three blocks: min θ 1 (x 1 )+θ 2 (x 2 )+θ 3 (x 3 ) s.t. A 1 x 1 + A 2 x 2 + A 3 x 3 = b x 1 X 1, x 2 X 2, x 3 X 3 θ 1 (x 1 ),θ 2 (x 2 )andθ 3 (x 3 ) are convex closed proper functions; X 1, X 2 and X 3 are convex sets. Yinyu Ye Stanford 6/ 22

9 ADMM for Multi-block Convex Minimization Problems Convex minimization problems with three blocks: min θ 1 (x 1 )+θ 2 (x 2 )+θ 3 (x 3 ) s.t. A 1 x 1 + A 2 x 2 + A 3 x 3 = b x 1 X 1, x 2 X 2, x 3 X 3 θ 1 (x 1 ),θ 2 (x 2 )andθ 3 (x 3 ) are convex closed proper functions; X 1, X 2 and X 3 are convex sets. The direct extension of ADMM: x1 k+1 =argmin{l A (x 1, x2 k, x 3 k,λk ) x 1 X 1 } x2 k+1 =argmin{l A (x1 k+1, x 2, x3 k,λk ) x 2 X 2 } x3 k+1 =argmin{l A (x1 k+1, x2 k+1, x 3,λ k ) x 3 X 3 } λ k+1 = λ k β(a 1 x1 k+1 + A 2 x2 k+1 + A 3 x3 k+1 b) L A (x 1, x 2, x 3,λ)= 3 θ i (x i ) λ T ( 3 A i x i b ) + β 3 A i x i b 2 2 i=1 i=1 i=1 Yinyu Ye Stanford 6/ 22

10 Applications of the Extended ADMM The extended ADMM can find many applications: robust PCA with noisy and incomplete data, image alignment problem, latent variable Gaussian graphical mode, quadratic discriminant analysis model, etc. was popularly used in practice, and it has outperformed other variants of ADMM most of the time. Yinyu Ye Stanford 7/ 22

11 Applications of the Extended ADMM The extended ADMM can find many applications: robust PCA with noisy and incomplete data, image alignment problem, latent variable Gaussian graphical mode, quadratic discriminant analysis model, etc. was popularly used in practice, and it has outperformed other variants of ADMM most of the time. Therefore, one would expect that the extended ADMM always converges. However, Yinyu Ye Stanford 7/ 22

12 Theoretical Results of the Extended ADMM Not easy to analyze the convergence The operator theory for the ADMM cannot be directly extended to the ADMM with three blocks. Big difference between the ADMM with two blocks and with three blocks. Yinyu Ye Stanford 8/ 22

13 Theoretical Results of the Extended ADMM Not easy to analyze the convergence The operator theory for the ADMM cannot be directly extended to the ADMM with three blocks. Big difference between the ADMM with two blocks and with three blocks. Existing results for global convergence: Strong convexity; plus β in a specific range (Han & Yuan 12). Certain conditions on the problem; then take a sufficiently small stepsize γ in the update of the multipliers (Hong & Luo 12), i.e., λ k+1 = λ k γβ(a 1 x1 k+1 + A 2 x2 k+1 + A k+1 3 b). A correction step (He&Tao&Yuan 12, He&Tao&Yuan-IMA). Yinyu Ye Stanford 8/ 22

14 Theoretical Results of the Extended ADMM Not easy to analyze the convergence The operator theory for the ADMM cannot be directly extended to the ADMM with three blocks. Big difference between the ADMM with two blocks and with three blocks. Existing results for global convergence: Strong convexity; plus β in a specific range (Han & Yuan 12). Certain conditions on the problem; then take a sufficiently small stepsize γ in the update of the multipliers (Hong & Luo 12), i.e., λ k+1 = λ k γβ(a 1 x1 k+1 + A 2 x2 k+1 + A k+1 3 b). A correction step (He&Tao&Yuan 12, He&Tao&Yuan-IMA). But, these did not answer the open question whether or not the direct extension of ADMM converges under the simple convexity assumption. Yinyu Ye Stanford 8/ 22

15 2. Divergent Examples for the Extended ADMM Yinyu Ye Stanford 9/ 22

16 Strategy to Construct the Counter-example A sufficient condition to guarantee the ADMM convergence: A T 1 A 2 =0, or A T 2 A 3 =0, or A T 3 A 1 =0. Consider the case A T 1 A 2 = 0 in which the extended ADMM reduces to the ADMM with two blocks (by regarding (x 1, x 2 )as one variable). Yinyu Ye Stanford 10/ 22

17 Strategy to Construct the Counter-example A sufficient condition to guarantee the ADMM convergence: A T 1 A 2 =0, or A T 2 A 3 =0, or A T 3 A 1 =0. Consider the case A T 1 A 2 = 0 in which the extended ADMM reduces to the ADMM with two blocks (by regarding (x 1, x 2 )as one variable). Consequently, our strategy to construct a non-convergence example: A 1, A 2 and A 3 are similar but not identical. No objective function so that the operator is a linear mapping and the convergence is independent of the choice of β Then, we set β = 1 for simplicity. Yinyu Ye Stanford 10/ 22

18 Strategy to Construct the Counter-example A sufficient condition to guarantee the ADMM convergence: A T 1 A 2 =0, or A T 2 A 3 =0, or A T 3 A 1 =0. Consider the case A T 1 A 2 = 0 in which the extended ADMM reduces to the ADMM with two blocks (by regarding (x 1, x 2 )as one variable). Consequently, our strategy to construct a non-convergence example: A 1, A 2 and A 3 are similar but not identical. No objective function so that the operator is a linear mapping and the convergence is independent of the choice of β Then, we set β = 1 for simplicity. Thus, we simply consider the system of homogeneous linear equations with three variables: A 1 x 1 + A 2 x 2 + A 3 x 3 =0. Yinyu Ye Stanford 10/ 22

19 Divergent Example of the Extended ADMM I Concretely, we take A =(A 1, A 2, A 3 )= Thus the extended ADMM with β = 1 can be specified as x k+1 1 x k+1 2 x k+1 3 λ k = x k 1 x k 2 x k 3 λ k. Yinyu Ye Stanford 11/ 22

20 Divergent Example of the Extended ADMM II Or equivalently, where x k+1 2 x k+1 3 λ k+1 = M x k 2 x k 3 λ k, M = Yinyu Ye Stanford 12/ 22

21 Divergent Example of the Extended ADMM III The matrix M = V Diag(d)V 1,where i i d = i i. 0 and V = i i i i i i i i i i i i i i i i , Note that ρ(m) = d 1 = d 2 > 1. Yinyu Ye Stanford 13/ 22

22 Divergent Example of the Extended ADMM IV Take the initial point (x2 0, x0 3,λ0 )asv(:, 1) + V (:, 2) R 5.Then x k+1 2 = V Diag(d k+1 )V 1 which is divergent. x k+1 3 λ k+1 = V Diag(d k+1 ) = V ( i ) k+1 ( i ) k x 0 2 x 0 3 λ 0, Yinyu Ye Stanford 14/ 22

23 Strong Convexity Helps? Consider the following example min 0.05x x x x 1 s.t x 2 = x 3 (1) Yinyu Ye Stanford 15/ 22

24 Strong Convexity Helps? Consider the following example min 0.05x x x x 1 s.t x 2 = x 3 (1) the matrix M in the extended ADMM (β =1)has ρ(m) = > 1 Yinyu Ye Stanford 15/ 22

25 Strong Convexity Helps? Consider the following example min 0.05x x x x 1 s.t x 2 = x 3 (1) the matrix M in the extended ADMM (β =1)has ρ(m) = > 1 able to find a proper initial point such that the extended ADMM diverges Yinyu Ye Stanford 15/ 22

26 Strong Convexity Helps? Consider the following example min 0.05x x x x 1 s.t x 2 = x 3 (1) the matrix M in the extended ADMM (β =1)has ρ(m) = > 1 able to find a proper initial point such that the extended ADMM diverges even for strongly convex programming, the extended ADMM is not necessarily convergent for a given β>0 Yinyu Ye Stanford 15/ 22

27 4. The Small Stepsize Variant of ADMM Yinyu Ye Stanford 16/ 22

28 The stepsize of ADMM In the direct extension of ADMM, the Lagrangian multiplier is updated by λ k+1 := λ k γβ(a 1 x k A 2 x k A j x k+1 j ). Convergence is proved: Yinyu Ye Stanford 17/ 22

29 The stepsize of ADMM In the direct extension of ADMM, the Lagrangian multiplier is updated by λ k+1 := λ k γβ(a 1 x k A 2 x k A j x k+1 j ). Convergence is proved: j = 1; (Augmented Lagrangian Method) for γ (0, 2), (Hestenes 69, Powell 69). j = 2; (Alternating Direction Method of Multipliers) for γ (0, ), (Glowinski, 84). j 3; for γ sufficiently small provided additional conditions on the problem, (Hong & Luo 12) Yinyu Ye Stanford 17/ 22

30 The stepsize of ADMM In the direct extension of ADMM, the Lagrangian multiplier is updated by λ k+1 := λ k γβ(a 1 x k A 2 x k A j x k+1 j ). Convergence is proved: j = 1; (Augmented Lagrangian Method) for γ (0, 2), (Hestenes 69, Powell 69). j = 2; (Alternating Direction Method of Multipliers) for γ (0, ), (Glowinski, 84). j 3; for γ sufficiently small provided additional conditions on the problem, (Hong & Luo 12) Question: Is there a problem-data-independent γ such that the method converges? Yinyu Ye Stanford 17/ 22

31 A Numerical Study (Ongoing) Consider the linear system γ 1 1+γ 1+γ x 1 x 2 x 3 =0. Yinyu Ye Stanford 18/ 22

32 A Numerical Study (Ongoing) Consider the linear system γ 1 1+γ 1+γ x 1 x 2 x 3 =0. Table: The radius of the problem γ e-2 1e-3 1e-4 1e-5 1e-6 1e-7 ρ(m) > 1 > 1 > 1 > 1 > 1 Yinyu Ye Stanford 18/ 22

33 A Numerical Study (Ongoing) Consider the linear system γ 1 1+γ 1+γ x 1 x 2 x 3 =0. Table: The radius of the problem γ e-2 1e-3 1e-4 1e-5 1e-6 1e-7 ρ(m) > 1 > 1 > 1 > 1 > 1 Thus, there seems no practical problem-data-independent γ such that the small-step size variant works. Yinyu Ye Stanford 18/ 22

34 5. Conclusion Yinyu Ye Stanford 19/ 22

35 Conclusion We construct examples to show that the direct extension of ADMM for multi-block convex minimization problems is not necessarily convergent for any given algorithm parameter β. Yinyu Ye Stanford 20/ 22

36 Conclusion We construct examples to show that the direct extension of ADMM for multi-block convex minimization problems is not necessarily convergent for any given algorithm parameter β. Even in the case where the objective function is strongly convex, the direct extension of ADMM loses its convergence for a given β. Yinyu Ye Stanford 20/ 22

37 Conclusion We construct examples to show that the direct extension of ADMM for multi-block convex minimization problems is not necessarily convergent for any given algorithm parameter β. Even in the case where the objective function is strongly convex, the direct extension of ADMM loses its convergence for a given β. There doesn t exist a problem-data-independent stepsize γ such that the small stepsize variant of ADMM would work. Yinyu Ye Stanford 20/ 22

38 Conclusion We construct examples to show that the direct extension of ADMM for multi-block convex minimization problems is not necessarily convergent for any given algorithm parameter β. Even in the case where the objective function is strongly convex, the direct extension of ADMM loses its convergence for a given β. There doesn t exist a problem-data-independent stepsize γ such that the small stepsize variant of ADMM would work. Is there a cyclic non-converging example? Yinyu Ye Stanford 20/ 22

39 Conclusion We construct examples to show that the direct extension of ADMM for multi-block convex minimization problems is not necessarily convergent for any given algorithm parameter β. Even in the case where the objective function is strongly convex, the direct extension of ADMM loses its convergence for a given β. There doesn t exist a problem-data-independent stepsize γ such that the small stepsize variant of ADMM would work. Is there a cyclic non-converging example? Our results support the need of a correction step in the ADMM-type method (He&Tao&Yuan 12, He&Tao&Yuan-IMA). Yinyu Ye Stanford 20/ 22

40 Conclusion We construct examples to show that the direct extension of ADMM for multi-block convex minimization problems is not necessarily convergent for any given algorithm parameter β. Even in the case where the objective function is strongly convex, the direct extension of ADMM loses its convergence for a given β. There doesn t exist a problem-data-independent stepsize γ such that the small stepsize variant of ADMM would work. Is there a cyclic non-converging example? Our results support the need of a correction step in the ADMM-type method (He&Tao&Yuan 12, He&Tao&Yuan-IMA). Question: Is there an simple correction of the ADMM for the multi-block convex minimization problems? Or how to treat the multi blocks equally? Yinyu Ye Stanford 20/ 22

41 How to Treat All Blocks Equally? Answer: Independent uniform random permutation in each iteration! Yinyu Ye Stanford 21/ 22

42 How to Treat All Blocks Equally? Answer: Independent uniform random permutation in each iteration! Select the block-update order in the uniformly random fashion this equivalently reduces the ADMM algorithm to one block. Yinyu Ye Stanford 21/ 22

43 How to Treat All Blocks Equally? Answer: Independent uniform random permutation in each iteration! Select the block-update order in the uniformly random fashion this equivalently reduces the ADMM algorithm to one block. Or fix the first block, and then select the rest block order in the uniformly random fashion this equivalently reduces the ADMM algorithm to two blocks. Yinyu Ye Stanford 21/ 22

44 How to Treat All Blocks Equally? Answer: Independent uniform random permutation in each iteration! Select the block-update order in the uniformly random fashion this equivalently reduces the ADMM algorithm to one block. Or fix the first block, and then select the rest block order in the uniformly random fashion this equivalently reduces the ADMM algorithm to two blocks. It works for the example, and it works in general my conjecture. Yinyu Ye Stanford 21/ 22

45 Thank You! Yinyu Ye Stanford 22/ 22

Recent Developments of Alternating Direction Method of Multipliers with Multi-Block Variables

Recent Developments of Alternating Direction Method of Multipliers with Multi-Block Variables Recent Developments of Alternating Direction Method of Multipliers with Multi-Block Variables Department of Systems Engineering and Engineering Management The Chinese University of Hong Kong 2014 Workshop

More information

The Direct Extension of ADMM for Multi-block Convex Minimization Problems is Not Necessarily Convergent

The Direct Extension of ADMM for Multi-block Convex Minimization Problems is Not Necessarily Convergent The Direct Extension of ADMM for Multi-block Convex Minimization Problems is Not Necessarily Convergent Caihua Chen Bingsheng He Yinyu Ye Xiaoming Yuan 4 December, Abstract. The alternating direction method

More information

arxiv: v1 [math.oc] 23 May 2017

arxiv: v1 [math.oc] 23 May 2017 A DERANDOMIZED ALGORITHM FOR RP-ADMM WITH SYMMETRIC GAUSS-SEIDEL METHOD JINCHAO XU, KAILAI XU, AND YINYU YE arxiv:1705.08389v1 [math.oc] 23 May 2017 Abstract. For multi-block alternating direction method

More information

Distributed Optimization via Alternating Direction Method of Multipliers

Distributed Optimization via Alternating Direction Method of Multipliers Distributed Optimization via Alternating Direction Method of Multipliers Stephen Boyd, Neal Parikh, Eric Chu, Borja Peleato Stanford University ITMANET, Stanford, January 2011 Outline precursors dual decomposition

More information

On convergence rate of the Douglas-Rachford operator splitting method

On convergence rate of the Douglas-Rachford operator splitting method On convergence rate of the Douglas-Rachford operator splitting method Bingsheng He and Xiaoming Yuan 2 Abstract. This note provides a simple proof on a O(/k) convergence rate for the Douglas- Rachford

More information

On the Convergence of Multi-Block Alternating Direction Method of Multipliers and Block Coordinate Descent Method

On the Convergence of Multi-Block Alternating Direction Method of Multipliers and Block Coordinate Descent Method On the Convergence of Multi-Block Alternating Direction Method of Multipliers and Block Coordinate Descent Method Caihua Chen Min Li Xin Liu Yinyu Ye This version: September 9, 2015 Abstract The paper

More information

Dealing with Constraints via Random Permutation

Dealing with Constraints via Random Permutation Dealing with Constraints via Random Permutation Ruoyu Sun UIUC Joint work with Zhi-Quan Luo (U of Minnesota and CUHK (SZ)) and Yinyu Ye (Stanford) Simons Institute Workshop on Fast Iterative Methods in

More information

Shiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 9. Alternating Direction Method of Multipliers

Shiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 9. Alternating Direction Method of Multipliers Shiqian Ma, MAT-258A: Numerical Optimization 1 Chapter 9 Alternating Direction Method of Multipliers Shiqian Ma, MAT-258A: Numerical Optimization 2 Separable convex optimization a special case is min f(x)

More information

Multi-Block ADMM and its Convergence

Multi-Block ADMM and its Convergence Multi-Block ADMM and its Convergence Yinyu Ye Department of Management Science and Engineering Stanford University Email: yyye@stanford.edu We show that the direct extension of alternating direction method

More information

Contraction Methods for Convex Optimization and Monotone Variational Inequalities No.16

Contraction Methods for Convex Optimization and Monotone Variational Inequalities No.16 XVI - 1 Contraction Methods for Convex Optimization and Monotone Variational Inequalities No.16 A slightly changed ADMM for convex optimization with three separable operators Bingsheng He Department of

More information

Inexact Alternating Direction Method of Multipliers for Separable Convex Optimization

Inexact Alternating Direction Method of Multipliers for Separable Convex Optimization Inexact Alternating Direction Method of Multipliers for Separable Convex Optimization Hongchao Zhang hozhang@math.lsu.edu Department of Mathematics Center for Computation and Technology Louisiana State

More information

Optimal Linearized Alternating Direction Method of Multipliers for Convex Programming 1

Optimal Linearized Alternating Direction Method of Multipliers for Convex Programming 1 Optimal Linearized Alternating Direction Method of Multipliers for Convex Programming Bingsheng He 2 Feng Ma 3 Xiaoming Yuan 4 October 4, 207 Abstract. The alternating direction method of multipliers ADMM

More information

Linearized Alternating Direction Method of Multipliers via Positive-Indefinite Proximal Regularization for Convex Programming.

Linearized Alternating Direction Method of Multipliers via Positive-Indefinite Proximal Regularization for Convex Programming. Linearized Alternating Direction Method of Multipliers via Positive-Indefinite Proximal Regularization for Convex Programming Bingsheng He Feng Ma 2 Xiaoming Yuan 3 July 3, 206 Abstract. The alternating

More information

Splitting methods for decomposing separable convex programs

Splitting methods for decomposing separable convex programs Splitting methods for decomposing separable convex programs Philippe Mahey LIMOS - ISIMA - Université Blaise Pascal PGMO, ENSTA 2013 October 4, 2013 1 / 30 Plan 1 Max Monotone Operators Proximal techniques

More information

Application of the Strictly Contractive Peaceman-Rachford Splitting Method to Multi-block Separable Convex Programming

Application of the Strictly Contractive Peaceman-Rachford Splitting Method to Multi-block Separable Convex Programming Application of the Strictly Contractive Peaceman-Rachford Splitting Method to Multi-block Separable Convex Programming Bingsheng He, Han Liu, Juwei Lu, and Xiaoming Yuan Abstract Recently, a strictly contractive

More information

Coordinate Update Algorithm Short Course Operator Splitting

Coordinate Update Algorithm Short Course Operator Splitting Coordinate Update Algorithm Short Course Operator Splitting Instructor: Wotao Yin (UCLA Math) Summer 2016 1 / 25 Operator splitting pipeline 1. Formulate a problem as 0 A(x) + B(x) with monotone operators

More information

Contraction Methods for Convex Optimization and Monotone Variational Inequalities No.11

Contraction Methods for Convex Optimization and Monotone Variational Inequalities No.11 XI - 1 Contraction Methods for Convex Optimization and Monotone Variational Inequalities No.11 Alternating direction methods of multipliers for separable convex programming Bingsheng He Department of Mathematics

More information

Beyond Heuristics: Applying Alternating Direction Method of Multipliers in Nonconvex Territory

Beyond Heuristics: Applying Alternating Direction Method of Multipliers in Nonconvex Territory Beyond Heuristics: Applying Alternating Direction Method of Multipliers in Nonconvex Territory Xin Liu(4Ð) State Key Laboratory of Scientific and Engineering Computing Institute of Computational Mathematics

More information

Does Alternating Direction Method of Multipliers Converge for Nonconvex Problems?

Does Alternating Direction Method of Multipliers Converge for Nonconvex Problems? Does Alternating Direction Method of Multipliers Converge for Nonconvex Problems? Mingyi Hong IMSE and ECpE Department Iowa State University ICCOPT, Tokyo, August 2016 Mingyi Hong (Iowa State University)

More information

Distributed Optimization and Statistics via Alternating Direction Method of Multipliers

Distributed Optimization and Statistics via Alternating Direction Method of Multipliers Distributed Optimization and Statistics via Alternating Direction Method of Multipliers Stephen Boyd, Neal Parikh, Eric Chu, Borja Peleato Stanford University Stanford Statistics Seminar, September 2010

More information

Accelerated primal-dual methods for linearly constrained convex problems

Accelerated primal-dual methods for linearly constrained convex problems Accelerated primal-dual methods for linearly constrained convex problems Yangyang Xu SIAM Conference on Optimization May 24, 2017 1 / 23 Accelerated proximal gradient For convex composite problem: minimize

More information

Proximal ADMM with larger step size for two-block separable convex programming and its application to the correlation matrices calibrating problems

Proximal ADMM with larger step size for two-block separable convex programming and its application to the correlation matrices calibrating problems Available online at www.isr-publications.com/jnsa J. Nonlinear Sci. Appl., (7), 538 55 Research Article Journal Homepage: www.tjnsa.com - www.isr-publications.com/jnsa Proximal ADMM with larger step size

More information

Adaptive Stochastic Alternating Direction Method of Multipliers

Adaptive Stochastic Alternating Direction Method of Multipliers Peilin Zhao, ZHAOP@IRA-SAREDUSG Jinwei Yang JYANG7@NDEDU ong Zhang ZHANG@SARUGERSEDU Ping Li PINGLI@SARUGERSEDU Data Analytics Department, Institute for Infocomm Research, A*SAR, Singapore Department of

More information

Optimization for Learning and Big Data

Optimization for Learning and Big Data Optimization for Learning and Big Data Donald Goldfarb Department of IEOR Columbia University Department of Mathematics Distinguished Lecture Series May 17-19, 2016. Lecture 1. First-Order Methods for

More information

Contraction Methods for Convex Optimization and monotone variational inequalities No.12

Contraction Methods for Convex Optimization and monotone variational inequalities No.12 XII - 1 Contraction Methods for Convex Optimization and monotone variational inequalities No.12 Linearized alternating direction methods of multipliers for separable convex programming Bingsheng He Department

More information

Relaxed linearized algorithms for faster X-ray CT image reconstruction

Relaxed linearized algorithms for faster X-ray CT image reconstruction Relaxed linearized algorithms for faster X-ray CT image reconstruction Hung Nien and Jeffrey A. Fessler University of Michigan, Ann Arbor The 13th Fully 3D Meeting June 2, 2015 1/20 Statistical image reconstruction

More information

A Unified Approach to Proximal Algorithms using Bregman Distance

A Unified Approach to Proximal Algorithms using Bregman Distance A Unified Approach to Proximal Algorithms using Bregman Distance Yi Zhou a,, Yingbin Liang a, Lixin Shen b a Department of Electrical Engineering and Computer Science, Syracuse University b Department

More information

Math 273a: Optimization Overview of First-Order Optimization Algorithms

Math 273a: Optimization Overview of First-Order Optimization Algorithms Math 273a: Optimization Overview of First-Order Optimization Algorithms Wotao Yin Department of Mathematics, UCLA online discussions on piazza.com 1 / 9 Typical flow of numerical optimization Optimization

More information

2 Regularized Image Reconstruction for Compressive Imaging and Beyond

2 Regularized Image Reconstruction for Compressive Imaging and Beyond EE 367 / CS 448I Computational Imaging and Display Notes: Compressive Imaging and Regularized Image Reconstruction (lecture ) Gordon Wetzstein gordon.wetzstein@stanford.edu This document serves as a supplement

More information

Tight Rates and Equivalence Results of Operator Splitting Schemes

Tight Rates and Equivalence Results of Operator Splitting Schemes Tight Rates and Equivalence Results of Operator Splitting Schemes Wotao Yin (UCLA Math) Workshop on Optimization for Modern Computing Joint w Damek Davis and Ming Yan UCLA CAM 14-51, 14-58, and 14-59 1

More information

A GENERAL INERTIAL PROXIMAL POINT METHOD FOR MIXED VARIATIONAL INEQUALITY PROBLEM

A GENERAL INERTIAL PROXIMAL POINT METHOD FOR MIXED VARIATIONAL INEQUALITY PROBLEM A GENERAL INERTIAL PROXIMAL POINT METHOD FOR MIXED VARIATIONAL INEQUALITY PROBLEM CAIHUA CHEN, SHIQIAN MA, AND JUNFENG YANG Abstract. In this paper, we first propose a general inertial proximal point method

More information

An Algorithmic Framework of Generalized Primal-Dual Hybrid Gradient Methods for Saddle Point Problems

An Algorithmic Framework of Generalized Primal-Dual Hybrid Gradient Methods for Saddle Point Problems An Algorithmic Framework of Generalized Primal-Dual Hybrid Gradient Methods for Saddle Point Problems Bingsheng He Feng Ma 2 Xiaoming Yuan 3 January 30, 206 Abstract. The primal-dual hybrid gradient method

More information

The Alternating Direction Method of Multipliers

The Alternating Direction Method of Multipliers The Alternating Direction Method of Multipliers With Adaptive Step Size Selection Peter Sutor, Jr. Project Advisor: Professor Tom Goldstein December 2, 2015 1 / 25 Background The Dual Problem Consider

More information

Convergence of a Class of Stationary Iterative Methods for Saddle Point Problems

Convergence of a Class of Stationary Iterative Methods for Saddle Point Problems Convergence of a Class of Stationary Iterative Methods for Saddle Point Problems Yin Zhang 张寅 August, 2010 Abstract A unified convergence result is derived for an entire class of stationary iterative methods

More information

Contraction Methods for Convex Optimization and Monotone Variational Inequalities No.18

Contraction Methods for Convex Optimization and Monotone Variational Inequalities No.18 XVIII - 1 Contraction Methods for Convex Optimization and Monotone Variational Inequalities No18 Linearized alternating direction method with Gaussian back substitution for separable convex optimization

More information

On the acceleration of augmented Lagrangian method for linearly constrained optimization

On the acceleration of augmented Lagrangian method for linearly constrained optimization On the acceleration of augmented Lagrangian method for linearly constrained optimization Bingsheng He and Xiaoming Yuan October, 2 Abstract. The classical augmented Lagrangian method (ALM plays a fundamental

More information

arxiv: v2 [math.oc] 1 Dec 2014

arxiv: v2 [math.oc] 1 Dec 2014 A Convergent 3-Block Semi-Proximal Alternating Direction Method of Multipliers for Conic Programming with 4-Type of Constraints arxiv:1404.5378v2 [math.oc] 1 Dec 2014 Defeng Sun, Kim-Chuan Toh, Liuqin

More information

Sparse and Low-Rank Matrix Decomposition Via Alternating Direction Method

Sparse and Low-Rank Matrix Decomposition Via Alternating Direction Method Sparse and Low-Rank Matrix Decomposition Via Alternating Direction Method Xiaoming Yuan a, 1 and Junfeng Yang b, 2 a Department of Mathematics, Hong Kong Baptist University, Hong Kong, China b Department

More information

Randomized Coordinate Descent with Arbitrary Sampling: Algorithms and Complexity

Randomized Coordinate Descent with Arbitrary Sampling: Algorithms and Complexity Randomized Coordinate Descent with Arbitrary Sampling: Algorithms and Complexity Zheng Qu University of Hong Kong CAM, 23-26 Aug 2016 Hong Kong based on joint work with Peter Richtarik and Dominique Cisba(University

More information

Alternating Direction Method of Multipliers. Ryan Tibshirani Convex Optimization

Alternating Direction Method of Multipliers. Ryan Tibshirani Convex Optimization Alternating Direction Method of Multipliers Ryan Tibshirani Convex Optimization 10-725 Consider the problem Last time: dual ascent min x f(x) subject to Ax = b where f is strictly convex and closed. Denote

More information

EE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6)

EE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6) EE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6) Gordon Wetzstein gordon.wetzstein@stanford.edu This document serves as a supplement to the material discussed in

More information

Generalized ADMM with Optimal Indefinite Proximal Term for Linearly Constrained Convex Optimization

Generalized ADMM with Optimal Indefinite Proximal Term for Linearly Constrained Convex Optimization Generalized ADMM with Optimal Indefinite Proximal Term for Linearly Constrained Convex Optimization Fan Jiang 1 Zhongming Wu Xingju Cai 3 Abstract. We consider the generalized alternating direction method

More information

Frist order optimization methods for sparse inverse covariance selection

Frist order optimization methods for sparse inverse covariance selection Frist order optimization methods for sparse inverse covariance selection Katya Scheinberg Lehigh University ISE Department (joint work with D. Goldfarb, Sh. Ma, I. Rish) Introduction l l l l l l The field

More information

Convex Optimization Algorithms for Machine Learning in 10 Slides

Convex Optimization Algorithms for Machine Learning in 10 Slides Convex Optimization Algorithms for Machine Learning in 10 Slides Presenter: Jul. 15. 2015 Outline 1 Quadratic Problem Linear System 2 Smooth Problem Newton-CG 3 Composite Problem Proximal-Newton-CD 4 Non-smooth,

More information

Nonconvex ADMM: Convergence and Applications

Nonconvex ADMM: Convergence and Applications Nonconvex ADMM: Convergence and Applications Instructor: Wotao Yin (UCLA Math) Based on CAM 15-62 with Yu Wang and Jinshan Zeng Summer 2016 1 / 54 1. Alternating Direction Method of Multipliers (ADMM):

More information

Sparse Gaussian conditional random fields

Sparse Gaussian conditional random fields Sparse Gaussian conditional random fields Matt Wytock, J. ico Kolter School of Computer Science Carnegie Mellon University Pittsburgh, PA 53 {mwytock, zkolter}@cs.cmu.edu Abstract We propose sparse Gaussian

More information

ACCELERATED FIRST-ORDER PRIMAL-DUAL PROXIMAL METHODS FOR LINEARLY CONSTRAINED COMPOSITE CONVEX PROGRAMMING

ACCELERATED FIRST-ORDER PRIMAL-DUAL PROXIMAL METHODS FOR LINEARLY CONSTRAINED COMPOSITE CONVEX PROGRAMMING ACCELERATED FIRST-ORDER PRIMAL-DUAL PROXIMAL METHODS FOR LINEARLY CONSTRAINED COMPOSITE CONVEX PROGRAMMING YANGYANG XU Abstract. Motivated by big data applications, first-order methods have been extremely

More information

Douglas-Rachford Splitting: Complexity Estimates and Accelerated Variants

Douglas-Rachford Splitting: Complexity Estimates and Accelerated Variants 53rd IEEE Conference on Decision and Control December 5-7, 204. Los Angeles, California, USA Douglas-Rachford Splitting: Complexity Estimates and Accelerated Variants Panagiotis Patrinos and Lorenzo Stella

More information

A relaxed customized proximal point algorithm for separable convex programming

A relaxed customized proximal point algorithm for separable convex programming A relaxed customized proximal point algorithm for separable convex programming Xingju Cai Guoyong Gu Bingsheng He Xiaoming Yuan August 22, 2011 Abstract The alternating direction method (ADM) is classical

More information

On Glowinski s Open Question of Alternating Direction Method of Multipliers

On Glowinski s Open Question of Alternating Direction Method of Multipliers On Glowinski s Open Question of Alternating Direction Method of Multipliers Min Tao 1 and Xiaoming Yuan Abstract. The alternating direction method of multipliers ADMM was proposed by Glowinski and Marrocco

More information

INERTIAL PRIMAL-DUAL ALGORITHMS FOR STRUCTURED CONVEX OPTIMIZATION

INERTIAL PRIMAL-DUAL ALGORITHMS FOR STRUCTURED CONVEX OPTIMIZATION INERTIAL PRIMAL-DUAL ALGORITHMS FOR STRUCTURED CONVEX OPTIMIZATION RAYMOND H. CHAN, SHIQIAN MA, AND JUNFENG YANG Abstract. The primal-dual algorithm recently proposed by Chambolle & Pock (abbreviated as

More information

On the Iteration Complexity of Some Projection Methods for Monotone Linear Variational Inequalities

On the Iteration Complexity of Some Projection Methods for Monotone Linear Variational Inequalities On the Iteration Complexity of Some Projection Methods for Monotone Linear Variational Inequalities Caihua Chen Xiaoling Fu Bingsheng He Xiaoming Yuan January 13, 2015 Abstract. Projection type methods

More information

Dual Ascent. Ryan Tibshirani Convex Optimization

Dual Ascent. Ryan Tibshirani Convex Optimization Dual Ascent Ryan Tibshirani Conve Optimization 10-725 Last time: coordinate descent Consider the problem min f() where f() = g() + n i=1 h i( i ), with g conve and differentiable and each h i conve. Coordinate

More information

Dual Methods. Lecturer: Ryan Tibshirani Convex Optimization /36-725

Dual Methods. Lecturer: Ryan Tibshirani Convex Optimization /36-725 Dual Methods Lecturer: Ryan Tibshirani Conve Optimization 10-725/36-725 1 Last time: proimal Newton method Consider the problem min g() + h() where g, h are conve, g is twice differentiable, and h is simple.

More information

ALADIN An Algorithm for Distributed Non-Convex Optimization and Control

ALADIN An Algorithm for Distributed Non-Convex Optimization and Control ALADIN An Algorithm for Distributed Non-Convex Optimization and Control Boris Houska, Yuning Jiang, Janick Frasch, Rien Quirynen, Dimitris Kouzoupis, Moritz Diehl ShanghaiTech University, University of

More information

arxiv: v1 [math.oc] 27 Jan 2013

arxiv: v1 [math.oc] 27 Jan 2013 An Extragradient-Based Alternating Direction Method for Convex Minimization arxiv:1301.6308v1 [math.oc] 27 Jan 2013 Shiqian MA Shuzhong ZHANG January 26, 2013 Abstract In this paper, we consider the problem

More information

Dual and primal-dual methods

Dual and primal-dual methods ELE 538B: Large-Scale Optimization for Data Science Dual and primal-dual methods Yuxin Chen Princeton University, Spring 2018 Outline Dual proximal gradient method Primal-dual proximal gradient method

More information

Provable Alternating Minimization Methods for Non-convex Optimization

Provable Alternating Minimization Methods for Non-convex Optimization Provable Alternating Minimization Methods for Non-convex Optimization Prateek Jain Microsoft Research, India Joint work with Praneeth Netrapalli, Sujay Sanghavi, Alekh Agarwal, Animashree Anandkumar, Rashish

More information

M. Marques Alves Marina Geremia. November 30, 2017

M. Marques Alves Marina Geremia. November 30, 2017 Iteration complexity of an inexact Douglas-Rachford method and of a Douglas-Rachford-Tseng s F-B four-operator splitting method for solving monotone inclusions M. Marques Alves Marina Geremia November

More information

An efficient ADMM algorithm for high dimensional precision matrix estimation via penalized quadratic loss

An efficient ADMM algorithm for high dimensional precision matrix estimation via penalized quadratic loss An efficient ADMM algorithm for high dimensional precision matrix estimation via penalized quadratic loss arxiv:1811.04545v1 [stat.co] 12 Nov 2018 Cheng Wang School of Mathematical Sciences, Shanghai Jiao

More information

Probabilistic Graphical Models

Probabilistic Graphical Models School of Computer Science Probabilistic Graphical Models Distributed ADMM for Gaussian Graphical Models Yaoliang Yu Lecture 29, April 29, 2015 Eric Xing @ CMU, 2005-2015 1 Networks / Graphs Eric Xing

More information

Preconditioned Alternating Direction Method of Multipliers for the Minimization of Quadratic plus Non-Smooth Convex Functionals

Preconditioned Alternating Direction Method of Multipliers for the Minimization of Quadratic plus Non-Smooth Convex Functionals SpezialForschungsBereich F 32 Karl Franzens Universita t Graz Technische Universita t Graz Medizinische Universita t Graz Preconditioned Alternating Direction Method of Multipliers for the Minimization

More information

On Glowinski s Open Question on the Alternating Direction Method of Multipliers

On Glowinski s Open Question on the Alternating Direction Method of Multipliers Journal of Optimization Theory and Applications (2018) 179:163 196 https://doi.org/10.1007/s10957-018-1338-x On Glowinski s Open Question on the Alternating Direction Method of Multipliers Min Tao 1 Xiaoming

More information

Project Discussions: SNL/ADMM, MDP/Randomization, Quadratic Regularization, and Online Linear Programming

Project Discussions: SNL/ADMM, MDP/Randomization, Quadratic Regularization, and Online Linear Programming Project Discussions: SNL/ADMM, MDP/Randomization, Quadratic Regularization, and Online Linear Programming Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305,

More information

Optimal linearized symmetric ADMM for multi-block separable convex programming

Optimal linearized symmetric ADMM for multi-block separable convex programming Optimal linearized symmetric ADMM for multi-block separable convex programming Xiaokai Chang Jianchao Bai 3 and Sanyang Liu School of Mathematics and Statistics Xidian University Xi an 7007 PR China College

More information

You should be able to...

You should be able to... Lecture Outline Gradient Projection Algorithm Constant Step Length, Varying Step Length, Diminishing Step Length Complexity Issues Gradient Projection With Exploration Projection Solving QPs: active set

More information

ADMM and Fast Gradient Methods for Distributed Optimization

ADMM and Fast Gradient Methods for Distributed Optimization ADMM and Fast Gradient Methods for Distributed Optimization João Xavier Instituto Sistemas e Robótica (ISR), Instituto Superior Técnico (IST) European Control Conference, ECC 13 July 16, 013 Joint work

More information

An ADMM Algorithm for Clustering Partially Observed Networks

An ADMM Algorithm for Clustering Partially Observed Networks An ADMM Algorithm for Clustering Partially Observed Networks Necdet Serhat Aybat Industrial Engineering Penn State University 2015 SIAM International Conference on Data Mining Vancouver, Canada Problem

More information

The Multi-Path Utility Maximization Problem

The Multi-Path Utility Maximization Problem The Multi-Path Utility Maximization Problem Xiaojun Lin and Ness B. Shroff School of Electrical and Computer Engineering Purdue University, West Lafayette, IN 47906 {linx,shroff}@ecn.purdue.edu Abstract

More information

Primal-dual Subgradient Method for Convex Problems with Functional Constraints

Primal-dual Subgradient Method for Convex Problems with Functional Constraints Primal-dual Subgradient Method for Convex Problems with Functional Constraints Yurii Nesterov, CORE/INMA (UCL) Workshop on embedded optimization EMBOPT2014 September 9, 2014 (Lucca) Yu. Nesterov Primal-dual

More information

Optimizing Nonconvex Finite Sums by a Proximal Primal-Dual Method

Optimizing Nonconvex Finite Sums by a Proximal Primal-Dual Method Optimizing Nonconvex Finite Sums by a Proximal Primal-Dual Method Davood Hajinezhad Iowa State University Davood Hajinezhad Optimizing Nonconvex Finite Sums by a Proximal Primal-Dual Method 1 / 35 Co-Authors

More information

A General Framework for a Class of Primal-Dual Algorithms for TV Minimization

A General Framework for a Class of Primal-Dual Algorithms for TV Minimization A General Framework for a Class of Primal-Dual Algorithms for TV Minimization Ernie Esser UCLA 1 Outline A Model Convex Minimization Problem Main Idea Behind the Primal Dual Hybrid Gradient (PDHG) Method

More information

ADMM Fused Lasso for Copy Number Variation Detection in Human 3 March Genomes / 1

ADMM Fused Lasso for Copy Number Variation Detection in Human 3 March Genomes / 1 ADMM Fused Lasso for Copy Number Variation Detection in Human Genomes Yifei Chen and Jacob Biesinger 3 March 2011 ADMM Fused Lasso for Copy Number Variation Detection in Human 3 March Genomes 2011 1 /

More information

Key words. alternating direction method of multipliers, convex composite optimization, indefinite proximal terms, majorization, iteration-complexity

Key words. alternating direction method of multipliers, convex composite optimization, indefinite proximal terms, majorization, iteration-complexity A MAJORIZED ADMM WITH INDEFINITE PROXIMAL TERMS FOR LINEARLY CONSTRAINED CONVEX COMPOSITE OPTIMIZATION MIN LI, DEFENG SUN, AND KIM-CHUAN TOH Abstract. This paper presents a majorized alternating direction

More information

Sparse Optimization Lecture: Dual Methods, Part I

Sparse Optimization Lecture: Dual Methods, Part I Sparse Optimization Lecture: Dual Methods, Part I Instructor: Wotao Yin July 2013 online discussions on piazza.com Those who complete this lecture will know dual (sub)gradient iteration augmented l 1 iteration

More information

Introduction to Alternating Direction Method of Multipliers

Introduction to Alternating Direction Method of Multipliers Introduction to Alternating Direction Method of Multipliers Yale Chang Machine Learning Group Meeting September 29, 2016 Yale Chang (Machine Learning Group Meeting) Introduction to Alternating Direction

More information

A Brief Overview of Practical Optimization Algorithms in the Context of Relaxation

A Brief Overview of Practical Optimization Algorithms in the Context of Relaxation A Brief Overview of Practical Optimization Algorithms in the Context of Relaxation Zhouchen Lin Peking University April 22, 2018 Too Many Opt. Problems! Too Many Opt. Algorithms! Zero-th order algorithms:

More information

10-725/36-725: Convex Optimization Spring Lecture 21: April 6

10-725/36-725: Convex Optimization Spring Lecture 21: April 6 10-725/36-725: Conve Optimization Spring 2015 Lecturer: Ryan Tibshirani Lecture 21: April 6 Scribes: Chiqun Zhang, Hanqi Cheng, Waleed Ammar Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer:

More information

Improving an ADMM-like Splitting Method via Positive-Indefinite Proximal Regularization for Three-Block Separable Convex Minimization

Improving an ADMM-like Splitting Method via Positive-Indefinite Proximal Regularization for Three-Block Separable Convex Minimization Improving an AMM-like Splitting Method via Positive-Indefinite Proximal Regularization for Three-Block Separable Convex Minimization Bingsheng He 1 and Xiaoming Yuan August 14, 016 Abstract. The augmented

More information

A Primal-dual Three-operator Splitting Scheme

A Primal-dual Three-operator Splitting Scheme Noname manuscript No. (will be inserted by the editor) A Primal-dual Three-operator Splitting Scheme Ming Yan Received: date / Accepted: date Abstract In this paper, we propose a new primal-dual algorithm

More information

Infeasibility Detection in Alternating Direction Method of Multipliers for Convex Quadratic Programs

Infeasibility Detection in Alternating Direction Method of Multipliers for Convex Quadratic Programs MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com Infeasibility Detection in Alternating Direction Method of Multipliers for Convex Quadratic Programs Raghunathan, A. U.; Di Cairano, S. TR2014-109

More information

Adaptive Corrected Procedure for TVL1 Image Deblurring under Impulsive Noise

Adaptive Corrected Procedure for TVL1 Image Deblurring under Impulsive Noise Adaptive Corrected Procedure for TVL1 Image Deblurring under Impulsive Noise Minru Bai(x T) College of Mathematics and Econometrics Hunan University Joint work with Xiongjun Zhang, Qianqian Shao June 30,

More information

On the equivalence of the primal-dual hybrid gradient method and Douglas Rachford splitting

On the equivalence of the primal-dual hybrid gradient method and Douglas Rachford splitting Mathematical Programming manuscript No. (will be inserted by the editor) On the equivalence of the primal-dual hybrid gradient method and Douglas Rachford splitting Daniel O Connor Lieven Vandenberghe

More information

Donald Goldfarb IEOR Department Columbia University UCLA Mathematics Department Distinguished Lecture Series May 17 19, 2016

Donald Goldfarb IEOR Department Columbia University UCLA Mathematics Department Distinguished Lecture Series May 17 19, 2016 Optimization for Tensor Models Donald Goldfarb IEOR Department Columbia University UCLA Mathematics Department Distinguished Lecture Series May 17 19, 2016 1 Tensors Matrix Tensor: higher-order matrix

More information

A Bregman alternating direction method of multipliers for sparse probabilistic Boolean network problem

A Bregman alternating direction method of multipliers for sparse probabilistic Boolean network problem A Bregman alternating direction method of multipliers for sparse probabilistic Boolean network problem Kangkang Deng, Zheng Peng Abstract: The main task of genetic regulatory networks is to construct a

More information

Convergent prediction-correction-based ADMM for multi-block separable convex programming

Convergent prediction-correction-based ADMM for multi-block separable convex programming Convergent prediction-correction-based ADMM for multi-block separable convex programming Xiaokai Chang a,b, Sanyang Liu a, Pengjun Zhao c and Xu Li b a School of Mathematics and Statistics, Xidian University,

More information

Linearized Alternating Direction Method: Two Blocks and Multiple Blocks. Zhouchen Lin 林宙辰北京大学

Linearized Alternating Direction Method: Two Blocks and Multiple Blocks. Zhouchen Lin 林宙辰北京大学 Linearized Alternating Direction Method: Two Blocks and Multiple Blocks Zhouchen Lin 林宙辰北京大学 Dec. 3, 014 Outline Alternating Direction Method (ADM) Linearized Alternating Direction Method (LADM) Two Blocks

More information

A Convex Approach for Designing Good Linear Embeddings. Chinmay Hegde

A Convex Approach for Designing Good Linear Embeddings. Chinmay Hegde A Convex Approach for Designing Good Linear Embeddings Chinmay Hegde Redundancy in Images Image Acquisition Sensor Information content

More information

AUGMENTED LAGRANGIAN METHODS FOR CONVEX OPTIMIZATION

AUGMENTED LAGRANGIAN METHODS FOR CONVEX OPTIMIZATION New Horizon of Mathematical Sciences RIKEN ithes - Tohoku AIMR - Tokyo IIS Joint Symposium April 28 2016 AUGMENTED LAGRANGIAN METHODS FOR CONVEX OPTIMIZATION T. Takeuchi Aihara Lab. Laboratories for Mathematics,

More information

Asynchronous Algorithms for Conic Programs, including Optimal, Infeasible, and Unbounded Ones

Asynchronous Algorithms for Conic Programs, including Optimal, Infeasible, and Unbounded Ones Asynchronous Algorithms for Conic Programs, including Optimal, Infeasible, and Unbounded Ones Wotao Yin joint: Fei Feng, Robert Hannah, Yanli Liu, Ernest Ryu (UCLA, Math) DIMACS: Distributed Optimization,

More information

ARock: an algorithmic framework for asynchronous parallel coordinate updates

ARock: an algorithmic framework for asynchronous parallel coordinate updates ARock: an algorithmic framework for asynchronous parallel coordinate updates Zhimin Peng, Yangyang Xu, Ming Yan, Wotao Yin ( UCLA Math, U.Waterloo DCO) UCLA CAM Report 15-37 ShanghaiTech SSDS 15 June 25,

More information

Proximal-like contraction methods for monotone variational inequalities in a unified framework

Proximal-like contraction methods for monotone variational inequalities in a unified framework Proximal-like contraction methods for monotone variational inequalities in a unified framework Bingsheng He 1 Li-Zhi Liao 2 Xiang Wang Department of Mathematics, Nanjing University, Nanjing, 210093, China

More information

Splitting Techniques in the Face of Huge Problem Sizes: Block-Coordinate and Block-Iterative Approaches

Splitting Techniques in the Face of Huge Problem Sizes: Block-Coordinate and Block-Iterative Approaches Splitting Techniques in the Face of Huge Problem Sizes: Block-Coordinate and Block-Iterative Approaches Patrick L. Combettes joint work with J.-C. Pesquet) Laboratoire Jacques-Louis Lions Faculté de Mathématiques

More information

Approximation algorithms for nonnegative polynomial optimization problems over unit spheres

Approximation algorithms for nonnegative polynomial optimization problems over unit spheres Front. Math. China 2017, 12(6): 1409 1426 https://doi.org/10.1007/s11464-017-0644-1 Approximation algorithms for nonnegative polynomial optimization problems over unit spheres Xinzhen ZHANG 1, Guanglu

More information

RECOVERING LOW-RANK AND SPARSE COMPONENTS OF MATRICES FROM INCOMPLETE AND NOISY OBSERVATIONS. December

RECOVERING LOW-RANK AND SPARSE COMPONENTS OF MATRICES FROM INCOMPLETE AND NOISY OBSERVATIONS. December RECOVERING LOW-RANK AND SPARSE COMPONENTS OF MATRICES FROM INCOMPLETE AND NOISY OBSERVATIONS MIN TAO AND XIAOMING YUAN December 31 2009 Abstract. Many applications arising in a variety of fields can be

More information

ALTERNATING DIRECTION METHOD OF MULTIPLIERS WITH VARIABLE STEP SIZES

ALTERNATING DIRECTION METHOD OF MULTIPLIERS WITH VARIABLE STEP SIZES ALTERNATING DIRECTION METHOD OF MULTIPLIERS WITH VARIABLE STEP SIZES Abstract. The alternating direction method of multipliers (ADMM) is a flexible method to solve a large class of convex minimization

More information

A Simple Heuristic Based on Alternating Direction Method of Multipliers for Solving Mixed-Integer Nonlinear Optimization

A Simple Heuristic Based on Alternating Direction Method of Multipliers for Solving Mixed-Integer Nonlinear Optimization A Simple Heuristic Based on Alternating Direction Method of Multipliers for Solving Mixed-Integer Nonlinear Optimization Yoshihiro Kanno Satoshi Kitayama The University of Tokyo Kanazawa University May

More information

An ADMM algorithm for optimal sensor and actuator selection

An ADMM algorithm for optimal sensor and actuator selection An ADMM algorithm for optimal sensor and actuator selection Neil K. Dhingra, Mihailo R. Jovanović, and Zhi-Quan Luo 53rd Conference on Decision and Control, Los Angeles, California, 2014 1 / 25 2 / 25

More information

LARGE SCALE COMPOSITE OPTIMIZATION PROBLEMS WITH COUPLED OBJECTIVE FUNCTIONS: THEORY, ALGORITHMS AND APPLICATIONS CUI YING. (B.Sc.

LARGE SCALE COMPOSITE OPTIMIZATION PROBLEMS WITH COUPLED OBJECTIVE FUNCTIONS: THEORY, ALGORITHMS AND APPLICATIONS CUI YING. (B.Sc. LARGE SCALE COMPOSITE OPTIMIZATION PROBLEMS WITH COUPLED OBJECTIVE FUNCTIONS: THEORY, ALGORITHMS AND APPLICATIONS CUI YING (B.Sc., ZJU, China) A THESIS SUBMITTED FOR THE DEGREE OF DOCTOR OF PHILOSOPHY

More information

ON THE GLOBAL AND LINEAR CONVERGENCE OF THE GENERALIZED ALTERNATING DIRECTION METHOD OF MULTIPLIERS

ON THE GLOBAL AND LINEAR CONVERGENCE OF THE GENERALIZED ALTERNATING DIRECTION METHOD OF MULTIPLIERS ON THE GLOBAL AND LINEAR CONVERGENCE OF THE GENERALIZED ALTERNATING DIRECTION METHOD OF MULTIPLIERS WEI DENG AND WOTAO YIN Abstract. The formulation min x,y f(x) + g(y) subject to Ax + By = b arises in

More information