Distributed Control of Connected Vehicles and Fast ADMM for Sparse SDPs

Similar documents
Fast ADMM for Sum of Squares Programs Using Partial Orthogonality

Exploiting Structure in SDPs with Chordal Sparsity

III. Applications in convex optimization

Sparse Matrix Theory and Semidefinite Optimization

Research Reports on Mathematical and Computing Sciences

Parallel implementation of primal-dual interior-point methods for semidefinite programs

Uses of duality. Geoff Gordon & Ryan Tibshirani Optimization /

The Ongoing Development of CSDP

arxiv: v1 [math.oc] 26 Sep 2015

Degeneracy in Maximal Clique Decomposition for Semidefinite Programs

12. Interior-point methods

THe increasing traffic demand in today s life brings a

IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS 1

Behavioral Cooperation of Multiple Connected Vehicles with Directed Acyclic Interactions using Feedforward-Feedback Control

A CONIC DANTZIG-WOLFE DECOMPOSITION APPROACH FOR LARGE SCALE SEMIDEFINITE PROGRAMMING

Convex Optimization of Graph Laplacian Eigenvalues

SEMIDEFINITE PROGRAM BASICS. Contents

Primal-Dual Geometry of Level Sets and their Explanatory Value of the Practical Performance of Interior-Point Methods for Conic Optimization

Sparse Optimization Lecture: Basic Sparse Optimization Models

Graph and Controller Design for Disturbance Attenuation in Consensus Networks

Dimension reduction for semidefinite programming

Fast Algorithms for SDPs derived from the Kalman-Yakubovich-Popov Lemma

SMCP Documentation. Release Martin S. Andersen and Lieven Vandenberghe

15. Conic optimization

Lecture 14: Optimality Conditions for Conic Problems

Distributed Model Predictive Control for Heterogeneous Vehicle Platoons under Unidirectional Topologies

Algorithm-Hardware Co-Optimization of Memristor-Based Framework for Solving SOCP and Homogeneous QCQP Problems

Convex Optimization of Graph Laplacian Eigenvalues

Semidefinite Programming, Combinatorial Optimization and Real Algebraic Geometry

Optimal Network Topology Design in Multi-Agent Systems for Efficient Average Consensus

CSC Linear Programming and Combinatorial Optimization Lecture 10: Semidefinite Programming

Analysis and synthesis: a complexity perspective

2 Regularized Image Reconstruction for Compressive Imaging and Beyond

A PARALLEL INTERIOR POINT DECOMPOSITION ALGORITHM FOR BLOCK-ANGULAR SEMIDEFINITE PROGRAMS IN POLYNOMIAL OPTIMIZATION

On the Scalability in Cooperative Control. Zhongkui Li. Peking University

Equivalent relaxations of optimal power flow

ALADIN An Algorithm for Distributed Non-Convex Optimization and Control

4. Algebra and Duality

Fast Linear Iterations for Distributed Averaging 1

A CONIC INTERIOR POINT DECOMPOSITION APPROACH FOR LARGE SCALE SEMIDEFINITE PROGRAMMING

Robust and Optimal Control, Spring 2015

Preconditioning via Diagonal Scaling

Asynchronous Algorithms for Conic Programs, including Optimal, Infeasible, and Unbounded Ones

Various Techniques for Nonlinear Energy-Related Optimizations. Javad Lavaei. Department of Electrical Engineering Columbia University

A New Use of Douglas-Rachford Splitting and ADMM for Classifying Infeasible, Unbounded, and Pathological Conic Programs

On Optimal Frame Conditioners

PENNON A Generalized Augmented Lagrangian Method for Convex NLP and SDP p.1/39

Dual methods and ADMM. Barnabas Poczos & Ryan Tibshirani Convex Optimization /36-725

6-1 The Positivstellensatz P. Parrilo and S. Lall, ECC

12. Interior-point methods

Lecture 14: Random Walks, Local Graph Clustering, Linear Programming

Introduction to Semidefinite Programming I: Basic properties a

MULTI-AGENT TRACKING OF A HIGH-DIMENSIONAL ACTIVE LEADER WITH SWITCHING TOPOLOGY

Introduction to Alternating Direction Method of Multipliers

Explicit Sensor Network Localization using Semidefinite Programming and Clique Reductions

m i=1 c ix i i=1 F ix i F 0, X O.

Semidefinite Optimization using MOSEK

Decentralized Stabilization of Heterogeneous Linear Multi-Agent Systems

A PARALLEL INTERIOR POINT DECOMPOSITION ALGORITHM FOR BLOCK-ANGULAR SEMIDEFINITE PROGRAMS

Continuous Optimisation, Chpt 9: Semidefinite Problems

SDPARA : SemiDefinite Programming Algorithm PARAllel version

Stochastic dynamical modeling:

EE C128 / ME C134 Feedback Control Systems

Semidefinite Programming

Sparsity-Promoting Optimal Control of Distributed Systems

Lecture 7: Convex Optimizations

Research overview. Seminar September 4, Lehigh University Department of Industrial & Systems Engineering. Research overview.

15-780: LinearProgramming

Solving large Semidefinite Programs - Part 1 and 2

The maximal stable set problem : Copositive programming and Semidefinite Relaxations

Lecture 17: Primal-dual interior-point methods part II

Reducing Time Headway for Platoons of Connected Vehicles via Multiple-Predecessor Following

Second-order cone programming

Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 5

Average-Consensus of Multi-Agent Systems with Direct Topology Based on Event-Triggered Control

Lecture Note 5: Semidefinite Programming for Stability Analysis

How to generate weakly infeasible semidefinite programs via Lasserre s relaxations for polynomial optimization

Numerical Methods I Non-Square and Sparse Linear Systems

Lecture 5. The Dual Cone and Dual Problem

The moment-lp and moment-sos approaches

A Customized ADMM for Rank-Constrained Optimization Problems with Approximate Formulations

A Julia JuMP-based module for polynomial optimization with complex variables applied to ACOPF

Homework 5 ADMM, Primal-dual interior point Dual Theory, Dual ascent

Lagrange Duality. Daniel P. Palomar. Hong Kong University of Science and Technology (HKUST)

18. Primal-dual interior-point methods

Sparse quadratic regulator

Infeasibility detection in the alternating direction method of multipliers for convex optimization

Low-ranksemidefiniteprogrammingforthe. MAX2SAT problem. Bosch Center for Artificial Intelligence

CS6375: Machine Learning Gautam Kunapuli. Support Vector Machines

Semidefinite Programming Basics and Applications

Lecture 5. Theorems of Alternatives and Self-Dual Embedding

SPARSE SECOND ORDER CONE PROGRAMMING FORMULATIONS FOR CONVEX OPTIMIZATION PROBLEMS

Dual Ascent. Ryan Tibshirani Convex Optimization

Module 04 Optimization Problems KKT Conditions & Solvers

A direct formulation for sparse PCA using semidefinite programming

LMI Methods in Optimal and Robust Control

Alternating Direction Method of Multipliers. Ryan Tibshirani Convex Optimization

4. Convex optimization problems

Sparse Optimization Lecture: Dual Certificate in l 1 Minimization

Strange Behaviors of Interior-point Methods. for Solving Semidefinite Programming Problems. in Polynomial Optimization

Transcription:

Distributed Control of Connected Vehicles and Fast ADMM for Sparse SDPs Yang Zheng Department of Engineering Science, University of Oxford Homepage: http://users.ox.ac.uk/~ball4503/index.html Talk at Cranfield University, May 2017

OUTLINE 1 Distributed Control of Connected Vehicles 2 Fast ADMM for Sparse SDPs 2017/5/9 2

Outline 1 Distributed Control of Connected Vehicles 1) Background 2) Modeling: the four-component framework 3) Analysis: Stability and Robustness 4) Synthesis: Design of DMPC 1. Zheng, Y., Li, S. E., Wang, J., Cao, D., & Li, K. (2016). Stability and scalability of homogeneous vehicular platoon: Study on the influence of information flow topologies. IEEE Transactions on Intelligent Transportation Systems, 17(1), 14-26. 2. Zheng, Y., Li, S. E., Li, K., & Wang, L. Y. (2016). Stability margin improvement of vehicular platoon considering undirected topology and asymmetric control. IEEE Transactions on Control Systems Technology, 24(4), 1253-1265. 3. Zheng, Y., Li, S. E., Li, K., Borrelli, F., & Hedrick, J. K. (2017). Distributed model predictive control for heterogeneous vehicle platoons under unidirectional topologies. IEEE Transactions on Control Systems Technology, 25(3), 899-910. 2017/5/9 3

1. Distributed Control of Connected Vehicles Background: Vehicle Platoon Control Objectives a) to ensure all the vehicles in the same group to move at the same speed with the leader b) to maintain the desired spaces between adjacent vehicles Potential Benefits Improve traffic efficiency, enhance road safety, and reduce fuel consumption, etc. The earliest implementation can date back to the PATH program during the last eighties Real-world experiments USA - PATH Europe - SARTRE Japan - Energy ITS 4

1. Distributed Control of Connected Vehicles View platoons from a networked control perspective Typical Communication Topology Different Communication Topologies ( a ) Connected Vehicle by V2V ( b ) ( c ) ( d ) ( e ) New challenges: Variety of topologies ( f ) 1. Dynamical Modeling 2. Performance Analysis 3. Controller Synthesis 1. The four-component framework 2. Stability and robustness analysis 3. Design of Distributed Model Predictive Control 5

1. Distributed Control of Connected Vehicles Modeling: Networks of Dynamical Systems From Control Perspective Source: http://www.mcs.anl.gov/~fulin/talks/argonne.pdf Applications 1. Dynamics + Communication 2. Control Theory + Graph Theory Research topics 1. Dynamic: single integrator, double integrator, linear dynamic, nonlinear dynamic 2. Communication: data rate, switching topology, time-delay Nature Transportation Smart grid Vehicle platoons can be viewed as a special one-dimensional network of dynamical system 6

1. Distributed Control of Connected Vehicles Modeling of Platoons: the four-component framework Distributed Controller Information Flow Topology C N u N Controller C i C i-1 C 1 u i u i-1 u 1 v 0 LV d r,i t d des,i N node i i-1 1 0 Vehicle Dynamics Formation Geometry Vehicle Dynamics: linear dynamics, nonlinear dyanmics; Formation Geometry: constant spacing, time headway policy. Distributed Controller: linear controller, MPC, robust control; Information Flow Topology: PF, PFL, BD, etc; Algebraic graph theory Explicitly highlight the influence of different components! 7

1. Distributed Control of Connected Vehicles Modeling: Categorization of existing works [Li and Zheng et al., 2015] 8

1. Distributed Control of Connected Vehicles Modeling: Typical case 1. Linear dynamics 2. Constant spacing policy 3. Communication topology 4. Linear controller Closed-loop dynamics x i t = u i t = xሶ i t = Ax i t + B 1 u i t + B 2 w i t p i v i a i, A = 0 1 0 0 0 1 0 0 1 τ, B 1 = 0 0 1 τ, B 2 = lim v i t v 0 t = 0, i = 1,2, N t lim p, i 1 t p i t d i 1,i = 0 t Pinning matrix P, Adjacency matrix A, Laplacian matrix L j I i k p p i p j d i,j + k v v i v j + k a a i a j 0 0 1 τ ሶ X = I N A L + P B 1 k T X + I N B 2 W 9

1. Distributed Control of Connected Vehicles Analysis: Stability and Robustness Performance index Closed-loop Stability: Stability Margin: ሶ X = I N A L + P B 1 k T X + I N B 2 W Im Stability Margin d min Robustness index: R w i t + L 2 = න 0 w i t 2 dt < AF f2l = sup p N L 2 w 1 L2 = G f2l s H AF a2a = sup Y L 2 W L2 = G a2a s H 10c

1. Distributed Control of Connected Vehicles Stability Region Analysis [Zheng et al. 2014, ITSC] Consider a homogeneous platoon with linear controllers given by ሶ X = I N A L + P Bk T X If graph G satisfies certain conditions (all the eigenvalues of L + P are positive real numbers), the platoon is asymptotically stable if and only if Proof sketch: k p > 0 k v > k p ττmin λ i k a + 1 k a > 1Τmax λ i Similarity transformation + Routh Hurwitz stability criterion S I N A L + P Bk T si A λ i Bk T = s 3 + λ ik a +1 τ N = ራ i=1 S A λ i Bk T s 2 + λ ik v τ s + λ ik p τ. 11c

1. Distributed Control of Connected Vehicles Scaling of Stability Margin [Zheng et al. 2016, IEEE ITS] Consider a homogeneous platoon with linear controllers given by ሶ X = I N A L + P Bk T X (2.1) if the graph G is in Bidirectional topology, then the stability margin decays to zero as O 1ΤN 2 (2.2) if the graph G is in BDL topology, then the stability margin is always bounded away from zero. N N-1 N-2 1 0 Im Bidirectional (BD) topology Stability Margin d min R N N-1 N-2 1 0 Bidirectional-leader (BDL) topology 12c

1. Distributed Control of Connected Vehicles Stability Margin Improvement : Asymmetric control [Zheng et al. 2016, IEEE CST] Consider a homogeneous platoon under the BD topology with the asymmetric controller architecture given by ሶ X = I N A L BD + P BD ε Bk T X (3.2) For any fixed ϵ 0,1, the stability margin is bounded away from zero and independent of the platoon size N (N can be any finite integer). k f N k f N-1 k f 3 k f 2 k f 1 k b N-1 N N-1 k b N-2 k b 2 2 k b 1 1 0 Asymmetric control The controller is called asymmetric, if LV ቐ k f i = 1 + ε k, k b i = 1 ε k i = 1,, N 1 k f N = 1 + ε k, where ε 0,1 is called the asymmetric degree. Note that if ε = 0, then it is reduced to the symmetric case. 13c

Stability Margin Spacing Error (m) 1. Distributed Control of Connected Vehicles Stability Margin Improvement : Asymmetric control [Zheng et al. 2016, IEEE CST] Tradeoff:Convergence Speed and Transient Performance Benefit: bounded stability margin good for convergence speed Cost: overshooting phenomena in transient process. Spacing Error (m) 15 10 5 0 1 6 12 18 24 30 Spacing Error (m) 30 20 10 0-10 1 6 12 18 24 30 10 0-5 -20 10-1 10-2 10-3 10-4 10-5 ε=0 ε=0.2 ε=0.4 ε=0.6 10 100 500 The number of vehicles:n Spacing Error (m) -10 0 50 100 150 200 Time (s) 40 30 20 10 0-10 -20-30 (a) 1 6 12 18 24 30-40 0 50 100 150 200 Time (s) -30 0 50 100 150 200 Time (s) 25 20 15 10 5 0-5 -10-15 -20 (b) 1 6 12 18 24 30-25 0 50 100 150 200 Time (s) (c) (d) Space errors for homogeneous platoons under BD topology with different asymmetric degree ϵ. (a) ϵ=0 (symmetric); (b) ϵ=0.2; (c) ϵ=0.4; (d) ϵ=0.6

N=2n c N=c(k+1) n+2 c+1 1. Distributed Control of Connected Vehicles Stability Margin Improvement : Topological Selection [Zheng et al. 2016, IEEE CST] Consider a homogeneous platoon with linear controllers given by ሶ X = I N A L + P Bk T X (3.1) if the graph G is undirected, to maintain bounded stability margin, it needs at least lots of followers (i.e. Ω(N) = O(N)) to obtain the leader s information. To maintain bounded stability margin, the tree depth of graph G should be a constant number and independent of the platoon size N N=2n n n+1 n 2 1 0 N=c(k+1) c ck+1 ck c(k-1)+1 c 1 0 0 0 1 2 n-1 n 1 c+1 c(k-1)+1 ck+1 15c

Stability Margin 1. Distributed Control of Connected Vehicles Stability Margin Improvement : Topological Selection [Zheng et al. 2016, IEEE CST] 10 0 10-1 10-2 10-3 10-4 10-5 10-6 Ω(N)=N/2,Tree depth: c=2 Ω(N)=N/4,Tree depth: c=4 Ω(N)=N/2,Tree depth: c=n/2 Ω(N)=N/4,Tree depth: c=3n/4 10 100 500 The number of vehicles:n Extending information flow to reduce the tree depth is one major way to guarantee a bounded stability margin. 16c

1. Distributed Control of Connected Vehicles Design of DMPC for Nonlinear Heterogeneous platoons Design a distributed controller for a heterogeneous platoon considering nonlinear dynamics, input constraints and variety of communication topologies Nonlinear Heterogeneous model Design of DMPC pሶ i t = v i t η T,i T r i t = m i vሶ i t + C A,i v 2 i t + m i gf i w,i τ i Tሶ i t + T i t = u i t discretization x i t + 1 = φ i x i + ψ i u i t Stability analysis Linear homogeneous platoon Performance analysis Simulation Benchmark 17

1. Distributed Control of Connected Vehicles DMPC: Local open-loop optimal control problem Problem F i : For i 1,2,, N at time t min J i y p U i : ȁt, u p a i : ȁt, y i : ȁt, y i i s.t. N p 1 = p xሶ i k=0 l i y i p p kȁt, u i ȁ k + 1 t = φ i y i p y i p x i p a kȁt, y i ȁ k t p kȁt = γx i a ȁ : t a kȁt, y i + ψ i u i p ȁ k t k = 0,, N p 1 x i p u i p N p หt = 1 I i T i p ȁ 0 t = x i t ȁ k t U j I i a y j p N p หt = h i v i N p หt d j,i N p หt ȁ k t ȁ k t, Cost function Dynamic constraints in predictive horizon Input constraints Terminal Constraints stability This is based on the local average of neighboring outputs. 18

1. Distributed Control of Connected Vehicles DMPC: Local open-loop optimal control problem Construction of local cost function l i y i p p kȁt, u i kȁ a t, y i kȁ a t, y i ȁ k t Design Parameters = Q i y i p + R i u i p + + F i y i p j N i p G i y i ȁ k t y des,i ȁ k t h i v i p kȁ a t y i kȁ a t y j ȁ k t ȁ k t ȁ k t 2 2 2 ȁ k t ሚd i,j 2 Tracking leader p i = 0, Q i = 0 Penalize the input R i 0 Maintain its assumed output F i 0 Maintain the assumed output of its neighbors G i 0 This output is sent to the nodes in set O i Node i tries to maintain the output as close to the assumed trajectories of its neighbors (i.e., j N i ) as possible Stability j 1 j 2 j j 3 j 4 O j = j 1, j 2, j 3, j 4. i 1 i 2 i i 3 i 4 N i = i 1, i 2, i 3, i 4 19

1. Distributed Control of Connected Vehicles Assumption 1 (Unidirectional topology): The graph G contains a spanning tree rooting at the leader, and the communications are unidirectional from preceding vehicles to downstream ones Sufficient conditions [Zheng et al. 2017 IEEE CST] If G satisfies Assumption 1, a platoon under proposed DMPC is asymptotically stable if satisfying The main strategy is to construct a proper Lyapunov function for the platoon and prove its decreasing property F i j O i G j, i N ( a ) ( b ) ( c ) sum of local cost functions ( d ) N N-1 N-2 2 1 20 0 LV

Spacing error (m) Spacing error (m) Spacing error (m) Spacing error (m) 4. Synthesis: Design of DMPC The desired trajectory 20 mτs v 0 = ቐ20 + 2t mτs 22 mτs Weights PF PLF TPF TPLF F i F i = 10I 2, i N G i G 1 = 0, G i = 5I 2, i N\ 1 Q 1 = 10I 2, Q i Q i = 0, i N\ 1 R i R i = I 2, i N F i = 10I 2, i N G 1 = 0, G i = 5I 2, i N\ 1 Q i = 10I 2, i N R i = I 2, i N F i = 10I 2, i N G 1 = 0, G i = 5I 2, i N\ 1 Q 1 = 10I 2, Q 2 = 10I 2, Q i = 0, i N\ 1,2 R i = I 2, i N F i = 10I 2, i N G 1 = 0, G i = 5I 2, i N\ 1 Q i = 10I 2, i N R i = I 2, i N t 1 s 1s < t 2 s t > 2s N N-1 N-2 ( a ) ( b ) ( c ) ( d ) 2 1 0 LV 0.2 0.15 0.1 0.05 1 2 3 4 5 6 7 0.2 0.15 0.1 1 2 3 4 5 6 7 0.2 0.15 0.1 1 2 3 4 5 6 7 0.2 0.15 0.1 1 2 3 4 5 6 7 0 0.05 0.05 0.05-0.05-0.1 0 0 0-0.15 0 2 4 6 Time (s) -0.05 0 2 4 6 Time (s) -0.05 0 2 4 6 Time (s) -0.05 0 2 4 6 Time (s) (a) PF (b) PLF (c) TPF (d) TPLF 21

Outline 2 Fast ADMM for Sparse SDPs 1) SDPs with Chordal Sparsity 2) ADMM for Primal and Dual Sparse SDPs 3) ADMM for the Homogeneous Self-dual Embedding 4) CDCS: Cone Decomposition Conic Solver 1. Zheng, Y., Fantuzzi, G., Papachristodoulou, A., Goulart, P., & Wynn, A. (2016). Fast ADMM for semidefinite programs with chordal sparsity. arxiv preprint arxiv:1609.06068. 2. Zheng, Y., Fantuzzi, G., Papachristodoulou, A., Goulart, P., & Wynn, A. (2016). Fast ADMM for homogeneous selfdual embeddings of sparse SDPs. arxiv preprint arxiv:1611.01828. 2017/5/9 22

1. SDPs with Chordal Sparsity Standard Primal-dual Semidefinite Programs (SDPs) Dual Applications: control theory, power systems, polynomial optimization, combinatorics, operations research, etc. Control of a networked system (e.g., via Lyapunov theory) Picture sources: http://scholar.princeton.edu/ghall/home Optimal power flow problem (e.g., by dropping a rank constraint) 23

1. SDPs with Chordal Sparsity Standard Primal-dual Semidefinite Programs (SDPs) Dual Applications: control theory, fluid mechanics, polynomial optimization, combinatorics, operations research, etc. Interior-point solvers: SeDuMi, SDPA, SDPT3 (suitable for small and medium-sized problems); Modelling package: YALMIP, CVX; Large-scale cases: it is important to exploit the inherent structure of the instances (De Klerk, 2010): Low Rank Algebraic Symmetry Chordal Sparsity: Second-order methods: Fukuda et al., 2001; Nakata et al., 2003; Andersen et al., 2010; First-order methods: Madani et al. 2015; Sun, Andersen, and Vandenberghe, 2014. 24

1. SDPs with Chordal Sparsity Sparsity Pattern of Matrices Dual Sparse matrices S n E, 0 = S + n E, 0 = ȁ X S n X ij = 0, i, j E ȁ X S n E, 0 X 0 S n E,? = the set of n n partial symmetric matrices with elements defined on E. S n + E,? = X S n E,? ȁ M 0, M ij = X ij, i, j E S + n E,? and S + n E, 0 are dual cones of each other. 25

1. SDPs with Chordal Sparsity Chordal Graph A graph G is chordal if every cycle of length at least four has a chord. Any non-chordal graph can be chordal extended; Chordal extension A chordal graph can be decomposed into its maximal cliques C = C 1, C 2,, C p. Cliques in a graph are maximal complete subgraphs 26

1. SDPs with Chordal Sparsity Clique Decomposition Given a choral graph G = V, E with a set of maximal cliques C 1, C 2,, C p Grone s Theorem: X S n + E,? if and only if X C k 0, k = 1,, p. X C 1 0 X C 2 0 X S + n E,? i.e., matrix X is PSD completable X C 3 0 27

1. SDPs with Chordal Sparsity Clique Decomposition Given a choral graph G = V, E with a set of maximal cliques C 1, C 2,, C p Agler s Theorem: X S + n E, 0 if and only if there exists M k S + n C k p such that X = σ k=1 M k. X S + n E, 0 M 1 S + n C 1 M 2 S + n C 2 M 3 S + n C 3 Sparse Cone Decomposition (chordal) S + n E,? Grone s Theorem Dual Dual S + n E, 0 Agler s Theorem Topics in this talk ADMM for primal and dual SDPs; ADMM for the homogeneous self-dual embedding; CDCS: Cone Decomposition Conic Solver. 28

1. SDPs with Chordal Sparsity ADMM algorithm Augmented Lagrangian ADMM steps Iterations of ADMM: a) An x-minimization step b) A y-minimization step c) A dual variable update Boyd, S., Parikh, N., Chu, E., Peleato, B., & Eckstein, J. (2011). Distributed optimization and statistical learning via the alternating direction method of multipliers. Foundations and Trends in Machine Learning, 3(1), 1-122. 29

2. ADMM for Primal and Dual Sparse SDPs Aggregate sparsity pattern of matrices A union of patterns of C, A 1, A 2 Primal Dual Patterns of solutions Cone replacement 30

2. ADMM for Primal and Dual Sparse SDPs Cone Decomposition of Sparse SDPs Primal Dual Dual Aggregate sparsity pattern E is union of patterns of C, A 1,..., A m Cone Decomposition A big sparse PSD cone is equivalently replaced by a set of coupled small PSD cones; Our idea: introduce additional variables to decouple the coupling constraints. 31

2. ADMM for Primal and Dual Sparse SDPs ADMM for primal SDPs Consensus Reformulate using indicator functions Augmented Lagrangian Regroup the variables 32

2. ADMM for Primal and Dual Sparse SDPs ADMM for primal SDPs Consensus 1) Minimization over block X QP with linear constraint (Projections onto a linear subspace) 2) Minimization over block Y Projections onto small PSD cones; Can be computed in parallel. 3) Update multipliers 33

2. ADMM for Primal and Dual Sparse SDPs ADMM for dual SDPs Consensus Reformulate using indicator functions Augmented Lagrangian QP with linear constraints Projections in parallel ADMM steps in the dual form are scaled versions of those in the primal form! 34

2. ADMM for Primal and Dual Sparse SDPs The Big Picture The duality between the primal and dual SDP is inherited by the decomposed problems by virtue of the duality between Grone s and Agler s theorems. 35

3. ADMM for the Homogenous Self-dual Embedding KKT condition Primal Dual Notational simplicity KKT conditions Primal feasible Dual feasible Zero-duality gap 36

3. ADMM for the Homogenous Self-dual Embedding The Homogeneous Self-dual Embedding τ, κ: two non-negative and complementary variables Notational simplicity Feasibility problem The big sparse PSD cone has already been equivalently replaced by a set of coupled small PSD cones; 37

3. ADMM for the Homogenous Self-dual Embedding ADMM algorithm ADMM steps (similar to the solver SCS [1]) Projection onto a subspace Projection onto cones (smaller dimension) Q is highly structured and sparse Block elimination can be applied here to speed up the projection greatly; Then, the per-iteration cost is the same as applying a splitting method to the primal or dual alone. [1] O Donoghue, B., Chu, E., and Parikh, Nealand Boyd, S. (2016b). Conic optimization via operator splitting and homogeneous self-dual embedding. Journal of Optimization Theory and Applications, 169(3), 1042 1068 38

4. CDCS: Cone Decomposition Conic Solver CDCS An open source MATLAB solver for partially decomposable conic programs; CDCS supports constraints on the following cones: Free variables non-negative orthant second-order cone the positive semidefinite cone. Input-output format is in accordance with SeDuMi; Works with latest Yalmip release. Syntax: [x,y,z,info] = cdcs(at,b,c,k,opts); Download from https://github.com/oxfordcontrol/cdcs 39

4. CDCS: Cone Decomposition Conic Solver Random SDPs with block-arrow pattern Block size: d, Number of Blocks: l Arrow head: h Number of constraints: m Numerical Comparison SeDuMi SCS sparsecolo (preprocessor) +SeDuMi ε tol = 10 3 CDCS and SCS Numerical Results 40

4. CDCS: Cone Decomposition Conic Solver Benchmark problems in SDPLIB [2] Three sets of benchmark problems in SDPLIB (Borchers, 1999): 1) Four small and medium-sized SDPs ( theta1, theta2, qap5 and qap9); 2) Four large-scale sparse SDPs (maxg11, maxg32, qpg11 and qpg51); 3) Two infeasible SDPs (infp1 and infd1). [2] Borchers, Brian. "SDPLIB 1.2, a library of semidefinite programming test problems." Optimization Methods and Software 11.1-4 (1999): 683-690. 41

4. CDCS: Cone Decomposition Conic Solver Result: small and medium-sized instances 42

4. CDCS: Cone Decomposition Conic Solver Result: large-sparse instances maxg32: original cone size 2000; after chordal decomposition, maximal size 60; qpg11: original cone size 1600; after chordal decomposition, maximal size 24; 43

4. CDCS: Cone Decomposition Conic Solver Result: Infeasible instances 44

4. CDCS: Cone Decomposition Conic Solver Result: CPU time per iteration small and medium size large-scale and sparse Work with smaller semidefinite cones for large-scale sparse problems Our codes are currently written in MATLAB SCS is implemented in C. 45

5. Conclusion Summary Introduced a conversion framework for sparse SDPs Developed efficient ADMM algorithms Primal and dual standard form; The homogeneous self-dual embedding; suitable for first-order methods; CDCS: Download from https://github.com/oxfordcontrol/cdcs Ongoing work Develop ADMM algorithms for sparse SDPs arising in SOS. Applications in networked systems and power systems. 46

Thank you for your attention! Q & A