SDP and eigenvalue bounds for the graph partition problem

Similar documents
Applications of semidefinite programming, symmetry and algebra to graph partitioning problems

Semidefinite programming and eigenvalue bounds for the graph partition problem

On bounding the bandwidth of graphs with symmetry

New bounds for the max-k-cut and chromatic number of a graph

Relaxations of combinatorial problems via association schemes

Introduction to Semidefinite Programming I: Basic properties a

SDP relaxations for some combinatorial optimization problems

The min-cut and vertex separator problem

Modeling with semidefinite and copositive matrices

1 Matrix notation and preliminaries from spectral graph theory

A Continuation Approach Using NCP Function for Solving Max-Cut Problem

A Characterization of Distance-Regular Graphs with Diameter Three

Applications of semidefinite programming in Algebraic Combinatorics

MIT Algebraic techniques and semidefinite optimization February 14, Lecture 3

A lower bound for the Laplacian eigenvalues of a graph proof of a conjecture by Guo

CSC Linear Programming and Combinatorial Optimization Lecture 12: The Lift and Project Method

Improved bounds on crossing numbers of graphs via semidefinite programming

Copositive and Semidefinite Relaxations of the Quadratic Assignment Problem (appeared in Discrete Optimization 6 (2009) )

MIT Algebraic techniques and semidefinite optimization May 9, Lecture 21. Lecturer: Pablo A. Parrilo Scribe:???

Dimension reduction for semidefinite programming

Improved bounds on book crossing numbers of complete bipartite graphs via semidefinite programming

Copositive Programming and Combinatorial Optimization

Lift-and-Project Techniques and SDP Hierarchies

Introduction to Association Schemes

Lecture 13: Spectral Graph Theory

1 T 1 = where 1 is the all-ones vector. For the upper bound, let v 1 be the eigenvector corresponding. u:(u,v) E v 1(u)

Max k-cut and the smallest eigenvalue

The Simplest Semidefinite Programs are Trivial

Characterizations of Strongly Regular Graphs: Part II: Bose-Mesner algebras of graphs. Sung Y. Song Iowa State University

Linear algebra and applications to graphs Part 1

A LINEAR PROGRAMMING APPROACH TO SEMIDEFINITE PROGRAMMING PROBLEMS

The maximal stable set problem : Copositive programming and Semidefinite Relaxations

Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 5

Lecture 9: Laplacian Eigenmaps

Mustapha Ç. Pinar 1. Communicated by Jean Abadie

1 Matrix notation and preliminaries from spectral graph theory

An Algorithmist s Toolkit September 10, Lecture 1

Certifying the Global Optimality of Graph Cuts via Semidefinite Programming: A Theoretic Guarantee for Spectral Clustering

Eigenvalue Optimization for Solving the

Spectral Graph Theory and You: Matrix Tree Theorem and Centrality Metrics

CSC Linear Programming and Combinatorial Optimization Lecture 10: Semidefinite Programming

Finding normalized and modularity cuts by spectral clustering. Ljubjana 2010, October

CSCI 1951-G Optimization Methods in Finance Part 10: Conic Optimization

Solving large Semidefinite Programs - Part 1 and 2

Notes on the decomposition result of Karlin et al. [2] for the hierarchy of Lasserre by M. Laurent, December 13, 2012

Approximation Algorithms

Symmetry in RLT-type relaxations for the quadratic assignment and standard quadratic optimization problems

arxiv: v1 [math.oc] 23 Nov 2012

Semidefinite Programming, Combinatorial Optimization and Real Algebraic Geometry

Four new upper bounds for the stability number of a graph

Lecture 10: October 27, 2016

Copositive Programming and Combinatorial Optimization

XLVI Pesquisa Operacional na Gestão da Segurança Pública

6-1 The Positivstellensatz P. Parrilo and S. Lall, ECC

Lecture 12 : Graph Laplacians and Cheeger s Inequality

Projection Results for the k-partition Problem

Spectral Clustering on Handwritten Digits Database

arxiv: v2 [math.co] 27 Jul 2013

Convex Optimization of Graph Laplacian Eigenvalues

SDP Relaxations for MAXCUT

ON SUM OF SQUARES DECOMPOSITION FOR A BIQUADRATIC MATRIX FUNCTION

A New Spectral Technique Using Normalized Adjacency Matrices for Graph Matching 1

A Questionable Distance-Regular Graph

New Lower Bounds on the Stability Number of a Graph

Ma/CS 6b Class 23: Eigenvalues in Regular Graphs

On Hadamard Diagonalizable Graphs

6.854J / J Advanced Algorithms Fall 2008

Knowledge Discovery and Data Mining 1 (VO) ( )

ELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications

A well-quasi-order for tournaments

Graph Partitioning Using Random Walks

A Note on Representations of Linear Inequalities in Non-Convex Mixed-Integer Quadratic Programs

Chapter 4. Signed Graphs. Intuitively, in a weighted graph, an edge with a positive weight denotes similarity or proximity of its endpoints.

The spectra of super line multigraphs

COM Optimization for Communications 8. Semidefinite Programming

On Equivalence of Semidefinite Relaxations for Quadratic Matrix Programming

Note on deleting a vertex and weak interlacing of the Laplacian spectrum

Linear Algebra. Shan-Hung Wu. Department of Computer Science, National Tsing Hua University, Taiwan. Large-Scale ML, Fall 2016

Networks and Their Spectra

Data Analysis and Manifold Learning Lecture 9: Diffusion on Manifolds and on Graphs

1 The independent set problem

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 11 Luca Trevisan February 29, 2016

Data Mining and Analysis: Fundamental Concepts and Algorithms

Preliminaries Overview OPF and Extensions. Convex Optimization. Lecture 8 - Applications in Smart Grids. Instructor: Yuanzhang Xiao

Semidefinite Programming Basics and Applications

Acyclic Semidefinite Approximations of Quadratically Constrained Quadratic Programs

There are several approaches to solve UBQP, we will now briefly discuss some of them:

ON THE QUALITY OF SPECTRAL SEPARATORS

Lecture 1: Review of linear algebra

Eigenvalues of 2-edge-coverings

Diffusion and random walks on graphs

Uniqueness of the Solutions of Some Completion Problems

JOHNSON SCHEMES AND CERTAIN MATRICES WITH INTEGRAL EIGENVALUES

The rank of connection matrices and the dimension of graph algebras

A STRENGTHENED SDP RELAXATION. via a. SECOND LIFTING for the MAX-CUT PROBLEM. July 15, University of Waterloo. Abstract

Numerical block diagonalization of matrix -algebras with application to semidefinite programming

Laplacians of Graphs, Spectra and Laplacian polynomials

Using Laplacian Eigenvalues and Eigenvectors in the Analysis of Frequency Assignment Problems

Nonnegative Matrices I

MATH 431: FIRST MIDTERM. Thursday, October 3, 2013.

Transcription:

SDP and eigenvalue bounds for the graph partition problem Renata Sotirov and Edwin van Dam Tilburg University, The Netherlands

Outline... the graph partition problem

Outline... the graph partition problem matrix lifting vector lifting

Outline... the graph partition problem matrix lifting vector lifting simplify complicate

Outline... the graph partition problem matrix lifting vector lifting simplify complicate???

The Graph Partition Problem G = (V, E)... an undirected graph V... vertex set, V = n E... edge set

The Graph Partition Problem G = (V, E)... an undirected graph V... vertex set, V = n E... edge set The k-partition problem (GPP) Find a partition of V into k subsets S 1,..., S k of given sizes m 1... m k, s.t. the total weight of edges joining different S i is minimized.

The Graph Partition Problem G = (V, E)... an undirected graph V... vertex set, V = n E... edge set The k-partition problem (GPP) Find a partition of V into k subsets S 1,..., S k of given sizes m 1... m k, s.t. the total weight of edges joining different S i is minimized. m i = V k, i the graph equipartition problem k = 2 the bisection problem

The k-partition problem A... the adjacency matrix of G, m := (m 1,..., m k ) T

The k-partition problem A... the adjacency matrix of G, m := (m 1,..., m k ) T V k let X = (x ij ) R { 1, if vertex i Sj x ij := 0, if vertex i / S j

The k-partition problem A... the adjacency matrix of G, m := (m 1,..., m k ) T V k let X = (x ij ) R { 1, if vertex i Sj x ij := 0, if vertex i / S j P k := { X R n k : Xu k = u n, X T u n = m, x ij {0, 1} }, where u n... vector of all ones

The k-partition problem A... the adjacency matrix of G, m := (m 1,..., m k ) T V k let X = (x ij ) R { 1, if vertex i Sj x ij := 0, if vertex i / S j P k := { X R n k : Xu k = u n, X T u n = m, x ij {0, 1} }, where u n... vector of all ones For X P k : w(e cut ) = 1 2 tr(x T LX ) = 1 2 tr A(J n XX T ) where L := Diag(Au n ) A is the Laplacian matrix of G

The Graph Partition Problem The trace formulation: (GPP) 1 min 2 trace(x T LX ) s.t. Xu k = u n X T u n = m x ij {0, 1}

The Graph Partition Problem The trace formulation: (GPP) 1 min 2 trace(x T LX ) s.t. Xu k = u n X T u n = m x ij {0, 1} the GPP... is NP-hard (Garey and Johnson, 1976)

The Graph Partition Problem The trace formulation: (GPP) 1 min 2 trace(x T LX ) s.t. Xu k = u n X T u n = m x ij {0, 1} the GPP... is NP-hard (Garey and Johnson, 1976) applications: VLSI design, parallel computing, floor planning, telecommunications, etc.

. matrix lifting SDP for the GPP...

SDP for GPP linearize the objective: trace(lxx T ) trace(ly)

SDP for GPP linearize the objective: trace(lxx T ) trace(ly) Y conv{ỹ : X P k s.t. Ỹ = XX T } ky J n 0.

SDP for GPP linearize the objective: trace(lxx T ) trace(ly) Y conv{ỹ : X P k s.t. Ỹ = XX T } ky J n 0. S., 2013 (GPP m ) min 1 2 tr(ly ) s.t. diag(y ) = u n tr(jy ) = k mi 2 i=1 ky J n 0, Y 0

SDP for GPP linearize the objective: trace(lxx T ) trace(ly) Y conv{ỹ : X P k s.t. Ỹ = XX T } ky J n 0. S., 2013 (GPP m ) min 1 2 tr(ly ) s.t. diag(y ) = u n tr(jy ) = k mi 2 i=1 ky J n 0, Y 0 for k = 2 the nonnegativity constraints are redundant

GPP m and known relaxations GPP m... for equipartition is equivalent to the relaxation from: S.E. Karisch, F. Rendl. Semidefnite programming and graph equipartition. In: Topics in Semidefinite and Interior Point Methods, The Fields Institute for research in Math. Sc., Comm. Ser. Rhode Island, 18, 1998.

GPP m and known relaxations GPP m... for equipartition is equivalent to the relaxation from: S.E. Karisch, F. Rendl. Semidefnite programming and graph equipartition. In: Topics in Semidefinite and Interior Point Methods, The Fields Institute for research in Math. Sc., Comm. Ser. Rhode Island, 18, 1998. for bisection is equivalent to the relaxation from: S. E. Karisch, F. Rendl, J. Clausen. Solving graph bisection problems with semidefinite programming, INFORMS J. Comput., 12:177-191, 2000.

Strengthening?

Strengthening?? How to strengthen GPP m?

Strengthening?? How to strengthen GPP m? impose the linear inequalities: constraints y ab + y ac 1 + y bc, (a, b, c) independent set constraints y ab 1, W s.t. W = k + 1 a<b, a,b W

Strengthening?? How to strengthen GPP m? impose the linear inequalities: constraints y ab + y ac 1 + y bc, (a, b, c) independent set constraints y ab 1, W s.t. W = k + 1 a<b, a,b W there are 3 ( ) ( n 3, and n k+1) independent set constraints

On computational issues... in general, for graphs with 100 vertices and k = 3: the best known vector lifting relaxation is hopeless

On computational issues... in general, for graphs with 100 vertices and k = 3: the best known vector lifting relaxation is hopeless GPP m + triangle inequalities + independent set solves 3 h

On computational issues... in general, for graphs with 100 vertices and k = 3: the best known vector lifting relaxation is hopeless GPP m + triangle inequalities + independent set solves 3 h GPP m solves 14 min

On computational issues... in general, for graphs with 100 vertices and k = 3: the best known vector lifting relaxation is hopeless GPP m + triangle inequalities + independent set solves 3 h GPP m solves 14 min Can we compute GPP m with/without additional constr. more efficiently?

On computational issues... in general, for graphs with 100 vertices and k = 3: the best known vector lifting relaxation is hopeless GPP m + triangle inequalities + independent set solves 3 h GPP m solves 14 min Can we compute GPP m with/without additional constr. more efficiently? yes

Simplification highly symmetric graphs...

Simplification highly symmetric graphs... matrix *-algebra: subspace of R n n that is closed under matrix multiplication and taking transposes

Simplification highly symmetric graphs... matrix *-algebra: subspace of R n n that is closed under matrix multiplication and taking transposes Assumption: The data matrices of an SDP problem and I belong to a matrix *-algebra span{a 1,..., A r } where r n 2

Simplification highly symmetric graphs... matrix *-algebra: subspace of R n n that is closed under matrix multiplication and taking transposes Assumption: The data matrices of an SDP problem and I belong to a matrix *-algebra span{a 1,..., A r } where r n 2 Then, if the SDP relaxation has an optimal solution it has an optimal solution in the matrix *-algebra.

Simplification highly symmetric graphs... matrix *-algebra: subspace of R n n that is closed under matrix multiplication and taking transposes Assumption: The data matrices of an SDP problem and I belong to a matrix *-algebra span{a 1,..., A r } where r n 2 Then, if the SDP relaxation has an optimal solution it has an optimal solution in the matrix *-algebra. Schrijver, Goemans, Rendl, Parrilo,...

Simplification highly symmetric graphs... matrix *-algebra: subspace of R n n that is closed under matrix multiplication and taking transposes Assumption: The data matrices of an SDP problem and I belong to a matrix *-algebra span{a 1,..., A r } where r n 2 Then, if the SDP relaxation has an optimal solution it has an optimal solution in the matrix *-algebra. Schrijver, Goemans, Rendl, Parrilo,... a basis of the matrix *-algebra (coming from combinatorial or group symmetry): (i) A i {0, 1} n n, A T i {A 1,..., A r }, (i = 1,..., r) (ii) r i=1 A i = J, i I A i = I, I {1,..., r} (iii) For i, j {1,..., r}, p h ij such that A i A j = r h=1 ph ija h.

Simplification highly symmetric graphs... Y = r z i A i, z i R (r n 2 ) i=1 (GPP m ) 1 min 2 tr(aj r n) 1 2 z i tr(aa i ) i=1 s.t. z i diag(a i ) = u n i I r z i tr(ja i ) = k i=1 mi 2 i=1 k r z i A i J n 0, z i 0, i = 1,..., r. j=1

Simplification highly symmetric graphs... Y = r z i A i, z i R (r n 2 ) i=1 (GPP m ) 1 min 2 tr(aj r n) 1 2 z i tr(aa i ) i=1 s.t. z i diag(a i ) = u n i I r z i tr(ja i ) = k i=1 mi 2 i=1 k r z i A i J n 0, z i 0, i = 1,..., r. j=1 LMI may be (block-)diagonalized

Simplification highly symmetric graphs... Y = r z i A i, z i R (r n 2 ) i=1 (GPP m ) 1 min 2 tr(aj r n) 1 2 z i tr(aa i ) i=1 s.t. z i diag(a i ) = u n i I r z i tr(ja i ) = k i=1 mi 2 i=1 k r z i A i J n 0, z i 0, i = 1,..., r. j=1 LMI may be (block-)diagonalized exploit properties of A i to aggregate and indep. set const. extend the approach from: M.X. Goemans, F. Rendl. Semidefinite Programs and Association Schemes. Computing, 63(4):331 340, 1999.

On aggregating constraints... for a given (a, b, c) consider the inequality y ab + y ac 1 + y bc

On aggregating constraints... for a given (a, b, c) consider the inequality y ab + y ac 1 + y bc if (A i ) ab = 1, (A h ) ac = 1, (A j ) bc = 1 type (i, j, h) ineq.,

On aggregating constraints... for a given (a, b, c) consider the inequality y ab + y ac 1 + y bc if (A i ) ab = 1, (A h ) ac = 1, (A j ) bc = 1 type (i, j, h) ineq., by summing all ineq. of type (i, j, h), the aggregated ineq.: p i hj tr A iy + p h ij tr A h Y p i hj tr A ij + p j i h tr A jy, where p h ij : A ia j = r h=1 ph ij A h j : is the index s.t. A j = A T j

On aggregating constraints... for a given (a, b, c) consider the inequality y ab + y ac 1 + y bc if (A i ) ab = 1, (A h ) ac = 1, (A j ) bc = 1 type (i, j, h) ineq., by summing all ineq. of type (i, j, h), the aggregated ineq.: p i hj tr A iy + p h ij tr A h Y p i hj tr A ij + p j i h tr A jy, where p h ij : A ia j = r h=1 ph ij A h j : is the index s.t. A j = A T j use: Y = r j=1 z ja j

On aggregating constraints... for a given (a, b, c) consider the inequality y ab + y ac 1 + y bc if (A i ) ab = 1, (A h ) ac = 1, (A j ) bc = 1 type (i, j, h) ineq., by summing all ineq. of type (i, j, h), the aggregated ineq.: p i hj tr A iy + p h ij tr A h Y p i hj tr A ij + p j i h tr A jy, where p h ij : A ia j = r h=1 ph ij A h j : is the index s.t. A j = A T j use: Y = r j=1 z ja j of aggregated constraints is bounded by r 3

On aggregating constraints... for a given (a, b, c) consider the inequality y ab + y ac 1 + y bc if (A i ) ab = 1, (A h ) ac = 1, (A j ) bc = 1 type (i, j, h) ineq., by summing all ineq. of type (i, j, h), the aggregated ineq.: p i hj tr A iy + p h ij tr A h Y p i hj tr A ij + p j i h tr A jy, where p h ij : A ia j = r h=1 ph ij A h j : is the index s.t. A j = A T j use: Y = r j=1 z ja j of aggregated constraints is bounded by r 3 similar approach applies to independent set constr. when k = 2

Simplification highly symmetric graphs... Example. Strongly regular graph (SRG)

Simplification highly symmetric graphs... Example. Strongly regular graph (SRG) n vertices, κ the valency of the graph

Simplification highly symmetric graphs... Example. Strongly regular graph (SRG) n vertices, κ the valency of the graph A has exactly two eigenvalues r 0 and s < 0 associated with eigenvectors u n

Simplification highly symmetric graphs... Example. Strongly regular graph (SRG) n vertices, κ the valency of the graph A has exactly two eigenvalues r 0 and s < 0 associated with eigenvectors u n A belongs to the *-algebra spanned by {I, A, J A I }

Simplification highly symmetric graphs... Example. Strongly regular graph (SRG) n vertices, κ the valency of the graph A has exactly two eigenvalues r 0 and s < 0 associated with eigenvectors u n A belongs to the *-algebra spanned by {I, A, J A I } Y = I + z 1 A + z 2 (J A I )

Simplification highly symmetric graphs... Example. Strongly regular graph (SRG) n vertices, κ the valency of the graph A has exactly two eigenvalues r 0 and s < 0 associated with eigenvectors u n A belongs to the *-algebra spanned by {I, A, J A I } Y = I + z 1 A + z 2 (J A I ) (GPP m ) min 1 2 κn(1 z 1) s.t. κz 1 + (n κ 1)z 2 = 1 n 1 + rz 1 (r + 1)z 2 0 1 + sz 1 (s + 1)z 2 0 z 1, z 2 0 k mi 2 1 i=1

SRG Theorem. Let G = (V, E) be a SRG with eigenvalues κ, r, s. Let m i N, i = 1,..., k s.t. k j=1 m j = n. Then the SDP bound for the minimum k-partition is { κ r max n i<j m ( ) } 1 im j, 2 n(κ + 1) i m2 i Similarly, the SDP bound for the maximum k-partition is { } κ s min n i<j m 1 im j, 2 κn.

SRG Theorem. Let G = (V, E) be a SRG with eigenvalues κ, r, s. Let m i N, i = 1,..., k s.t. k j=1 m j = n. Then the SDP bound for the minimum k-partition is { κ r max n i<j m ( ) } 1 im j, 2 n(κ + 1) i m2 i Similarly, the SDP bound for the maximum k-partition is { } κ s min n i<j m 1 im j, 2 κn. this is an extension of the result for the equipartition: De Klerk, Pasechnik, S., Dobre: On SDP relaxations of maximum k-section, Math. Program. Ser. B, 136(2):253-278, 2012.

SRG after aggregating, 3 ( n 3) constraints remain: z 1 1 z 2 1 2z 1 z 2 1 z 1 + 2z 2 1

SRG after aggregating, 3 ( n 3) constraints remain: z 1 1 z 2 1 2z 1 z 2 1 z 1 + 2z 2 1 Prop. For SRG with n > 5 the ineq. are redundant in GPP m.

SRG after aggregating, 3 ( n 3) constraints remain: z 1 1 z 2 1 2z 1 z 2 1 z 1 + 2z 2 1 Prop. For SRG with n > 5 the ineq. are redundant in GPP m. However, the independent set constraints improve GPP m.

Simplification not a special graph...

Simplification not a special graph... closed form expression for the GPP for any graph

Simplification not a special graph... closed form expression for the GPP for any graph L = Diag(Au n ) A... the Laplacian matrix of G

Simplification not a special graph... closed form expression for the GPP for any graph L = Diag(Au n ) A... the Laplacian matrix of G L := span{f 0,..., F d } the Laplacian algebra corr. to L

Simplification not a special graph... closed form expression for the GPP for any graph L = Diag(Au n ) A... the Laplacian matrix of G L := span{f 0,..., F d } the Laplacian algebra corr. to L F i = U i Ui T, i... where U i corr. to the distinct eig. λ i d i=0 F i = I F i F j = δ ij F i for i j tr(f i ) = f i... the multiplicity of i-th eigenvalue of L

Simplification not a special graph... in GPP m : relax diag(y ) = u n tr(y ) = n remove nonnegativity constraints

Simplification not a special graph... in GPP m : relax diag(y ) = u n tr(y ) = n remove nonnegativity constraints (GPP eig ) 1 min 2 tr LY s.t. tr(y ) = n tr(jy ) = k mi 2 i=1 ky J n 0

Simplification not a special graph... in GPP m : relax diag(y ) = u n tr(y ) = n remove nonnegativity constraints (GPP eig ) 1 min 2 tr LY s.t. tr(y ) = n tr(jy ) = k ky J n 0 mi 2 i=1 Y = d z i F i, z i R (i = 0,..., d) i=0

Simplification not a special graph... in GPP m : relax diag(y ) = u n tr(y ) = n remove nonnegativity constraints (GPP eig ) 1 min 2 tr LY s.t. tr(y ) = n tr(jy ) = k ky J n 0 mi 2 i=1 Y = d z i F i, z i R (i = 0,..., d) i=0 tr(ly ) = tr( d λ j F j ( d z i F i )) = d λ i f i z i j=0 i=0 i=0 where 0 = λ 0... λ d distinct eigenvalues of L etc....

Simplification not a special graph... Theorem Let G = (V, E) be a graph, m T = (m 1,..., m k ) s.t. k j=1 m j = n. Then the GPP eig bound for the minimum k-partition of G equals m i m j, λ 1 n i<j and the bound GPP eig for the maximum k-partition of G equals m i m j. λ d n i<j

Simplification not a special graph... Theorem Let G = (V, E) be a graph, m T = (m 1,..., m k ) s.t. k j=1 m j = n. Then the GPP eig bound for the minimum k-partition of G equals m i m j, λ 1 n i<j and the bound GPP eig for the maximum k-partition of G equals m i m j. λ d n i<j for the bisection the above results coincide with: M. Juvan, B. Mohar: Optimal linear labelings and eigenvalues of graphs. Discrete Appl. Math., 36:153 168, 1992. for the min 3-partition: J. Falkner, F. Rendl, H. Wolkowicz. A computational study of graph partitioning. Math. Program., 66:211 239, 1994.

. computational results...

Quality of the presented bounds G n partition GPP eig GPP m Doob 64 8 112 160 design 90 9 360 360 grid graph 100 (50,25,25) 4 6 Higman-Sims 100 20 950 950 Table : Lower bounds for the min graph partition.

Quality of the presented bounds G n partition GPP eig GPP m Doob 64 8 112 160 design 90 9 360 360 grid graph 100 (50,25,25) 4 6 Higman-Sims 100 20 950 950 Table : Lower bounds for the min graph partition. G n m GPP m GPP m GPP m ind J(7, 2) 21 (11,10) 37 37 40 Foster 90 (45,45) 13 18 14 Biggs-Smith 102 (70,32) 10 15 10 Table : Lower bounds for the min bisection. each bound computed in a few seconds

. vector lifting for the GPP...

Vector lifting for GPP let m = (m 1,..., m k ) T, i m i = n

Vector lifting for GPP let m = (m 1,..., m k ) T, i m i = n X P k := { X R n k : Xu k = u n, X T u n = m, x ij {0, 1} }

Vector lifting for GPP let m = (m 1,..., m k ) T, i m i = n X P k := { X R n k : Xu k = u n, X T u n = m, x ij {0, 1} } define y := vec(x ), Y := yy T relax Y yy T 0

Vector lifting for GPP let m = (m 1,..., m k ) T, i m i = n X P k := { X R n k : Xu k = u n, X T u n = m, x ij {0, 1} } define y := vec(x ), Y := yy T relax Y yy T 0 (GPP v ) min 1 2 tr((j k I k ) A)Y s.t. tr((j k I k ) I n )Y = 0 tr(i k J n )Y + tr(y ) = ( k mi 2 + n) ( 1 y T y Y i=1 ) + 2y T ((m + u k ) u n ) S + nk+1, Y 0 H. Wolkowicz and Q. Zhao. Semidefinite programming relaxations for the graph partitioning problem. Discrete Appl. Math., 96 97:461 479, 1999. original Zhao-Wolkowicz relaxation does not include Y 0

Vector lifting for GPP Theorem (S., 2012) When restricted to the equipartition, GPP v and GPP m are equivalent.

Vector lifting for GPP Theorem (S., 2012) When restricted to the equipartition, GPP v and GPP m are equivalent. Theorem (S., 2013) When restricted to the bisection, GPP v dominates GPP m.

Vector lifting for GPP Theorem (S., 2012) When restricted to the equipartition, GPP v and GPP m are equivalent. Theorem (S., 2013) When restricted to the bisection, GPP v dominates GPP m. numerical experiments show: gap between GPP v and GPP m reduces for k > 5

. How to strengthen GPP v?

. How to strengthen GPP v? We demonstrate for the bisection problem.

New bound for the bisection assign a pair of vertices of G to different parts of the partition

New bound for the bisection assign a pair of vertices of G to different parts of the partition Which pair of vertices?

New bound for the bisection assign a pair of vertices of G to different parts of the partition Which pair of vertices? consider the action of aut(a) on the pair of vertices (i, j) i.e., orbital: {(Pe i, Pe j ) : P aut(a)}

New bound for the bisection assign a pair of vertices of G to different parts of the partition Which pair of vertices? consider the action of aut(a) on the pair of vertices (i, j) i.e., orbital: {(Pe i, Pe j ) : P aut(a)} orbitals represent the different kinds of pairs of vertices

New bound for the bisection assign a pair of vertices of G to different parts of the partition Which pair of vertices? consider the action of aut(a) on the pair of vertices (i, j) i.e., orbital: {(Pe i, Pe j ) : P aut(a)} orbitals represent the different kinds of pairs of vertices assume that there are t such orbitals: O h (h = 1, 2,..., t) we prove the following

New bound for the bisection Theorem. Let G be an undirected graph with adjacency matrix A, and t orbitals O h (h = 1, 2,..., t) of edges and nonedges.

New bound for the bisection Theorem. Let G be an undirected graph with adjacency matrix A, and t orbitals O h (h = 1, 2,..., t) of edges and nonedges. Let (r h1, r h2 ) be an arbitrary pair of vertices in O h (h = 1, 2,..., t).

New bound for the bisection Theorem. Let G be an undirected graph with adjacency matrix A, and t orbitals O h (h = 1, 2,..., t) of edges and nonedges. Let (r h1, r h2 ) be an arbitrary pair of vertices in O h (h = 1, 2,..., t). Then min tr Z T AZ(J 2 I 2 ) = Z P 2 min h=1,2,...,t min tr X T AX (J 2 I 2 ), X P 2(h) where P 2 (h) = {X P 2 : X rh1,1 = 1, X rh2,2 = 1} (h = 1, 2,..., t).

New bound for the bisection Theorem. Let G be an undirected graph with adjacency matrix A, and t orbitals O h (h = 1, 2,..., t) of edges and nonedges. Let (r h1, r h2 ) be an arbitrary pair of vertices in O h (h = 1, 2,..., t). Then min tr Z T AZ(J 2 I 2 ) = Z P 2 min h=1,2,...,t min tr X T AX (J 2 I 2 ), X P 2(h) where P 2 (h) = {X P 2 : X rh1,1 = 1, X rh2,2 = 1} (h = 1, 2,..., t). for each h, compute: µ h := {GPP v with two additional constraints}

New bound for the bisection Theorem. Let G be an undirected graph with adjacency matrix A, and t orbitals O h (h = 1, 2,..., t) of edges and nonedges. Let (r h1, r h2 ) be an arbitrary pair of vertices in O h (h = 1, 2,..., t). Then min tr Z T AZ(J 2 I 2 ) = Z P 2 min h=1,2,...,t min tr X T AX (J 2 I 2 ), X P 2(h) where P 2 (h) = {X P 2 : X rh1,1 = 1, X rh2,2 = 1} (h = 1, 2,..., t). for each h, compute: µ h := {GPP v with two additional constraints} the new lower bound for the bisection problem is: GPP fix := min h=1,...,t µ h

. computational results...

Comparison of bounds... in general, it is difficult to solve GPP fix

Comparison of bounds... in general, it is difficult to solve GPP fix but for graphs with symmetry... G n m T GPP m GPP v GPP m ind GPP fix J(6, 2) 15 (8,7) 23 23 26 24 Gewirtz 56 (53,3) 23 24 23 26 M 22 77 (74,3) 41 42 41 44 Higman-Sims 100 25-part. 960 960 960 964 Table : Lower bounds for the min GPP each bound computed with IPM in < 30s

Example: the bandwidth problem

Example: the bandwidth problem The Bandwidth Problem in graphs: label the vertices v i of G with distinct integers φ(v i ) s.t. max φ(v i) φ(v j ) minimal (v i,v j ) E

Example: the bandwidth problem The Bandwidth Problem in graphs: label the vertices v i of G with distinct integers φ(v i ) s.t. max φ(v i) φ(v j ) minimal (v i,v j ) E 2 3 8 1 00000000 11111111 5 4 7 6

Example: the bandwidth problem The Bandwidth Problem in graphs: label the vertices v i of G with distinct integers φ(v i ) s.t. max φ(v i) φ(v j ) minimal (v i,v j ) E 2 3 3 7 8 1 00000000 11111111 1 5 5 4 2 6 7 6 4 8

Example: the bandwidth problem The Bandwidth Problem in graphs: label the vertices v i of G with distinct integers φ(v i ) s.t. max φ(v i) φ(v j ) minimal (v i,v j ) E 2 3 3 7 8 1 00000000 11111111 1 5 5 4 2 6 7 6 0 1 0 0 0 0 1 1 1 0 1 0 1 0 0 0 0 1 0 1 0 0 0 1 0 0 1 0 0 1 0 1 0 1 0 1 0 0 1 0 0 0 0 1 0 0 1 1 1 0 0 1 0 1 0 0 1 0 1 0 0 1 0 0 4 8 0 0 1 1 1 0 0 0 0 0 1 1 0 1 0 0 1 1 0 0 0 0 1 0 1 1 0 0 0 0 0 1 1 0 0 0 0 0 1 1 0 1 0 0 0 0 1 1 0 0 1 0 1 1 0 0 0 0 0 1 1 1 0 0

The bandwidth problem the bandwidth problem is related to the following GPP problem

The bandwidth problem the bandwidth problem is related to the following GPP problem The min-cut problem is: OPT MC := min s.t. i S 1,j S 2 a ij (S 1, S 2, S 3 ) partitions V S i = m i, i = 1, 2, 3, where A = (a ij ) is the adjacency matrix of G.

The bandwidth problem the bandwidth problem is related to the following GPP problem The min-cut problem is: OPT MC := min s.t. i S 1,j S 2 a ij (S 1, S 2, S 3 ) partitions V S i = m i, i = 1, 2, 3, where A = (a ij ) is the adjacency matrix of G. bandwidth lower bound (Povh-Rendl (2007), van Dam-S.): If for some m = (m 1, m 2, m 3 ) it holds that OPT MC ν > 0, then σ (G) m 3 + 12 + 2ν + 1 4

The bandwidth problem - SDP relaxation SDP relaxations for the min-cut: solve GPP v and GPP fix with objective 1 trace(d A)Y 2 where D = 0 1 0 1 0 0 0 0 0

Bandwidth of Hamming graphs... Hamming graph H(d, q) is the graph Cartesian product of d copies of the complete graph K q.

Bandwidth of Hamming graphs... Hamming graph H(d, q) is the graph Cartesian product of d copies of the complete graph K q. q nodes old bw v time(s) bw fix time(s) u.b. 3 27 9 10 0 12 44 13 4 64 22 22 3 25 176 31 5 125 42 43 15 47 536 60 6 216 72 74 76 78 1756 101 Table : Bounds on the bandwidth of H(3, q) bw v and bw fix obtained by use of: m 3 + 12 + 2α + 1 4 upper bounds obtained by improved rev. Cuthill-McKee algor.

More on bounds... we also compute the best known lower/upper bounds for: H(4, q) the 3-dimensional generalized Hamming graphs H q1,q2,q3 the Johnson and Kneser graphs

ThAnK YoU!