SOS TENSOR DECOMPOSITION: THEORY AND APPLICATIONS

Similar documents
An Even Order Symmetric B Tensor is Positive Definite

Are There Sixth Order Three Dimensional PNS Hankel Tensors?

SOS-Hankel Tensors: Theory and Application

Finding the Maximum Eigenvalue of Essentially Nonnegative Symmetric Tensors via Sum of Squares Programming

Tensor Complementarity Problem and Semi-positive Tensors

An Explicit SOS Decomposition of A Fourth Order Four Dimensional Hankel Tensor with A Symmetric Generating Vector

arxiv: v1 [math.ra] 11 Aug 2014

THE PERTURBATION BOUND FOR THE SPECTRAL RADIUS OF A NON-NEGATIVE TENSOR

Ranks of Real Symmetric Tensors

Eigenvalue analysis of constrained minimization problem for homogeneous polynomial

ON SUM OF SQUARES DECOMPOSITION FOR A BIQUADRATIC MATRIX FUNCTION

Strictly nonnegative tensors and nonnegative tensor partition

Approximation algorithms for nonnegative polynomial optimization problems over unit spheres

AN ELEMENTARY PROOF OF THE SPECTRAL RADIUS FORMULA FOR MATRICES

Permutation transformations of tensors with an application

Linear Algebra March 16, 2019

ELEMENTARY LINEAR ALGEBRA

A Note on Nonconvex Minimax Theorem with Separable Homogeneous Polynomials

c 2004 Society for Industrial and Applied Mathematics

Linear Algebra. The analysis of many models in the social sciences reduces to the study of systems of equations.

Elementary linear algebra

2. Matrix Algebra and Random Vectors

A matrix over a field F is a rectangular array of elements from F. The symbol

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Preliminary Linear Algebra 1. Copyright c 2012 Dan Nettleton (Iowa State University) Statistics / 100

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

Properties of Solution Set of Tensor Complementarity Problem

Trust Region Problems with Linear Inequality Constraints: Exact SDP Relaxation, Global Optimality and Robust Optimization

a 11 x 1 + a 12 x a 1n x n = b 1 a 21 x 1 + a 22 x a 2n x n = b 2.

Absolute value equations

LMI MODELLING 4. CONVEX LMI MODELLING. Didier HENRION. LAAS-CNRS Toulouse, FR Czech Tech Univ Prague, CZ. Universidad de Valladolid, SP March 2009

Chapter 1. Preliminaries. The purpose of this chapter is to provide some basic background information. Linear Space. Hilbert Space.

Lecture Note 5: Semidefinite Programming for Stability Analysis

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

POSITIVE MAP AS DIFFERENCE OF TWO COMPLETELY POSITIVE OR SUPER-POSITIVE MAPS

CHAPTER 2: CONVEX SETS AND CONCAVE FUNCTIONS. W. Erwin Diewert January 31, 2008.

We describe the generalization of Hazan s algorithm for symmetric programming

Matrices and Vectors. Definition of Matrix. An MxN matrix A is a two-dimensional array of numbers A =

Lecture 7: Positive Semidefinite Matrices

Key words. Complementarity set, Lyapunov rank, Bishop-Phelps cone, Irreducible cone

Polynomial complementarity problems

M-tensors and nonsingular M-tensors

Global Quadratic Minimization over Bivalent Constraints: Necessary and Sufficient Global Optimality Condition

Conditions for Strong Ellipticity of Anisotropic Elastic Materials

Duke University, Department of Electrical and Computer Engineering Optimization for Scientists and Engineers c Alex Bronstein, 2014

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Circulant Tensors with Applications to Spectral Hypergraph Theory and Stochastic Process

Tensors. Notes by Mateusz Michalek and Bernd Sturmfels for the lecture on June 5, 2018, in the IMPRS Ringvorlesung Introduction to Nonlinear Algebra

Fundamentals of Linear Algebra. Marcel B. Finan Arkansas Tech University c All Rights Reserved

Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 2

Exact SDP Relaxations for Classes of Nonlinear Semidefinite Programming Problems

16. Local theory of regular singular points and applications

Linear Algebra. Session 12

Lecture 5. Ch. 5, Norms for vectors and matrices. Norms for vectors and matrices Why?

MATH 326: RINGS AND MODULES STEFAN GILLE

Linear Algebra (Review) Volker Tresp 2017

LOWER BOUNDS FOR A POLYNOMIAL IN TERMS OF ITS COEFFICIENTS

Massachusetts Institute of Technology Department of Economics Statistics. Lecture Notes on Matrix Algebra

Chapter 4 Euclid Space

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

1 Quantum states and von Neumann entropy

Spectral inequalities and equalities involving products of matrices

Course 311: Michaelmas Term 2005 Part III: Topics in Commutative Algebra

A strongly polynomial algorithm for linear systems having a binary solution

Real Analysis Math 131AH Rudin, Chapter #1. Dominique Abdi

ELEMENTARY LINEAR ALGEBRA

First, we review some important facts on the location of eigenvalues of matrices.

Recall the convention that, for us, all vectors are column vectors.

Foundations of Matrix Analysis

R, 1 i 1,i 2,...,i m n.

CANONICAL FORMS FOR LINEAR TRANSFORMATIONS AND MATRICES. D. Katz

A practical method for computing the largest M-eigenvalue of a fourth-order partially symmetric tensor

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Properties of Matrices and Operations on Matrices

Semidefinite Programming

ELEMENTARY LINEAR ALGEBRA

Maximizing the numerical radii of matrices by permuting their entries

Exercise Solutions to Functional Analysis

EECS 598: Statistical Learning Theory, Winter 2014 Topic 11. Kernels

Problem 1. CS205 Homework #2 Solutions. Solution

Lebesgue Measure on R n

Semidefinite Programming Basics and Applications

Math Linear Algebra II. 1. Inner Products and Norms

Chap 3. Linear Algebra

3 Matrix Algebra. 3.1 Operations on matrices

SPECTRAL THEOREM FOR COMPACT SELF-ADJOINT OPERATORS

B553 Lecture 5: Matrix Algebra Review

Mathematics 530. Practice Problems. n + 1 }

A new criterion and a special class of k-positive maps

Math113: Linear Algebra. Beifang Chen

MATRIX ALGEBRA. or x = (x 1,..., x n ) R n. y 1 y 2. x 2. x m. y m. y = cos θ 1 = x 1 L x. sin θ 1 = x 2. cos θ 2 = y 1 L y.

Semidefinite Programming

Z-eigenvalue methods for a global polynomial optimization problem

Analytical formulas for calculating extremal ranks and inertias of quadratic matrix-valued functions and their applications

Knowledge Discovery and Data Mining 1 (VO) ( )

October 25, 2013 INNER PRODUCT SPACES

5 Set Operations, Functions, and Counting

Copositive Plus Matrices

Chap. 3. Controlled Systems, Controllability

Chapter 5 Eigenvalues and Eigenvectors

Transcription:

SOS TENSOR DECOMPOSITION: THEORY AND APPLICATIONS HAIBIN CHEN, GUOYIN LI, AND LIQUN QI Abstract. In this paper, we examine structured tensors which have sum-of-squares (SOS) tensor decomposition, and study the SOS-rank of SOS tensor decomposition. We first show that several classes of even order symmetric structured tensors available in the literature have SOS tensor decomposition. These include positive Cauchy tensors, weakly diagonally dominated tensors, B 0 -tensors, double B- tensors, quasi-double B 0 -tensors, MB 0 -tensors, H-tensors, absolute tensors of positive semi-definite Z-tensors and extended Z-tensors. We also examine the SOS-rank of SOS tensor decomposition and the SOS-width for SOS tensor cones. The SOS-rank provides the minimal number of squares in the SOS tensor decomposition, and, for a given SOS tensor cone, its SOS-width is the maximum possible SOS-rank for all the tensors in this cone. We first deduce an upper bound for general tensors that have SOS decomposition and the SOS-width for general SOS tensor cone using the known results in the literature of polynomial theory. Then, we provide an explicit sharper estimate for the SOS-rank of SOS tensor decomposition with bounded exponent and identify the SOS-width for the tensor cone consisting of all tensors with bounded exponent that have SOS decompositions. Finally, as applications, we show how the SOS tensor decomposition can be used to compute the minimum H-eigenvalue of an even order symmetric extended Z-tensor and test the positive definiteness of an associated multivariate form. Numerical examples ranging from small size to large size are provided to show the efficiency of the proposed numerical methods. Key words. structured tensor, SOS tensor decomposition, positive semi-definite tensor, SOS-rank, H-eigenvalue. AMS subject classifications. 90C30, 15A06. 1. Introduction Tensor decomposition is an important research area, and has found numerous applications in data mining [18, 19, 20], computational neuroscience [5, 11], and statistical learning for latent variable models [1]. An important class of tensor decomposition is sum-of-squares (SOS) tensor decomposition. It is known that to determine a given even order symmetric tensor is positive semi-definite or not is an NP-hard problem in general. On the other hand, an interesting feature of SOS tensor decomposition is checking whether a given even order symmetric tensor has SOS decomposition or not can be verified by solving a semi-definite programming problem (see for example [14]), and hence, can be validated efficiently. SOS tensor decomposition has a close connection with SOS polynomials, and SOS polynomials are very important in polynomial theory [3, 4, 12, 13, 35, 40] and polynomial optimization [16, 21, 22, 23, 34, 41]. It is known that an even order symmetric tensor having SOS decomposition is positive semi-definite, but the converse is not true in general. Recently, a few classes of structured tensors such as B tensors [37] and diagonally dominated tensor [36], have been shown to be positive semi-definite in the even order symmetric case. It then raises a natural and interesting question: Will these structured tensors admit an SOS decomposition? Providing an Department of Applied Mathematics, The Hong Kong Polytechnic University, Hung Hom, Kowloon, Hong Kong. Email: chenhaibin508@163.com. This author s work was supported by the Natural Science Foundation of China (11171180). Department of Applied Mathematics, University of New South Wales, Sydney 2052, Australia. E-mail: g.li@unsw.edu.au (G. Li). This author s work was partially supported by Australian Research Council. Department of Applied Mathematics, The Hong Kong Polytechnic University, Hung Hom, Kowloon, Hong Kong. Email: maqilq@polyu.edu.hk. This author s work was supported by the Hong Kong Research Grant Council (Grant No. PolyU 502111, 501212, 501913 and 15302114). 1

2 SOS TENSOR DECOMPOSITION: THEORY AND APPLICATIONS answer for this question is important because this will enrich the theory of SOS tensor decomposition, achieve a better understanding for these structured tensors, and lead to efficient numerical methods for solving problems involving these structured tensors. In this paper, we make the following contributions in answering the above theoretical question and providing applications on important numerical problems involving structured tensors: (1) We first show that several classes of symmetric structured tensors available in the literature have SOS decomposition when the order is even. These classes include positive Cauchy tensors, weakly diagonally dominated tensors, B 0 - tensors, double B-tensors, quasi-double B 0 -tensors, MB 0 -tensors, H-tensors, absolute tensors of positive semi-definite Z-tensors and extended Z-tensors. (2) Secondly, we examine the SOS-rank for tensors with SOS decomposition and the SOS-width for SOS tensor cones. The SOS-rank of tensor A is defined to be the minimal number of the squares which appear in the sums-of-squares decomposition of the associated homogeneous polynomial of A, and, for a given SOS tensor cone, its SOS-width is the maximum possible SOS-rank for all the tensors in this cone. We deduce an upper bound for the SOS-rank of general SOS tensor decomposition and the SOS-width for the general SOS tensor cone using the known result in polynomial theory [4]. We then provide a sharper explicit upper bound of the SOS-rank for tensors with bounded exponent and identify the exact SOS-width for the cone consists of all such tensors with SOS decomposition. (3) Finally, as applications, we show how the derived SOS tensor decomposition can be used to compute the minimum H-eigenvalue of an even order symmetric extended Z-tensor and test the positive definiteness of an associated multivariate form. Numerical examples ranging from small size to large size are provided to show the efficiency of the proposed numerical methods. The rest of this paper is organized as follows. In Section 2, we recall some basic definitions and facts for tensors and polynomials. We also present some properties of SOS tensor cone and its duality. In Section 3, we present SOS decomposition property for various classes of structured tensors. In Section 4, we study the SOS-rank of SOS tensor decomposition and SOS-width for a given SOS tensor cone. In particular, we examine SOS tensor decomposition with bounded exponents and the SOS-width of the cone constituted by all such tensors with SOS decomposition, and provide their sharper explicit estimate. In Section 5, as an application for the derived SOS decomposition of structure tensors, we show that the minimum H-eigenvalue of an even order extended Z-tensor can be computed via polynomial optimization technique. Accordingly, this also leads to an efficient test for positive definiteness of an associated multivariate form. Numerical experiments are reported to illustrate the significance of the result. Final remarks and some questions are listed in Section 6. Before we move on, we briefly mention the notation that will be used in the sequel. Let R n be the n dimensional real Euclidean space and the set consisting of all positive integers is denoted by N. Suppose m,n N are two natural numbers. Denote [n] = {1,2,,n}. Vectors are denoted by bold lowercase letters i.e. x, y,, matrices are denoted by capital letters i.e. A,B,, and tensors are written as calligraphic capitals such as A,T,. The i-th unit coordinate vector in R n is denoted by e i. If A = (a i1 i m ) 1 ij n, j = 1,,m, then A = ( a i1 i m ) 1 ij n, j = 1,,m. 2. Preliminaries A real mth order n-dimensional tensor A = (a i1i 2 i m ) is a multi-array of real en-

HAIBIN CHEN, GUOYIN LI, AND LIQUN QI 3 tries a i1i 2 i m, where i j [n] for j [m]. If the entries a i1i 2 i m are invariant under any permutation of their indices, then tensor A is called a symmetric tensor. In this paper, we always consider symmetric tensors defined in R n. The identity tensor I with order m and dimension n is given by I i1 i m = 1 if i 1 = = i m and I i1 i m = 0 otherwise. We first fix some symbols and recall some basic facts of tensors and polynomials. Let m,n N. Consider S m,n := {A : A is an mth-order n-dimensional symmetric tensor}. Clearly, S m,n is a linear space under the addition and multiplication defined as below: for any t R, A = (a i1 i m ) 1 i1,,i m n and B = (b i1 i m ) 1 i1,,i m n, A+B = (a i1 i m +b i1 i m ) 1 i1,,i m n and ta = (ta i1 i m ) 1 i1,,i m n. For each A,B S m,n, we define the inner product by A,B := n i 1,,i m=1 a i1 i m b i1 i m. The corresponding norm is defined by A = ( A,A ) 1/2 = n i 1,,i m=1 (a i1 i m ) 2 For a vector x R n, we use x i to denote its ith component. Moreover, for a vector x R n, we use x m to denote the mth-order n-dimensional symmetric rank one tensor induced by x, i.e., (x m ) i1i 2 i m = x i1 x i2 x im, i 1,,i m {1,,n}. We note that an mth order n-dimensional symmetric tensor uniquely defines an mth degree homogeneous polynomial f A on R n : for all x = (x 1,,x n ) T R n, (2.1) f A (x) = Ax m = a i1i 2 i m x i1 x i2 x im. i 1,i 2,,i m [n] Conversely, any mth degree homogeneous polynomial function f on R n also uniquely corresponds a symmetric tensor. Furthermore, a tensor A is called positive semi-definite (positive definite) if f A (x) 0 (f A (x) > 0) for all x R n (x R n \{0}). We now recall the following definitions on eigenvalues and eigenvectors for a tensor [36]. Definition 2.1. Let C be the complex field. Let A = (a i1i 2 i m ) be an order m dimension n tensor. A pair (λ,x) C C n \{0} is called an eigenvalue-eigenvector pair of tensor A, if they satisfy Ax m 1 = λx [m 1], where Ax m 1 and x [m 1] are all n dimensional column vectors given by Ax m 1 = n a ii2 i m x i2 x im i 2,,i m=1 1 i n and x [m 1] = (x m 1 1,...,x m 1 n ) T R n. If the eigenvalue λ and the eigenvector x are real, then λ is called an H-eigenvalue of A and x is its corresponding H-eigenvector [36]. An important fact which will be used frequently later on is that an even order 1/2.

4 SOS TENSOR DECOMPOSITION: THEORY AND APPLICATIONS symmetric tensor is positive semi-definite (definite) if and only if all H-eigenvalues of the tensor are nonnegative (positive). Suppose that m is even. In (2.1), if f A (x) is a sums-of-squares (SOS) polynomial, then we say A has an SOS tensor decomposition (or an SOS decomposition, for simplicity). If a given tensor has SOS decomposition, then the tensor is positive semidefinite, but not vice versa. Next, we recall a useful lemma which provides a test for verifying whether a homogeneous polynomial is a sums-of-squares polynomial or not. To do this, we introduce some basic notions. For all x R n, consider a homogeneous polynomial f(x) = α f αx α with degree m (m is an even number), where α = (α 1,,α n ) (N {0}) n, x α = x α1 1 xαn n and α := n α i = m. Let f m,i be the coefficient associated with x m i. Let e i be the ith unit vector and let (2.2) Ω f = {α = (α 1,,α n ) (N {0}) n : f α 0 and α me i, i = 1,,n}. Then, f can be decomposed as f(x) = n f m,ix m i + α Ω f f α x α. Recall that 2N denotes the set consisting of all the even numbers. Define where n ˆf(x) = f m,i x m i f α x α, α f (2.3) f := {α = (α 1,,α n ) Ω f : f α < 0 or α / (2N {0}) n }. Lemma 2.2. [9, Corollary 2.8] Let f be a homogeneous polynomial of degree m, where m is an even number. If ˆf is a polynomial which always takes nonnegative values, then f is a sums-of-squares polynomial. 2.1. SOS tensor cone and its dual cone. In this part, we study the cone consisting of all tensors that have SOS decomposition, and its dual cone [30]. We use SOS m,n to denote the cone consisting of all order m and dimension n tensors, which have SOS decomposition. The following simple lemma from [14] gives some basic properties of SOS m,n. Lemma 2.3. (cf.[14]) Let m,n N and m be an even number. Then, SOS m,n is a closed convex cone with dimension at most I(m,n) = ( ) n+m 1 m. For a closed convex cone C, we recall that the dual cone of C in S m,n is denoted by C and defined by C = {A S m,n : A,C 0 for all C C}. Let M = (m i1,i 2,,i m ) S m,n. We also define the symmetric tensor sym(m M) S 2m,n by sym(m M)x 2m = (Mx m ) 2 = m i1,,i m m j1,,j m x i1 x im x j1 x jm. 1 i 1,,i m,j 1,,j m n Moreover, in the case where the degree m = 2, SOS 2,n and its dual cone are equal, and both reduce to the cone of positive semidefinite (n n) matrices. Therefore, to avoid triviality, we consider the duality of the SOS tensor cone SOS m,n in the case where m is an even number with m 4. Proposition 2.4. (Duality between tensor cones) Let n N and m be an even number with m 4. Then, we have SOS m,n = {A S m,n : A,sym(M M) 0, M S m 2,n } and SOS m,n SOS m,n. Proof. We define SOS h m,n to be the cone consisting of all mth-order n-dimensional symmetric tensors such that f A (x) := A,x m is a polynomial which can be written as

HAIBIN CHEN, GUOYIN LI, AND LIQUN QI 5 sums of finitely many homogeneous polynomials. We now see that indeed SOS h m,n = SOS m,n. Clearly, SOS h m,n SOS m,n. To see the reverse inclusion, we let A SOS m,n. Then, there exists l N and f 1,,f l are real polynomials with degree at most m 2 such that A,x m = l f i(x) 2. In particular, for all t 0, we have t m A,x m = A,(tx) m = l f i (tx) 2 Dividing t m on both sides and letting t +, we see that A,x m = l f i, m 2 (x)2, where f i, m 2 is the m 2 th-power term of f i, i = 1,,l. This shows that A SOS h m,n. Thus, we have SOS h m,n = SOS m,n. It then follows that ( SOSm,n ) = ( SOS h m,n ) = {A Sm,n : A,C 0 for all C SOS h m,n} = {A S m,n : A,C 0 for all C = l sym(m i M i ), M i S m 2,n,i = 1,,l} = {A S m,n : A,sym(M M) 0 for all M S m 2,n }. We now show that SOS m,n SOS m,n if m 4. Let f(x) = x 4 1 +x 4 2 + 1 4 x4 3 +6x 2 1x 2 2 + 6x 2 1x 2 3 +6x 2 2x 2 3 and let A S 4,3 be such that Ax 4 = f(x). Then, A has an SOS decomposition and A 1,1,1,1 = A 2,2,2,2 = 1, A 3,3,3,3 = 1 4, A 1,1,3,3 = A 1,1,2,2 = A 2,2,3,3 = 1. We now see that A SOS m,n. To see this, we only need to find M S 2,3 such that A,sym(M M) < 0. To see this, let M = diag(1,1, 4). Then, sym(m M)x 4 = (x T Mx) 2 = (x 2 1 + x 2 2 4x 2 3) 2. Direct verification shows that sym(m M)x 4 = x 4 1 +x 4 2 +16x 4 3 +2x 2 1x 2 2 8x 2 1x 2 3 8x 2 2x 2 3. So, sym(m M) 1,1,1,1 = sym(m M) 2,2,2,2 = 1, sym(m M) 3,3,3,3 = 16, sym(m M) 1,1,2,2 = 1 3, sym(m M) 1,1,3,3 = sym(m M) 2,2,3,3 = 4 3. Therefore, A,sym(M M) = 1+1+ 1 ( 4 16+6 1 1 ) ( ( +6 1 4 )) ( ( +6 1 4 )) = 8 < 0, 3 3 3 and the desired results hold. Question: It is known from polynomial optimization (see [22, Proposition 4.9] or [21]) that the dual cone of the cone consisting of all sums-of-squares polynomials (possibly nonhomogeneous) is the moment cone (that is, all the sequence whose associated moment matrix is positive semi-definite). Can we link the dual cone of SOS m,n to the moment matrix? Can the membership problem of SOS m,n be solvable in polynomial time? 3. SOS Decomposition of Several Classes of Structured Tensors In this section, we examine the SOS decomposition of several classes of symmetric even order structured tensors, such as positive Cauchy tensor, weakly diagonally dominated tensors, B 0 -tensors, double B-tensors, quasi-double B 0 -tensors, MB 0 -tensors, H-tensors, absolute tensors of positive semi-definite Z-tensors and extended Z-tensors. For SOS decomposition of strong Hankel tensors, see [7, 27]. 3.1. Characterizing SOS decomposition for even order Cauchy tensors. Symmetric Cauchy tensors was first studied in [2]. Some checkable sufficient and necessary conditions for an even order symmetric Cauchy tensor to be positive semi-definite

6 SOS TENSOR DECOMPOSITION: THEORY AND APPLICATIONS or positive definite were provided in [2], which extends the matrix cases established in [10]. Let c = (c 1,c 2,,c n ) T R n with c i1 +c i2 + +c im 0 for all i j {1,...,n}, j = 1,...,m. Let the real tensor C = (c i1i 2 i m ) be defined by c i1i 2 i m = 1 c i1 +c i2 + +c im, j [m], i j [n]. Then, we say that C is a symmetric Cauchy tensor with order m and dimension n or simply a Cauchy tensor. The corresponding vector c R n is called the generating vector of C. To establish the SOS decomposition of Cauchy tensors, we will also need another class of tensors called completely positive tensor, which has an SOS tensor decomposition in the even order case. Tensor A is called a completely decomposable tensor if there are vectors x j R n, j [r] such that A can be written as sums of rank-one tensors generated by the vector x j, that is, A = j [r]x m j. If x j R n + for all j [r], then A is called a completely positive tensor [38]. It was shown that a strongly symmetric, hierarchically dominated nonnegative tensor is a completely positive tensor [38]. We now characterize the SOS decomposition and completely positivity for even order Cauchy tensors. Theorem 3.1. Let c = (c 1,c 2,,c n ) T R n with c i1 +c i2 + +c im 0 for all i j {1,...,n}, j = 1,...,m. Let C be a Cauchy tensor generated by c with even order m and dimension n. Then, the following statements are equivalent. (i) the Cauchy tensor C has an SOS tensor decomposition; (ii) the Cauchy tensor C is positive semi-definite; (iii) the generating vector of the Cauchy tensor c i, i [n], are all positive; (iv) the Cauchy tensor C is a completely positive tensor. Proof. Since m is even, by definitions of completely positive tensor, SOS tensor decomposition and positive semi-definite tensor, we can easily obtain (i) (ii) and (iv) (i). By Theorem 2.1 of [2], we know that C is positive semi-definite if and only if c i > 0,i [n], and hence, (ii) (iii) holds. So, we only need to prove (iii) (iv), that is, any Cauchy tensors with positive generating vector is completely positive. Assume (iii) holds. Then, for any x R n, Cx m = n = n = 1 0 = 1 0 x i1 x i2 x im i 1,i 2,,i m=1 c i1 +c i2 + +c im i( 1,i 2,,i m=1 n ( ) 1 0 tci 1 +ci 2 + +cim 1 x i1 x i2 x im dt i 1,i 2,,i m=1 tci 1 +ci 2 + +cim 1 x i1 x i2 x im )dt ( n ) m tci 1 m x i dt. By the definition of Riemann integral, we have ( n ) m k Cx m ( j k )ci 1 m x i = lim. k k j=1

HAIBIN CHEN, GUOYIN LI, AND LIQUN QI 7 Let C k be the symmetric tensor such that C k x m = k = = ( n ) m ( j k )c i m 1 x i j=1 k k j=1 ( n k j=1 ( j k )c i m 1 k m 1 ( u j,x ) m, x i ) m where u j = ( ( j k )c1 1 m k 1 m,, ( j k )cn 1 m k 1 m ) R n, j = 1,,k. Let CD m,n denote the set consisting of all completely decomposable tensor with order m and dimension n. From [27, Theorem 1], CD m,n is a closed convex cone when m is even. It then follows that C = lim k C k is also a completely positive tensor. 3.2. Even order symmetric weakly diagonally dominated tensors have SOS decompositions. In this section, we establish that even order symmetric weakly diagonally dominated tensors have SOS decompositions. Firstly, we give the definition of weakly diagonally dominated tensors. To do this, we introduce an index set A associated with a tensor A. Now, let A be a tensor with order m and dimension n, and let f A be its associated homogeneous polynomial such that f A (x) = Ax m. We then define the index set A as f with f = f A, as given as in (2.3). Definition 3.2. We say A is a diagonally dominated tensor if, for each i = 1,,n, a ii i a ii2 i m. (i 2,,i m) (i i) We say A is a weakly diagonally dominated tensor if, for each i = 1,,n, a ii i a. ii2 im (i 2 im) (i i), (i,i 2,im) A Clearly, any diagonally dominated tensor is a weakly diagonally dominated tensor. However, the converse is, in general, not true. Theorem 3.3. Let A be a symmetric weakly diagonally dominated tensor with order m and dimension n. Suppose that m is even. Then, A has an SOS tensor decomposition. Proof. Denote I = {(i,,i) 1 i n}. Let x R n. Then, Ax m = = n a ii i x m i + n a ii i n (i 1,,i m) / I a i1i 2 i m x i1 x i2 x im a ii2 im (i 2 im) (i i) (i,i 2,im) A a x ii2 im (i 2 im) (i i) (i,i 2,im) A m i + x m i + (i 1,,i m) / I a i1i 2 i m x i1 x i2 x im

8 SOS TENSOR DECOMPOSITION: THEORY AND APPLICATIONS = n a ii i + + n n a ii2 im (i 2 im) (i i) (i,i 2,im) A a x ii2 im (i 2 im) (i i) (i,i 2,im) A m i + x m i n a x ii2 im ix x i2 im (i 2 im) (i i) (i,i 2,im) / A a x ii2 im ix x i2 im (i 2 im) (i i) (i,i 2,im) A Define h(x) = n a x ii2 im (i 2 im) (i i) (i,i 2,im) A m i + n a x ii2 im ix x. i2 im (i 2 im) (i i) (i,i 2,im) A We now show that h is a sums-of-squares polynomial. To see h is indeed sums-of-squares, from Lemma 2.2, it suffices to show that ĥ(x) := n m a ii2 im xi (i 2 im) (i i) (i,i 2,im) A n a ii2 im x i x i2 x im (i 2 im) (i i) (i,i 2,im) A is a polynomial which always takes nonnegative values. As ĥ is a homogeneous polynomial with degree m on R n, let Ĥ be a symmetric tensor with order m and dimension n such that ĥ(x) = Ĥxm. Since A is symmetric, the nonzero entries of Ĥ are the same as the corresponding entries of A. Now, let λ be an arbitrary H-eigenvalue of Ĥ, from the Gershgorin Theorem for eigenvalues of tensors [36], we have λ a ii2 im a. ii2 im (i 2 im) (i i) (i 2 im) (i i) (i,i 2,im) A (i,i 2,im) A So, we must have λ 0. This shows that all H-eigenvalues of Ĥ must be nonnegative, and so, Ĥ is positive semi-definite [36]. Thus, ĥ is a polynomial which always takes nonnegative values. Now, as A is a weakly diagonally dominated tensor and m is even, n a ii i a ii2 im x m i (i 2 im) (i i) (i,i 2,im) A is an SOS polynomial. Moreover, from the definition of A, for each (i 1 i m ) / A, a i1 i m 0 and x i1 x im is a squares term. Then, n a ii2 imx i x i2 x im (i 2 im) (i i) (i,i 2,im) / A is also a sums-of-square polynomial. Thus, A has an SOS tensor decomposition.

HAIBIN CHEN, GUOYIN LI, AND LIQUN QI 9 As a diagonally dominated tensor is weakly diagonally dominated, the following corollary follows immediately. Corollary 3.4. Let A be a symmetric diagonally dominated tensor with even order m and dimension n. Then, A has an SOS tensor decomposition. 3.3. The absolute tensor of an even order symmetric positive semidefinite Z-tensor has an SOS decomposition. Let A be an order m dimension n tensor. If all off-diagonal elements of A are non-positive, then A is called a Z-tensor [44]. A Z-tensor A = (a i1,...,i m ) can be written as (3.1) A = D C, where D is a diagonal tensor where its ith diagonal elements equals a ii...i, i = 1,...,n, and C is a nonnegative tensor (or a tensor with nonnegative entries) such that diagonal entries all equal to zero. We now define the absolute tensor of A by A = D +C. Note that all even order symmetric positive semi-definite Z-tensors have SOS decompositions [14, 15], a natural interesting question would be: do all absolute tensors of even order symmetric positive semi-definite Z-tensors have SOS decompositions? Below, we provide an answer for this question. Theorem 3.5. Let A be a symmetric Z-tensor with even order m and dimension n defined as in (3.1). If A is positive semi-definite, then A has an SOS tensor decomposition. Proof. Let A = (a i1...i m ) be a symmetric positive semi-definite Z-tensor. From (3.1), we have A = D C, where D is a diagonal tensor where the diagonal entries of D is d i := a i...i,i [n] and C = (c i1i 2 i m ) is a nonnegative tensor with zero diagonal entries. Define three index sets as follows: I = {(i 1,i 2,,i m ) [n] m i 1 = i 2 = = i m }; Ω = {(i 1,i 2,,i m ) [n] m c i1i 2 i m 0 and (i 1,i 2,,i m ) / I}; = {(i 1,i 2,,i m ) Ω c i1i 2 i m > 0 or at least one index in (i 1,i 2,,i m ) exists odd times}. Let f(x) = A x m and define a polynomial ˆf by n ˆf(x) = d i x m i c i1i 2 i m x i1 x i2 x im. (i 1,i 2,,i m) From Lemma 2.2, to see polynomial f(x) = A x m is a sums-of-squares polynomial, we only need to show that ˆf always takes nonnegative value. To see this, as A is positive semi-definite, we have d i 0. Since c i1i 2 i m 0, i j [n], j [m], it follows that ˆf(x) = n d ix m i (i c 1,i 2,,i m) i 1i 2 i m x i1 x i2 x im = n d ix m i (i c 1,i 2,,i m) Ω i 1i 2 i m x i1 x i2 x im + (i c 1,i 2,,i m) Ω\ i 1i 2 i m x i1 x i2 x im = Ax m + (i c 1,i 2,,i m) Ω\ i 1i 2 i m x i1 x i2 x im 0. Here, the last inequality follows from the fact that m is even, A is positive semi-definite and x i1 x i2 x im is a square term if (i 1,i 2,,i m ) Ω\. Thus, the desired result follows.

10 SOS TENSOR DECOMPOSITION: THEORY AND APPLICATIONS 3.4. SOS tensor decomposition for even order symmetric extended Z- tensors. In this subsection, we introduce a new class of symmetric tensor which extends symmetric Z-tensors to the cases where the off-diagonal elements can be positive, and examine its SOS tensor decomposition. Let f be a polynomial on R n with degree m. Let f m,i be the coefficient of f associated with x m i, i [n]. We say f is an extended Z-polynomial if there exist s N with s n and index sets Γ l {1,,n}, l = 1,,s with s l=1 Γ l = {1,,n} and Γ l1 Γ l2 = for all l 1 l 2 such that where n s f(x) = f m,i x m i + f αl x αl, α l Ω l Ω l = {α ([n] {0}) n α = m,x α = x i1 x i2 x im,{i 1,,i m } Γ l, and α me i, i = 1,,n} for each l = 1,,s and either one of the following two conditions holds: (1) f αl = 0 for all but one α l Ω l ; (2) f αl 0 for all α l Ω l. We now say a symmetric tensor A is an extended Z-tensor if its associated polynomial f A (x) = Ax m is an extended Z-polynomial. From the definition, it is clear that any Z-tensor is an extended Z-tensor with s = 1 and Γ 1 = {1,,n}. On the other hand, an extended Z-tensor allows a few elements of the off-diagonal elements to be positive, and so, an extended Z-tensor need not to be a Z-tensor. For example, consider a symmetric tensor A where its associated polynomial f A (x) = Ax m = x 6 1 +x 6 2 +x 6 3 +x 6 4 +4x 3 1x 3 2 +6x 2 3x 4 4. It can be easily see that A is an extended Z-tensor but not a Z-tensor (as there are positive off-diagonal elements). In [31], partially Z-tensors are introduced. There is no direct relation between these two concepts, except that both of them contain Z-tensors. But they do have intersection which is larger than the set of all Z-tensors. Actually, the example just discussed is not a Z-tensor, but it is an extended Z-tensor and a partially Z-tensor as well. We now see that any positive semi-definite extended Z-tensor has an SOS tensor decomposition. To achieve this, we recall the following useful lemma, which provides us a simple criterion for determining whether a homogeneous polynomial with only one mixed term is a sum of squares polynomial or not. Lemma 3.6. [9] Let b 1,b 2,,b n 0 and d N. Let a 1,a 2,,a n N be such that n = 2d. Consider the homogeneous polynomial f(x) defined by l=1 f(x) = b 1 x 2d 1 + +b n x 2d n µx a1 1 xan n. Let µ 0 = 2d a ( bi i 0,1 i n a i ) a i 2d. Then, the following statements are equivalent: (i) f is a nonnegative polynomial i.e. f(x) 0 for all x R n ; (ii) either µ µ 0 or µ < µ 0 and all a i are even; (iii) f is an SOS polynomial. Theorem 3.7. Let A be an even order positive semi-definite extended Z-tensor. Then, A has an SOS tensor decomposition. Proof. Let f A (x) = Ax m. As A is a positive semi-definite symmetric extended Z- tensor, there exist s N and index sets Γ l {1,,n}, l = 1,,s with s l=1 Γ l = {1,,n}

HAIBIN CHEN, GUOYIN LI, AND LIQUN QI 11 and Γ l1 Γ l2 = for all l 1 l 2 such that for all x R n n s f(x) = f m,i x m i + f αl x αl α l Ω l such that, for each l = 1,,s, either one of the following two condition holds: (1) f αl = 0 for all but one α l Ω l ; (2) f αl 0 for all α l Ω l. Define, for each l = 1,,s, h l (x) := f m,i x m i + i Γ l l=1 α l Ω l f αl x αl. It follows that each h l is an extended Z-polynomial. Moreover, from the construction, s l=1 h s l = f A and so, inf x R n l=1 h l(x) = 0. Note that each h l is also a homogeneous polynomial, and hence inf x R n h l (x) 0. Noting that each h l is indeed a polynomial on (x i ) i Γl, s l=1 Γ l = {1,,n} and Γ l1 Γ l2 = for all l 1 l 2, we have s inf x R n l=1 h l(x) = s l=1 inf x R n h l(x). This enforces that inf x R n h l (x) = 0. In particular, each h l is a polynomial which takes nonnegative values. We now see that h l, 1 l s, are SOS polynomial. Indeed, if f αl = 0 for all but one α l Ω l, then h l is a homogeneous polynomial with only a mixed term, and so, Lemma 3.6 implies that h l is a SOS polynomial. On the other hand, if f αl 0 for all α l Ω l, h l corresponds to a Z-tensor, and so, h l is also a SOS polynomial in this case because any positive semidefinite Z-tensor has an SOS tensor decomposition [14]. Thus, f A = s l=1 h l is also a SOS polynomial, and hence the conclusion follows. Remark 3.1. A close inspection of the above proof indicates that we indeed show that the associated polynomial f A (x) = Ax m satisfies f A = s l=1 h l where each h l is an SOS polynomial in (x i ) i Γl. 3.5. Even order symmetric B 0 -tensors have SOS decompositions. In this part, we show that even order symmetric B 0 tensors have SOS tensor decompositions. Recall that a tensor A = (a i1i 2 i m ) with order m and dimension n is called a B 0 -tensor [37] if n i 2,,i m=1 a ii2 i m 0 and 1 n m 1 n i 2,,i m=1 a ii2 i m a ij2 j m for all (j 2,,j m ) (i,,i). To establish that a B 0 -tensor has an SOS tensor decomposition, we first present the SOS tensor decomposition of the all-one-tensor. We say E is an all-one-tensor if with each of its elements of E is equal to one. Lemma 3.8. Let E be an even order all-one-tensor. Then, E has an SOS tensor decomposition. Proof. Let E = (e i1i 2 i m ) be an all-one-tensor with even order m and dimension n. For all x R n, one has Ex m = i 1,i 2,,i m [n] e i 1i 2 i m x i1 x i2 x im = i 1,i 2,,i m [n] x i 1 x i2 x im = (x 1 +x 2 + +x n ) m 0,

12 SOS TENSOR DECOMPOSITION: THEORY AND APPLICATIONS which implies that E has an SOS tensor decomposition. Let J [n]. E J is called a partially all-one-tensor if its elements are defined such that e i1i 2 i m = 1, i 1,i 2,,i m J and e i1i 2 i m = 0 for the others. Similar to Lemma 3.2, it is easy to check that all even order partially all-one-tensors have SOS decompositions. We also need the following characterization of B 0 -tensors established in [37]. Lemma 3.9. Suppose that A is a B 0 -tensor with order m and dimension n. Then either A is a diagonally dominated symmetric M-tensor itself, or we have A = M+ s h k E J k, k=1 where M is a diagonally dominated symmetric M-tensor, s is a positive integer, h k > 0 and J k {1,,n}, for k = 1,,s, and J k J l =, for k l. From Theorem 3.3, Lemma 3.8 and Lemma 3.9, we have the following result. Theorem 3.10. All even order symmetric B 0 -tensors have SOS tensor decompositions. Before we move on to the next part, we note that, stimulated by B 0 -tensors in [37], symmetric double B-tensors, symmetric quasi-double B 0 -tensors and symmetric MB 0 - tensors have been studied in [24, 25]. Below, we briefly explain that, using a similar method of proof as above, these three classes of tensors all have SOS decompositions. To do this, let us recall the definitions of these three classes of tensors. For a real symmetric tensor B = (b i1i 2 i m ) with order m and dimension n, denote β i (B) = max {0,b ij 2 j m }; j 2,,j m [n],(i,j 2,,j m) / I i (B) = j 2,,j m [n],(i,j 2,,j m) / I (β i (B) b ij2 j m ); i j(b) = j (B) (β j (B) b jii i ), i j. As defined in [24, Definition 3], B is called a double B-tensor if, b ii i > β i (B), for all i [n] and for all i,j [n],i j such that and If b ii i > β i (B), for all i [n] and b ii i β i (B) i (B) (b ii i β i (B))(b jj j β j (B)) > i (B) j (B). (b ii i β i (B))(b jj j β j (B) i j(b)) (β j (B) b ji i ) i (B), then tensor B is called a quasi-double B 0 -tensor (see Definition 2 of [25]). Let A = (a i1i 2 i m ) such that a i1i 2 i m = b i1i 2 i m β i1 (B), for all i 1 [n]. If A is an M-tensor, then B is called an MB 0 -tensor (see Definition 3 of [25]). It was shown in [25] that all quasi-double B 0 -tensors are MB 0 -tensors.

HAIBIN CHEN, GUOYIN LI, AND LIQUN QI 13 In [24], Li et al. proved that, for any symmetric double B-tensor B, either B is a doubly strictly diagonally dominated (DSDD) Z-tensor, or B can be decomposed to the sum of a DSDD Z-tensor and several positive multiples of partially all-onetensors (see Theorem 6 of [24]). From Theorem 4 of [24], we know that an even order symmetric DSDD Z-tensor is positive definite. This together with the fact that any positive semi-definite Z-tensor has an SOS tensor decomposition [14] implies that any even order symmetric double B-tensor B has an SOS tensor decomposition. Moreover, from Theorem 7 of [25], we know that, for any symmetric MB 0 -tensor, it is either an M-tensor itself or it can be decomposed as the sum of an M-tensor and several positive multiples of partially all-one-tensors. As even order symmetric M-tensors are positive semi-definite Z-tensors [44] which have, in particular, SOS decomposition, we see that any even order symmetric MB 0 tensor also has an SOS tensor decomposition. Combining these and noting that any quasi-double B 0 -tensor is an MB 0 -tensor, we arrive at the following conclusion. Theorem 3.11. Even order symmetric double B-tensors, even order symmetric quasi double B 0 -tensors and even order symmetric MB 0 -tensors all have SOS tensor decompositions. 3.6. Even order symmetric H-tensors with nonnegative diagonal elements have SOS decompositions. In this part, we show that any even order symmetric H-tensor with nonnegative diagonal elements has an SOS tensor decomposition. Recall that an mth order n dimensional tensor A = (a i1i 2 i m ), it s comparison tensor is defined by M(A) = (m i1i 2 i m ) such that m ii i = a ii i, and m i1i 2 i m = a i1i 2 i m, for all i,i 1,,i m [n],(i 1,i 2,,i m ) / I. Then, tensor A is called an H-tensor [6] if there exists a tensor Z with nonnegative entries such that M(A) = si Z and s ρ(z), where I is the identity tensor and ρ(z) is the spectral radius of Z defined as the maximum of modulus of all eigenvalues of Z. If s > ρ(z), then A is called a nonsingular H- tensor. A characterization for nonsingular H-tensors was given in [6] which states A is a nonsingular H-tensor if and only if there exists an enteritis positive vector y = (y 1,y 2,,y n ) R n such that a ii i y m 1 i > (i,i 2,,i m) / I a ii2 i m y i2 y i3 y im, i [n]. We note that the above definitions were first introduced in [6]. These were further examined in [17, 26] where the authors in [26] referred nonsingular H-tensors simply as H-tensors and the authors in [17] referred nonsingular H-tensors as strong H-tensors. Theorem 3.12. Let A = (a i1i 2 i m ) be a symmetric H-tensor with even order m dimension n. Suppose that all the diagonal elements of A are nonnegative. Then, A has an SOS tensor decomposition. Proof. We first show that any nonsingular H-tensor with positive diagonal elements has an SOS tensor decomposition. Let A = (a i1i 2 i m ) be a nonsingular H-tensor with even order m dimension n such that a ii i > 0, i [n]. Then, there exists a vector y = (y 1,,y n ) T R n with y i > 0, i = 1,,n, such that (3.2) a ii i y m 1 i > (i,i 2,,i m) / I a ii2 i m y i2 y i3 y im, i [n].

14 SOS TENSOR DECOMPOSITION: THEORY AND APPLICATIONS To prove the conclusion, by Lemma 2.2, we only need to prove ˆf A (x) = a ii i x m i a i1i 2 i m x i1 x i2 x im 0, x R n. i [n] (i 1,i 2,,i m) A From (3.2), we know that (3.3) = ˆf A (x) ( i [n] (i,i 2,,i m) / I a ii 2 i m y 1 m i y i2 y i3 y im x m i (i 1,i 2,,i m) A a i1i 2 i m x i1 x i2 x im. Here, for any fixed tuple (i 0 1,i 0 2,,i 0 m) A, assume (i 0 1,i 0 2,,i 0 m) is constituted by k distinct indices j1,j 0 2,,j 0 k 0, k m, which appear s 1,s 2,,s k times in (i 0 1,i 0 2,,i 0 m) respectively, s l [m],l [k]. Then, one has s 1 +s 2 + +s k = m. Without loss of generality, we denote a = a i 0 1 i 0 > 0. Let 2 i0 π(i0 m 1,i 0 2,,i 0 m) be the set consisting of all permutations of (i 0 1,i 0 2,,i 0 m). So, on the right side of (3.3), there are some terms corresponding to the fixed tuple (i 0 1,i 0 2,,i 0 m) such that a j 0 1 i 2 i m y 1 m y j 0 i2 y i3 y im x m j 0 (j1 0,i2,,im) π(i0 1,i0 2,,i0 m ) 1 1 + a j 0 2 i 2 i m y 1 m y j 0 i2 y i3 y im x m j 0 (j2 0,i2,,im) π(i0 1,i0 2,,i0 m ) 2 2 + + a j 0 k i2 im y1 m y j 0 i2 y i3 y im x m j 0 (j 0 k,i2,,im) π(i0 1,i0 2,,i0 m ) k k a i1i 2 i m x i1 x i2 x im (i 1,i 2,,i m) π(i 0 1,i0 2,,i0 m ( )( )( ) ) ( ) m 1 m s1 m s1 s 2 m s1 s 2 s k 1 = ay s1 m y s2 y s s 1 1 s 2 s 3 s j 0 k 1 j2 0 kx m j 0 j 0 k 1 ( )( )( ) ( ) m 1 m s2 m s1 s 2 m s1 s 2 s k 1 + ay s2 m y s1 y s3 y s s 2 1 s 1 s 3 s j 0 k 2 j1 0 j3 0 kx m j 0 j 0 k 2 + ( )( )( ) ( ) m 1 m sk m sk s 1 m sk s 1 s k 2 + ay s k m y s1 y s k 1 x m s k 1 s 1 s 2 s j 0 j 0 k 1 k 1 j 0 j 0 k 1 k ( )( )( ) ( ) m m s1 m s1 s 2 m s1 s 2 s k 1 ax s1 x s2 x s s 1 s 2 s 3 s j 0 k 1 j2 0 k j 0 k (m 1)!ay s1 y s2 y s [ ) j1 0 j2 0 k m ( ) m j 0 xj 0 k 2 +s 2 0, m s 1!s 2! s k! ( ) s1 ( xj 0 xj 0 1 2 y j 0 1 y j 0 2 ) s2 s 1 ( xj 0 1 y j 0 1 ( xj 0 k y j 0 k ) sk ] y j 0 2 ) m + +s k ( xj 0 k y j 0 k where the last inequality follows the arithmetic-geometric inequality and the fact y > 0. Thus, each tuple (i 1,i 2,,i m ) A corresponds to a nonnegative value on the right side of (3.3), which implies that ˆf(x) 0 for all x R n. Hence, by Lemma 2.2, A has an SOS tensor decomposition. )

HAIBIN CHEN, GUOYIN LI, AND LIQUN QI 15 Now, let A be a general H-tensor with nonnegative diagonal elements. Then, for each ɛ > 0, A ɛ := A+ɛI is a nonsingular H-tensor with positive diagonal elements. Thus, A ɛ A, and for each ɛ > 0, A ɛ has an SOS tensor decomposition. As SOS m,n is a closed convex cone, we see that A also has an SOS tensor decomposition and the desired results follows. 4. The SOS-Rank of SOS tensor Decomposition In this section, we study the SOS-rank of SOS tensor decomposition. We formally define the SOS-rank of SOS tensor decomposition as follows. Let A be a tensor with even order m and dimension n. Suppose A has an SOS tensor decomposition. As shown in Proposition 3.1, SOS m,n = SOS h m,n where SOS m,n is the SOS tensor cone and SOS h m,n is the cone consisting of all mth-order n-dimensional symmetric tensors such that f A (x) := A,x m is a polynomial which can be written as sums of finitely many homogeneous polynomials. Thus, there exists r N such that the homogeneous polynomial f A (x) = Ax m can be decomposed by f A (x) = f 2 1 (x)+f 2 2 (x)+ +f 2 r (x), x R n, where f i (x), i [r] are homogeneous polynomials with degree m 2. The minimum value r is called the SOS-rank of A, and is denoted by SOSrank(A). Let C be a convex cone in the SOS tensor cone, that is, C SOS m,n. We define the SOS-width of the convex cone C by SOS-width(C) = sup{sosrank(a) : A C}. Here, we do not care about the minimum of the SOS-rank of all the possible tensors in the cone C as it will always be zero. Recall that it was shown by Choi et al. in [4, Theorem 4.4] that, an SOS homogeneous polynomial can be decomposed as sums of at most Λ many squares of homogeneous polynomials where ( ) 1+8a 1 n+m 1 (4.1) Λ = and a =. 2 m This immediately gives us that Proposition 4.1. Let A be a tensor with even order m and dimension n, m,n N. Suppose A has an SOS tensor decomposition. Then, its SOS-rank satisfies SOSrank(A) Λ, where Λ is given in (4.1). In particular, SOS-width(SOS m,n ) Λ. In the matrix case, that is, m = 2, the upper bound Λ equals the dimension n of the symmetric tensor which is tight in this case. On the other hand, in general, the upper bound is of the order n m/2 and need not to be tight. However, for a class of structured tensors with bounded exponent (BD-tensors) that have SOS decompositions, we show that their SOS-rank is less or equal to the dimension n which is significantly smaller than the upper bound in the above proposition. Moreover, in this case, the SOS-width of the associated BD-tensor cone can be determined explicitly. To do this, let us recall the definition of polynomials with bounded exponent and define the BD-tensors. Let e N. Recall that f is said to be a degree m homogeneous polynomials on R n with bounded exponent e if f(x) = α f α x α = α f α x α1 1 xαn n, where 0 α j e and n j=1 α j = m. We note that degree 4 homogeneous polynomials on R n with bounded exponent 2 is nothing but the bi-quadratic forms in dimension n.

16 SOS TENSOR DECOMPOSITION: THEORY AND APPLICATIONS Let us denote BD e m,n to be the set consists of all degree m homogeneous polynomials on R n with bounded exponent e. An interesting result for characterizing when a positive semi-definite (PDF) homogeneous polynomial with bounded exponent has SOS tensor decomposition was established in [3] and can be stated as follows. Lemma 4.2. Let n N with n 3. Suppose e,m are even numbers and m 4. (1) If n 4, then BD e m,n PSD m,n SOS m,n if and only if m en 2; (2) If n = 3, then BD e m,n PSD m,n SOS m,n if and only if m = 4 or m 3e 4. Now, we say a symmetric tensor A is a BD-tensor with order m, dimension n and exponent e if f(x) = Ax m is a degree m homogeneous polynomial on R n with bounded exponent e. We also define BD m,n to be the set consisting of all symmetric BD-tensors with order m, dimension n and exponent e. It is clear that BD e m,n is a convex cone. Theorem 4.3. Let n N with n 3. Suppose e,m are even numbers and m 4. Let A be a BD-tensor with order m, dimension n and exponent e. Suppose that A has an SOS tensor decomposition. Then, we have SOSrank(A) n. Moreover, we have { SOS-width(BD e 1 if m = en m,n SOS m,n ) = n otherwise. Proof. As A is a BD-tensor and it has SOS decomposition, the preceding lemma implies that either (i) n 4 and m en 2 (ii) n = 3 and m = 4 and (iii) n = 3 and m 3e 4. We now divide the discussion into these three cases. Suppose that Case (i) holds, i.e., n 4 and m en 2. From the construction, we have m en. If m = en, then A has the form ax e 1 x e n. Here, a 0 because A has SOS decomposition and e is an even number. In this case, SOSrank(A) = 1. Now, let m = en 2. Then, Ax m = x e 1 x e n (i,j) F a ij x 1 i x 1 j for some a ij R, (i,j) F and for some F {1,,n} {1,,n}. As e is an even number and A has SOS decomposition, we have a ij x 1 i x 1 j 0 for all x i 0 and x j 0. (i,j) F Thus, by continuity, Q(t 1,,t n ) = (i,j) F a ijt i t j is a positive semi-definite quadratic form, and so, is at most sums of n many squares of linear functions in t 1,,t n. Let Q(t 1,,t n ) = n [ k=1 qk (t 1,,t n ) ] 2 where qk are linear functions. Then, ( n ) n Note that Ax m = x e 1 x e n [ qk (x 1 1,,x 1 n ) ] 2 =, ( x e 1 x e [ n qk (x 1 1,,x 1 n ) ] ) 2, x e 1 x e [ n qk (x 1 1,,x 1 n ) ] [ ] 2 = x e 2 1 x e 2 2 n q k (x 1 1,,x 1 n ) is a square. Thus, SOSrank(A) n in this case. Suppose that Case (ii) holds, i.e., n = 3 and m = 4. Then by Hilbert s theorem [13], SOSrank(A) 3 = n.

HAIBIN CHEN, GUOYIN LI, AND LIQUN QI 17 Suppose that Case (iii) holds, i.e., n = 3 and m 3e 4. In the case of m = en 2 = 3e 2 and m = en = 3e, using similar argument as in the Case (i), we see that the conclusion follows. The only remaining case is when m = 3e 4. In this case, as A is a BD-tensor with order m, dimension 3 and exponent e and A has SOS decomposition, we have Ax m = x e 1x e 2x e 3G(x 1 1,x 1 2,x 1 3 ), where G is a positive semi-definite form and is of 3 dimension and degree 4. It then from Hilbert s theorem [13] that G(t 1,t 2,t 3 ) can be expressed as at most the sum of 3 squares of 3-dimensional quadratic forms. Thus, using similar line of argument as in Case (i) and noting that e 4 (as m = 3e 4 and m 4), we have SOSrank(A) n = 3. Combining these three cases, we see that SOSrank(A) n, and SOSrank(A) = 1 if m = en. In particular, we have SOS-width(BD e m,n SOS m,n ) n, and SOS-width(BD e m,n SOS m,n ) = 1 if m = en. To see the conclusion, we consider the homogeneous polynomial x e 1 x e n( n x 2 i ) if n 3 and m = en 2 f 0 (x) = x 2 1x 2 2 +x 2 2x 2 3 +x 2 3x 2 1 if n = 3 and m = 4 x e 1x e 2x e 3(x 2 1 x 2 2 +x 2 2 x 2 3 +x 2 3 x 2 1 ) if n = 3 and m = 3e 4 and its associated BD-tensor A 0 such that f 0 (x) = A 0 x m. It can be directly verified that n if n 3 and m = en 2, SOSrank(A 0 ) = 3 if n = 3 and m = 4, 3 if n = 3 and m = 3e 4. For example, in the case n 3 and m = en 2, to see SOSrank(A 0 ) = n, we only need to show SOSrank(A 0 ) n. Suppose on the contrary that SOSrank(A 0 ) n 1. Then, there exists r n 1 and homogeneous polynomial f i with degree m/2 = e 2n 1 such that ( n ) r x e 1 x e n = f i (x) 2. x 2 i This implies that for each x = (x 1,,x n ) with x i 0, i = 1,,n [ ] 2 n r x 2 f i (x) i = x e 2 1 x e 2 n Letting t i = x 1 i, by continuity, we see that the quadratic form n t2 i can be written as a sum of at most r many squares of rational functions in (t 1,,,t n ). Then, the Cassels-Pfister s Theorem [8, Theorem 17.3] (see also [8, Corollary 17.6]), implies that the quadratic form n t2 i can be written as a sum of at most r many sums of squares of polynomial functions in (t 1,,,t n ), which is impossible. In the case n = 3 and m = 4, we only need to show SOSrank(A 0 ) 3. Suppose on the contrary that SOSrank(A 0 ) 2. Then, there exist a i,b i,c i,d i,e i,f i R, i = 1,2, such that x 2 1x 2 2 +x 2 2x 2 3 +x 2 3x 2 1 = (a 1 x 2 1 +b 1 x 2 2 +c 1 x 2 3 +d 1 x 1 x 2 +e 1 x 1 x 3 +f 1 x 2 x 3 ) 2 +(a 2 x 2 1 +b 2 x 2 2 +c 2 x 2 3 +d 2 x 1 x 2 +e 2 x 1 x 3 +f 2 x 2 x 3 ) 2.

18 SOS TENSOR DECOMPOSITION: THEORY AND APPLICATIONS Comparing with the coefficients gives us that a 1 = a 2 = b 1 = b 2 = c 1 = c 2 = 0 and d 2 1 +d 2 2 = 1 e 2 1 +e 2 2 = 1 f1 2 +f2 2 = 1 d 1 e 1 +d 2 e 2 = 0 d 1 f 1 +d 2 f 2 = 0 e 1 f 1 +e 2 f 2 = 0. From the last three equations, we see that one of d 1,d 2,e 1,e 2,f 1,f 2 must be zero. Let us assume say d 1 = 0. Then, the first equation shows d 2 = ±1 and hence, e 2 = 0 (by the fourth equation). This implies that e 1 = ±1 and f 2 = 0. Again, we have f 1 = ±1 and hence e 1 f 1 +e 2 f 2 = (±1)(±1)+0 = ±1 0. This leads to a contradiction. For the last case, suppose again by contradiction that SOSrank(A 0 ) 2. there exist two homogeneous polynomial f i with degree m/2 = 3e 2 2 such that Then, 2 x e 1x e 2x e 3(x 2 1 x 2 2 +x 2 2 x 2 3 +x 2 3 x 2 1 ) = f i (x) 2. This implies that for each x = (x 1,,x n ) with x i 0, i = 1,,n x 2 1 x 2 2 +x 2 2 x 2 3 +x 2 3 x 2 1 = [ 2 f i (x) x e 2 1 x e 2 3 Letting t i = x 1 i, using a similar line argument in the case m = en 2, we see that the polynomial t 2 1t 2 2 +t 2 2t 2 3 +t 2 3t 2 1 can be written as sums of 2 squares of polynomials in (t 1,t 2,t 3 ). This is impossible by the preceding case. Therefore, the conclusion follows. Below, let us mention that calculating the exact SOS-rank of SOS tensor decomposition is not a trivial task even for the identity tensor, and this relates to some open question in algebraic geometry in the literature. To explain this, we recall that the identity tensor I with order m and dimension n is given by I i1 i m = 1 if i 1 = = i m and I i1 i m = 0 otherwise. The identity tensor I induces the polynomial f I (x) = Ix m = x m 1 + +x m n. It is clear that, I has an SOS tensor decomposition when m is even and the corresponding SOS-rank of I is less than or equal to n. It was conjectured by Reznick [39] that f I (x) cannot be written as sums of (n 1) many squares, that is, SOSrank(I) = n. The positive answer for this conjecture in the special case of m = n = 4 was provided in [42, 43]. On the other hand, the answer for this conjecture in the general case is still open to the best of our knowledge. Moreover, this conjecture relates to another conjecture of Reznick [39] in the same paper where he showed that the polynomial f R (x) = x n 1 + +x n n nx 1 x n can be written as sums of (n 1) many squares whenever n = 2 k for some k N, and he conjectured that the estimate of the numbers of squares is sharp. Indeed, he also showed that this conjecture is true whenever the previous conjecture of f I (x) cannot be written as sums of (n 1) many squares is true. ] 2

HAIBIN CHEN, GUOYIN LI, AND LIQUN QI 19 5. Applications In this section, we provide some applications for the SOS tensor decomposition of the structure tensors such as finding the minimum H-eigenvalue of an even order extended Z-tensor and testing the positive definiteness of a multivariate form. We also provide some numerical examples/experiments to support the theoretical findings. Throughout this section, all numerical experiments are performed on a desktop, with 3.47 GHz quad-core Intel E5620 Xeon 64-bit CPUs and 4 GB RAM, equipped with Matlab 2015. 5.1. Finding the minimum H-eigenvalue of an even order symmetric extended Z-tensor. Finding the minimum eigenvalue of a tensor is an important topic in tensor computation and multilinear algebra, and has found numerous applications including automatic control and image processing [36]. Recently, it was shown that the minimum H-eigenvalue of an even order symmetric Z-tensor [15, 14] can be found by solving a sums-of-squares optimization problem, which can be equivalently reformulated as a semi-definite programming problem, and so, can be solved efficiently. In [15], some upper and lower estimates for the minimum H-eigenvalue of general symmetric tensors with even order are provided via sums-of-squares programming problems. Examples show that the estimate can be sharp in some cases. On the other hand, it was unknown in [14, 15] that whether similar results can continue to hold for some classes of symmetric tensors which are not Z-tensors, that is, for symmetric tensors with possible positive entries on the off-diagonal elements. In this section, as applications of the derived SOS decomposition of structured tensors, we show that the class of even order symmetric extended Z-tensor serves as one such class. To present the conclusion, the following Lemma plays an important role in our later analysis. [36] Lemma 5.1. Let A be a symmetric tensor with even order m and dimension n. Denote the minimum H-eigenvalue of A by λ min (A). Then, we have (5.1) Ax m λ min (A) = min x 0 x m = min x Axm, m=1 where x m = ( n x i m ) 1 m. Theorem 5.2. (Finding the minimum H-eigenvalue of an even order symmetric extended Z-tensor) Let m be an even number. Let A be a symmetric extended Z-tensor with order m and dimension n. Then, we have λ min (A) = max µ,r R {µ : f A(x) r( x m m 1) µ Σ 2 m[x]}, where f A (x) = Ax m and Σ 2 m[x] is the set of all SOS polynomials with degree at most m. Proof. Consider the following problem (P ) min{ax m : x m m = 1} and denote its global minimizer by a = (a 1,,a n ) T R n. Clearly, n am i = 1. Then, λ min (A) = f A (a) = Aa m. It follows that for all x R n \{0} f A (x) λ min (A) n x m i = f A (x) f A (a) n x m i