SOCP Relaxation of Sensor Network Localization
|
|
- Clemence Hancock
- 5 years ago
- Views:
Transcription
1 SOCP Relaxation of Sensor Network Localization Paul Tseng Mathematics, University of Washington Seattle University of Vienna/Wien June 19, 2006 Abstract This is a talk given at Univ. Vienna, 2006.
2 SOCP RELAXATION OF SENSOR NETWORK LOCALIZATION 1 Talk Outline Problem description SDP and SOCP relaxations Properties of SDP and SOCP relaxations Performance of SOCP relaxation and efficient solution methods Conclusions & Future Directions
3 SOCP RELAXATION OF SENSOR NETWORK LOCALIZATION 2 Sensor Network Localization Basic Problem: n pts in R d (d = 1, 2, 3). Know last n m pts ( anchors ) x m+1,..., x n pairs of neighboring pts and Eucl. dist. estimate for d ij 0 (i, j) A with A {(i, j) : 1 i < j n}. Estimate first m pts ( sensors ). History? Graph realization, position estimation in wireless sensor network, determining protein structures,...
4 SOCP RELAXATION OF SENSOR NETWORK LOCALIZATION 3 Optimization Problem Formulation υ opt := min x 1,...,x m xi x j 2 d 2 ij Objective function is nonconvex... Problem is NP-hard (reduction from PARTITION)... Use a convex (SDP, SOCP) relaxation.
5 SOCP RELAXATION OF SENSOR NETWORK LOCALIZATION 4 NP-hardness PARTITION: Given positive integers a 1, a 2,..., a n, a partition I 1, I 2 of {1,..., n} with i I 1 a i = i I 2 a i? (NP-complete) Reduction to our problem (d = 1) (Saxe 79): Let m = n 1, x n = 0, d n1 = a 1, d 12 = a 2,..., d n 1,n = a n [This extends to d 2] PARTITION yes υ opt = 0
6 SOCP RELAXATION OF SENSOR NETWORK LOCALIZATION 5 SDP Relaxation Let X := [x 1 x m ], A := [x m+1 x n ]. Then (Biswas,Ye 03) x i x j 2 = tr ( b ij b T ij [ X T X X T X I d ]) with b ij := [ ] Im 0 (e 0 A i e j ). Fact: [ Y X T X I d ] 0 has rank d Y = X T X
7 SOCP RELAXATION OF SENSOR NETWORK LOCALIZATION 6 Thus υ opt = Drop low-rank constraint: min X,Y s.t. Z = ( tr bij b T ijz ) d 2 ij [ Y X T X I d ] 0, rankz = d υ sdp := min X,Y s.t. Z = tr ( b ij b T ijz ) d 2 ij [ Y X T X I d ] 0 Biswas and Ye gave probabilistic interpretation of SDP soln, and proposed a distributed (domain partitioning) method for solving SDP when n > 100.
8 SOCP RELAXATION OF SENSOR NETWORK LOCALIZATION 7 SOCP Relaxation Second-order cone program (SOCP) is easier to solve than SDP. Q: Is SOCP relaxation a good approximation? Or a mixed SDP-SOCP relaxation? Q: How to efficiently solve SOCP relaxation?
9 SOCP RELAXATION OF SENSOR NETWORK LOCALIZATION 8 SOCP Relaxation υ opt = min x 1,...,x m,y ij yij d 2 ij s.t. y ij = x i x j 2 (i, j) A Relax = to constraint: υ socp = min x 1,...,x m,y ij yij d 2 ij s.t. y ij x i x j 2 (i, j) A y x 2 y + 1 (y 1, 2x) (also Doherty,Pister,El Ghaoui 03)
10 SOCP RELAXATION OF SENSOR NETWORK LOCALIZATION 9 Properties of SDP, SOCP Relaxations d = 2, n = 3, m = 1, d 12 = d 13 = 2 Problem: 0 = min x 1 =(α,β) R 2 (1 α) 2 + β ( 1 α) 2 + β 2 4
11 SOCP RELAXATION OF SENSOR NETWORK LOCALIZATION 10 SDP Relaxation: 0 = min y 2α 3 + y + 2α 3 x 1 =(α,β) R 2 y R s.t. y α β α β 0 1 If solve SDP by IP method, then likely get analy. center.
12 SOCP RELAXATION OF SENSOR NETWORK LOCALIZATION 11 SOCP Relaxation: 0 = min x 1 =(α,β) R 2 y,z R y 4 + z 4 s.t. y (1 α) 2 + β 2 z ( 1 α) 2 + β 2 If solve SOCP by IP method, then likely get analy. center.
13 SOCP RELAXATION OF SENSOR NETWORK LOCALIZATION 12 Properties of SDP & SOCP Relaxations Fact 1: υ socp υ sdp. If υ socp = υ sdp, then {SOCP (x 1,..., x m ) solns} {SDP (x 1,..., x m ) solns}. Fact 2: If (x 1,..., x m, y ij ) is the analytic center soln of SOCP, then x i conv {x j } j N (i) i m with N (i) := {j : (i, j) A}.
14 SOCP RELAXATION OF SENSOR NETWORK LOCALIZATION 13 Opt soln (m = 900, n = 1000, nhbrs if dist<.06) SOCP soln found by IP method (SeDuMi)
15 SOCP RELAXATION OF SENSOR NETWORK LOCALIZATION 14 Fact 3: If X = [x 1 x m ], Y is a relative-interior SDP soln (e.g., analytic center), then for each i, x i 2 = Y ii = x i appears in every SDP soln. Fact 4: If (x 1,..., x m, y ij ) is a relative-interior SOCP soln (e.g., analytic center), then for each i, x i x j 2 = y ij for some j N (i) x i appears in every SOCP soln.
16 SOCP RELAXATION OF SENSOR NETWORK LOCALIZATION 15 Error Bounds What if distances have errors? d 2 ij = d 2 ij + δ ij, where δ ij R and d ij := x true i x true j (x true i = x i i > m). Fact 5: If (x 1,..., x m, y ij ) is a relative-interior SOCP soln corresp. (d ij ) and δ ij δ, then for each i, x i x j 2 = y ij for some j N (i) = x i x true i = O( δ ij ). Fact 6: As δ ij 0, (analytic center SOCP soln corresp. (d ij ) ) (analytic center SOCP soln corresp. ( d ij ) ).
17 SOCP RELAXATION OF SENSOR NETWORK LOCALIZATION 16 Error bounds for the analytic center SOCP soln when distances have small errors.
18 SOCP RELAXATION OF SENSOR NETWORK LOCALIZATION 17 Solving SOCP Relaxation I: IP Method min x 1,...,x m,y ij yij d 2 ij s.t. y ij x i x j 2 (i, j) A Put into conic form: min u ij + v ij s.t. x i x j w ij = 0 (i, j) A y ij u ij + v ij = d 2 ij (i, j) A α ij = 1 2 (i, j) A u ij 0, v ij 0, (α ij, y ij, w ij ) Rcone d+2 (i, j) A with Rcone d+2 := {(α, y, w) R R R d : w 2 /2 αy}. Solve by an IP method, e.g., SeDuMi 1.05 (Sturm 01).
19 SOCP RELAXATION OF SENSOR NETWORK LOCALIZATION 18 Solving SOCP Relaxation II: Smoothing + Coordinate Gradient Descent So SOCP relaxation: min y z y d2 = max{0, z d 2 } min x 1,...,x m max{0, x i x j 2 d 2 ij} This is an unconstrained nonsmooth convex program. Smooth approximation: max{0, t} µh(t/µ) (µ > 0) h smooth convex, lim t h(t) = lim t h(t) t = 0. We use h(t) = ((t 2 + 4) 1/2 + t)/2 (CHKS).
20 SOCP RELAXATION OF SENSOR NETWORK LOCALIZATION 19 SOCP approximation: min f µ (x 1,.., x m ) := ( ) xi x j 2 d 2 ij µh µ Add a smoothed log-barrier term µ ( ( )) d 2 ij x i x j 2 log µh µ Solve the smooth approximation using coordinate gradient descent (SCGD): If xi f µ = Ω(µ), then update x i by moving it along the Newton direction [ 2 x i x i f µ ] 1 xi f µ, with Armijo stepsize rule, and re-iterate. Decrease µ when xi f µ = O(µ) i. µ init = 1e 5. µ end = 1e 9. Decrease µ by a factor of 10. Code in Fortran. Computation easily distributes.
21 SOCP RELAXATION OF SENSOR NETWORK LOCALIZATION 20 Simulation Results Uniformly generate x true 1,..., x true n in [0, 1] 2, m =.9n, two pts are nhbrs if dist< radiorange. Set d ij = x true i ɛ ij N(0, 1) (Biswas, Ye 03) Solve SOCP using SeDuMi 1.05 or SCGD. x true j max{0, 1 + ɛ ij nf}, Sensor i is uniquely positioned if x i x j 2 y ij 10 7 d ij for some j N (i).
22 SOCP RELAXATION OF SENSOR NETWORK LOCALIZATION 21 SeDuMi SCGD n m nf cpu/m up /Err up cpu/m up /Err up /402/7.2e-4.4/365/3.7e /473/1.8e-3 3.3/451/1.5e /554/1.5e-2 2.2/518/1.1e /1534/4.3e-4 1.3/1541/3.3e /1464/3.6e-3 6.8/1466/3.6e /1710/5.1e-2 3.7/1710/5.1e /2851/4.0e-4 2.5/2864/3.2e /2938/3.2e /2900/3.0e /3073/1.0e /3033/9.1e-3 Table 1: radiorange =.06(.035) for n = 1000, 2000(4000) cpu (sec) times are on a HP DL360 workstation, running Linux 3.5. m up := number of uniquely positioned sensors. Err up := max i uniq. pos. x i x true i.
23 SOCP RELAXATION OF SENSOR NETWORK LOCALIZATION 22 True positions of sensors (dots) and anchors (circles) (m = 900, n = 1000) SOCP soln found by SeDuMi and SCGD
24 SOCP RELAXATION OF SENSOR NETWORK LOCALIZATION 23 Mixed SDP-SOCP Relaxation Choose 0 l m. Let B := {(i, j) A : i l, j l}. min x 1,...,x m,y ij,y (i,j) B s.t. Z = tr ( b ij b T ijz ) d 2 ij + [ Y X T X I d ] 0 y ij d 2 ij \B X = [x 1 x l ] y ij x i x j 2 (i, j) A \ B with b ij := [ ] Il 0 0 (e 0 0 A i e j ). Easier to solve than SDP? As good a relaxation? Properties?
25 SOCP RELAXATION OF SENSOR NETWORK LOCALIZATION 24 Conclusions & Future Directions SOCP relaxation may be a good pre-processor. Faster methods for solving SOCP? Exploiting network structures of SOCP? (For d = 1, solvable by ɛ-relaxation method (Bertsekas,Polymenakos,T 97)) Error bound for SDP relaxation? Additional (convex) constraints? x i x j d ij 2? Other objective functions, e.g., Replace 2-norm by a p-norm (1 p )? p-order cone relaxation? Q: For any x 1,..., x n R d, does arg min x (1 < p < ) A: Yes for d 2. No for d 3. n x x i p p conv {x 1,..., x n }? i=1
SOCP Relaxation of Sensor Network Localization
SOCP Relaxation of Sensor Network Localization Paul Tseng Mathematics, University of Washington Seattle IWCSN, Simon Fraser University August 19, 2006 Abstract This is a talk given at Int. Workshop on
More informationOn SDP and ESDP Relaxation of Sensor Network Localization
On SDP and ESDP Relaxation of Sensor Network Localization Paul Tseng Mathematics, University of Washington Seattle IASI, Roma September 16, 2008 (joint work with Ting Kei Pong) ON SDP/ESDP RELAXATION OF
More informationOn SDP and ESDP Relaxation of Sensor Network Localization
On SDP and ESDP Relaxation of Sensor Network Localization Paul Tseng Mathematics, University of Washington Seattle Southern California Optimization Day, UCSD March 19, 2009 (joint work with Ting Kei Pong)
More informationThe Graph Realization Problem
The Graph Realization Problem via Semi-Definite Programming A. Y. Alfakih alfakih@uwindsor.ca Mathematics and Statistics University of Windsor The Graph Realization Problem p.1/21 The Graph Realization
More informationFrom Evolutionary Biology to Interior-Point Methods
From Evolutionary Biology to Interior-Point Methods Paul Tseng Mathematics, University of Washington Seattle WCOM, UBC Kelowna October 27, 2007 Joint work with Immanuel Bomze and Werner Schachinger (Univ.
More informationSDP Relaxation of Quadratic Optimization with Few Homogeneous Quadratic Constraints
SDP Relaxation of Quadratic Optimization with Few Homogeneous Quadratic Constraints Paul Tseng Mathematics, University of Washington Seattle LIDS, MIT March 22, 2006 (Joint work with Z.-Q. Luo, N. D. Sidiropoulos,
More informationProblem 1 (Exercise 2.2, Monograph)
MS&E 314/CME 336 Assignment 2 Conic Linear Programming January 3, 215 Prof. Yinyu Ye 6 Pages ASSIGNMENT 2 SOLUTIONS Problem 1 (Exercise 2.2, Monograph) We prove the part ii) of Theorem 2.1 (Farkas Lemma
More informationAccelerated Proximal Gradient Methods for Convex Optimization
Accelerated Proximal Gradient Methods for Convex Optimization Paul Tseng Mathematics, University of Washington Seattle MOPTA, University of Guelph August 18, 2008 ACCELERATED PROXIMAL GRADIENT METHODS
More informationNetwork Localization via Schatten Quasi-Norm Minimization
Network Localization via Schatten Quasi-Norm Minimization Anthony Man-Cho So Department of Systems Engineering & Engineering Management The Chinese University of Hong Kong (Joint Work with Senshan Ji Kam-Fung
More informationFurther Relaxations of the SDP Approach to Sensor Network Localization
Further Relaxations of the SDP Approach to Sensor Network Localization Zizhuo Wang, Song Zheng, Stephen Boyd, and Yinyu Ye September 27, 2006; Revised May 2, 2007 Abstract Recently, a semi-definite programming
More informationReconstruction of Block-Sparse Signals by Using an l 2/p -Regularized Least-Squares Algorithm
Reconstruction of Block-Sparse Signals by Using an l 2/p -Regularized Least-Squares Algorithm Jeevan K. Pant, Wu-Sheng Lu, and Andreas Antoniou University of Victoria May 21, 2012 Compressive Sensing 1/23
More informationRandomized Coordinate Descent Methods on Optimization Problems with Linearly Coupled Constraints
Randomized Coordinate Descent Methods on Optimization Problems with Linearly Coupled Constraints By I. Necoara, Y. Nesterov, and F. Glineur Lijun Xu Optimization Group Meeting November 27, 2012 Outline
More informationLine Search Methods for Unconstrained Optimisation
Line Search Methods for Unconstrained Optimisation Lecture 8, Numerical Linear Algebra and Optimisation Oxford University Computing Laboratory, MT 2007 Dr Raphael Hauser (hauser@comlab.ox.ac.uk) The Generic
More informationA direct formulation for sparse PCA using semidefinite programming
A direct formulation for sparse PCA using semidefinite programming A. d Aspremont, L. El Ghaoui, M. Jordan, G. Lanckriet ORFE, Princeton University & EECS, U.C. Berkeley Available online at www.princeton.edu/~aspremon
More informationPOLYNOMIAL OPTIMIZATION WITH SUMS-OF-SQUARES INTERPOLANTS
POLYNOMIAL OPTIMIZATION WITH SUMS-OF-SQUARES INTERPOLANTS Sercan Yıldız syildiz@samsi.info in collaboration with Dávid Papp (NCSU) OPT Transition Workshop May 02, 2017 OUTLINE Polynomial optimization and
More informationA direct formulation for sparse PCA using semidefinite programming
A direct formulation for sparse PCA using semidefinite programming A. d Aspremont, L. El Ghaoui, M. Jordan, G. Lanckriet ORFE, Princeton University & EECS, U.C. Berkeley A. d Aspremont, INFORMS, Denver,
More informationProject Discussions: SNL/ADMM, MDP/Randomization, Quadratic Regularization, and Online Linear Programming
Project Discussions: SNL/ADMM, MDP/Randomization, Quadratic Regularization, and Online Linear Programming Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305,
More informationAlgorithms for Nonsmooth Optimization
Algorithms for Nonsmooth Optimization Frank E. Curtis, Lehigh University presented at Center for Optimization and Statistical Learning, Northwestern University 2 March 2018 Algorithms for Nonsmooth Optimization
More informationSemidefinite Facial Reduction for Euclidean Distance Matrix Completion
Semidefinite Facial Reduction for Euclidean Distance Matrix Completion Nathan Krislock, Henry Wolkowicz Department of Combinatorics & Optimization University of Waterloo First Alpen-Adria Workshop on Optimization
More informationLecture 5: September 12
10-725/36-725: Convex Optimization Fall 2015 Lecture 5: September 12 Lecturer: Lecturer: Ryan Tibshirani Scribes: Scribes: Barun Patra and Tyler Vuong Note: LaTeX template courtesy of UC Berkeley EECS
More informationNumerisches Rechnen. (für Informatiker) M. Grepl P. Esser & G. Welper & L. Zhang. Institut für Geometrie und Praktische Mathematik RWTH Aachen
Numerisches Rechnen (für Informatiker) M. Grepl P. Esser & G. Welper & L. Zhang Institut für Geometrie und Praktische Mathematik RWTH Aachen Wintersemester 2011/12 IGPM, RWTH Aachen Numerisches Rechnen
More informationFurther Relaxations of the SDP Approach to Sensor Network Localization
Further Relaxations of the SDP Approach to Sensor Network Localization Zizhuo Wang, Song Zheng, Stephen Boyd, and inyu e September 27, 2006 Abstract Recently, a semidefinite programming (SDP) relaxation
More informationMathematical Optimization Models and Applications
Mathematical Optimization Models and Applications Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A. http://www.stanford.edu/ yyye Chapters 1, 2.1-2,
More informationl p -Norm Constrained Quadratic Programming: Conic Approximation Methods
OUTLINE l p -Norm Constrained Quadratic Programming: Conic Approximation Methods Wenxun Xing Department of Mathematical Sciences Tsinghua University, Beijing Email: wxing@math.tsinghua.edu.cn OUTLINE OUTLINE
More informationAcyclic Semidefinite Approximations of Quadratically Constrained Quadratic Programs
Acyclic Semidefinite Approximations of Quadratically Constrained Quadratic Programs Raphael Louca & Eilyan Bitar School of Electrical and Computer Engineering American Control Conference (ACC) Chicago,
More informationIntroducing PENLAB a MATLAB code for NLP-SDP
Introducing PENLAB a MATLAB code for NLP-SDP Michal Kočvara School of Mathematics, The University of Birmingham jointly with Jan Fiala Numerical Algorithms Group Michael Stingl University of Erlangen-Nürnberg
More informationDimension reduction for semidefinite programming
1 / 22 Dimension reduction for semidefinite programming Pablo A. Parrilo Laboratory for Information and Decision Systems Electrical Engineering and Computer Science Massachusetts Institute of Technology
More informationPart 2: Linesearch methods for unconstrained optimization. Nick Gould (RAL)
Part 2: Linesearch methods for unconstrained optimization Nick Gould (RAL) minimize x IR n f(x) MSc course on nonlinear optimization UNCONSTRAINED MINIMIZATION minimize x IR n f(x) where the objective
More informationSolution to EE 617 Mid-Term Exam, Fall November 2, 2017
Solution to EE 67 Mid-erm Exam, Fall 207 November 2, 207 EE 67 Solution to Mid-erm Exam - Page 2 of 2 November 2, 207 (4 points) Convex sets (a) (2 points) Consider the set { } a R k p(0) =, p(t) for t
More informationE5295/5B5749 Convex optimization with engineering applications. Lecture 8. Smooth convex unconstrained and equality-constrained minimization
E5295/5B5749 Convex optimization with engineering applications Lecture 8 Smooth convex unconstrained and equality-constrained minimization A. Forsgren, KTH 1 Lecture 8 Convex optimization 2006/2007 Unconstrained
More informationLeader selection in consensus networks
Leader selection in consensus networks Mihailo Jovanović www.umn.edu/ mihailo joint work with: Makan Fardad Fu Lin 2013 Allerton Conference Consensus with stochastic disturbances 1 RELATIVE INFORMATION
More informationInterior Point Algorithms for Constrained Convex Optimization
Interior Point Algorithms for Constrained Convex Optimization Chee Wei Tan CS 8292 : Advanced Topics in Convex Optimization and its Applications Fall 2010 Outline Inequality constrained minimization problems
More informationELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications
ELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications Professor M. Chiang Electrical Engineering Department, Princeton University March
More informationAdvances in Convex Optimization: Theory, Algorithms, and Applications
Advances in Convex Optimization: Theory, Algorithms, and Applications Stephen Boyd Electrical Engineering Department Stanford University (joint work with Lieven Vandenberghe, UCLA) ISIT 02 ISIT 02 Lausanne
More informationIntroduction. New Nonsmooth Trust Region Method for Unconstraint Locally Lipschitz Optimization Problems
New Nonsmooth Trust Region Method for Unconstraint Locally Lipschitz Optimization Problems Z. Akbari 1, R. Yousefpour 2, M. R. Peyghami 3 1 Department of Mathematics, K.N. Toosi University of Technology,
More informationConvex Optimization M2
Convex Optimization M2 Lecture 3 A. d Aspremont. Convex Optimization M2. 1/49 Duality A. d Aspremont. Convex Optimization M2. 2/49 DMs DM par email: dm.daspremont@gmail.com A. d Aspremont. Convex Optimization
More informationCoordinate Update Algorithm Short Course Subgradients and Subgradient Methods
Coordinate Update Algorithm Short Course Subgradients and Subgradient Methods Instructor: Wotao Yin (UCLA Math) Summer 2016 1 / 30 Notation f : H R { } is a closed proper convex function domf := {x R n
More informationApproximation Algorithms
Approximation Algorithms Chapter 26 Semidefinite Programming Zacharias Pitouras 1 Introduction LP place a good lower bound on OPT for NP-hard problems Are there other ways of doing this? Vector programs
More informationA derivative-free nonmonotone line search and its application to the spectral residual method
IMA Journal of Numerical Analysis (2009) 29, 814 825 doi:10.1093/imanum/drn019 Advance Access publication on November 14, 2008 A derivative-free nonmonotone line search and its application to the spectral
More informationAn unconstrained smooth minimization reformulation of the second-order cone complementarity problem
Math. Program., Ser. B 104, 93 37 005 Digital Object Identifier DOI 10.1007/s10107-005-0617-0 Jein-Shan Chen Paul Tseng An unconstrained smooth minimization reformulation of the second-order cone complementarity
More informationA PARALLEL INTERIOR POINT DECOMPOSITION ALGORITHM FOR BLOCK-ANGULAR SEMIDEFINITE PROGRAMS IN POLYNOMIAL OPTIMIZATION
A PARALLEL INTERIOR POINT DECOMPOSITION ALGORITHM FOR BLOCK-ANGULAR SEMIDEFINITE PROGRAMS IN POLYNOMIAL OPTIMIZATION Kartik K. Sivaramakrishnan Department of Mathematics North Carolina State University
More informationA PARALLEL INTERIOR POINT DECOMPOSITION ALGORITHM FOR BLOCK-ANGULAR SEMIDEFINITE PROGRAMS
A PARALLEL INTERIOR POINT DECOMPOSITION ALGORITHM FOR BLOCK-ANGULAR SEMIDEFINITE PROGRAMS Kartik K. Sivaramakrishnan Department of Mathematics North Carolina State University kksivara@ncsu.edu http://www4.ncsu.edu/
More informationHilbert s 17th Problem to Semidefinite Programming & Convex Algebraic Geometry
Hilbert s 17th Problem to Semidefinite Programming & Convex Algebraic Geometry Rekha R. Thomas University of Washington, Seattle References Monique Laurent, Sums of squares, moment matrices and optimization
More informationConvex Optimization Algorithms for Machine Learning in 10 Slides
Convex Optimization Algorithms for Machine Learning in 10 Slides Presenter: Jul. 15. 2015 Outline 1 Quadratic Problem Linear System 2 Smooth Problem Newton-CG 3 Composite Problem Proximal-Newton-CD 4 Non-smooth,
More informationSequential convex programming,: value function and convergence
Sequential convex programming,: value function and convergence Edouard Pauwels joint work with Jérôme Bolte Journées MODE Toulouse March 23 2016 1 / 16 Introduction Local search methods for finite dimensional
More informationTutorial on Convex Optimization: Part II
Tutorial on Convex Optimization: Part II Dr. Khaled Ardah Communications Research Laboratory TU Ilmenau Dec. 18, 2018 Outline Convex Optimization Review Lagrangian Duality Applications Optimal Power Allocation
More informationA Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear Programming
A Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear Programming Zhaosong Lu Lin Xiao March 9, 2015 (Revised: May 13, 2016; December 30, 2016) Abstract We propose
More informationLecture 4: January 26
10-725/36-725: Conve Optimization Spring 2015 Lecturer: Javier Pena Lecture 4: January 26 Scribes: Vipul Singh, Shinjini Kundu, Chia-Yin Tsai Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer:
More informationA NEW SECOND-ORDER CONE PROGRAMMING RELAXATION FOR MAX-CUT PROBLEMS
Journal of the Operations Research Society of Japan 2003, Vol. 46, No. 2, 164-177 2003 The Operations Research Society of Japan A NEW SECOND-ORDER CONE PROGRAMMING RELAXATION FOR MAX-CUT PROBLEMS Masakazu
More informationFacial reduction for Euclidean distance matrix problems
Facial reduction for Euclidean distance matrix problems Nathan Krislock Department of Mathematical Sciences Northern Illinois University, USA Distance Geometry: Theory and Applications DIMACS Center, Rutgers
More informationA Gradient Search Method to Round the Semidefinite Programming Relaxation Solution for Ad Hoc Wireless Sensor Network Localization
A Gradient Search Method to Round the Semidefinite Programming Relaxation Solution for Ad Hoc Wireless Sensor Network Localization Tzu-Chen Liang, Ta-Chung Wang, and Yinyu Ye August 28, 2004; Updated October
More informationSecond-order cone programming
Outline Second-order cone programming, PhD Lehigh University Department of Industrial and Systems Engineering February 10, 2009 Outline 1 Basic properties Spectral decomposition The cone of squares The
More informationAn accelerated proximal gradient algorithm for nuclear norm regularized linear least squares problems
An accelerated proximal gradient algorithm for nuclear norm regularized linear least squares problems Kim-Chuan Toh Sangwoon Yun March 27, 2009; Revised, Nov 11, 2009 Abstract The affine rank minimization
More informationINTERIOR-POINT METHODS ROBERT J. VANDERBEI JOINT WORK WITH H. YURTTAN BENSON REAL-WORLD EXAMPLES BY J.O. COLEMAN, NAVAL RESEARCH LAB
1 INTERIOR-POINT METHODS FOR SECOND-ORDER-CONE AND SEMIDEFINITE PROGRAMMING ROBERT J. VANDERBEI JOINT WORK WITH H. YURTTAN BENSON REAL-WORLD EXAMPLES BY J.O. COLEMAN, NAVAL RESEARCH LAB Outline 2 Introduction
More informationarxiv: v1 [math.oc] 9 Sep 2013
Linearly Convergent First-Order Algorithms for Semi-definite Programming Cong D. Dang, Guanghui Lan arxiv:1309.2251v1 [math.oc] 9 Sep 2013 October 31, 2018 Abstract In this paper, we consider two formulations
More information12. Interior-point methods
12. Interior-point methods Convex Optimization Boyd & Vandenberghe inequality constrained minimization logarithmic barrier function and central path barrier method feasibility and phase I methods complexity
More informationCSCI 1951-G Optimization Methods in Finance Part 10: Conic Optimization
CSCI 1951-G Optimization Methods in Finance Part 10: Conic Optimization April 6, 2018 1 / 34 This material is covered in the textbook, Chapters 9 and 10. Some of the materials are taken from it. Some of
More informationBlock Coordinate Descent for Regularized Multi-convex Optimization
Block Coordinate Descent for Regularized Multi-convex Optimization Yangyang Xu and Wotao Yin CAAM Department, Rice University February 15, 2013 Multi-convex optimization Model definition Applications Outline
More informationSparse Covariance Selection using Semidefinite Programming
Sparse Covariance Selection using Semidefinite Programming A. d Aspremont ORFE, Princeton University Joint work with O. Banerjee, L. El Ghaoui & G. Natsoulis, U.C. Berkeley & Iconix Pharmaceuticals Support
More informationLecture 3. Optimization Problems and Iterative Algorithms
Lecture 3 Optimization Problems and Iterative Algorithms January 13, 2016 This material was jointly developed with Angelia Nedić at UIUC for IE 598ns Outline Special Functions: Linear, Quadratic, Convex
More informationA PARALLEL TWO-STAGE INTERIOR POINT DECOMPOSITION ALGORITHM FOR LARGE SCALE STRUCTURED SEMIDEFINITE PROGRAMS
A PARALLEL TWO-STAGE INTERIOR POINT DECOMPOSITION ALGORITHM FOR LARGE SCALE STRUCTURED SEMIDEFINITE PROGRAMS Kartik K. Sivaramakrishnan Research Division Axioma, Inc., Atlanta kksivara@axiomainc.com http://www4.ncsu.edu/
More informationLecture 15 Newton Method and Self-Concordance. October 23, 2008
Newton Method and Self-Concordance October 23, 2008 Outline Lecture 15 Self-concordance Notion Self-concordant Functions Operations Preserving Self-concordance Properties of Self-concordant Functions Implications
More informationStreamSVM Linear SVMs and Logistic Regression When Data Does Not Fit In Memory
StreamSVM Linear SVMs and Logistic Regression When Data Does Not Fit In Memory S.V. N. (vishy) Vishwanathan Purdue University and Microsoft vishy@purdue.edu October 9, 2012 S.V. N. Vishwanathan (Purdue,
More informationMore First-Order Optimization Algorithms
More First-Order Optimization Algorithms Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A. http://www.stanford.edu/ yyye Chapters 3, 8, 3 The SDM
More informationLecture: Introduction to LP, SDP and SOCP
Lecture: Introduction to LP, SDP and SOCP Zaiwen Wen Beijing International Center For Mathematical Research Peking University http://bicmr.pku.edu.cn/~wenzw/bigdata2015.html wenzw@pku.edu.cn Acknowledgement:
More informationMultivariate Normal Models
Case Study 3: fmri Prediction Graphical LASSO Machine Learning/Statistics for Big Data CSE599C1/STAT592, University of Washington Emily Fox February 26 th, 2013 Emily Fox 2013 1 Multivariate Normal Models
More informationBlock stochastic gradient update method
Block stochastic gradient update method Yangyang Xu and Wotao Yin IMA, University of Minnesota Department of Mathematics, UCLA November 1, 2015 This work was done while in Rice University 1 / 26 Stochastic
More informationRecovery of Sparse Signals from Noisy Measurements Using an l p -Regularized Least-Squares Algorithm
Recovery of Sparse Signals from Noisy Measurements Using an l p -Regularized Least-Squares Algorithm J. K. Pant, W.-S. Lu, and A. Antoniou University of Victoria August 25, 2011 Compressive Sensing 1 University
More informationOptimal Power Flow. S. Bose, M. Chandy, M. Farivar, D. Gayme S. Low. C. Clarke. Southern California Edison. Caltech. March 2012
Optimal Power Flow over Radial Networks S. Bose, M. Chandy, M. Farivar, D. Gayme S. Low Caltech C. Clarke Southern California Edison March 2012 Outline Motivation Semidefinite relaxation Bus injection
More informationOn the interior of the simplex, we have the Hessian of d(x), Hd(x) is diagonal with ith. µd(w) + w T c. minimize. subject to w T 1 = 1,
Math 30 Winter 05 Solution to Homework 3. Recognizing the convexity of g(x) := x log x, from Jensen s inequality we get d(x) n x + + x n n log x + + x n n where the equality is attained only at x = (/n,...,
More informationmin f(x). (2.1) Objectives consisting of a smooth convex term plus a nonconvex regularization term;
Chapter 2 Gradient Methods The gradient method forms the foundation of all of the schemes studied in this book. We will provide several complementary perspectives on this algorithm that highlight the many
More informationMath 273a: Optimization Subgradients of convex functions
Math 273a: Optimization Subgradients of convex functions Made by: Damek Davis Edited by Wotao Yin Department of Mathematics, UCLA Fall 2015 online discussions on piazza.com 1 / 42 Subgradients Assumptions
More informationGeneralized Conditional Gradient and Its Applications
Generalized Conditional Gradient and Its Applications Yaoliang Yu University of Alberta UBC Kelowna, 04/18/13 Y-L. Yu (UofA) GCG and Its Apps. UBC Kelowna, 04/18/13 1 / 25 1 Introduction 2 Generalized
More informationLow-Rank Factorization Models for Matrix Completion and Matrix Separation
for Matrix Completion and Matrix Separation Joint work with Wotao Yin, Yin Zhang and Shen Yuan IPAM, UCLA Oct. 5, 2010 Low rank minimization problems Matrix completion: find a low-rank matrix W R m n so
More informationNon-negative Matrix Factorization via accelerated Projected Gradient Descent
Non-negative Matrix Factorization via accelerated Projected Gradient Descent Andersen Ang Mathématique et recherche opérationnelle UMONS, Belgium Email: manshun.ang@umons.ac.be Homepage: angms.science
More informationMultivariate Normal Models
Case Study 3: fmri Prediction Coping with Large Covariances: Latent Factor Models, Graphical Models, Graphical LASSO Machine Learning for Big Data CSE547/STAT548, University of Washington Emily Fox February
More informationUniversal Rigidity and Edge Sparsification for Sensor Network Localization
Universal Rigidity and Edge Sparsification for Sensor Network Localization Zhisu Zhu Anthony Man Cho So Yinyu Ye September 23, 2009 Abstract Owing to their high accuracy and ease of formulation, there
More information8 Numerical methods for unconstrained problems
8 Numerical methods for unconstrained problems Optimization is one of the important fields in numerical computation, beside solving differential equations and linear systems. We can see that these fields
More informationMatrix Factorizations: A Tale of Two Norms
Matrix Factorizations: A Tale of Two Norms Nati Srebro Toyota Technological Institute Chicago Maximum Margin Matrix Factorization S, Jason Rennie, Tommi Jaakkola (MIT), NIPS 2004 Rank, Trace-Norm and Max-Norm
More informationSelf-Calibration and Biconvex Compressive Sensing
Self-Calibration and Biconvex Compressive Sensing Shuyang Ling Department of Mathematics, UC Davis July 12, 2017 Shuyang Ling (UC Davis) SIAM Annual Meeting, 2017, Pittsburgh July 12, 2017 1 / 22 Acknowledgements
More informationApproximate Farkas Lemmas in Convex Optimization
Approximate Farkas Lemmas in Convex Optimization Imre McMaster University Advanced Optimization Lab AdvOL Graduate Student Seminar October 25, 2004 1 Exact Farkas Lemma Motivation 2 3 Future plans The
More informationAgenda. Interior Point Methods. 1 Barrier functions. 2 Analytic center. 3 Central path. 4 Barrier method. 5 Primal-dual path following algorithms
Agenda Interior Point Methods 1 Barrier functions 2 Analytic center 3 Central path 4 Barrier method 5 Primal-dual path following algorithms 6 Nesterov Todd scaling 7 Complexity analysis Interior point
More informationLecture 4: Linear and quadratic problems
Lecture 4: Linear and quadratic problems linear programming examples and applications linear fractional programming quadratic optimization problems (quadratically constrained) quadratic programming second-order
More informationOptimization. Benjamin Recht University of California, Berkeley Stephen Wright University of Wisconsin-Madison
Optimization Benjamin Recht University of California, Berkeley Stephen Wright University of Wisconsin-Madison optimization () cost constraints might be too much to cover in 3 hours optimization (for big
More informationUnconstrained Optimization
1 / 36 Unconstrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University February 2, 2015 2 / 36 3 / 36 4 / 36 5 / 36 1. preliminaries 1.1 local approximation
More informationLecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem
Lecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem Michael Patriksson 0-0 The Relaxation Theorem 1 Problem: find f := infimum f(x), x subject to x S, (1a) (1b) where f : R n R
More informationModern Optimization Techniques
Modern Optimization Techniques 2. Unconstrained Optimization / 2.2. Stochastic Gradient Descent Lars Schmidt-Thieme Information Systems and Machine Learning Lab (ISMLL) Institute of Computer Science University
More informationPDE-Constrained and Nonsmooth Optimization
Frank E. Curtis October 1, 2009 Outline PDE-Constrained Optimization Introduction Newton s method Inexactness Results Summary and future work Nonsmooth Optimization Sequential quadratic programming (SQP)
More informationEE 227A: Convex Optimization and Applications October 14, 2008
EE 227A: Convex Optimization and Applications October 14, 2008 Lecture 13: SDP Duality Lecturer: Laurent El Ghaoui Reading assignment: Chapter 5 of BV. 13.1 Direct approach 13.1.1 Primal problem Consider
More informationA Brief Overview of Practical Optimization Algorithms in the Context of Relaxation
A Brief Overview of Practical Optimization Algorithms in the Context of Relaxation Zhouchen Lin Peking University April 22, 2018 Too Many Opt. Problems! Too Many Opt. Algorithms! Zero-th order algorithms:
More informationAn Extended Frank-Wolfe Method, with Application to Low-Rank Matrix Completion
An Extended Frank-Wolfe Method, with Application to Low-Rank Matrix Completion Robert M. Freund, MIT joint with Paul Grigas (UC Berkeley) and Rahul Mazumder (MIT) CDC, December 2016 1 Outline of Topics
More informationNon-convex optimization. Issam Laradji
Non-convex optimization Issam Laradji Strongly Convex Objective function f(x) x Strongly Convex Objective function Assumptions Gradient Lipschitz continuous f(x) Strongly convex x Strongly Convex Objective
More informationConvex Optimization M2
Convex Optimization M2 Lecture 8 A. d Aspremont. Convex Optimization M2. 1/57 Applications A. d Aspremont. Convex Optimization M2. 2/57 Outline Geometrical problems Approximation problems Combinatorial
More informationLECTURE 25: REVIEW/EPILOGUE LECTURE OUTLINE
LECTURE 25: REVIEW/EPILOGUE LECTURE OUTLINE CONVEX ANALYSIS AND DUALITY Basic concepts of convex analysis Basic concepts of convex optimization Geometric duality framework - MC/MC Constrained optimization
More informationFirst-order methods for structured nonsmooth optimization
First-order methods for structured nonsmooth optimization Sangwoon Yun Department of Mathematics Education Sungkyunkwan University Oct 19, 2016 Center for Mathematical Analysis & Computation, Yonsei University
More informationA Convex Approach for Designing Good Linear Embeddings. Chinmay Hegde
A Convex Approach for Designing Good Linear Embeddings Chinmay Hegde Redundancy in Images Image Acquisition Sensor Information content
More informationConvex Optimization and l 1 -minimization
Convex Optimization and l 1 -minimization Sangwoon Yun Computational Sciences Korea Institute for Advanced Study December 11, 2009 2009 NIMS Thematic Winter School Outline I. Convex Optimization II. l
More informationWhat s New in Active-Set Methods for Nonlinear Optimization?
What s New in Active-Set Methods for Nonlinear Optimization? Philip E. Gill Advances in Numerical Computation, Manchester University, July 5, 2011 A Workshop in Honor of Sven Hammarling UCSD Center for
More informationA Julia JuMP-based module for polynomial optimization with complex variables applied to ACOPF
JuMP-dev workshop 018 A Julia JuMP-based module for polynomial optimization with complex variables applied to ACOPF Gilles Bareilles, Manuel Ruiz, Julie Sliwak 1. MathProgComplex.jl: A toolbox for Polynomial
More informationGradient methods for minimizing composite functions
Math. Program., Ser. B 2013) 140:125 161 DOI 10.1007/s10107-012-0629-5 FULL LENGTH PAPER Gradient methods for minimizing composite functions Yu. Nesterov Received: 10 June 2010 / Accepted: 29 December
More information