Tutorial on Convex Optimization: Part II
|
|
- Darren Nash
- 5 years ago
- Views:
Transcription
1 Tutorial on Convex Optimization: Part II Dr. Khaled Ardah Communications Research Laboratory TU Ilmenau Dec. 18, 2018
2 Outline Convex Optimization Review Lagrangian Duality Applications Optimal Power Allocation for Rate Maximization Downlink Beamforming as SDP and SOCP Uplink-Downlink Duality via Lagrangian Duality Disciplined convex programming and CVX
3 Convex Optimization Review Mathematical optimization problem (P) min x s.t. f 0 (x) f i (x) 0, i = 1,..., m h j (x) = 0, j = 1,..., p Variable x R n, domain (set) X = {x f i (x) 0, i, h j (x) = 0, j} P is convex if: objective and domain X are convex, h j (x) are affine P is still convex if: min max, f i (x) 0, f i (x) are all concave feasible solution x: p = f 0(x), x X, local optimal solution x: p = f 0( x) f 0(x), x x ɛ, x, x X global optimal solution x : p = f 0(x ) f 0(x), x, x X if P is convex, then local optimal solution x is global optimal x.
4 Lagrangian Duality Mathematical optimization problem (P) (not necessarily convex) min x s.t. f 0 (x) f i (x) 0, i = 1,..., m h j (x) = 0, j = 1,..., p Variable x R n, domain X, optimal value p = f 0 (x ) Lagrangian function L (named after Joseph-Louis Lagrange 1811) m p L(x, λ, ν) = f 0 (x) + λ i f i (x) + ν j h j (x) i=1 j=1 L is a weighted sum of objective and constraint functions λ i is the Lagrange multiplier associated with f i (x) 0 ν j is the Lagrange multiplier associated with h j (x) = 0 other names: weights, penalties, prices,...
5 Lagrange dual function (problem) Lagrange dual function g(λ, ν) = min L(x, λ, ν) = min x x ( f 0 (x) + m λ i f i (x) + i=1 ) p ν i h j (x) For any given x, g(λ, ν) is a pointwise minimum of a family of linear functions (in (λ, ν)), thus g(λ, ν) is always a concave function We say that (λ, ν) is dual feasible if λ 0 and g(λ, ν) is finite g(λ, ν) f 0(x) (proof) This means: for any dual feasible vector (λ, ν), the dual function always serves as a lower bound. j=1
6 Lagrange dual function (problem) Dual problem (D) max (λ,ν) g(λ, ν) s.t. λ 0, Variable: vector (λ, ν), optimal value d = g(λ, ν ) Dual problem D is always convex regardless of the convexity of the original (primal) problem P Duality gap: e = p d In general, e > 0, i.e., there is a gap between primal and dual values If primal problem P is convex, strong duality holds and thus e = 0
7 Optimality Conditions The necessary conditions for x to be a (local) optimal solution to primal problem P is that there exists some (λ, ν) such that Primal feasibility conditions f i (x ) 0, i = 1,..., m h j (x ) = 0, j = 1,..., p Dual feasibility condition λ 0 Complementary slackness condition λ i f i (x ) = 0, i = 1,..., m First order optimality xl(x, λ, ν ) = xf 0(x ) + m i=1 λ i xf i (x ) + p j=1 ν i xh j (x ) = 0
8 Optimality Conditions The optimality conditions provided above are called the Karush-Kuhn-Tucker (KKT) conditions In general, the KKT conditions are necessarily, but not sufficient If problem is convex, the KKT conditions are sufficient Remark For unconstrained optimization problem, KKT conditions reduces to only first order Optimality condition xf 0(x ) = 0. The local optimal must be attained at a stationary point For constraint optimization problems, the (local) optimal is no longer attained at a stationary point, instead, it is attained at a KKT point.
9 Example Solve the following problem min x 2 + y 2 + 2z 2 s.t. 2x + 2y 4z 8
10 Example Solve the following problem min x 2 + y 2 + 2z 2 s.t. 2x + 2y 4z 8 The Lagrangian L(x, y, z, λ) = x 2 + y 2 + 2z 2 + λ(8 2x 2y + 4z) Dual function g(λ) = min x,y,z L(x, y, z, λ) λ = 2x 2λ = 0 x λ = 2y 2λ = 0 y λ = 4z + 4λ = 0 z x = y = λ, z = λ Substitute x = y = λ, z = λ into 2x + 2y 4z = 8, we get λ = 1 What is the dual problem? Are the optimality conditions satisfied?
11 Example Least-norm solution of linear equations min x x T x s.t. Ax = b, Recall optimal solution is x = A + p = (A T A) 1 A T b If A is very large, this solution cannot be used. Lagrangian function L(x, λ) = x T x + λ T (Ax b) Dual function g(λ) = min x L(x, λ) min x L(x, λ) = xl(x, λ) = 2x + A T λ = 0 x = 1 2 AT λ x here minimizes L(x, λ) g(λ) = L( 1 2 AT λ, λ) = 1 4 λt AA T λ b T λ, concave function of λ lower bound property: p 1 4 λt AA T λ b T λ, λ
12 Example Stranded form LP min x s.t. c T x Ax = b Lagrangian function L(x, λ) = c T x + λ T (Ax b) = b T λ + (c + A T λ) T x which is affine in x Dual function g(λ) = min x L(x, λ) = { b T λ c + A T λ = 0 otherwise which is linear on affine domain {λ c + A T λ = 0} lower bound property: p b T λ if c + A T λ 0
13 Optimal Power Allocation for Rate Maximization
14 Optimal Power Allocation for Rate Maximization Assume that we have n channels, where the i-th channel gain is α i For each channel i, the transmit power is given by p i The SNR of channel i is given as (where σ is noise power) Γ i = α ip i σ The rate of channel i is then given as r i = log(1 + Γ i ) Our problem: find power allocation vector p = [p 1,..., p n ] T that maximizes the sum rate subject to maximum power constraint, i.e., max p s.t. n r i = i=1 n i=1 n p i = p max i=1 p i 0, i ( log 1 + α ip i σ )
15 Optimal Power Allocation for Rate Maximization L(p, λ, ν) = n i=1 log ( 1 + α i p i Take gradient wrt p i, we have pi L(p, λ, ν) = α i p i σ σ ) + µ( n i=1 p i p max ) n i=1 λ ip i αi σ + µ λ i = 0. Thus, we have µ = α i σ+α i p i + λ i From the complementary slackness condition, we have λ i p i = 0 Case 1: λ i = 0 and p i > 0 thus α i µ = σ + α i p i p i = 1 µ σ α i, where 1 µ σ α i Case 2: p i = 0 and λ i > 0, thus µ = α i σ + λ i λ i = µ α i σ µ > α i σ = 1 µ < σ α i
16 Optimal Power Allocation for Rate Maximization From above, we have optimal power allocation as p i = max{ 1 µ σ [ 1, 0} = α i µ σ ] + α i We find µ such that n [ 1 µ σ ] + = pmax α i i=1 Remark: if α i increases σ α i decreases p i increases. Can we draw a diagram illustrating the above relation?
17 Downlink Beamforming as SDP and SOCP
18 Downlink Beamforming as SDP and SOCP A wireless network consisting of one Tx and K Rxs The Tx has N antennas, while each Rx has one antenna Problem: minimize the transmit power subject to SINR targets. First, the received signal at k-th Rx is y k = K k h H k w k s k + n k = h H k w k s k + h H k w j s j + n }{{}}{{} k j k desiredsignal }{{} noise interference y k C is the received signal h j C N is the channel between Tx and j-th Rx w j C N is the transmit beamforming for the j-th Rx s k s k = 1, while s k s j = 0 Thus, the SINR at Rx k is given as h H k Γ k = w k 2 j k hh k w j 2 + σ
19 Downlink Beamforming as SDP and SOCP The SINR at Rx k is given as h H k Γ k = w k 2 j k hh k w j 2 + σ The QoS constraints requires that Γ k γ k, k Mathematical optimization problem (nonconvex) w k 2 min w k, k s.t. k h H k w k 2 j k hh k w j 2 + σ γ k, k Note that the transmit power is represented by w k 2 = p k In other problems, you may want to design the beamforming direction and the beamforming power independently where w k 2 = 1 w k = p k w k
20 Downlink Beamforming as SDP and SOCP Solve the above problem using the relaxed SDP w k 2 = w H k w k = Tr(w k w H k ) = Tr(W k ), where W k C N N h H k w k 2 = (h H k w k ) H (h H k w k ) = w H k h k h H k w k = Tr(w k w H k h k h H k ) = Tr(W k H k ) where H k C N N W k and H k are both rank-one matrices Rearrange the SINR constraints as h H k w k 2 j k hh k w j 2 + σ γ k h H k w k 2 γ k ( j k Modify the SINR constraints using the above results ( ) Tr(W k H k ) γ k Tr(H k W j ) + σ j k ) h H k w j 2 + σ
21 Downlink Beamforming as SDP and SOCP The original problem can be written as (nonconvex) min w k, k s.t. Tr(W k ) k Tr(W k H k ) γ k ( W k 0 rank(w k ) = 1, k j k ) Tr(H k W j ) + σ, k The above problem is still nonconvex, due to rank-one constraint Ignoring the rank-one constraint, the problem becomes a relaxed SDP, which is convex
22 Downlink Beamforming as SDP and SOCP Based on observation that an arbitrary phase rotation can be added to the beamforming vectors without effecting the SINR functions Thus, h H k w k can be chosen to be real without the loss of generality. Let W = [w 1,... w K ] C N K The SINR constrains become (1 1 γ k ) h H k w k 2 [ ] h H σ k W 2 Because h H k w k can be assumed real, we can take square root as ( 1 1 ) [ ] h H h H k w k σ k W γ k which is a second-order cone constraint The original problem can be written as SOCP (convex) ( min w k 2 s.t. 1 1 ) [ ] h H h H k w k σ k W w k, k γ k k
23 Uplink-Downlink Duality via Lagrangian Duality
24 Uplink-Downlink Duality via Lagrangian Duality In engineering design, we are not only in the numerical solution to the problem, but also to the structure of the optimal solution. The Lagrangian Duality of the original problem often reveals such structure. Uplink-Downlink Duality: refers to the fact that the total transmit power required to satisfy a certain SINR constrains in the downlink is equal to the total transmit power required to satisfy a certain SINR constrains in the uplink p k = q k i=1 i=1 where p k is the downlink power and q k is the uplink power Note that, p k does not have be equal to q k. It is the sum of powers that is equal!
25 Uplink-Downlink Duality via Lagrangian Duality The original optimization problem is min w k, k s.t. w k 2 = wk H w k k k h H k w k 2 j k hh k w j 2 + σ γ k, k Lagrangian function L(w k, λ k ) = k w H k w k k λ k ( 1 γ k h H k w k 2 j k ) h H k w j 2 σ = λ k σ + k k [ Note that I N + j k λ jh j h H j w H k [ I N + λ j h j h H j j k ] λ k γ k h k h H k λ ] k h k h H k w k γ k
26 Uplink-Downlink Duality via Lagrangian Duality The dual optimization problem max λ k, k s.t. λ k σ k λ j h j h H j j + I N (1 + 1 γ k ) λ k h k h H k, k
27 Uplink-Downlink Duality via Lagrangian Duality Let us consider now the uplink problem The received uplink signal at Rx with regards to Tx k y k = K wk H h k q k s k + n k = wk H h k q k s k + wk H h j q j s j + wk H n k }{{}}{{} k j k desiredsignal }{{} noise interference SINR of Tx k at uplink is given as q k h Γ H k k = w k 2 j k q j h H j w k 2 + w k H w kσ The mathematical optimization problem min q k, k s.t. k q k q k h H k w k 2 j k q j h H j w k 2 + w H k w kσ γ k, k
28 Uplink-Downlink Duality via Lagrangian Duality The optimal beamforming direction w k is given by the MMSE as ( 1hk w k = ρ q j h j h H j + σi) where ρ is a normalization parameter so that w k = 1 j After substituting w k into the SINR constraints and rearranging it, we have min q k, k s.t. k q k q j h j h H j j + σi N (1 + 1 γ k ) q k h k h H k, k
29 Uplink-Downlink Duality via Lagrangian Duality The dual optimization problem of downlink problem max λ k, k s.t. λ k σ k λ j h j h H j j + I N (1 + 1 γ k ) λ k h k h H k, k The uplink optimization problem min q k s.t. k q k q j h j h H j j + σi N (1 + 1 γ k ) q k h k h H k, k Note that q k = λ k σ. Thus, Both problems are identical, except that the maximization, the minimization, and the SINR constraint are reversed
30 Disciplined convex programming and CVX
31 Disciplined convex programming and CVX LP solvers lots available (GLPK, Excel, Matlab s linprog,... ) Cone solvers typically handle (combinations of) LP, SOCP, SDP cones several available (SDPT3, SeDuMi, CSDP,... ) general convex solvers some available (CVXOPT, MOSEK,... ) could write your own use some tricks to transform the problem into an equivalent one that has a standard form (e.g., LP, SDP) modeling systems can partly automate this step
32 CVX runs in Matlab, between the cvx begin and cvx end commands relies on SDPT3 or SeDuMi (LP/SOCP/SDP) solvers refer to user guide, online help for more info the CVX example library has more than a hundred examples
33 Example: Constrained norm minimization between cvx begin and cvx end, x is a CVX variable statement subject to does nothing, but can be added for readability inequalities are treated element wise
34 What CVX does after cvx end, CVX solver will transforms problem into an LP calls solver SDPT3 overwrites (object) x with (numeric) optimal value assigns problem optimal value to cvx optval assigns problem status (which here is Solved) to cvx status
35 Some useful functions
Tutorial on Convex Optimization for Engineers Part II
Tutorial on Convex Optimization for Engineers Part II M.Sc. Jens Steinwandt Communications Research Laboratory Ilmenau University of Technology PO Box 100565 D-98684 Ilmenau, Germany jens.steinwandt@tu-ilmenau.de
More informationLagrange Duality. Daniel P. Palomar. Hong Kong University of Science and Technology (HKUST)
Lagrange Duality Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization Fall 2017-18, HKUST, Hong Kong Outline of Lecture Lagrangian Dual function Dual
More informationLagrange duality. The Lagrangian. We consider an optimization program of the form
Lagrange duality Another way to arrive at the KKT conditions, and one which gives us some insight on solving constrained optimization problems, is through the Lagrange dual. The dual is a maximization
More informationConvex Optimization M2
Convex Optimization M2 Lecture 3 A. d Aspremont. Convex Optimization M2. 1/49 Duality A. d Aspremont. Convex Optimization M2. 2/49 DMs DM par email: dm.daspremont@gmail.com A. d Aspremont. Convex Optimization
More informationLecture 18: Optimization Programming
Fall, 2016 Outline Unconstrained Optimization 1 Unconstrained Optimization 2 Equality-constrained Optimization Inequality-constrained Optimization Mixture-constrained Optimization 3 Quadratic Programming
More information14. Duality. ˆ Upper and lower bounds. ˆ General duality. ˆ Constraint qualifications. ˆ Counterexample. ˆ Complementary slackness.
CS/ECE/ISyE 524 Introduction to Optimization Spring 2016 17 14. Duality ˆ Upper and lower bounds ˆ General duality ˆ Constraint qualifications ˆ Counterexample ˆ Complementary slackness ˆ Examples ˆ Sensitivity
More information5. Duality. Lagrangian
5. Duality Convex Optimization Boyd & Vandenberghe Lagrange dual problem weak and strong duality geometric interpretation optimality conditions perturbation and sensitivity analysis examples generalized
More informationDuality. Lagrange dual problem weak and strong duality optimality conditions perturbation and sensitivity analysis generalized inequalities
Duality Lagrange dual problem weak and strong duality optimality conditions perturbation and sensitivity analysis generalized inequalities Lagrangian Consider the optimization problem in standard form
More informationConvex Optimization Boyd & Vandenberghe. 5. Duality
5. Duality Convex Optimization Boyd & Vandenberghe Lagrange dual problem weak and strong duality geometric interpretation optimality conditions perturbation and sensitivity analysis examples generalized
More informationLecture: Duality.
Lecture: Duality http://bicmr.pku.edu.cn/~wenzw/opt-2016-fall.html Acknowledgement: this slides is based on Prof. Lieven Vandenberghe s lecture notes Introduction 2/35 Lagrange dual problem weak and strong
More informationICS-E4030 Kernel Methods in Machine Learning
ICS-E4030 Kernel Methods in Machine Learning Lecture 3: Convex optimization and duality Juho Rousu 28. September, 2016 Juho Rousu 28. September, 2016 1 / 38 Convex optimization Convex optimisation This
More informationLecture: Duality of LP, SOCP and SDP
1/33 Lecture: Duality of LP, SOCP and SDP Zaiwen Wen Beijing International Center For Mathematical Research Peking University http://bicmr.pku.edu.cn/~wenzw/bigdata2017.html wenzw@pku.edu.cn Acknowledgement:
More informationEE/AA 578, Univ of Washington, Fall Duality
7. Duality EE/AA 578, Univ of Washington, Fall 2016 Lagrange dual problem weak and strong duality geometric interpretation optimality conditions perturbation and sensitivity analysis examples generalized
More informationConvex Optimization & Lagrange Duality
Convex Optimization & Lagrange Duality Chee Wei Tan CS 8292 : Advanced Topics in Convex Optimization and its Applications Fall 2010 Outline Convex optimization Optimality condition Lagrange duality KKT
More informationConstrained Optimization and Lagrangian Duality
CIS 520: Machine Learning Oct 02, 2017 Constrained Optimization and Lagrangian Duality Lecturer: Shivani Agarwal Disclaimer: These notes are designed to be a supplement to the lecture. They may or may
More informationCS-E4830 Kernel Methods in Machine Learning
CS-E4830 Kernel Methods in Machine Learning Lecture 3: Convex optimization and duality Juho Rousu 27. September, 2017 Juho Rousu 27. September, 2017 1 / 45 Convex optimization Convex optimisation This
More informationI.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010
I.3. LMI DUALITY Didier HENRION henrion@laas.fr EECI Graduate School on Control Supélec - Spring 2010 Primal and dual For primal problem p = inf x g 0 (x) s.t. g i (x) 0 define Lagrangian L(x, z) = g 0
More informationEE364a Review Session 5
EE364a Review Session 5 EE364a Review announcements: homeworks 1 and 2 graded homework 4 solutions (check solution to additional problem 1) scpd phone-in office hours: tuesdays 6-7pm (650-723-1156) 1 Complementary
More informationLecture 7: Convex Optimizations
Lecture 7: Convex Optimizations Radu Balan, David Levermore March 29, 2018 Convex Sets. Convex Functions A set S R n is called a convex set if for any points x, y S the line segment [x, y] := {tx + (1
More informationExtreme Abridgment of Boyd and Vandenberghe s Convex Optimization
Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization Compiled by David Rosenberg Abstract Boyd and Vandenberghe s Convex Optimization book is very well-written and a pleasure to read. The
More informationOn the Method of Lagrange Multipliers
On the Method of Lagrange Multipliers Reza Nasiri Mahalati November 6, 2016 Most of what is in this note is taken from the Convex Optimization book by Stephen Boyd and Lieven Vandenberghe. This should
More informationConvex Optimization. Dani Yogatama. School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, USA. February 12, 2014
Convex Optimization Dani Yogatama School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, USA February 12, 2014 Dani Yogatama (Carnegie Mellon University) Convex Optimization February 12,
More informationSupport Vector Machines: Maximum Margin Classifiers
Support Vector Machines: Maximum Margin Classifiers Machine Learning and Pattern Recognition: September 16, 2008 Piotr Mirowski Based on slides by Sumit Chopra and Fu-Jie Huang 1 Outline What is behind
More informationConvex Optimization Overview (cnt d)
Conve Optimization Overview (cnt d) Chuong B. Do November 29, 2009 During last week s section, we began our study of conve optimization, the study of mathematical optimization problems of the form, minimize
More information12. Interior-point methods
12. Interior-point methods Convex Optimization Boyd & Vandenberghe inequality constrained minimization logarithmic barrier function and central path barrier method feasibility and phase I methods complexity
More informationELE539A: Optimization of Communication Systems Lecture 6: Quadratic Programming, Geometric Programming, and Applications
ELE539A: Optimization of Communication Systems Lecture 6: Quadratic Programming, Geometric Programming, and Applications Professor M. Chiang Electrical Engineering Department, Princeton University February
More informationIntroduction to Machine Learning Spring 2018 Note Duality. 1.1 Primal and Dual Problem
CS 189 Introduction to Machine Learning Spring 2018 Note 22 1 Duality As we have seen in our discussion of kernels, ridge regression can be viewed in two ways: (1) an optimization problem over the weights
More informationKarush-Kuhn-Tucker Conditions. Lecturer: Ryan Tibshirani Convex Optimization /36-725
Karush-Kuhn-Tucker Conditions Lecturer: Ryan Tibshirani Convex Optimization 10-725/36-725 1 Given a minimization problem Last time: duality min x subject to f(x) h i (x) 0, i = 1,... m l j (x) = 0, j =
More informationLagrangian Duality Theory
Lagrangian Duality Theory Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A. http://www.stanford.edu/ yyye Chapter 14.1-4 1 Recall Primal and Dual
More informationEE 227A: Convex Optimization and Applications October 14, 2008
EE 227A: Convex Optimization and Applications October 14, 2008 Lecture 13: SDP Duality Lecturer: Laurent El Ghaoui Reading assignment: Chapter 5 of BV. 13.1 Direct approach 13.1.1 Primal problem Consider
More informationLecture 6: Conic Optimization September 8
IE 598: Big Data Optimization Fall 2016 Lecture 6: Conic Optimization September 8 Lecturer: Niao He Scriber: Juan Xu Overview In this lecture, we finish up our previous discussion on optimality conditions
More informationMultiuser Downlink Beamforming: Rank-Constrained SDP
Multiuser Downlink Beamforming: Rank-Constrained SDP Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization Fall 2018-19, HKUST, Hong Kong Outline of Lecture
More informationChapter 14 Unconstrained and Constrained Optimization Problems
main def 2011/9/26 15:22 page 480 #511 Chapter 14 Unconstrained and Constrained Optimization Problems Shuguang Cui, Man-Cho Anthony So, and Rui Zhang Texas A&M University, College Station, USA Chinese
More informationLagrangian Duality and Convex Optimization
Lagrangian Duality and Convex Optimization David Rosenberg New York University February 11, 2015 David Rosenberg (New York University) DS-GA 1003 February 11, 2015 1 / 24 Introduction Why Convex Optimization?
More informationIntroduction to Machine Learning Lecture 7. Mehryar Mohri Courant Institute and Google Research
Introduction to Machine Learning Lecture 7 Mehryar Mohri Courant Institute and Google Research mohri@cims.nyu.edu Convex Optimization Differentiation Definition: let f : X R N R be a differentiable function,
More informationNumerical Optimization
Constrained Optimization Computer Science and Automation Indian Institute of Science Bangalore 560 012, India. NPTEL Course on Constrained Optimization Constrained Optimization Problem: min h j (x) 0,
More informationOptimization for Machine Learning
Optimization for Machine Learning (Problems; Algorithms - A) SUVRIT SRA Massachusetts Institute of Technology PKU Summer School on Data Science (July 2017) Course materials http://suvrit.de/teaching.html
More informationsubject to (x 2)(x 4) u,
Exercises Basic definitions 5.1 A simple example. Consider the optimization problem with variable x R. minimize x 2 + 1 subject to (x 2)(x 4) 0, (a) Analysis of primal problem. Give the feasible set, the
More information10701 Recitation 5 Duality and SVM. Ahmed Hefny
10701 Recitation 5 Duality and SVM Ahmed Hefny Outline Langrangian and Duality The Lagrangian Duality Eamples Support Vector Machines Primal Formulation Dual Formulation Soft Margin and Hinge Loss Lagrangian
More informationMotivation. Lecture 2 Topics from Optimization and Duality. network utility maximization (NUM) problem:
CDS270 Maryam Fazel Lecture 2 Topics from Optimization and Duality Motivation network utility maximization (NUM) problem: consider a network with S sources (users), each sending one flow at rate x s, through
More informationConvex Optimization and SVM
Convex Optimization and SVM Problem 0. Cf lecture notes pages 12 to 18. Problem 1. (i) A slab is an intersection of two half spaces, hence convex. (ii) A wedge is an intersection of two half spaces, hence
More informationConvex Optimization M2
Convex Optimization M2 Lecture 8 A. d Aspremont. Convex Optimization M2. 1/57 Applications A. d Aspremont. Convex Optimization M2. 2/57 Outline Geometrical problems Approximation problems Combinatorial
More informationConvex Optimization in Communications and Signal Processing
Convex Optimization in Communications and Signal Processing Prof. Dr.-Ing. Wolfgang Gerstacker 1 University of Erlangen-Nürnberg Institute for Digital Communications National Technical University of Ukraine,
More informationOptimization in Wireless Communication
Zhi-Quan (Tom) Luo Department of Electrical and Computer Engineering University of Minnesota 200 Union Street SE Minneapolis, MN 55455 2007 NSF Workshop Challenges Optimization problems from wireless applications
More informationInterior Point Algorithms for Constrained Convex Optimization
Interior Point Algorithms for Constrained Convex Optimization Chee Wei Tan CS 8292 : Advanced Topics in Convex Optimization and its Applications Fall 2010 Outline Inequality constrained minimization problems
More informationIEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 2, FEBRUARY Uplink Downlink Duality Via Minimax Duality. Wei Yu, Member, IEEE (1) (2)
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 2, FEBRUARY 2006 361 Uplink Downlink Duality Via Minimax Duality Wei Yu, Member, IEEE Abstract The sum capacity of a Gaussian vector broadcast channel
More informationLectures 9 and 10: Constrained optimization problems and their optimality conditions
Lectures 9 and 10: Constrained optimization problems and their optimality conditions Coralia Cartis, Mathematical Institute, University of Oxford C6.2/B2: Continuous Optimization Lectures 9 and 10: Constrained
More informationLecture 7: Weak Duality
EE 227A: Conve Optimization and Applications February 7, 2012 Lecture 7: Weak Duality Lecturer: Laurent El Ghaoui 7.1 Lagrange Dual problem 7.1.1 Primal problem In this section, we consider a possibly
More informationOptimality, Duality, Complementarity for Constrained Optimization
Optimality, Duality, Complementarity for Constrained Optimization Stephen Wright University of Wisconsin-Madison May 2014 Wright (UW-Madison) Optimality, Duality, Complementarity May 2014 1 / 41 Linear
More informationLinear and Combinatorial Optimization
Linear and Combinatorial Optimization The dual of an LP-problem. Connections between primal and dual. Duality theorems and complementary slack. Philipp Birken (Ctr. for the Math. Sc.) Lecture 3: Duality
More informationLagrange Relaxation and Duality
Lagrange Relaxation and Duality As we have already known, constrained optimization problems are harder to solve than unconstrained problems. By relaxation we can solve a more difficult problem by a simpler
More informationConvex Optimization for Signal Processing and Communications: From Fundamentals to Applications
Convex Optimization for Signal Processing and Communications: From Fundamentals to Applications Chong-Yung Chi Institute of Communications Engineering & Department of Electrical Engineering National Tsing
More informationThe Lagrangian L : R d R m R r R is an (easier to optimize) lower bound on the original problem:
HT05: SC4 Statistical Data Mining and Machine Learning Dino Sejdinovic Department of Statistics Oxford Convex Optimization and slides based on Arthur Gretton s Advanced Topics in Machine Learning course
More informationQuiz Discussion. IE417: Nonlinear Programming: Lecture 12. Motivation. Why do we care? Jeff Linderoth. 16th March 2006
Quiz Discussion IE417: Nonlinear Programming: Lecture 12 Jeff Linderoth Department of Industrial and Systems Engineering Lehigh University 16th March 2006 Motivation Why do we care? We are interested in
More informationCSCI : Optimization and Control of Networks. Review on Convex Optimization
CSCI7000-016: Optimization and Control of Networks Review on Convex Optimization 1 Convex set S R n is convex if x,y S, λ,µ 0, λ+µ = 1 λx+µy S geometrically: x,y S line segment through x,y S examples (one
More informationTHE USE OF optimization methods is ubiquitous in communications
1426 IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, VOL. 24, NO. 8, AUGUST 2006 An Introduction to Convex Optimization for Communications and Signal Processing Zhi-Quan Luo, Senior Member, IEEE, and
More informationA Brief Review on Convex Optimization
A Brief Review on Convex Optimization 1 Convex set S R n is convex if x,y S, λ,µ 0, λ+µ = 1 λx+µy S geometrically: x,y S line segment through x,y S examples (one convex, two nonconvex sets): A Brief Review
More informationPrimal/Dual Decomposition Methods
Primal/Dual Decomposition Methods Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization Fall 2018-19, HKUST, Hong Kong Outline of Lecture Subgradients
More informationDuality. Geoff Gordon & Ryan Tibshirani Optimization /
Duality Geoff Gordon & Ryan Tibshirani Optimization 10-725 / 36-725 1 Duality in linear programs Suppose we want to find lower bound on the optimal value in our convex problem, B min x C f(x) E.g., consider
More informationCourse Notes for EE227C (Spring 2018): Convex Optimization and Approximation
Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu
More informationLecture 14: Optimality Conditions for Conic Problems
EE 227A: Conve Optimization and Applications March 6, 2012 Lecture 14: Optimality Conditions for Conic Problems Lecturer: Laurent El Ghaoui Reading assignment: 5.5 of BV. 14.1 Optimality for Conic Problems
More informationConvex Optimization Overview (cnt d)
Convex Optimization Overview (cnt d) Chuong B. Do October 6, 007 1 Recap During last week s section, we began our study of convex optimization, the study of mathematical optimization problems of the form,
More informationISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints
ISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints Instructor: Prof. Kevin Ross Scribe: Nitish John October 18, 2011 1 The Basic Goal The main idea is to transform a given constrained
More information4TE3/6TE3. Algorithms for. Continuous Optimization
4TE3/6TE3 Algorithms for Continuous Optimization (Duality in Nonlinear Optimization ) Tamás TERLAKY Computing and Software McMaster University Hamilton, January 2004 terlaky@mcmaster.ca Tel: 27780 Optimality
More informationNonlinear Optimization: What s important?
Nonlinear Optimization: What s important? Julian Hall 10th May 2012 Convexity: convex problems A local minimizer is a global minimizer A solution of f (x) = 0 (stationary point) is a minimizer A global
More informationLinear classifiers selecting hyperplane maximizing separation margin between classes (large margin classifiers)
Support vector machines In a nutshell Linear classifiers selecting hyperplane maximizing separation margin between classes (large margin classifiers) Solution only depends on a small subset of training
More informationOptimality Conditions for Constrained Optimization
72 CHAPTER 7 Optimality Conditions for Constrained Optimization 1. First Order Conditions In this section we consider first order optimality conditions for the constrained problem P : minimize f 0 (x)
More informationMachine Learning. Support Vector Machines. Manfred Huber
Machine Learning Support Vector Machines Manfred Huber 2015 1 Support Vector Machines Both logistic regression and linear discriminant analysis learn a linear discriminant function to separate the data
More informationIntroduction to Mathematical Programming IE406. Lecture 10. Dr. Ted Ralphs
Introduction to Mathematical Programming IE406 Lecture 10 Dr. Ted Ralphs IE406 Lecture 10 1 Reading for This Lecture Bertsimas 4.1-4.3 IE406 Lecture 10 2 Duality Theory: Motivation Consider the following
More informationConvex Optimization. Newton s method. ENSAE: Optimisation 1/44
Convex Optimization Newton s method ENSAE: Optimisation 1/44 Unconstrained minimization minimize f(x) f convex, twice continuously differentiable (hence dom f open) we assume optimal value p = inf x f(x)
More informationSupport Vector Machine
Andrea Passerini passerini@disi.unitn.it Machine Learning Support vector machines In a nutshell Linear classifiers selecting hyperplane maximizing separation margin between classes (large margin classifiers)
More informationHomework Set #6 - Solutions
EE 15 - Applications of Convex Optimization in Signal Processing and Communications Dr Andre Tkacenko JPL Third Term 11-1 Homework Set #6 - Solutions 1 a The feasible set is the interval [ 4] The unique
More informationMore on Lagrange multipliers
More on Lagrange multipliers CE 377K April 21, 2015 REVIEW The standard form for a nonlinear optimization problem is min x f (x) s.t. g 1 (x) 0. g l (x) 0 h 1 (x) = 0. h m (x) = 0 The objective function
More informationShiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 4. Subgradient
Shiqian Ma, MAT-258A: Numerical Optimization 1 Chapter 4 Subgradient Shiqian Ma, MAT-258A: Numerical Optimization 2 4.1. Subgradients definition subgradient calculus duality and optimality conditions Shiqian
More informationConstrained Optimization
1 / 22 Constrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University March 30, 2015 2 / 22 1. Equality constraints only 1.1 Reduced gradient 1.2 Lagrange
More informationConvex Optimization and Support Vector Machine
Convex Optimization and Support Vector Machine Problem 0. Consider a two-class classification problem. The training data is L n = {(x 1, t 1 ),..., (x n, t n )}, where each t i { 1, 1} and x i R p. We
More informationApplications of Linear Programming
Applications of Linear Programming lecturer: András London University of Szeged Institute of Informatics Department of Computational Optimization Lecture 9 Non-linear programming In case of LP, the goal
More informationLagrangian Duality. Richard Lusby. Department of Management Engineering Technical University of Denmark
Lagrangian Duality Richard Lusby Department of Management Engineering Technical University of Denmark Today s Topics (jg Lagrange Multipliers Lagrangian Relaxation Lagrangian Duality R Lusby (42111) Lagrangian
More informationLecture Notes on Support Vector Machine
Lecture Notes on Support Vector Machine Feng Li fli@sdu.edu.cn Shandong University, China 1 Hyperplane and Margin In a n-dimensional space, a hyper plane is defined by ω T x + b = 0 (1) where ω R n is
More information5 Handling Constraints
5 Handling Constraints Engineering design optimization problems are very rarely unconstrained. Moreover, the constraints that appear in these problems are typically nonlinear. This motivates our interest
More informationDuality in Linear Programs. Lecturer: Ryan Tibshirani Convex Optimization /36-725
Duality in Linear Programs Lecturer: Ryan Tibshirani Convex Optimization 10-725/36-725 1 Last time: proximal gradient descent Consider the problem x g(x) + h(x) with g, h convex, g differentiable, and
More information1. f(β) 0 (that is, β is a feasible point for the constraints)
xvi 2. The lasso for linear models 2.10 Bibliographic notes Appendix Convex optimization with constraints In this Appendix we present an overview of convex optimization concepts that are particularly useful
More informationLagrangian Duality for Dummies
Lagrangian Duality for Dummies David Knowles November 13, 2010 We want to solve the following optimisation problem: f 0 () (1) such that f i () 0 i 1,..., m (2) For now we do not need to assume conveity.
More informationSupport Vector Machine (SVM) and Kernel Methods
Support Vector Machine (SVM) and Kernel Methods CE-717: Machine Learning Sharif University of Technology Fall 2015 Soleymani Outline Margin concept Hard-Margin SVM Soft-Margin SVM Dual Problems of Hard-Margin
More informationSupport Vector Machines
Support Vector Machines Support vector machines (SVMs) are one of the central concepts in all of machine learning. They are simply a combination of two ideas: linear classification via maximum (or optimal
More informationPrimal-Dual Interior-Point Methods for Linear Programming based on Newton s Method
Primal-Dual Interior-Point Methods for Linear Programming based on Newton s Method Robert M. Freund March, 2004 2004 Massachusetts Institute of Technology. The Problem The logarithmic barrier approach
More informationPrimal-dual Subgradient Method for Convex Problems with Functional Constraints
Primal-dual Subgradient Method for Convex Problems with Functional Constraints Yurii Nesterov, CORE/INMA (UCL) Workshop on embedded optimization EMBOPT2014 September 9, 2014 (Lucca) Yu. Nesterov Primal-dual
More informationSpace-Time Processing for MIMO Communications
Space-Time Processing for MIMO Communications Editors: Alex B. Gershman Dept. of ECE, McMaster University Hamilton, L8S 4K1, Ontario, Canada; & Dept. of Communication Systems, Duisburg-Essen University,
More information4. Algebra and Duality
4-1 Algebra and Duality P. Parrilo and S. Lall, CDC 2003 2003.12.07.01 4. Algebra and Duality Example: non-convex polynomial optimization Weak duality and duality gap The dual is not intrinsic The cone
More informationMinimum Mean Squared Error Interference Alignment
Minimum Mean Squared Error Interference Alignment David A. Schmidt, Changxin Shi, Randall A. Berry, Michael L. Honig and Wolfgang Utschick Associate Institute for Signal Processing Technische Universität
More informationDuality Uses and Correspondences. Ryan Tibshirani Convex Optimization
Duality Uses and Correspondences Ryan Tibshirani Conve Optimization 10-725 Recall that for the problem Last time: KKT conditions subject to f() h i () 0, i = 1,... m l j () = 0, j = 1,... r the KKT conditions
More informationGeneralization to inequality constrained problem. Maximize
Lecture 11. 26 September 2006 Review of Lecture #10: Second order optimality conditions necessary condition, sufficient condition. If the necessary condition is violated the point cannot be a local minimum
More informationSupport Vector Machines for Regression
COMP-566 Rohan Shah (1) Support Vector Machines for Regression Provided with n training data points {(x 1, y 1 ), (x 2, y 2 ),, (x n, y n )} R s R we seek a function f for a fixed ɛ > 0 such that: f(x
More informationSupport Vector Machines
Support Vector Machines Sridhar Mahadevan mahadeva@cs.umass.edu University of Massachusetts Sridhar Mahadevan: CMPSCI 689 p. 1/32 Margin Classifiers margin b = 0 Sridhar Mahadevan: CMPSCI 689 p.
More informationSupport Vector Machine (SVM) and Kernel Methods
Support Vector Machine (SVM) and Kernel Methods CE-717: Machine Learning Sharif University of Technology Fall 2014 Soleymani Outline Margin concept Hard-Margin SVM Soft-Margin SVM Dual Problems of Hard-Margin
More informationLecture Note 5: Semidefinite Programming for Stability Analysis
ECE7850: Hybrid Systems:Theory and Applications Lecture Note 5: Semidefinite Programming for Stability Analysis Wei Zhang Assistant Professor Department of Electrical and Computer Engineering Ohio State
More informationLinear classifiers selecting hyperplane maximizing separation margin between classes (large margin classifiers)
Support vector machines In a nutshell Linear classifiers selecting hyperplane maximizing separation margin between classes (large margin classifiers) Solution only depends on a small subset of training
More informationSum-Power Iterative Watefilling Algorithm
Sum-Power Iterative Watefilling Algorithm Daniel P. Palomar Hong Kong University of Science and Technolgy (HKUST) ELEC547 - Convex Optimization Fall 2009-10, HKUST, Hong Kong November 11, 2009 Outline
More informationELE539A: Optimization of Communication Systems Lecture 16: Pareto Optimization and Nonconvex Optimization
ELE539A: Optimization of Communication Systems Lecture 16: Pareto Optimization and Nonconvex Optimization Professor M. Chiang Electrical Engineering Department, Princeton University March 16, 2007 Lecture
More informationIntroduction to Optimization Techniques. Nonlinear Optimization in Function Spaces
Introduction to Optimization Techniques Nonlinear Optimization in Function Spaces X : T : Gateaux and Fréchet Differentials Gateaux and Fréchet Differentials a vector space, Y : a normed space transformation
More information