# 10-725/ Optimization Midterm Exam

Save this PDF as:

Size: px
Start display at page:

## Transcription

1 10-725/ Optimization Midterm Exam November 6, 2012 NAME: ANDREW ID: Instructions: This exam is 1hr 20mins long Except for a single two-sided sheet of notes, no other material or discussion is permitted There are 11 one-sided sheets Please use the other side for rough work 1

2 1 Warm start [Shiva, 25 points] 11 Multiple choice [15 points] Circle the correct answer in each of the following questions 1 Which of the following functions is convex in x R n? (a x 1/2 (b x 2 (c max j xj (d min i a T i x [e ] log j exp(x j 2 What is the normal cone of the nonnegative orthant {x : x i 0} at the origin? (a The nonnegative orthant {x : x i 0} [b ] The nonpositive orthant {x : x i 0} (c The line y = x (d The line y = x (e The zero vector 3 In a nonsingular standard form linear program min c T x st Ax = b where x, c R n, b R m, and A R m n, how many basic variables are there? (a n [b ] m (c n m (d n + m (e max{n, m} 2

3 4 If the dual (maximization LP is feasible and unbounded then the primal (minimization LP is (a nonsingular [b ] infeasible (c bounded (d unbounded (e zero 5 Given the general optimization program min f(x st g(x 0, h(x = 0 consider its Lagrangian L(x, u, v with u and v introduced for the inequality and equality constraints, respectively Which of the following is true? (a L(x, u, v = L(x, u, v for primal-optimal x, dual-optimal (u, v, and all x, u, v (b L is convex in x [c ] L is concave in u and v (d f(x L(x, u, v for all x, u, v (e u i h i (x = 0 at dual-optimal u for all i 12 Conjugate functions [5 points] Derive the conjugate function f (y for each of the following functions: 1 f(x = 3x 2 + 4x f (y = max x x y (3x 2 + 4x By stationarity y 6x 4 = 0 = x = y 4 6 Thus f (y = (y f(x = ln x + 2 f (y = max x x y + ln(x 2 If y 0, then clearly f (y = Otherwise, by stationarity y + 1 = 0 = x = 1 Thus f (y = 3 + ln( 1 = 3 ln(y x y y 3

4 13 Matrix differential [5 points] Let L = Ax b 2 2 where A is a matrix and x and b are vectors Derive dl in terms of dx L = (Ax b T (Ax b = x T A T Ax x T A T b b T Ax + b T b so dl = x T A T dx dx T A T b b T Adx 4

5 2 First and second order methods [Wooyoung, 20 points] Consider the function f(x = f(x 1, x 2 = (x 1 + x (a [4 points] Derive the gradient of f(x SOLUTION: f(x = ( 2x1 + 2x 2 2 4x 1 x 2 + 4x 3 2 (b [3 points] At the point x 0 = (0, 1 T, we consider the search direction p = (1, 1 T Show that p is a descent direction SOLUTION: f(x 0 = (2, 4 T f(x 0 T p = 2 4 = 2 < 0 Therefore, p is a descent direction at x 0 (c [3 points] Find the stepsize α that minimizes f(x 0 + αp; that is, what is the result of this exact line search? Report f(x 0 + αp SOLUTION: We need to find α that minimizes f(x 0 + αp = f((0, 1 T + (α, α T = f((α, 1 α T = (α + (1 α 2 2 = (α 2 α = ((α Minimizing f(x 0 +αp with respect to α is equivalent to minimizing g(α = (α since g(α > 0 and α = 1 2 is the minimizer Therefore, f(x 0 + αp = ( =

6 (d [5 points] Derive the Hessian of f(x SOLUTION: ( 2 2 4x2 f(x = 4x 2 4x x 2 2 (e [5 points] Run one Newton s step with fixed stepsize t = 1 starting from x 0 = (0, 1 T to compute x 1 Show x 1 and f(x 1 Hint: The inverse of a 2 2 matrix is as follows ( d b ( 1 a b = c d 1 ad bc c a SOLUTION: 2 f(x 0 = ( x nt = ( 2 f(x 0 1 f(x 0 = 1 8 ( ( 2 4 = Therefore, x 1 = x 0 + t x nt = (0, 1 T + ( 1, 0 T = ( 1, 1 T and f(x 1 = 0 ( 1 0 6

7 3 Projection and convexity [Aadi, 25 points] 31 Projecting onto nullspaces The nullspace of a matrix A ( R m n, m < n is defined to be the set of all x R n such that Ax = 0 The projection of a point z onto a convex set S is defined as the point x S which is closest in Euclidean distance to z Find a closed form solution for the projection of z onto the convex set {x Ax = 0} (You can assume A is full rank, ie, rank m To solve the problem, set it up as a constrained optimization, write out the Lagrangian, and derive the KKT conditions [13 points] Solution min x x z 2 2 such that Ax = 0 Lagrangian L(x, λ = x z λ Ax KKT conditions give Ax = 0 and x z + A λ = 0 The second condition (on multiplying by A yields Az = AA λ, implying λ = (AA 1 Az, yielding x = z A (AA 1 Az 32 Understanding convexity Consider the five functions x, x 2, x 3, x 4, x 5 Which of these functions are convex on R? Which are strictly convex on R? Which are strongly convex on R? Which are strongly convex on [05, 45]? NO EXPLANATIONS REQUIRED! [12 points] Convex: [3 pts] x, x 2, x 4 Strictly convex: [3 pts] x 2, x 4 Strongly convex: [3 pts] x 2 Strongly convex on [05, 45]: [3 pts] x 2, x 3, x 4, x 5 7

8 4 The Dantzig Selector [Kevin, 25 Points] The Dantzig selector is another regression technique for when the problem dimension is larger than the number of data points, like LASSO That is, we wish to learn a sparse w R p such that y f(x = w T x Let X is an n by p matrix of n examples, y is an n-dimensional vector of targets, and ɛ > 0 is a supplied constant The Dantzig selector is a solution to the following convex optimization problem: min w w 1 subject to: (1 X T (y Xw ɛ (2 The L1-norm objective attempts to keep the weight vector w sparse, like in the LASSO optimization The constraints impose that no feature has high dot product with the residual That is, either the prediction error is low, or no single feature can explain the remaining error Let s reformulate the optimization problem as a linear program by expanding the norms and introducing an auxilary vector t R p Here, 1 is the vector of all ones min w,t 1 T t, subject to: (3 X T (y Xw ɛ1, (4 X T (y Xw ɛ1, (5 w t, (6 w t (7 (a [6 points] Write the dual linear program of the reformulated program Use α, β, γ, δ 0 as your dual variables for the four sets of constraints (in that order Solution: max α,β,γ,δ 0 yt X(α β ɛ1 T (α + β subject to: (8 X T X(α β = γ δ, (9 δ + γ = 1 (10 (b [6 points] Rewrite this dual program by replacing α β = q and γ δ = z Your program will no longer be a linear program You do not need to justify your answer Hint 1: Given the constraint z i = γ i δ i and the constraints you derived on γ i and δ i, what is the largest that z i could be? What is the smallest that z i could be? Hint 2: Think about the norms and their duals in the original program 8

9 Solution: max q,z y T Xq ɛ q 1 subject to: (11 X T Xq = z, (12 z 1 (13 (c [1 points] You should have one remaining linear constraint involving X T X Use it to eliminate z Solution: max q y T Xq ɛ q 1 subject to: (14 X T Xq 1 (15 Let w(ɛ be a LASSO solution with regularization parameter ɛ, w(ɛ = arg min w y Xw 2 /2 + ɛ w 1 (16 and write w(ɛ for the solution to the Dantzig selector program with the same value of ɛ We will show how to relate w(ɛ to w(ɛ (d [6 points] Write the KKT (ie, first-order optimality conditions for the LASSO optimization at w(ɛ Solution: X T (y X w(ɛ ɛ w(ɛ 1 (17 (e [6 points] Using the KKT conditions, show that w(ɛ is a feasible point for the Dantzig selector optimization with parameter ɛ Solution: For all j {1, 2,, p}, Xj T (y X w(ɛ 1 ɛ (18 Xj T (y X w(ɛ ɛ (19 max Xj T (y X w(ɛ ɛ (20 j X T (y X w(ɛ ɛ (21 (22 We ve now shown that the Dantzig selector finds solutions whose L1-norm is at least as small as the LASSO solution 9

### minimize x subject to (x 2)(x 4) u,

Math 6366/6367: Optimization and Variational Methods Sample Preliminary Exam Questions 1. Suppose that f : [, L] R is a C 2 -function with f () on (, L) and that you have explicit formulae for

### 10725/36725 Optimization Homework 4

10725/36725 Optimization Homework 4 Due November 27, 2012 at beginning of class Instructions: There are four questions in this assignment. Please submit your homework as (up to) 4 separate sets of pages

### Convex Optimization. Newton s method. ENSAE: Optimisation 1/44

Convex Optimization Newton s method ENSAE: Optimisation 1/44 Unconstrained minimization minimize f(x) f convex, twice continuously differentiable (hence dom f open) we assume optimal value p = inf x f(x)

### Math 164-1: Optimization Instructor: Alpár R. Mészáros

Math 164-1: Optimization Instructor: Alpár R. Mészáros First Midterm, April 20, 2016 Name (use a pen): Student ID (use a pen): Signature (use a pen): Rules: Duration of the exam: 50 minutes. By writing

### 12. Interior-point methods

12. Interior-point methods Convex Optimization Boyd & Vandenberghe inequality constrained minimization logarithmic barrier function and central path barrier method feasibility and phase I methods complexity

### Uses of duality. Geoff Gordon & Ryan Tibshirani Optimization /

Uses of duality Geoff Gordon & Ryan Tibshirani Optimization 10-725 / 36-725 1 Remember conjugate functions Given f : R n R, the function is called its conjugate f (y) = max x R n yt x f(x) Conjugates appear

### Generalization to inequality constrained problem. Maximize

Lecture 11. 26 September 2006 Review of Lecture #10: Second order optimality conditions necessary condition, sufficient condition. If the necessary condition is violated the point cannot be a local minimum

### Lagrange duality. The Lagrangian. We consider an optimization program of the form

Lagrange duality Another way to arrive at the KKT conditions, and one which gives us some insight on solving constrained optimization problems, is through the Lagrange dual. The dual is a maximization

### E5295/5B5749 Convex optimization with engineering applications. Lecture 5. Convex programming and semidefinite programming

E5295/5B5749 Convex optimization with engineering applications Lecture 5 Convex programming and semidefinite programming A. Forsgren, KTH 1 Lecture 5 Convex optimization 2006/2007 Convex quadratic program

### Optimization for Communications and Networks. Poompat Saengudomlert. Session 4 Duality and Lagrange Multipliers

Optimization for Communications and Networks Poompat Saengudomlert Session 4 Duality and Lagrange Multipliers P Saengudomlert (2015) Optimization Session 4 1 / 14 24 Dual Problems Consider a primal convex

### Exam in TMA4180 Optimization Theory

Norwegian University of Science and Technology Department of Mathematical Sciences Page 1 of 11 Contact during exam: Anne Kværnø: 966384 Exam in TMA418 Optimization Theory Wednesday May 9, 13 Tid: 9. 13.

### Nonlinear Optimization: What s important?

Nonlinear Optimization: What s important? Julian Hall 10th May 2012 Convexity: convex problems A local minimizer is a global minimizer A solution of f (x) = 0 (stationary point) is a minimizer A global

### Algorithms for constrained local optimization

Algorithms for constrained local optimization Fabio Schoen 2008 http://gol.dsi.unifi.it/users/schoen Algorithms for constrained local optimization p. Feasible direction methods Algorithms for constrained

### E5295/5B5749 Convex optimization with engineering applications. Lecture 8. Smooth convex unconstrained and equality-constrained minimization

E5295/5B5749 Convex optimization with engineering applications Lecture 8 Smooth convex unconstrained and equality-constrained minimization A. Forsgren, KTH 1 Lecture 8 Convex optimization 2006/2007 Unconstrained

### 2.098/6.255/ Optimization Methods Practice True/False Questions

2.098/6.255/15.093 Optimization Methods Practice True/False Questions December 11, 2009 Part I For each one of the statements below, state whether it is true or false. Include a 1-3 line supporting sentence

### Constrained Optimization and Lagrangian Duality

CIS 520: Machine Learning Oct 02, 2017 Constrained Optimization and Lagrangian Duality Lecturer: Shivani Agarwal Disclaimer: These notes are designed to be a supplement to the lecture. They may or may

### Lecture 18: Optimization Programming

Fall, 2016 Outline Unconstrained Optimization 1 Unconstrained Optimization 2 Equality-constrained Optimization Inequality-constrained Optimization Mixture-constrained Optimization 3 Quadratic Programming

### Convex Optimization and l 1 -minimization

Convex Optimization and l 1 -minimization Sangwoon Yun Computational Sciences Korea Institute for Advanced Study December 11, 2009 2009 NIMS Thematic Winter School Outline I. Convex Optimization II. l

### WHY DUALITY? Gradient descent Newton s method Quasi-newton Conjugate gradients. No constraints. Non-differentiable ???? Constrained problems? ????

DUALITY WHY DUALITY? No constraints f(x) Non-differentiable f(x) Gradient descent Newton s method Quasi-newton Conjugate gradients etc???? Constrained problems? f(x) subject to g(x) apple 0???? h(x) =0

### March 8, 2010 MATH 408 FINAL EXAM SAMPLE

March 8, 200 MATH 408 FINAL EXAM SAMPLE EXAM OUTLINE The final exam for this course takes place in the regular course classroom (MEB 238) on Monday, March 2, 8:30-0:20 am. You may bring two-sided 8 page

### Primal-dual Subgradient Method for Convex Problems with Functional Constraints

Primal-dual Subgradient Method for Convex Problems with Functional Constraints Yurii Nesterov, CORE/INMA (UCL) Workshop on embedded optimization EMBOPT2014 September 9, 2014 (Lucca) Yu. Nesterov Primal-dual

### Written Examination

Division of Scientific Computing Department of Information Technology Uppsala University Optimization Written Examination 202-2-20 Time: 4:00-9:00 Allowed Tools: Pocket Calculator, one A4 paper with notes

### 4TE3/6TE3. Algorithms for. Continuous Optimization

4TE3/6TE3 Algorithms for Continuous Optimization (Algorithms for Constrained Nonlinear Optimization Problems) Tamás TERLAKY Computing and Software McMaster University Hamilton, November 2005 terlaky@mcmaster.ca

### Introduction to Mathematical Programming IE406. Lecture 10. Dr. Ted Ralphs

Introduction to Mathematical Programming IE406 Lecture 10 Dr. Ted Ralphs IE406 Lecture 10 1 Reading for This Lecture Bertsimas 4.1-4.3 IE406 Lecture 10 2 Duality Theory: Motivation Consider the following

### Dual methods and ADMM. Barnabas Poczos & Ryan Tibshirani Convex Optimization /36-725

Dual methods and ADMM Barnabas Poczos & Ryan Tibshirani Convex Optimization 10-725/36-725 1 Given f : R n R, the function is called its conjugate Recall conjugate functions f (y) = max x R n yt x f(x)

### Assignment 1: From the Definition of Convexity to Helley Theorem

Assignment 1: From the Definition of Convexity to Helley Theorem Exercise 1 Mark in the following list the sets which are convex: 1. {x R 2 : x 1 + i 2 x 2 1, i = 1,..., 10} 2. {x R 2 : x 2 1 + 2ix 1x

### UC Berkeley Department of Electrical Engineering and Computer Science. EECS 227A Nonlinear and Convex Optimization. Solutions 5 Fall 2009

UC Berkeley Department of Electrical Engineering and Computer Science EECS 227A Nonlinear and Convex Optimization Solutions 5 Fall 2009 Reading: Boyd and Vandenberghe, Chapter 5 Solution 5.1 Note that

### Mark your answers ON THE EXAM ITSELF. If you are not sure of your answer you may wish to provide a brief explanation.

CS 189 Spring 2015 Introduction to Machine Learning Midterm You have 80 minutes for the exam. The exam is closed book, closed notes except your one-page crib sheet. No calculators or electronic items.

### Outline. Roadmap for the NPP segment: 1 Preliminaries: role of convexity. 2 Existence of a solution

Outline Roadmap for the NPP segment: 1 Preliminaries: role of convexity 2 Existence of a solution 3 Necessary conditions for a solution: inequality constraints 4 The constraint qualification 5 The Lagrangian

### subject to (x 2)(x 4) u,

Exercises Basic definitions 5.1 A simple example. Consider the optimization problem with variable x R. minimize x 2 + 1 subject to (x 2)(x 4) 0, (a) Analysis of primal problem. Give the feasible set, the

### Homework Set #6 - Solutions

EE 15 - Applications of Convex Optimization in Signal Processing and Communications Dr Andre Tkacenko JPL Third Term 11-1 Homework Set #6 - Solutions 1 a The feasible set is the interval [ 4] The unique

### Convex Optimization. Dani Yogatama. School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, USA. February 12, 2014

Convex Optimization Dani Yogatama School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, USA February 12, 2014 Dani Yogatama (Carnegie Mellon University) Convex Optimization February 12,

### Lecture 15 Newton Method and Self-Concordance. October 23, 2008

Newton Method and Self-Concordance October 23, 2008 Outline Lecture 15 Self-concordance Notion Self-concordant Functions Operations Preserving Self-concordance Properties of Self-concordant Functions Implications

### HW1 solutions. 1. α Ef(x) β, where Ef(x) is the expected value of f(x), i.e., Ef(x) = n. i=1 p if(a i ). (The function f : R R is given.

HW1 solutions Exercise 1 (Some sets of probability distributions.) Let x be a real-valued random variable with Prob(x = a i ) = p i, i = 1,..., n, where a 1 < a 2 < < a n. Of course p R n lies in the standard

### ICS-E4030 Kernel Methods in Machine Learning

ICS-E4030 Kernel Methods in Machine Learning Lecture 3: Convex optimization and duality Juho Rousu 28. September, 2016 Juho Rousu 28. September, 2016 1 / 38 Convex optimization Convex optimisation This

### Constrained Optimization

1 / 22 Constrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University March 30, 2015 2 / 22 1. Equality constraints only 1.1 Reduced gradient 1.2 Lagrange

### Convex Optimization and Support Vector Machine

Convex Optimization and Support Vector Machine Problem 0. Consider a two-class classification problem. The training data is L n = {(x 1, t 1 ),..., (x n, t n )}, where each t i { 1, 1} and x i R p. We

### 5. Duality. Lagrangian

5. Duality Convex Optimization Boyd & Vandenberghe Lagrange dual problem weak and strong duality geometric interpretation optimality conditions perturbation and sensitivity analysis examples generalized

### Two hours. To be provided by Examinations Office: Mathematical Formula Tables. THE UNIVERSITY OF MANCHESTER. xx xxxx 2017 xx:xx xx.

Two hours To be provided by Examinations Office: Mathematical Formula Tables. THE UNIVERSITY OF MANCHESTER CONVEX OPTIMIZATION - SOLUTIONS xx xxxx 27 xx:xx xx.xx Answer THREE of the FOUR questions. If

### March 5, 2012 MATH 408 FINAL EXAM SAMPLE

March 5, 202 MATH 408 FINAL EXAM SAMPLE Partial Solutions to Sample Questions (in progress) See the sample questions for the midterm exam, but also consider the following questions. Obviously, a final

### Convex Optimization M2

Convex Optimization M2 Lecture 3 A. d Aspremont. Convex Optimization M2. 1/49 Duality A. d Aspremont. Convex Optimization M2. 2/49 DMs DM par email: dm.daspremont@gmail.com A. d Aspremont. Convex Optimization

### Solution to EE 617 Mid-Term Exam, Fall November 2, 2017

Solution to EE 67 Mid-erm Exam, Fall 207 November 2, 207 EE 67 Solution to Mid-erm Exam - Page 2 of 2 November 2, 207 (4 points) Convex sets (a) (2 points) Consider the set { } a R k p(0) =, p(t) for t

### Finite Dimensional Optimization Part III: Convex Optimization 1

John Nachbar Washington University March 21, 2017 Finite Dimensional Optimization Part III: Convex Optimization 1 1 Saddle points and KKT. These notes cover another important approach to optimization,

### 12. Interior-point methods

12. Interior-point methods Convex Optimization Boyd & Vandenberghe inequality constrained minimization logarithmic barrier function and central path barrier method feasibility and phase I methods complexity

### DEPARTMENT OF STATISTICS AND OPERATIONS RESEARCH OPERATIONS RESEARCH DETERMINISTIC QUALIFYING EXAMINATION. Part I: Short Questions

DEPARTMENT OF STATISTICS AND OPERATIONS RESEARCH OPERATIONS RESEARCH DETERMINISTIC QUALIFYING EXAMINATION Part I: Short Questions August 12, 2008 9:00 am - 12 pm General Instructions This examination is

### Convex Optimization. Lecture 12 - Equality Constrained Optimization. Instructor: Yuanzhang Xiao. Fall University of Hawaii at Manoa

Convex Optimization Lecture 12 - Equality Constrained Optimization Instructor: Yuanzhang Xiao University of Hawaii at Manoa Fall 2017 1 / 19 Today s Lecture 1 Basic Concepts 2 for Equality Constrained

### On the Method of Lagrange Multipliers

On the Method of Lagrange Multipliers Reza Nasiri Mahalati November 6, 2016 Most of what is in this note is taken from the Convex Optimization book by Stephen Boyd and Lieven Vandenberghe. This should

### IE 5531 Midterm #2 Solutions

IE 5531 Midterm #2 s Prof. John Gunnar Carlsson November 9, 2011 Before you begin: This exam has 9 pages and a total of 5 problems. Make sure that all pages are present. To obtain credit for a problem,

### Lecture 3. Optimization Problems and Iterative Algorithms

Lecture 3 Optimization Problems and Iterative Algorithms January 13, 2016 This material was jointly developed with Angelia Nedić at UIUC for IE 598ns Outline Special Functions: Linear, Quadratic, Convex

### Midterm. Introduction to Machine Learning. CS 189 Spring You have 1 hour 20 minutes for the exam.

CS 189 Spring 2013 Introduction to Machine Learning Midterm You have 1 hour 20 minutes for the exam. The exam is closed book, closed notes except your one-page crib sheet. Please use non-programmable calculators

### Convex Optimization Algorithms for Machine Learning in 10 Slides

Convex Optimization Algorithms for Machine Learning in 10 Slides Presenter: Jul. 15. 2015 Outline 1 Quadratic Problem Linear System 2 Smooth Problem Newton-CG 3 Composite Problem Proximal-Newton-CD 4 Non-smooth,

### Convex Optimization and SVM

Convex Optimization and SVM Problem 0. Cf lecture notes pages 12 to 18. Problem 1. (i) A slab is an intersection of two half spaces, hence convex. (ii) A wedge is an intersection of two half spaces, hence

### Lecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem

Lecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem Michael Patriksson 0-0 The Relaxation Theorem 1 Problem: find f := infimum f(x), x subject to x S, (1a) (1b) where f : R n R

### CS-E4830 Kernel Methods in Machine Learning

CS-E4830 Kernel Methods in Machine Learning Lecture 3: Convex optimization and duality Juho Rousu 27. September, 2017 Juho Rousu 27. September, 2017 1 / 45 Convex optimization Convex optimisation This

### Introduction to Alternating Direction Method of Multipliers

Introduction to Alternating Direction Method of Multipliers Yale Chang Machine Learning Group Meeting September 29, 2016 Yale Chang (Machine Learning Group Meeting) Introduction to Alternating Direction

### Optimality Conditions for Constrained Optimization

72 CHAPTER 7 Optimality Conditions for Constrained Optimization 1. First Order Conditions In this section we consider first order optimality conditions for the constrained problem P : minimize f 0 (x)

Additional Homework Problems Robert M. Freund April, 2004 2004 Massachusetts Institute of Technology. 1 2 1 Exercises 1. Let IR n + denote the nonnegative orthant, namely IR + n = {x IR n x j ( ) 0,j =1,...,n}.

### Homework 4. Convex Optimization /36-725

Homework 4 Convex Optimization 10-725/36-725 Due Friday November 4 at 5:30pm submitted to Christoph Dann in Gates 8013 (Remember to a submit separate writeup for each problem, with your name at the top)

### Applications of Linear Programming

Applications of Linear Programming lecturer: András London University of Szeged Institute of Informatics Department of Computational Optimization Lecture 9 Non-linear programming In case of LP, the goal

Conditional Gradient (Frank-Wolfe) Method Lecturer: Aarti Singh Co-instructor: Pradeep Ravikumar Convex Optimization 10-725/36-725 1 Outline Today: Conditional gradient method Convergence analysis Properties

### Karush-Kuhn-Tucker Conditions. Lecturer: Ryan Tibshirani Convex Optimization /36-725

Karush-Kuhn-Tucker Conditions Lecturer: Ryan Tibshirani Convex Optimization 10-725/36-725 1 Given a minimization problem Last time: duality min x subject to f(x) h i (x) 0, i = 1,... m l j (x) = 0, j =

### min 4x 1 5x 2 + 3x 3 s.t. x 1 + 2x 2 + x 3 = 10 x 1 x 2 6 x 1 + 3x 2 + x 3 14

The exam is three hours long and consists of 4 exercises. The exam is graded on a scale 0-25 points, and the points assigned to each question are indicated in parenthesis within the text. If necessary,

### Duality in Linear Programs. Lecturer: Ryan Tibshirani Convex Optimization /36-725

Duality in Linear Programs Lecturer: Ryan Tibshirani Convex Optimization 10-725/36-725 1 Last time: proximal gradient descent Consider the problem x g(x) + h(x) with g, h convex, g differentiable, and

### Lecture 8 Plus properties, merit functions and gap functions. September 28, 2008

Lecture 8 Plus properties, merit functions and gap functions September 28, 2008 Outline Plus-properties and F-uniqueness Equation reformulations of VI/CPs Merit functions Gap merit functions FP-I book:

### Duality Theory of Constrained Optimization

Duality Theory of Constrained Optimization Robert M. Freund April, 2014 c 2014 Massachusetts Institute of Technology. All rights reserved. 1 2 1 The Practical Importance of Duality Duality is pervasive

### Max Margin-Classifier

Max Margin-Classifier Oliver Schulte - CMPT 726 Bishop PRML Ch. 7 Outline Maximum Margin Criterion Math Maximizing the Margin Non-Separable Data Kernels and Non-linear Mappings Where does the maximization

### EE364a Review Session 5

EE364a Review Session 5 EE364a Review announcements: homeworks 1 and 2 graded homework 4 solutions (check solution to additional problem 1) scpd phone-in office hours: tuesdays 6-7pm (650-723-1156) 1 Complementary

### Interior Point Methods. We ll discuss linear programming first, followed by three nonlinear problems. Algorithms for Linear Programming Problems

AMSC 607 / CMSC 764 Advanced Numerical Optimization Fall 2008 UNIT 3: Constrained Optimization PART 4: Introduction to Interior Point Methods Dianne P. O Leary c 2008 Interior Point Methods We ll discuss

### Optimization for Machine Learning

Optimization for Machine Learning (Problems; Algorithms - A) SUVRIT SRA Massachusetts Institute of Technology PKU Summer School on Data Science (July 2017) Course materials http://suvrit.de/teaching.html

### Miscellaneous Nonlinear Programming Exercises

Miscellaneous Nonlinear Programming Exercises Henry Wolkowicz 2 08 21 University of Waterloo Department of Combinatorics & Optimization Waterloo, Ontario N2L 3G1, Canada Contents 1 Numerical Analysis Background

### Numerical Optimization

Constrained Optimization Computer Science and Automation Indian Institute of Science Bangalore 560 012, India. NPTEL Course on Constrained Optimization Constrained Optimization Problem: min h j (x) 0,

### Quiz Discussion. IE417: Nonlinear Programming: Lecture 12. Motivation. Why do we care? Jeff Linderoth. 16th March 2006

Quiz Discussion IE417: Nonlinear Programming: Lecture 12 Jeff Linderoth Department of Industrial and Systems Engineering Lehigh University 16th March 2006 Motivation Why do we care? We are interested in

### Lecture 20: November 1st

10-725: Optimization Fall 2012 Lecture 20: November 1st Lecturer: Geoff Gordon Scribes: Xiaolong Shen, Alex Beutel Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer: These notes have not

### Primal-Dual Interior-Point Methods

Primal-Dual Interior-Point Methods Lecturer: Aarti Singh Co-instructor: Pradeep Ravikumar Convex Optimization 10-725/36-725 Outline Today: Primal-dual interior-point method Special case: linear programming

### Convex Optimization Boyd & Vandenberghe. 5. Duality

5. Duality Convex Optimization Boyd & Vandenberghe Lagrange dual problem weak and strong duality geometric interpretation optimality conditions perturbation and sensitivity analysis examples generalized

### December 2014 MATH 340 Name Page 2 of 10 pages

December 2014 MATH 340 Name Page 2 of 10 pages Marks [8] 1. Find the value of Alice announces a pure strategy and Betty announces a pure strategy for the matrix game [ ] 1 4 A =. 5 2 Find the value of

### Equality constrained minimization

Chapter 10 Equality constrained minimization 10.1 Equality constrained minimization problems In this chapter we describe methods for solving a convex optimization problem with equality constraints, minimize

### Machine Learning and Computational Statistics, Spring 2017 Homework 2: Lasso Regression

Machine Learning and Computational Statistics, Spring 2017 Homework 2: Lasso Regression Due: Monday, February 13, 2017, at 10pm (Submit via Gradescope) Instructions: Your answers to the questions below,

### Lecture: Duality of LP, SOCP and SDP

1/33 Lecture: Duality of LP, SOCP and SDP Zaiwen Wen Beijing International Center For Mathematical Research Peking University http://bicmr.pku.edu.cn/~wenzw/bigdata2017.html wenzw@pku.edu.cn Acknowledgement:

### CSCI : Optimization and Control of Networks. Review on Convex Optimization

CSCI7000-016: Optimization and Control of Networks Review on Convex Optimization 1 Convex set S R n is convex if x,y S, λ,µ 0, λ+µ = 1 λx+µy S geometrically: x,y S line segment through x,y S examples (one

### Support Vector Machines

Support Vector Machines Support vector machines (SVMs) are one of the central concepts in all of machine learning. They are simply a combination of two ideas: linear classification via maximum (or optimal

### Continuous Optimisation, Chpt 6: Solution methods for Constrained Optimisation

Continuous Optimisation, Chpt 6: Solution methods for Constrained Optimisation Peter J.C. Dickinson DMMP, University of Twente p.j.c.dickinson@utwente.nl http://dickinson.website/teaching/2017co.html version:

### A Brief Review on Convex Optimization

A Brief Review on Convex Optimization 1 Convex set S R n is convex if x,y S, λ,µ 0, λ+µ = 1 λx+µy S geometrically: x,y S line segment through x,y S examples (one convex, two nonconvex sets): A Brief Review

### Optimality, Duality, Complementarity for Constrained Optimization

Optimality, Duality, Complementarity for Constrained Optimization Stephen Wright University of Wisconsin-Madison May 2014 Wright (UW-Madison) Optimality, Duality, Complementarity May 2014 1 / 41 Linear

### Lectures 9 and 10: Constrained optimization problems and their optimality conditions

Lectures 9 and 10: Constrained optimization problems and their optimality conditions Coralia Cartis, Mathematical Institute, University of Oxford C6.2/B2: Continuous Optimization Lectures 9 and 10: Constrained

### Convex Optimization Theory. Chapter 5 Exercises and Solutions: Extended Version

Convex Optimization Theory Chapter 5 Exercises and Solutions: Extended Version Dimitri P. Bertsekas Massachusetts Institute of Technology Athena Scientific, Belmont, Massachusetts http://www.athenasc.com

### EE364b Convex Optimization II May 30 June 2, Final exam

EE364b Convex Optimization II May 30 June 2, 2014 Prof. S. Boyd Final exam By now, you know how it works, so we won t repeat it here. (If not, see the instructions for the EE364a final exam.) Since you

### Introduction to Machine Learning Lecture 7. Mehryar Mohri Courant Institute and Google Research

Introduction to Machine Learning Lecture 7 Mehryar Mohri Courant Institute and Google Research mohri@cims.nyu.edu Convex Optimization Differentiation Definition: let f : X R N R be a differentiable function,

### Lecture 2: Convex Sets and Functions

Lecture 2: Convex Sets and Functions Hyang-Won Lee Dept. of Internet & Multimedia Eng. Konkuk University Lecture 2 Network Optimization, Fall 2015 1 / 22 Optimization Problems Optimization problems are

### Interior Point Methods in Mathematical Programming

Interior Point Methods in Mathematical Programming Clóvis C. Gonzaga Federal University of Santa Catarina, Brazil Journées en l honneur de Pierre Huard Paris, novembre 2008 01 00 11 00 000 000 000 000

### Lagrange Duality. Daniel P. Palomar. Hong Kong University of Science and Technology (HKUST)

Lagrange Duality Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization Fall 2017-18, HKUST, Hong Kong Outline of Lecture Lagrangian Dual function Dual

### ECE580 Exam 1 October 4, Please do not write on the back of the exam pages. Extra paper is available from the instructor.

ECE580 Exam 1 October 4, 2012 1 Name: Solution Score: /100 You must show ALL of your work for full credit. This exam is closed-book. Calculators may NOT be used. Please leave fractions as fractions, etc.

### Solving Dual Problems

Lecture 20 Solving Dual Problems We consider a constrained problem where, in addition to the constraint set X, there are also inequality and linear equality constraints. Specifically the minimization problem

### On the interior of the simplex, we have the Hessian of d(x), Hd(x) is diagonal with ith. µd(w) + w T c. minimize. subject to w T 1 = 1,

Math 30 Winter 05 Solution to Homework 3. Recognizing the convexity of g(x) := x log x, from Jensen s inequality we get d(x) n x + + x n n log x + + x n n where the equality is attained only at x = (/n,...,

### Support Vector Machines, Kernel SVM

Support Vector Machines, Kernel SVM Professor Ameet Talwalkar Professor Ameet Talwalkar CS260 Machine Learning Algorithms February 27, 2017 1 / 40 Outline 1 Administration 2 Review of last lecture 3 SVM

### Final Exam - Math Camp August 27, 2014

Final Exam - Math Camp August 27, 2014 You will have three hours to complete this exam. Please write your solution to question one in blue book 1 and your solutions to the subsequent questions in blue

### Interior Point Methods for Mathematical Programming

Interior Point Methods for Mathematical Programming Clóvis C. Gonzaga Federal University of Santa Catarina, Florianópolis, Brazil EURO - 2013 Roma Our heroes Cauchy Newton Lagrange Early results Unconstrained

### 10 Numerical methods for constrained problems

10 Numerical methods for constrained problems min s.t. f(x) h(x) = 0 (l), g(x) 0 (m), x X The algorithms can be roughly divided the following way: ˆ primal methods: find descent direction keeping inside