January 29, Introduction to optimization and complexity. Outline. Introduction. Problem formulation. Convexity reminder. Optimality Conditions
|
|
- Sherman Short
- 5 years ago
- Views:
Transcription
1 Olga Galinina ELT Network Analysis Dimensioning II Department of Electronics Communications Engineering Tampere University of Technology, Tampere, Finl January 29, 2014
2
3
4 A bit of a Hisry... Leonhard Euler ( ): nothing at all takes place in the Universe in which some rule of maximum or minimum does not appear
5 Where Network Optimization Arises? Optimization discipline deals with finding the maxima minima of functions subject some constraints Transportation Systems Transportation of goods over transportation networks Scheduling of fleets of airplanes Manufacturing Systems Scheduling of goods for manufacturing Flow of manufactured items within invenry systems Communication Systems Design expansion of communication systems Flow of information across networks Energy Systems, Financial Systems, much more
6 Examples Portfolio variables: amounts invested in different assets constraints: budget, max./min. investment per asset, minimum return objective: overall risk or return variance Device sizing in electronic circuits variables: device widths lengths constraints: manufacturing limits, timing requirements, maximum area objective: power consumption Data fitting variables: model parameters constraints: prior information, parameter limits objective: measure of prediction error
7 Conventional Design Method Initial design System analysis Is design satisfacry? No Correct the design basing on experience Depends on the designers intuition, experience skills Trial--error method Not easy apply a complex system Does not always lead the best possible design Qualitative design
8
9 Mathematical Model Consider an (mathematical) problem minimize f (x), x R n subject x Ω. Definition The function f (x) : R R is a real-valued function, called the objective function, or cost function.
10 Mathematical Model Consider an (mathematical) problem minimize f (x), x R n subject x Ω. Definition The function f (x) : R R is a real-valued function, called the objective function, or cost function. Definition The variables x = [x 1,..., x n ] are variables.
11 Mathematical Model Consider an (mathematical) problem minimize f (x), x R n subject x Ω. Definition The function f (x) : R R is a real-valued function, called the objective function, or cost function. Definition The variables x = [x 1,..., x n ] are variables. Definition Optimal solution x 0 has the smallest value of f (x) among all vecrs.
12 Constraint Set Definition The set Ω R n is the constraint set or feasible set/region. Ω takes form: {x : h i (x) = 0, g j (x) < 0}, where h i (x), g i (x) are constraint functions
13 Constraint Set Definition The set Ω R n is the constraint set or feasible set/region. Definition The above problem is a general form of a constrained problem. If Ω = R n, the problem is we refer the problem unconstrained. Ω takes form: {x : h i (x) = 0, g j (x) < 0}, where h i (x), g i (x) are constraint functions
14 Constraint Set Definition The set Ω R n is the constraint set or feasible set/region. Definition The above problem is a general form of a constrained problem. If Ω = R n, the problem is we refer the problem unconstrained. Ω takes form: {x : h i (x) = 0, g j (x) < 0}, where h i (x), g i (x) are constraint functions Definition If Ω =, the problem is infeasible, otherwise feasible.
15 Solving Optimization s General problem very difficult solve methods involve some compromise, e.g., very long computation time, or not always finding the solution Exceptions: certain problem classes can be solved efficiently reliably least-squares problems linear programming problems convex problems
16 Least-squares problem minimize Ax b 2 2 analytical solution: x 0 = (A T A) 1 A T reliable efficient algorithms software computation time proportional n 2 k (A R k n ); less if structured a mature technology Using least-squares least-squares problems are easy recognize a few stard techniques increase flexibility (e.g., including weights, adding regularization terms)
17 Linear programming minimize c T x subject a T i x b i, i = 1,..., m Solving linear programs no analytical formula for solution reliable efficient algorithms software computation time proportional n 2 m if m n; less with structure a mature technology Using linear programming not as easy recognize as least-squares problems a few stard tricks used convert problems in linear programs (e.g., problems involving l 1 - or l -norms, piecewise-linear functions)
18 Convex problem minimize f (x) subject g i (x) b i, i = 1,..., m objective constraint functions are convex: g i (α 1 x + α 2 y) α 1 g i (x) + α 2 g i (y), if α 1 + α 2 = 1, α 1 0, α 1 0. includes least-squares problems linear programs as special cases
19 Convex problems Solving convex problems no analytical solution reliable efficient algorithms computation time (roughly) proportional max{n 3, n 2 m, F }, where F is cost of evaluating f (x) its first second derivatives almost a technology Using convex often difficult recognize many tricks for transforming problems in convex form surprisingly many problems can be solved via convex
20 Nonlinear Traditional techniques for general nonconvex problems involve compromises Local methods (nonlinear programming) find a point that minimizes f among feasible points near it fast, can hle large problems require initial guess provide no information about distance (global) optimum Global methods find the (global) solution worst-case grows exponentially with problem size These algorithms are often based on solving convex subproblems
21 Brief hisry of convex ( ) Algorithms 1947: simplex algorithm for linear programming (Dantzig) 1960s: early interior-point methods (Fiacco McCormick, Dikin,... ) 1970s: ellipsoid method other subgradient methods 1980s: polynomial-time interior-point methods for linear programming (Karmarkar 1984) late 1980s-now: polynomial-time interior-point methods for nonlinear convex (Nesterov Nemirovski 1994) Applications before 1990: mostly in operations research; few in engineering since 1990: many new applications in engineering (control, signal processing, communications, circuit design,... ); new problem classes
22 Mental break The last possibility is not the case. Answer: 2/(4 1) = 2/3 A kitten has a 50/50 chance be male or female. My cat just delivered two adorable kittens. My veterinarian said that at least one of them is female. What is the probability that the other kitten is a boy? There are 4 variants: female-female female-male male-female male-male
23
24 Affine Convex Sets Definition S R n is affine if [x, y S, α R] αx + (1 α)y S If x 1,..., x m R n, j α j = 1, α j > 0, then x = α 1 x α 1 x 1 is a convex combination of x 1,..., x m. The intersection of (any number of) convex sets is convex
25 Affine Convex Sets Definition S R n is affine if [x, y S, α R] αx + (1 α)y S Definition S R n is convex if for all [x, y S, 0 < α < 1] z = αx + (1 α)y S (z a convex combination of x y). If x 1,..., x m R n, j α j = 1, α j > 0, then x = α 1 x α 1 x 1 is a convex combination of x 1,..., x m. The intersection of (any number of) convex sets is convex
26 Compact Sets Let B δ (x 0 ) denote the open ball of radius δ centered at the point x: B δ (x 0 ) = {x : x x 0 < δ}. Definition Set S R n is said be open if for each point x 0 S there is δ such that B δ (x 0 ). A set S R n is said be closed if its complement R n \ S is open. Alternative: every sequence in S has a convergent subsequence, whose limit lies in S. Note: If S R n, closed bounded, then S - compact (Heine-Borel theorem).
27 Compact Sets Let B δ (x 0 ) denote the open ball of radius δ centered at the point x: B δ (x 0 ) = {x : x x 0 < δ}. Definition Set S R n is said be open if for each point x 0 S there is δ such that B δ (x 0 ). A set S R n is said be closed if its complement R n \ S is open. Definition Set S is compact if each of its open covers has a finite subcover: {C i } i A, S {C i } i A finite J : S {C j } j J. Alternative: every sequence in S has a convergent subsequence, whose limit lies in S. Note: If S R n, closed bounded, then S - compact (Heine-Borel theorem).
28 Convex functions Definition Let C R n be a nonempty convex set. Then f : C R is convex (on C) if for all x, y C all α (0, 1): f (αx + (1 α)y) αf (x) + (1 α)f (y) If strict inequality holds whenever x y, then f is said be strictly convex. The negative of a (strictly) convex function is called a (strictly) concave function.
29 Convex functions nonnegative multiple: αf is convex if f is convex, α 0 sum: f 1 + f 2 convex if f 1, f 2 convex (extends infinite sums, integrals) composition with affine function: f (Ax + b) is convex if f is convex Some univariate convex functions: 1. exponential f (x) = e αx (for all real α) 2. powers f (x) = x p if x 0 1 p < 3. powers of abs valuef (x) = x p, if x > 0 < p 0 Concave: 1. powers f (x) = x p if x 0 0 p 1 2. logarithm: f (x) = log x if x > 0.
30 Differentials f is dierentiable if dom(f ) is open the gradient of f : f (x) ( f,..., f ) T, x dom (f ) x 1 x n f is twice dierentiable if dom f is open the Hessian of f : H D 2 f (x) = 2 f x f x 1 x n... 2 f x n x f xn 2, x dom (f ) Note: Not all convex functions are differentiable.
31 First-order condition Theorem (gradient inequality) Differentiable f is convex on convex C R n i.f.f. x, y C f (y) f (x) + ( f (x)) T (y x).
32 First-order condition Theorem (gradient inequality) Differentiable f is convex on convex C R n i.f.f. x, y C f (y) f (x) + ( f (x)) T (y x). Theorem Minimizing differentiable convex function f (x) s.t. x C Find x C such that ( f (x )) T (y x ) 0 for all y C (variational inequality problem)
33 Second-order condition Theorem Twice differentiable f is convex on C R n i.f.f Hessian matrix 2 f is positive semidefinite for all x C. Note: If 2 f (x) is positive definite for all x C, then f is strictly convex on C. The converse is false. Example: consider the function f (x) = x 4
34
35 Minimization Find an optimal decision x (minimizer WLG): Definition x Ω is a local minimizer (minimum) of f over Ω if there exists ɛ > 0 such that f (x) f (x ) for all x Ω \ N(x ), where N(x ) is a neighborhood of x. Typically, N(x ) is just some open ball B δ (x ) If we replace with > then we have a strict local minimizer a strict global minimizer. Then f (x) is the global minimum value.
36 Minimization Find an optimal decision x (minimizer WLG): Definition x Ω is a local minimizer (minimum) of f over Ω if there exists ɛ > 0 such that f (x) f (x ) for all x Ω \ N(x ), where N(x ) is a neighborhood of x. Typically, N(x ) is just some open ball B δ (x ) Definition x Ω is a global minimizer (minimum) of f over Ω if f (x) f (x ) for all x Ω \ {x }. If we replace with > then we have a strict local minimizer a strict global minimizer. Then f (x) is the global minimum value.
37 The Method of Lagrange Multipliers minimize f (x) s.t. c i (x) = 0, i = 1,.., m, x R n, m n Jacobian matrix of the mapping c(x) = (c 1 (x),..., c m (x)): Lagrange theorem c(x) = c 1 x c m x 1... c 1 x n c m x n For local minimizer x continiously differentiable f, c 1,..., c m, y 1,...y m: m f (x) y i c i (x ) = 0 i=1
38 The Method of Lagrange Multipliers Lagrange multipliers: y 1,...y me Lagrange function (Lagrangian): Partial gradients: x L = L(x, y) = f (x) m y i c i (x) i=1 ( L,..., L ) m = f (x) y i c i (x) x 1 x n i=1 ( ) L L y L =,..., = c(x) y 1 y m
39 The Karush-Kuhn-Tucker Theorem minimize f (x) s.t. c i (x) 0, i = 1,.., m, s.t. h i (x) = 0, i = 1,.., l Active or binding constrait at x 0 : c i (x 0 ) = 0 Theorem For local minimizer x continiously differentiable f, c i, h i, λ 0, λ 1,..., λ m, µ 1,..., µ l : λ 0 f (x) m λ i c i (x ) i=1 l µ i h i (x ) = 0 i=1 λ i c i (x ) = 0, i = 1,..., m (complementary slackness) λ, λ i 0, i = 1,..., m (dual feasibility) c i (x) 0, h i (x) = 0 (primal feasibility)
40 Mental break How many apples at equal distances from each other can I have? Place three apples on a plane add one more below the plane. They form a tetrahedron = 4
41
42 Computational Answers questions: What is an efficient algorithm? How do we measure efficiency? Computational of an algorithm A measure of how many steps the algorithm will require in the worst case for an input of a given size or
43 Computational Answers questions: What is an efficient algorithm? How do we measure efficiency? Computational of an algorithm A measure of how many steps the algorithm will require in the worst case for an input of a given size or Classifying problems according tractability or intractability
44 Algorithms A problem e.g. Traveling Salesman : Given a graph with nodes edges costs associated with the edges, what is a least-cost closed walk (or ur) containing each of the nodes exactly once? can be thought of as a function p that maps an instance x an output p(x) (an answer).
45 Algorithms A problem e.g. Traveling Salesman : Given a graph with nodes edges costs associated with the edges, what is a least-cost closed walk (or ur) containing each of the nodes exactly once? An instance of a problem The graph contains nodes 1, 2, 3, 4, 5, 6, edges (1, 2) with cost 10, (1, 3) with cost 14,... can be thought of as a function p that maps an instance x an output p(x) (an answer).
46 Algorithms A problem e.g. Traveling Salesman : Given a graph with nodes edges costs associated with the edges, what is a least-cost closed walk (or ur) containing each of the nodes exactly once? An instance of a problem The graph contains nodes 1, 2, 3, 4, 5, 6, edges (1, 2) with cost 10, (1, 3) with cost 14,... can be thought of as a function p that maps an instance x an output p(x) (an answer). An algorithm A finite procedure for computing p(x) for any given input x.
47 Measuring Computational By counting the number of elementary operations addition (a + b) subtraction (a b) multiplication (a b) finite-precision division ( a b ) comparison of two numbers (a < b). or running time of the algorithm A simple function of the input size that is a reasonably tight upper bound on the actual number of steps Examples 100 (t 2 + t) = O(t 2 ), but t 3 O(t 2 )
48 Measuring Computational By counting the number of elementary operations addition (a + b) subtraction (a b) multiplication (a b) finite-precision division ( a b ) comparison of two numbers (a < b). or running time of the algorithm A simple function of the input size that is a reasonably tight upper bound on the actual number of steps Big-O notation We say that f (t) = O(g(t)), t 0 if c > 0: for t >0 f (t) cg(t). Examples 100 (t 2 + t) = O(t 2 ), but t 3 O(t 2 )
49 classes
Convex Optimization and l 1 -minimization
Convex Optimization and l 1 -minimization Sangwoon Yun Computational Sciences Korea Institute for Advanced Study December 11, 2009 2009 NIMS Thematic Winter School Outline I. Convex Optimization II. l
More informationIOE 611/Math 663: Nonlinear Programming
1. Introduction I course logistics I mathematical optimization I least-squares and linear programming I convex optimization I example I course goals and topics I nonlinear optimization I brief history
More information1. Introduction. mathematical optimization. least-squares and linear programming. convex optimization. example. course goals and topics
1. Introduction ESE 605 Modern Convex Optimization mathematical optimization least-squares and linear programming convex optimization example course goals and topics nonlinear optimization brief history
More informationConvex Optimization & Machine Learning. Introduction to Optimization
Convex Optimization & Machine Learning Introduction to Optimization mcuturi@i.kyoto-u.ac.jp CO&ML 1 Why do we need optimization in machine learning We want to find the best possible decision w.r.t. a problem
More informationLecture 6: Conic Optimization September 8
IE 598: Big Data Optimization Fall 2016 Lecture 6: Conic Optimization September 8 Lecturer: Niao He Scriber: Juan Xu Overview In this lecture, we finish up our previous discussion on optimality conditions
More informationCourse Outline. FRTN10 Multivariable Control, Lecture 13. General idea for Lectures Lecture 13 Outline. Example 1 (Doyle Stein, 1979)
Course Outline FRTN Multivariable Control, Lecture Automatic Control LTH, 6 L-L Specifications, models and loop-shaping by hand L6-L8 Limitations on achievable performance L9-L Controller optimization:
More informationIntroduction to Convex Optimization
Introduction to Convex Optimization Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization Fall 2018-19, HKUST, Hong Kong Outline of Lecture Optimization
More informationConstrained Optimization and Lagrangian Duality
CIS 520: Machine Learning Oct 02, 2017 Constrained Optimization and Lagrangian Duality Lecturer: Shivani Agarwal Disclaimer: These notes are designed to be a supplement to the lecture. They may or may
More informationExtreme Abridgment of Boyd and Vandenberghe s Convex Optimization
Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization Compiled by David Rosenberg Abstract Boyd and Vandenberghe s Convex Optimization book is very well-written and a pleasure to read. The
More informationThe Q-parametrization (Youla) Lecture 13: Synthesis by Convex Optimization. Lecture 13: Synthesis by Convex Optimization. Example: Spring-mass System
The Q-parametrization (Youla) Lecture 3: Synthesis by Convex Optimization controlled variables z Plant distubances w Example: Spring-mass system measurements y Controller control inputs u Idea for lecture
More informationIntroduction to Machine Learning Lecture 7. Mehryar Mohri Courant Institute and Google Research
Introduction to Machine Learning Lecture 7 Mehryar Mohri Courant Institute and Google Research mohri@cims.nyu.edu Convex Optimization Differentiation Definition: let f : X R N R be a differentiable function,
More informationA Brief Review on Convex Optimization
A Brief Review on Convex Optimization 1 Convex set S R n is convex if x,y S, λ,µ 0, λ+µ = 1 λx+µy S geometrically: x,y S line segment through x,y S examples (one convex, two nonconvex sets): A Brief Review
More informationFRTN10 Multivariable Control, Lecture 13. Course outline. The Q-parametrization (Youla) Example: Spring-mass System
FRTN Multivariable Control, Lecture 3 Anders Robertsson Automatic Control LTH, Lund University Course outline The Q-parametrization (Youla) L-L5 Purpose, models and loop-shaping by hand L6-L8 Limitations
More informationLecture: Duality of LP, SOCP and SDP
1/33 Lecture: Duality of LP, SOCP and SDP Zaiwen Wen Beijing International Center For Mathematical Research Peking University http://bicmr.pku.edu.cn/~wenzw/bigdata2017.html wenzw@pku.edu.cn Acknowledgement:
More informationCSCI : Optimization and Control of Networks. Review on Convex Optimization
CSCI7000-016: Optimization and Control of Networks Review on Convex Optimization 1 Convex set S R n is convex if x,y S, λ,µ 0, λ+µ = 1 λx+µy S geometrically: x,y S line segment through x,y S examples (one
More informationI.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010
I.3. LMI DUALITY Didier HENRION henrion@laas.fr EECI Graduate School on Control Supélec - Spring 2010 Primal and dual For primal problem p = inf x g 0 (x) s.t. g i (x) 0 define Lagrangian L(x, z) = g 0
More informationConvex Optimization. Dani Yogatama. School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, USA. February 12, 2014
Convex Optimization Dani Yogatama School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, USA February 12, 2014 Dani Yogatama (Carnegie Mellon University) Convex Optimization February 12,
More information5. Duality. Lagrangian
5. Duality Convex Optimization Boyd & Vandenberghe Lagrange dual problem weak and strong duality geometric interpretation optimality conditions perturbation and sensitivity analysis examples generalized
More informationLecture 18: Optimization Programming
Fall, 2016 Outline Unconstrained Optimization 1 Unconstrained Optimization 2 Equality-constrained Optimization Inequality-constrained Optimization Mixture-constrained Optimization 3 Quadratic Programming
More informationStructural and Multidisciplinary Optimization. P. Duysinx and P. Tossings
Structural and Multidisciplinary Optimization P. Duysinx and P. Tossings 2018-2019 CONTACTS Pierre Duysinx Institut de Mécanique et du Génie Civil (B52/3) Phone number: 04/366.91.94 Email: P.Duysinx@uliege.be
More informationFINANCIAL OPTIMIZATION
FINANCIAL OPTIMIZATION Lecture 1: General Principles and Analytic Optimization Philip H. Dybvig Washington University Saint Louis, Missouri Copyright c Philip H. Dybvig 2008 Choose x R N to minimize f(x)
More informationCS295: Convex Optimization. Xiaohui Xie Department of Computer Science University of California, Irvine
CS295: Convex Optimization Xiaohui Xie Department of Computer Science University of California, Irvine Course information Prerequisites: multivariate calculus and linear algebra Textbook: Convex Optimization
More informationConvex Optimization M2
Convex Optimization M2 Lecture 3 A. d Aspremont. Convex Optimization M2. 1/49 Duality A. d Aspremont. Convex Optimization M2. 2/49 DMs DM par email: dm.daspremont@gmail.com A. d Aspremont. Convex Optimization
More informationChapter 2: Preliminaries and elements of convex analysis
Chapter 2: Preliminaries and elements of convex analysis Edoardo Amaldi DEIB Politecnico di Milano edoardo.amaldi@polimi.it Website: http://home.deib.polimi.it/amaldi/opt-14-15.shtml Academic year 2014-15
More informationConvex Optimization Boyd & Vandenberghe. 5. Duality
5. Duality Convex Optimization Boyd & Vandenberghe Lagrange dual problem weak and strong duality geometric interpretation optimality conditions perturbation and sensitivity analysis examples generalized
More informationDuality. Lagrange dual problem weak and strong duality optimality conditions perturbation and sensitivity analysis generalized inequalities
Duality Lagrange dual problem weak and strong duality optimality conditions perturbation and sensitivity analysis generalized inequalities Lagrangian Consider the optimization problem in standard form
More informationIn view of (31), the second of these is equal to the identity I on E m, while this, in view of (30), implies that the first can be written
11.8 Inequality Constraints 341 Because by assumption x is a regular point and L x is positive definite on M, it follows that this matrix is nonsingular (see Exercise 11). Thus, by the Implicit Function
More informationMath 5593 Linear Programming Week 1
University of Colorado Denver, Fall 2013, Prof. Engau 1 Problem-Solving in Operations Research 2 Brief History of Linear Programming 3 Review of Basic Linear Algebra Linear Programming - The Story About
More informationLecture: Duality.
Lecture: Duality http://bicmr.pku.edu.cn/~wenzw/opt-2016-fall.html Acknowledgement: this slides is based on Prof. Lieven Vandenberghe s lecture notes Introduction 2/35 Lagrange dual problem weak and strong
More informationSTATIC LECTURE 4: CONSTRAINED OPTIMIZATION II - KUHN TUCKER THEORY
STATIC LECTURE 4: CONSTRAINED OPTIMIZATION II - KUHN TUCKER THEORY UNIVERSITY OF MARYLAND: ECON 600 1. Some Eamples 1 A general problem that arises countless times in economics takes the form: (Verbally):
More informationLecture 3. Optimization Problems and Iterative Algorithms
Lecture 3 Optimization Problems and Iterative Algorithms January 13, 2016 This material was jointly developed with Angelia Nedić at UIUC for IE 598ns Outline Special Functions: Linear, Quadratic, Convex
More informationInterior Point Methods for Mathematical Programming
Interior Point Methods for Mathematical Programming Clóvis C. Gonzaga Federal University of Santa Catarina, Florianópolis, Brazil EURO - 2013 Roma Our heroes Cauchy Newton Lagrange Early results Unconstrained
More informationCS-E4830 Kernel Methods in Machine Learning
CS-E4830 Kernel Methods in Machine Learning Lecture 3: Convex optimization and duality Juho Rousu 27. September, 2017 Juho Rousu 27. September, 2017 1 / 45 Convex optimization Convex optimisation This
More informationOn the Method of Lagrange Multipliers
On the Method of Lagrange Multipliers Reza Nasiri Mahalati November 6, 2016 Most of what is in this note is taken from the Convex Optimization book by Stephen Boyd and Lieven Vandenberghe. This should
More informationMicroeconomics I. September, c Leopold Sögner
Microeconomics I c Leopold Sögner Department of Economics and Finance Institute for Advanced Studies Stumpergasse 56 1060 Wien Tel: +43-1-59991 182 soegner@ihs.ac.at http://www.ihs.ac.at/ soegner September,
More informationComputational Finance
Department of Mathematics at University of California, San Diego Computational Finance Optimization Techniques [Lecture 2] Michael Holst January 9, 2017 Contents 1 Optimization Techniques 3 1.1 Examples
More informationPart IB Optimisation
Part IB Optimisation Theorems Based on lectures by F. A. Fischer Notes taken by Dexter Chua Easter 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after
More informationAppendix A Taylor Approximations and Definite Matrices
Appendix A Taylor Approximations and Definite Matrices Taylor approximations provide an easy way to approximate a function as a polynomial, using the derivatives of the function. We know, from elementary
More informationLecture 1: Introduction
EE 227A: Convex Optimization and Applications January 17 Lecture 1: Introduction Lecturer: Anh Pham Reading assignment: Chapter 1 of BV 1. Course outline and organization Course web page: http://www.eecs.berkeley.edu/~elghaoui/teaching/ee227a/
More informationConvex Optimization & Lagrange Duality
Convex Optimization & Lagrange Duality Chee Wei Tan CS 8292 : Advanced Topics in Convex Optimization and its Applications Fall 2010 Outline Convex optimization Optimality condition Lagrange duality KKT
More informationReview of Optimization Methods
Review of Optimization Methods Prof. Manuela Pedio 20550 Quantitative Methods for Finance August 2018 Outline of the Course Lectures 1 and 2 (3 hours, in class): Linear and non-linear functions on Limits,
More informationOptimality, Duality, Complementarity for Constrained Optimization
Optimality, Duality, Complementarity for Constrained Optimization Stephen Wright University of Wisconsin-Madison May 2014 Wright (UW-Madison) Optimality, Duality, Complementarity May 2014 1 / 41 Linear
More information10 Numerical methods for constrained problems
10 Numerical methods for constrained problems min s.t. f(x) h(x) = 0 (l), g(x) 0 (m), x X The algorithms can be roughly divided the following way: ˆ primal methods: find descent direction keeping inside
More informationx +3y 2t = 1 2x +y +z +t = 2 3x y +z t = 7 2x +6y +z +t = a
UCM Final Exam, 05/8/014 Solutions 1 Given the parameter a R, consider the following linear system x +y t = 1 x +y +z +t = x y +z t = 7 x +6y +z +t = a (a (6 points Discuss the system depending on the
More informationSupport Vector Machines: Maximum Margin Classifiers
Support Vector Machines: Maximum Margin Classifiers Machine Learning and Pattern Recognition: September 16, 2008 Piotr Mirowski Based on slides by Sumit Chopra and Fu-Jie Huang 1 Outline What is behind
More informationLecture 2: Convex Sets and Functions
Lecture 2: Convex Sets and Functions Hyang-Won Lee Dept. of Internet & Multimedia Eng. Konkuk University Lecture 2 Network Optimization, Fall 2015 1 / 22 Optimization Problems Optimization problems are
More informationInterior Point Methods. We ll discuss linear programming first, followed by three nonlinear problems. Algorithms for Linear Programming Problems
AMSC 607 / CMSC 764 Advanced Numerical Optimization Fall 2008 UNIT 3: Constrained Optimization PART 4: Introduction to Interior Point Methods Dianne P. O Leary c 2008 Interior Point Methods We ll discuss
More informationQuiz Discussion. IE417: Nonlinear Programming: Lecture 12. Motivation. Why do we care? Jeff Linderoth. 16th March 2006
Quiz Discussion IE417: Nonlinear Programming: Lecture 12 Jeff Linderoth Department of Industrial and Systems Engineering Lehigh University 16th March 2006 Motivation Why do we care? We are interested in
More information1 Introduction
2018-06-12 1 Introduction The title of this course is Numerical Methods for Data Science. What does that mean? Before we dive into the course technical material, let s put things into context. I will not
More informationGEORGIA INSTITUTE OF TECHNOLOGY H. MILTON STEWART SCHOOL OF INDUSTRIAL AND SYSTEMS ENGINEERING LECTURE NOTES OPTIMIZATION III
GEORGIA INSTITUTE OF TECHNOLOGY H. MILTON STEWART SCHOOL OF INDUSTRIAL AND SYSTEMS ENGINEERING LECTURE NOTES OPTIMIZATION III CONVEX ANALYSIS NONLINEAR PROGRAMMING THEORY NONLINEAR PROGRAMMING ALGORITHMS
More informationLagrange Duality. Daniel P. Palomar. Hong Kong University of Science and Technology (HKUST)
Lagrange Duality Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization Fall 2017-18, HKUST, Hong Kong Outline of Lecture Lagrangian Dual function Dual
More informationLecture Note 5: Semidefinite Programming for Stability Analysis
ECE7850: Hybrid Systems:Theory and Applications Lecture Note 5: Semidefinite Programming for Stability Analysis Wei Zhang Assistant Professor Department of Electrical and Computer Engineering Ohio State
More informationConvex Optimization in Communications and Signal Processing
Convex Optimization in Communications and Signal Processing Prof. Dr.-Ing. Wolfgang Gerstacker 1 University of Erlangen-Nürnberg Institute for Digital Communications National Technical University of Ukraine,
More informationConstrained Optimization Theory
Constrained Optimization Theory Stephen J. Wright 1 2 Computer Sciences Department, University of Wisconsin-Madison. IMA, August 2016 Stephen Wright (UW-Madison) Constrained Optimization Theory IMA, August
More informationLecture 1: Introduction. Outline. B9824 Foundations of Optimization. Fall Administrative matters. 2. Introduction. 3. Existence of optima
B9824 Foundations of Optimization Lecture 1: Introduction Fall 2009 Copyright 2009 Ciamac Moallemi Outline 1. Administrative matters 2. Introduction 3. Existence of optima 4. Local theory of unconstrained
More informationWeek 4: Calculus and Optimization (Jehle and Reny, Chapter A2)
Week 4: Calculus and Optimization (Jehle and Reny, Chapter A2) Tsun-Feng Chiang *School of Economics, Henan University, Kaifeng, China September 27, 2015 Microeconomic Theory Week 4: Calculus and Optimization
More informationLagrange duality. The Lagrangian. We consider an optimization program of the form
Lagrange duality Another way to arrive at the KKT conditions, and one which gives us some insight on solving constrained optimization problems, is through the Lagrange dual. The dual is a maximization
More informationLecture 1: Introduction. Outline. B9824 Foundations of Optimization. Fall Administrative matters. 2. Introduction. 3. Existence of optima
B9824 Foundations of Optimization Lecture 1: Introduction Fall 2010 Copyright 2010 Ciamac Moallemi Outline 1. Administrative matters 2. Introduction 3. Existence of optima 4. Local theory of unconstrained
More information12. Interior-point methods
12. Interior-point methods Convex Optimization Boyd & Vandenberghe inequality constrained minimization logarithmic barrier function and central path barrier method feasibility and phase I methods complexity
More informationFIN 550 Exam answers. A. Every unconstrained problem has at least one interior solution.
FIN 0 Exam answers Phil Dybvig December 3, 0. True-False points A. Every unconstrained problem has at least one interior solution. False. An unconstrained problem may not have any solution at all. For
More informationEE364a Review Session 5
EE364a Review Session 5 EE364a Review announcements: homeworks 1 and 2 graded homework 4 solutions (check solution to additional problem 1) scpd phone-in office hours: tuesdays 6-7pm (650-723-1156) 1 Complementary
More informationShiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 4. Subgradient
Shiqian Ma, MAT-258A: Numerical Optimization 1 Chapter 4 Subgradient Shiqian Ma, MAT-258A: Numerical Optimization 2 4.1. Subgradients definition subgradient calculus duality and optimality conditions Shiqian
More informationIntroduction and Math Preliminaries
Introduction and Math Preliminaries Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A. http://www.stanford.edu/ yyye Appendices A, B, and C, Chapter
More informationOptimality Conditions for Constrained Optimization
72 CHAPTER 7 Optimality Conditions for Constrained Optimization 1. First Order Conditions In this section we consider first order optimality conditions for the constrained problem P : minimize f 0 (x)
More informationEE/AA 578, Univ of Washington, Fall Duality
7. Duality EE/AA 578, Univ of Washington, Fall 2016 Lagrange dual problem weak and strong duality geometric interpretation optimality conditions perturbation and sensitivity analysis examples generalized
More informationFIN 550 Practice Exam Answers. A. Linear programs typically have interior solutions.
FIN 550 Practice Exam Answers Phil Dybvig. True-False 25 points A. Linear programs typically have interior solutions. False. Unless the objective is zero, all solutions are at the boundary. B. A local
More information14. Duality. ˆ Upper and lower bounds. ˆ General duality. ˆ Constraint qualifications. ˆ Counterexample. ˆ Complementary slackness.
CS/ECE/ISyE 524 Introduction to Optimization Spring 2016 17 14. Duality ˆ Upper and lower bounds ˆ General duality ˆ Constraint qualifications ˆ Counterexample ˆ Complementary slackness ˆ Examples ˆ Sensitivity
More informationConvex Optimization and Modeling
Convex Optimization and Modeling Duality Theory and Optimality Conditions 5th lecture, 12.05.2010 Jun.-Prof. Matthias Hein Program of today/next lecture Lagrangian and duality: the Lagrangian the dual
More information5 Handling Constraints
5 Handling Constraints Engineering design optimization problems are very rarely unconstrained. Moreover, the constraints that appear in these problems are typically nonlinear. This motivates our interest
More informationNonlinear Programming and the Kuhn-Tucker Conditions
Nonlinear Programming and the Kuhn-Tucker Conditions The Kuhn-Tucker (KT) conditions are first-order conditions for constrained optimization problems, a generalization of the first-order conditions we
More informationICS-E4030 Kernel Methods in Machine Learning
ICS-E4030 Kernel Methods in Machine Learning Lecture 3: Convex optimization and duality Juho Rousu 28. September, 2016 Juho Rousu 28. September, 2016 1 / 38 Convex optimization Convex optimisation This
More informationE 600 Chapter 4: Optimization
E 600 Chapter 4: Optimization Simona Helmsmueller August 8, 2018 Goals of this lecture: Every theorem in these slides is important! You should understand, remember and be able to apply each and every one
More informationNumerical Optimization
Constrained Optimization Computer Science and Automation Indian Institute of Science Bangalore 560 012, India. NPTEL Course on Constrained Optimization Constrained Optimization Problem: min h j (x) 0,
More informationMathematical Foundations -1- Constrained Optimization. Constrained Optimization. An intuitive approach 2. First Order Conditions (FOC) 7
Mathematical Foundations -- Constrained Optimization Constrained Optimization An intuitive approach First Order Conditions (FOC) 7 Constraint qualifications 9 Formal statement of the FOC for a maximum
More informationChap 2. Optimality conditions
Chap 2. Optimality conditions Version: 29-09-2012 2.1 Optimality conditions in unconstrained optimization Recall the definitions of global, local minimizer. Geometry of minimization Consider for f C 1
More informationTutorial on Convex Optimization for Engineers Part II
Tutorial on Convex Optimization for Engineers Part II M.Sc. Jens Steinwandt Communications Research Laboratory Ilmenau University of Technology PO Box 100565 D-98684 Ilmenau, Germany jens.steinwandt@tu-ilmenau.de
More informationSubgradient. Acknowledgement: this slides is based on Prof. Lieven Vandenberghes lecture notes. definition. subgradient calculus
1/41 Subgradient Acknowledgement: this slides is based on Prof. Lieven Vandenberghes lecture notes definition subgradient calculus duality and optimality conditions directional derivative Basic inequality
More informationIntroduction to Mathematical Programming IE406. Lecture 10. Dr. Ted Ralphs
Introduction to Mathematical Programming IE406 Lecture 10 Dr. Ted Ralphs IE406 Lecture 10 1 Reading for This Lecture Bertsimas 4.1-4.3 IE406 Lecture 10 2 Duality Theory: Motivation Consider the following
More informationISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints
ISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints Instructor: Prof. Kevin Ross Scribe: Nitish John October 18, 2011 1 The Basic Goal The main idea is to transform a given constrained
More informationLECTURE 25: REVIEW/EPILOGUE LECTURE OUTLINE
LECTURE 25: REVIEW/EPILOGUE LECTURE OUTLINE CONVEX ANALYSIS AND DUALITY Basic concepts of convex analysis Basic concepts of convex optimization Geometric duality framework - MC/MC Constrained optimization
More informationLinear and non-linear programming
Linear and non-linear programming Benjamin Recht March 11, 2005 The Gameplan Constrained Optimization Convexity Duality Applications/Taxonomy 1 Constrained Optimization minimize f(x) subject to g j (x)
More informationThe Kuhn-Tucker Problem
Natalia Lazzati Mathematics for Economics (Part I) Note 8: Nonlinear Programming - The Kuhn-Tucker Problem Note 8 is based on de la Fuente (2000, Ch. 7) and Simon and Blume (1994, Ch. 18 and 19). The Kuhn-Tucker
More informationGeneralization to inequality constrained problem. Maximize
Lecture 11. 26 September 2006 Review of Lecture #10: Second order optimality conditions necessary condition, sufficient condition. If the necessary condition is violated the point cannot be a local minimum
More informationARE202A, Fall 2005 CONTENTS. 1. Graphical Overview of Optimization Theory (cont) Separating Hyperplanes 1
AREA, Fall 5 LECTURE #: WED, OCT 5, 5 PRINT DATE: OCTOBER 5, 5 (GRAPHICAL) CONTENTS 1. Graphical Overview of Optimization Theory (cont) 1 1.4. Separating Hyperplanes 1 1.5. Constrained Maximization: One
More informationNONLINEAR. (Hillier & Lieberman Introduction to Operations Research, 8 th edition)
NONLINEAR PROGRAMMING (Hillier & Lieberman Introduction to Operations Research, 8 th edition) Nonlinear Programming g Linear programming has a fundamental role in OR. In linear programming all its functions
More information4TE3/6TE3. Algorithms for. Continuous Optimization
4TE3/6TE3 Algorithms for Continuous Optimization (Duality in Nonlinear Optimization ) Tamás TERLAKY Computing and Software McMaster University Hamilton, January 2004 terlaky@mcmaster.ca Tel: 27780 Optimality
More informationConstrained Optimization
1 / 22 Constrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University March 30, 2015 2 / 22 1. Equality constraints only 1.1 Reduced gradient 1.2 Lagrange
More informationCS711008Z Algorithm Design and Analysis
CS711008Z Algorithm Design and Analysis Lecture 8 Linear programming: interior point method Dongbo Bu Institute of Computing Technology Chinese Academy of Sciences, Beijing, China 1 / 31 Outline Brief
More informationFinal Exam - Math Camp August 27, 2014
Final Exam - Math Camp August 27, 2014 You will have three hours to complete this exam. Please write your solution to question one in blue book 1 and your solutions to the subsequent questions in blue
More informationOptimization for Machine Learning
Optimization for Machine Learning (Problems; Algorithms - A) SUVRIT SRA Massachusetts Institute of Technology PKU Summer School on Data Science (July 2017) Course materials http://suvrit.de/teaching.html
More information1. f(β) 0 (that is, β is a feasible point for the constraints)
xvi 2. The lasso for linear models 2.10 Bibliographic notes Appendix Convex optimization with constraints In this Appendix we present an overview of convex optimization concepts that are particularly useful
More informationCONVEX FUNCTIONS AND OPTIMIZATION TECHINIQUES A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF
CONVEX FUNCTIONS AND OPTIMIZATION TECHINIQUES A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF SCIENCE IN MATHEMATICS SUBMITTED TO NATIONAL INSTITUTE OF TECHNOLOGY,
More informationSeptember Math Course: First Order Derivative
September Math Course: First Order Derivative Arina Nikandrova Functions Function y = f (x), where x is either be a scalar or a vector of several variables (x,..., x n ), can be thought of as a rule which
More informationOptimization. Charles J. Geyer School of Statistics University of Minnesota. Stat 8054 Lecture Notes
Optimization Charles J. Geyer School of Statistics University of Minnesota Stat 8054 Lecture Notes 1 One-Dimensional Optimization Look at a graph. Grid search. 2 One-Dimensional Zero Finding Zero finding
More informationConvex Optimization. Newton s method. ENSAE: Optimisation 1/44
Convex Optimization Newton s method ENSAE: Optimisation 1/44 Unconstrained minimization minimize f(x) f convex, twice continuously differentiable (hence dom f open) we assume optimal value p = inf x f(x)
More informationEconomics 101A (Lecture 3) Stefano DellaVigna
Economics 101A (Lecture 3) Stefano DellaVigna January 24, 2017 Outline 1. Implicit Function Theorem 2. Envelope Theorem 3. Convexity and concavity 4. Constrained Maximization 1 Implicit function theorem
More informationWritten Examination
Division of Scientific Computing Department of Information Technology Uppsala University Optimization Written Examination 202-2-20 Time: 4:00-9:00 Allowed Tools: Pocket Calculator, one A4 paper with notes
More informationCS 6820 Fall 2014 Lectures, October 3-20, 2014
Analysis of Algorithms Linear Programming Notes CS 6820 Fall 2014 Lectures, October 3-20, 2014 1 Linear programming The linear programming (LP) problem is the following optimization problem. We are given
More informationCONSTRAINED NONLINEAR PROGRAMMING
149 CONSTRAINED NONLINEAR PROGRAMMING We now turn to methods for general constrained nonlinear programming. These may be broadly classified into two categories: 1. TRANSFORMATION METHODS: In this approach
More information36106 Managerial Decision Modeling Linear Decision Models: Part II
1 36106 Managerial Decision Modeling Linear Decision Models: Part II Kipp Martin University of Chicago Booth School of Business January 20, 2014 Reading and Excel Files Reading (Powell and Baker): Sections
More information