IOE 611/Math 663: Nonlinear Programming

Similar documents
1. Introduction. mathematical optimization. least-squares and linear programming. convex optimization. example. course goals and topics

Introduction to Convex Optimization

January 29, Introduction to optimization and complexity. Outline. Introduction. Problem formulation. Convexity reminder. Optimality Conditions

CS295: Convex Optimization. Xiaohui Xie Department of Computer Science University of California, Irvine

Convex Optimization and l 1 -minimization

Lecture 1: Introduction

Course Outline. FRTN10 Multivariable Control, Lecture 13. General idea for Lectures Lecture 13 Outline. Example 1 (Doyle Stein, 1979)

The Q-parametrization (Youla) Lecture 13: Synthesis by Convex Optimization. Lecture 13: Synthesis by Convex Optimization. Example: Spring-mass System

FRTN10 Multivariable Control, Lecture 13. Course outline. The Q-parametrization (Youla) Example: Spring-mass System

Convex Optimization & Machine Learning. Introduction to Optimization

Motivating examples Introduction to algorithms Simplex algorithm. On a particular example General algorithm. Duality An application to game theory

Example 1 linear elastic structure; forces f 1 ;:::;f 100 induce deections d 1 ;:::;d f i F max i, several hundred other constraints: max load p

FINANCIAL OPTIMIZATION

1. Introduction. mathematical optimization. least-squares and linear programming. convex optimization. example. course goals and topics

1. Introduction. mathematical optimization. least-squares and linear programming. convex optimization. example. course goals and topics

Lecture 6: Conic Optimization September 8

Lecture 9 Sequential unconstrained minimization

Optimisation and Operations Research

IOE510/MATH561/OMS518: Linear Programming I

Advances in Convex Optimization: Theory, Algorithms, and Applications

Welcome to Physics 161 Elements of Physics Fall 2018, Sept 4. Wim Kloet

CS 598 Statistical Reinforcement Learning. Nan Jiang

CSCI 5654 (Linear Programming, Fall 2013) CSCI 5654: Linear Programming. Notes. Lecture-1. August 29, Notes

IE598 Big Data Optimization Introduction

Linear-Quadratic Optimal Control: Full-State Feedback

Lecture 15: October 15

Interior Point Methods. We ll discuss linear programming first, followed by three nonlinear problems. Algorithms for Linear Programming Problems

MATH 345 Differential Equations

CHEE 3321 (Required) Analytical Methods for Chemical Engineers

12. Interior-point methods

Math 70 and 95 Master Syllabus Beginning and Intermediate Algebra, 5th ed., Miller, O Neill, Hyde with ALEKS.

Who should take this course? Required Text. Course Information. How to succeed in this course

ORF 363/COS 323 Final Exam, Fall 2015

Lecture 1 Introduction

ME451 Kinematics and Dynamics of Machine Systems

Optimization in. Stephen Boyd. 3rd SIAM Conf. Control & Applications. and Control Theory. System. Convex

In English, this means that if we travel on a straight line between any two points in C, then we never leave C.

Distributionally Robust Convex Optimization

Who should take this course? How to succeed in this course. Course Information

CS711008Z Algorithm Design and Analysis

An interior-point trust-region polynomial algorithm for convex programming

Chemistry 883 Computational Quantum Chemistry

Introduction. Outline. EE3321 Electromagnetic Field Theory. Welcome! About this class Rules and syllabus Let s get started! 9/19/2018. E t.

Optimeringslära för F (SF1811) / Optimization (SF1841)

Course: Math 111 Pre-Calculus Summer 2016

San José State University Aerospace Engineering Department AE138: Vector Based Dynamics for Aerospace Applications Fall 2018

and to estimate the quality of feasible solutions I A new way to derive dual bounds:

CPSC 340: Machine Learning and Data Mining. Gradient Descent Fall 2016

Convex Optimization. Lieven Vandenberghe Electrical Engineering Department, UCLA. Joint work with Stephen Boyd, Stanford University

Phys 631 Mathematical Methods of Theoretical Physics Fall 2018

Lecture Note 5: Semidefinite Programming for Stability Analysis

12. Interior-point methods

Dr. Soeren Prell A417 Zaffarano Hall Office Hours: by appointment (just send me a brief )

6.079/6.975 S. Boyd & P. Parrilo December 10 11, Final exam

Primal-Dual Interior-Point Methods. Javier Peña Convex Optimization /36-725

From Convex Optimization to Linear Matrix Inequalities

MATH 122 SYLLBAUS HARVARD UNIVERSITY MATH DEPARTMENT, FALL 2014

Interior Point Methods for Mathematical Programming

Today: Linear Programming

Lecture: Smoothing.

ORF 363/COS 323 Final Exam, Fall 2016

Fast Algorithms for SDPs derived from the Kalman-Yakubovich-Popov Lemma

The Ellipsoid Algorithm

CPSC 340: Machine Learning and Data Mining

NEW RIVER COMMUNITY COLLEGE DUBLIN, VIRGINIA COURSE PLAN

Math 1190/01 Calculus I Fall 2007


Oracle Complexity of Second-Order Methods for Smooth Convex Optimization

CSCI 1951-G Optimization Methods in Finance Part 01: Linear Programming

CHAPTER 2: QUADRATIC PROGRAMMING

PELLISSIPPI STATE COMMUNITY COLLEGE MASTER SYLLABUS

Astronomy 001 Online SP16 Syllabus (Section 8187)

Announcements - Homework

Optimization Methods in Finance

ORF 363/COS 323 Final Exam, Fall 2018

Math 410 Linear Algebra Summer Session American River College

CS Algorithms and Complexity

Lecture 6 Simplex method for linear programming

June Engineering Department, Stanford University. System Analysis and Synthesis. Linear Matrix Inequalities. Stephen Boyd (E.

Linear Algebra. Instructor: Justin Ryan

ORF 523 Final Exam, Spring 2016

Outline. Relaxation. Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING. 1. Lagrangian Relaxation. Lecture 12 Single Machine Models, Column Generation

15-780: LinearProgramming

Math 273a: Optimization

Sparse PCA with applications in finance

Second-Order Cone Program (SOCP) Detection and Transformation Algorithms for Optimization Software

Lecture Note 1: Introduction to optimization. Xiaoqun Zhang Shanghai Jiao Tong University

Welcome to Math 102 Section 102

MATH 137 : Calculus 1 for Honours Mathematics. Online Assignment #2. Introduction to Sequences

Introduction to Quantum Computing

Linear Programming: Simplex

AMSC/MATH 673, CLASSICAL METHODS IN PDE, FALL Required text: Evans, Partial Differential Equations second edition

Convex Optimization in Communications and Signal Processing

Convex optimization problems. Optimization problem in standard form

CSCI 1951-G Optimization Methods in Finance Part 10: Conic Optimization

with Binary Decision Diagrams Integer Programming J. N. Hooker Tarik Hadzic IT University of Copenhagen Carnegie Mellon University ICS 2007, January

Handout 8: Dealing with Data Uncertainty

Department of Mechanical Engineering MECH 221 (with MECH 224 & MATH 255) Engineering Science I

Advanced linear programming

Lecture 25: Subgradient Method and Bundle Methods April 24

Transcription:

1. Introduction I course logistics I mathematical optimization I least-squares and linear programming I convex optimization I example I course goals and topics I nonlinear optimization I brief history of convex optimization IOE 611: Nonlinear Programming, Fall 2018 1. Introduction Page 1 1 IOE 611/Math 663: Nonlinear Programming I Lectures: Tue and Thu, 12:00 1:20 PM, 1005 Dow I Instructors: Marina A. Epelman and Yiling Zhang (GSI); O ce hours: as announced on Canvas, and by appointment I Textbook: Convex Optimization by Stephen Boyd and Lieven Vandenberghe I Unrestricted resources (excluding assignments, exams, solutions, and grades) and calendar are at www-personal.umich.edu/~mepelman/teaching/ioe611/ Everything else, plus announcements, on Canvas I Required background: Working knowledge of linear algebra and analysis (review Appendix A of the book). Also, a level of mathematical maturity and willingness to program in Matlab. Exposure to numerical computing, optimization, and its application fields is helpful but not required. Please read syllabus for complete course policies IOE 611: Nonlinear Programming, Fall 2018 1. Introduction Page 1 2

Course Logistics I Lectures: I Slides posted prior to lectures I In class, combination of slides (bring a copy, preferably one you can write on) and board writing (take notes!, photographing the board is strongly discouraged) I Problem sets weekly or so, include some computer modeling and programming I Not graded, discussed in weekly review sessions (times TBD via online survey) I Crucial in preparation for... I Four takehome exams (20%, 20%, 30%, 30%) I Written solutions must be typed, L A TEX highly recommended I Computer code, when appropriate, uploaded to Canvas I For each exam, choose 60 consecutive hours within pre-announced week-long period IOE 611: Nonlinear Programming, Fall 2018 1. Introduction Page 1 3 Mathematical optimization (mathematical) optimization problem minimize f 0 (x) subject to f i (x) apple b i, i =1,...,m I x =(x 1,...,x n ): optimization variables (or decision variables) I f 0 : R n! R: objective function I f i : R n! R, i =1,...,m: constraint functions optimal solution x? has smallest value of f 0 among all vectors that are feasible, i.e., satisfy the constraints IOE 611: Nonlinear Programming, Fall 2018 1. Introduction Page 1 4

Examples portfolio optimization: looking for a best way to invest in a set of n assets I variables: amounts invested in di erent assets I constraints: budget, max./min. investment per asset, minimum return I objective: risk device sizing in electronic circuits: choosing the width and length of devices in a circuit I variables: device widths and lengths I constraints: manufacturing limits, timing requirements, maximum area I objective: power consumption data fitting: from a family of potential models, find a model that best fits observed data and prior information I variables: model parameters I constraints: prior information, parameter limits I objective: measure of misfit or prediction error IOE 611: Nonlinear Programming, Fall 2018 1. Introduction Page 1 5 Solving optimization problems solution method I A solution method for a class of problems is an analytical solution or an algorithm that computes the solution (to a given accuracy), given an instance of a problem from the class. general optimization problem I very di cult to solve I methods involve some compromise, e.g., very long computation time, or not always finding the solution exceptions: certain problem classes can be solved e reliably I least-squares problems I linear programming problems I convex optimization problems ciently and IOE 611: Nonlinear Programming, Fall 2018 1. Introduction Page 1 6

Least-squares minimize kax bk 2 2 A 2 R k n, k n, rank(a) =n solving least-squares problems I analytical solution: x? =(A T A) 1 A T b I reliable and e cient algorithms and software I computation time proportional to n 2 k;lessifstructured I a mature technology using least-squares I least-squares problems are easy to recognize I a few standard techniques increase flexibility (e.g., including weights, adding regularization terms) IOE 611: Nonlinear Programming, Fall 2018 1. Introduction Page 1 7 Linear programming minimize c T x subject to ai T x apple b i, i =1,...,m solving linear programs I no analytical formula for solution I reliable and e cient algorithms and software I computation time in practice proportional to n 2 m if m n; less with structure I a mature technology using linear programming I not as easy to recognize as least-squares problems I a few standard tricks used to convert problems into linear programs (e.g., problems involving `1- or`1-norms, convex piecewise-linear functions) IOE 611: Nonlinear Programming, Fall 2018 1. Introduction Page 1 8

Convex optimization problem minimize f 0 (x) subject to f i (x) apple b i, i =1,...,m I objective and constraint functions are convex: for i =0, 1,...,m, f i ( x + y) apple f i (x)+ f i (y) for any x, y 2 R n and any 0, 0 such that + =1 I includes least-squares problems and linear programs as special cases IOE 611: Nonlinear Programming, Fall 2018 1. Introduction Page 1 9 Convex optimization problem solving convex optimization problems I no analytical solution I reliable and e cient algorithms I computation time (roughly) proportional to max{n 3, n 2 m, F }, where F is cost of evaluating f i s and their first and second derivatives I almost a technology using convex optimization I often di cult to recognize I many tricks for transforming problems into convex form I surprisingly many problems can be solved via convex optimization IOE 611: Nonlinear Programming, Fall 2018 1. Introduction Page 1 10

Example m lamps illuminating n (small, flat) patches s lamp power p j θ kj r kj illumination I k intensity I k at patch k depends linearly on lamp powers p j : mx 2 I k = a kj p j, a kj = rkj max{cos kj, 0} j=1 problem: achieve desired illumination I des with bounded lamp powers minimize max k=1,...,n log I k log I des = f 0 (I ) subject to I k = P m j=1 a kjp j, k =1,...,n 0 apple p j apple p max, j =1,...,m IOE 611: Nonlinear Programming, Fall 2018 1. Introduction Page 1 11 How to solve? (using techniques we know, or ad hoc approaches) 1. use uniform power: p j = p, varyp 2. use least-squares: minimize P n k=1 (I k I des ) 2 round p j if p j > p max or p j < 0 3. use weighted least-squares: P minimize n k=1 (I k I des ) 2 + P m j=1 w j(p j p max /2) 2 iteratively adjust weights w j until 0 apple p j apple p max 4. use linear programming: minimize max k=1,...,n I k I des subject to 0 apple p j apple p max, j =1,...,m which can be solved via linear programming of course these are approximate (suboptimal) solutions IOE 611: Nonlinear Programming, Fall 2018 1. Introduction Page 1 12

5. use convex optimization: problem is equivalent to minimize f0 (I )=max k=1,...,n h(i k /I des ) subject to I k = P m j=1 a kjp j, k =1,...,n 0 apple p j apple p max, j =1,...,m with h(u) =max{u, 1/u} } 5 4 h(u) 3 2 nts 1 0 0 1 2 3 4 u f 0 is convex because maximum of convex functions is convex exact solution obtained with e ort modest factor least-squares e ort IOE 611: Nonlinear Programming, Fall 2018 1. Introduction Page 1 13 additional constraints: does adding (1) or (2) below complicate the problem? 1. no more than half of total power is in any 10 lamps 2. no more than half of the lamps are on (p j > 0) I answer: I moral: (untrained) intuition doesn t always work; without the proper background very easy problems can appear quite similar to very di cult problems IOE 611: Nonlinear Programming, Fall 2018 1. Introduction Page 1 14

Course goals and topics goals 1. recognize/formulate problems (such as the illumination problem) as convex optimization problems 2. develop code for problems of moderate size (e.g., 1000 lamps, 5000 patches) 3. characterize optimal solution (optimal power distribution), give limits of performance, etc. topics 1. convex sets, functions, optimization problems 2. examples and applications 3. algorithms IOE 611: Nonlinear Programming, Fall 2018 1. Introduction Page 1 15 Nonlinear optimization traditional techniques for general nonconvex problems involve compromises local optimization methods (nonlinear programming) I find a point that minimizes f 0 among feasible points near it, useful if goal is to find a good point I fast, can handle large problems I require initial guess I provide no information about distance to (global) optimum global optimization methods I find the (global) solution I worst-case complexity grows exponentially with problem size these algorithms are often based on solving convex subproblems IOE 611: Nonlinear Programming, Fall 2018 1. Introduction Page 1 16

Brief history of convex optimization theory (convex analysis): 1900 1970 algorithms I 1947: simplex algorithm for linear programming (Dantzig) I 1960s: early interior-point methods (Fiacco & McCormick, Dikin,... ) I 1970s: ellipsoid method and other subgradient methods I 1980s: polynomial-time interior-point methods for linear programming (Karmarkar 1984) I late 1980s now: polynomial-time interior-point methods for nonlinear convex optimization (Nesterov & Nemirovski 1994) I mid 1990s now: accelerated first order methods for large-scale and non-smooth convex optimization (Nesterov 2005) applications I before 1990: mostly in operations research; few in engineering I since 1990: many new applications in engineering (control, signal processing, communications, circuit design,... ); new problem classes (semidefinite and second-order cone programming, robust optimization) IOE 611: Nonlinear Programming, Fall 2018 1. Introduction Page 1 17