Math 273a: Optimization

Similar documents
Lecture Note 1: Introduction to optimization. Xiaoqun Zhang Shanghai Jiao Tong University

Optimization in Process Systems Engineering

Numerical Optimization. Review: Unconstrained Optimization

Support Vector Machines: Maximum Margin Classifiers

Network Flows. 6. Lagrangian Relaxation. Programming. Fall 2010 Instructor: Dr. Masoud Yaghini

Optimization: an Overview

Math 273a: Optimization Basic concepts

36106 Managerial Decision Modeling Linear Decision Models: Part II

CHAPTER 2: QUADRATIC PROGRAMMING

Analysis of Algorithms. Unit 5 - Intractable Problems

Quiz Discussion. IE417: Nonlinear Programming: Lecture 12. Motivation. Why do we care? Jeff Linderoth. 16th March 2006

Lecture 23 Branch-and-Bound Algorithm. November 3, 2009

Computational Integer Programming Universidad de los Andes. Lecture 1. Dr. Ted Ralphs

Written Examination

is called an integer programming (IP) problem. model is called a mixed integer programming (MIP)

Lecture 13: Constrained optimization

Convex Optimization and Support Vector Machine

OPTIMIZATION. joint course with. Ottimizzazione Discreta and Complementi di R.O. Edoardo Amaldi. DEIB Politecnico di Milano

Lecture 18: Optimization Programming

CE 191: Civil and Environmental Engineering Systems Analysis. LEC 05 : Optimality Conditions

minimize x subject to (x 2)(x 4) u,

Integer Programming and Branch and Bound

Optimality, Duality, Complementarity for Constrained Optimization

Linear Support Vector Machine. Classification. Linear SVM. Huiping Cao. Huiping Cao, Slide 1/26

Jeff Howbert Introduction to Machine Learning Winter

Equations and Inequalities

Lec3p1, ORF363/COS323

Handout 1: Introduction to Dynamic Programming. 1 Dynamic Programming: Introduction and Examples

DEPARTMENT OF STATISTICS AND OPERATIONS RESEARCH OPERATIONS RESEARCH DETERMINISTIC QUALIFYING EXAMINATION. Part I: Short Questions

Support Vector Machines

where u is the decision-maker s payoff function over her actions and S is the set of her feasible actions.

1 Column Generation and the Cutting Stock Problem

Math 273a: Optimization Subgradients of convex functions

Operations Research Lecture 6: Integer Programming

Chapter 3: Discrete Optimization Integer Programming

In complexity theory, algorithms and problems are classified by the growth order of computation time as a function of instance size.

Summary of the simplex method

Integer Programming ISE 418. Lecture 12. Dr. Ted Ralphs

Mathematical Optimization Models and Applications

Lecture 1: Introduction

Integer Linear Programming Modeling

Math 273a: Optimization Netwon s methods

Chapter 3: Discrete Optimization Integer Programming

Math 273a: Optimization The Simplex method

Math 273a: Optimization Lagrange Duality

CHAPTER 3: INTEGER PROGRAMMING

Constrained Optimization

Math 273a: Optimization Overview of First-Order Optimization Algorithms

- Well-characterized problems, min-max relations, approximate certificates. - LP problems in the standard form, primal and dual linear programs

Optimeringslära för F (SF1811) / Optimization (SF1841)

LINEAR PROGRAMMING: A GEOMETRIC APPROACH. Copyright Cengage Learning. All rights reserved.

IE 5531: Engineering Optimization I

Dynamics of Global Supply Chain and Electric Power Networks: Models, Pricing Analysis, and Computations

Reconnect 04 Introduction to Integer Programming

ML (cont.): SUPPORT VECTOR MACHINES

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 18

18.9 SUPPORT VECTOR MACHINES

INTEGER PROGRAMMING. In many problems the decision variables must have integer values.

Coordinate Update Algorithm Short Course Subgradients and Subgradient Methods

Nonlinear Optimization: The art of modeling

2.098/6.255/ Optimization Methods Practice True/False Questions

February 22, Introduction to the Simplex Algorithm

Appendix A Solving Systems of Nonlinear Equations

Introduction to optimization and operations research

Computational Integer Programming. Lecture 2: Modeling and Formulation. Dr. Ted Ralphs

Motivation. Lecture 2 Topics from Optimization and Duality. network utility maximization (NUM) problem:

Notes taken by Graham Taylor. January 22, 2005

Barrier Method. Javier Peña Convex Optimization /36-725

A Stochastic-Oriented NLP Relaxation for Integer Programming

Regression with Numerical Optimization. Logistic

September Math Course: First Order Derivative

Section Notes 9. Midterm 2 Review. Applied Math / Engineering Sciences 121. Week of December 3, 2018

Optimization Methods

On the Power of Robust Solutions in Two-Stage Stochastic and Adaptive Optimization Problems

1 Strict local optimality in unconstrained optimization

The Perceptron Algorithm, Margins

The Simplex Method of Linear Programming

Copositive Programming and Combinatorial Optimization

Agenda today. Introduction to prescriptive modeling. Linear optimization models through three examples: Beyond linear optimization

Computational Finance

SF2822 Applied nonlinear optimization, final exam Wednesday June

CE 191: Civil & Environmental Engineering Systems Analysis. LEC 17 : Final Review

ECS171: Machine Learning

Machine Learning. Support Vector Machines. Manfred Huber

Lectures 6, 7 and part of 8

CONSTRAINED OPTIMIZATION LARS IMSLAND

Nonlinear Programming (Hillier, Lieberman Chapter 13) CHEM-E7155 Production Planning and Control

SEQUENTIAL AND SIMULTANEOUS LIFTING IN THE NODE PACKING POLYHEDRON JEFFREY WILLIAM PAVELKA. B.S., Kansas State University, 2011

Mark your answers ON THE EXAM ITSELF. If you are not sure of your answer you may wish to provide a brief explanation.

Chapter 2: Linear Programming Basics. (Bertsimas & Tsitsiklis, Chapter 1)

Lecture 2. MATH3220 Operations Research and Logistics Jan. 8, Pan Li The Chinese University of Hong Kong. Integer Programming Formulations

Teaching & Testing Mathematics Reading

4M020 Design tools. Algorithms for numerical optimization. L.F.P. Etman. Department of Mechanical Engineering Eindhoven University of Technology

Math 164: Optimization Barzilai-Borwein Method

Structured Problems and Algorithms

An Algorithm for Unconstrained Quadratically Penalized Convex Optimization (post conference version)

Optimization. Broadly two types: Unconstrained and Constrained optimizations We deal with constrained optimization. General form:

Support vector machines Lecture 4

Lecture 1. Stochastic Optimization: Introduction. January 8, 2018

March 13, Duality 3

Transcription:

Math 273a: Optimization Instructor: Wotao Yin Department of Mathematics, UCLA Fall 2015 online discussions on piazza.com

What is mathematical optimization? Optimization models the goal of solving a problem in the optimal way. Examples: Running a business: to maximize profit, minimize loss, maximize efficiency, or minimize risk. Design: minimize the weight of a bridge/truss, and maximize the strength, within the design constraints Planning: select a flight route to minimize time or fuel consumption of an airplane Formal definition: to minimize (or maximize) a real function by deciding the values of free variables from within an allowed set.

Optimization is an essential tool in life, business, and engineer. Examples: Walmart pricing and logistics Airplane engineering such as shape design and material selection Packing millions of transistors in a computer chip in a functional way achieving these requires analyzing many related variables and possibilities, taking advantages of tiny opportunities. We will cover some of the sophisticated mathematics for optimization be closer to the reality than most other math courses

Status of optimization Last few decades: astonishing improvements in computer hardware and software, which motivated great leap in optimization modeling, algorithm designs, and implementations. Solving certain optimization problems has become standard techniques and everyday practice in business, science, and engineering. It is now possible to solve certain optimization problems with thousands, millions, and even thousands of millions of variables. Optimization has become part of undergrad curriculum. Optimization (along with statistics) has been the foundation of machine learning and big-data analytics. Matlab has two optimization toolboxes...

Ingredients of successful optimization modeling: turn a problem into one of the typical optimization formulations algorithms: an (iterative) procedure that leads you toward a solution (most optimization problems do not have a closed-form solution) software and, for some problems, hardware implementation: realize the algorithms and return numerical solutions

First examples Find two nonnegative numbers whose sum is up to 6 so that their product is a maximum. Find the largest area of a rectangular region provided that its perimeter is no greater than 100. Given a sequence of n numbers that are not all negative, find two indices so that the sum of those numbers between the two (including them) is a maximum.

Optimization formulation minimize x f (x) subject to x C minimize is often abbreviated as min decision variable is typically stated under minimize, unless obvious subject to is often shortened to s.t. in linear and nonlinear optimization, feasible set C is represented by h i(x) = b i, i E g j(x) b j, j I (equality constraints) (inequality constraints)

A quadratic program (QP) minimize f (x) = (x 1 1) 2 + (x 2 1) 2 Elements: decision variables, parameters, constraint, feasible set, objective.

A linearly constrained quadratic program (QP) minimize f (x) = (x 1 1) 2 + (x 2 1) 2 subject to x 1 + x 2 = 3. Elements: decision variables, parameters, constraint, feasible set, objective.

Linear program (LP) From Griva-Nash-Sofer 1.2: minimize f (x) = (2x 1 + x 2) subject to x 1 + x 2 1 x 1 0, x 2 0. Elements: decision variables, constraints, feasible set, objective.

Nonlinear linear program (NLP) minimize f (x) = (x 1 + x 2) 2 subject to x 1x 2 0 2 x 1 1 2 x 2 1. Elements: decision variables, feasible set, objective, (box) constraints, global minimizer vs local minimizer.

Global vs local solution Solution means optimal solution Global solution x : f (x ) f (x) for all x C Local solution x : δ > 0 such that f (x ) f (x) for all x C and x x δ a (global or local) solution x is unique if holds strictly as < In general, it is difficult to tell if a local solution is global because algorithms can only check nearby points and have not clue of behaviors farther away. Hence, a solution may refer to a local solution. A local solution to a convex program is globally optimal. A LP is convex. A stationary point (where the derivative is zero) is also known as a solution, but it can be a maximization, minimization, or saddle point.

This course Most of this course focuses on finding local solutions and, for convex programs, global solutions. This is seemingly odd but there are good reasons: Asking for global solutions is computationally intractable, in general. Most global optimization algorithms (often takes long time to run) seeks the global solution by finding local solutions Many useful problems are convex, that is, a local solution is global In some applications, a local solution is an improvement from an existing point. Local solutions are OK.

We do not cover discrete optimization Most of this course focuses on problems with continuous variables, such as length, volume, strength, weight, time, etc. Many useful variables are discrete such as yes/no decision, the number of flights to dispatch, etc. Discrete variables often take binary or integer values, or values from a discrete set. Those problems are called discrete optimization. In discrete problems, we cannot just check nearby points or simply round non-integers to integers (exceptions exist though). Continuous optimization is solved as subproblem in some discrete optimization, though not always.

Quiz Question: There are n numbers a 1,..., a n with at least one a i 0. Find 1 s t n, so that t ai is a maximum. i=s Idea: create x {0, 1} n of the form [0,..., 0, 1..., 1, 0..., 0] such that t i=s ai = n i=1 aixi

Quiz Question: There are n numbers a 1,..., a n with at least one a i 0. Find 1 s t n, so that t ai is a maximum. i=s Idea: create x {0, 1} n of the form [0,..., 0, 1..., 1, 0..., 0] such that t i=s ai = n i=1 aixi Solution: maximize x,y,z {0,1} n n a ix i i=1 subject to x 0 = x n+1 = 0 n x i 1 i=1 x i x i 1 y i, x i x i+1 y i, n y i = 1 i=1 n z i = 1 i=1 select at least one 0-to-1 change 1-to-0 change.

Quiz Question: There are n numbers a 1,..., a n with at least one a i 0. Find 1 s t n, so that t ai is a maximum. i=s Idea: create x {0, 1} n of the form [0,..., 0, 1..., 1, 0..., 0] such that t i=s ai = n i=1 aixi Combinatorial algorithm: there is an O(n) algorithm to solve the problem with just one pass of data.