Computational Optimization. Mathematical Programming Fundamentals 1/25 (revised)

Size: px
Start display at page:

Download "Computational Optimization. Mathematical Programming Fundamentals 1/25 (revised)"

Transcription

1 Computational Optimization Mathematical Programming Fundamentals 1/5 (revised)

2 If you don t know where you are going, you probably won t get there. -from some book I read in eight grade If you do get there, you won t know it. -Dr. Bennett s amendment Mathematical Programming Theory tells us How to formulate a model. Strategies for solving the model. How to know when we have found an optimal solutions. How hard it is to solve the model. Let s start with the basics

3 Line Segment Let x R n and y R n, the points on the line segment joining x and y are { z z = λx+(1- λ)y, 0 λ 1 }. x y

4 Convex Sets A set S is convex if the line segment joining any two points in the set is also in the set, i.e., for any x,y S, λx+(1- λ)y S for all 0 λ 1 }. convex not convex convex not convex not convex

5 Favorite Convex Sets Circle with center c and radius r { x x c r Linear Equalities = plane Linear Inequalities or Polyhedrons } Matrix A R b R x R { x Ax = b} mxn m n Matrix A R b R x R { x Ax b} mxn m n

6 Convex Sets Is the intersection of two convex sets convex? Yes Is the union of two convex sets convex? NO

7 Convex Functions A function f is (strictly) convex on a convex set S, if and only if for any x,y S, f(λx+(1- λ)y)(<) λf(x)+ (1- λ)f(y) for all 0 λ 1. f(λx+(1- λ)y) f(y) f(x) x λx+(1- λ)y y

8 Concave Functions A function f is (strictly) concave on a convex set S, if and only if for any f is (strictly) convex on S. f -f

9 (Strictly)Convex, Concave, or none of the above? None of the above Concave Convex Concave Strictly convex

10 Favorite Convex Functions Linear functions n f ( x) = w' x = wx where x R i= 1 f ( x, x ) = x + x 1 1 i Certain Quadratic functions depends on choice of Q (the Hessian matrix) i f ( x) = x' Qx+ w' x+ c f ( x, x ) = x + x n 1 1

11 Convexity of function affects optimization algorithm

12 Convexity of constraints affects optimization algorithm min f(x) subject to x S S convex direction of Steepest descent S not convex

13 Convex Program min f(x) subject to x S where f and S are convex Make optimization nice Many practical problems are convex problem Use convex program as subproblem for nonconvex programs

14 Theorem : Global Solution of convex program If x* is a local minimizer of a convex programming problem, x* is also a global minimizer. Further more if the objective is strictly convex then x* is the unique global minimizer. Proof: contradiction x* y f(y)<f(x*)

15 Proof by contradiction Suppose x* is a local but not global minimizer, i.e. there exist y, s.t. f(y) <f(x*). Then for all 0<ε<1, f(εx*+(1- ε)y) ε f(x*)+(1- ε)f(y) < ε f(x*)+(1- ε)f(x*)=f(x*). Contradiction, x* is not a local min. You try for uniqueness in strict case.

16 Problems with nonconvex objective Min f(x) subject to x [a,b] f strictly convex, problem has unique global minimum a x* b f not convex, problem has two local minima a x b x*

17 Problems with nonconvex set Min f(x) subject to x [a,b] or [c d] a b x c x* d

18 Multivariate Calculus For x R n, f(x)=f(x 1, x, x 3, x 4,, x n ) The gradient of f: f ( x) f ( x) f ( x) f( x) =,,..., x1 x xn The Hessian of f: f( x) f( x) f( x)... x1 x1 x1 x x1 xn f( x) f( x)... f( x) f( x) f( x)... xn x1 xn x xn xn f( x) = x x1 x x

19 For example 4 3 x 1 ( ) 3 4 f x = x + x + e + x x x 1 x1 + 3e + 4 x f ( x ) = 3 1x + 4x1 + 3 x 1 9 e 4 f ( x ) = x = [0,1] 7 f ( x ) = 1 f x 11 4 ( ) = x

20 Quadratic Functions Form n nxn n x R Q R b R Gradient 1 f ( x) = x Qx b ' x n n n 1 = Q x x b x j = 1 ij i j j j i= 1 j = 1 j = 1 f ( x) 1 1 = Q x + Q x + Q x b x k n kk k ik i kj j k i k j k = Q x b assuming Qsymmetric kj j k f () x = Qx b f () x = Q

21 Taylor Series Expansion about x* - 1D Case Let x=x*+p 3 3 f(x)= f(x*+p)=f(x*)+pf (x*)+ p f (x*)+ p f (x*) Equivalently 1 n n + + p f (x*) + n! 1 1 3! 3 3 f(x)=f(x*)+(x-x*)f (x*)+ (x-x*) f (x*)+ ( x x*) f (x*) 1 1 3! 1 n! n n + + ( x x*) f (x*) +

22 Taylor Series Example Let f(x) = exp(-x), compute Taylor Series Expansion about x*=0 1 1 x x 3! 1 n n + + ( x x*) f (x*) + n! 3 n x* x x* x x* n x x* = 1 xe + e e + +(-1) e + 3! n! 3 3 f(x)=f(x*)+(x-x*)f (x*)+ (x-x*) f (x*)+ ( *) f (x*) 3 n x x n x = 1 x + + +(-1) + 3! n!

23 First Order Taylor Series Let x=x*+p Approximation f(x)=f(x*+p)=f(x*)+p f(x*)+ p α ( x*, p) where lim α ( x*, p) = 0 p 0 Says that a linear approximation of a function works well locally f(x) f(x) f(x*+p)= f ( x*) + p f( x*) f(x) f ( x*) + ( x x*) ' f( x*) x*

24 Second OrderTaylor Series Let x=x*+p Approximation f(x)=f(x*+p)=f(x*)+p f(x*)+ f(x*)p+ p ( *, ) where lim α ( x*, p) = 0 p 0 Says that a quadratic approximation of a function works even better locally 1 p α x p f(x) f(x) f( x*) + ( x x*) ' f( x*) x* 1 ( *)' x x f ( x *)( x x *) +

25 Theorem.1 Taylor s Theorem version Suppose f is cont diff, f ( x+ p) = f( x) + f( x+ tp)' p for some t [0,1]. If f is twice cont. diff, f ( x+ p) = f( x) + f( x)' p+ p' f( x+ tp)' p for some t [0,1]. 1 Also called Mean Value Theorem

26 Taylor Series Approximation Exercise Consider the function and x*=[-,3] = f ( x, x ) x 5x x 7xx x Compute gradient and Hessian. What is First Order TSA about X* What is second order TSA about X* Evaluate both TSA at y=[-1.9,3.] and compare with f(y)

27 Exercise f ( x, x ) = x + 5x x + 7x x + x function f ( x) = f ( x*) = [, ]' gradient f ( x) f ( x*) Hessian First order TSA: = = g ( x) = f ( x*) + ( x x*) f ( x*) = second order TSA: h( x) = f ( x*) + ( x x*) f ( x*) f ( y) g ( y) = f ( y) h( y) = 1 + ( x x*) f ( x*)( x x*)

28 Exercise f( x, x ) = x + 5x x + 7x x + x function f( x*) = x1 + 10xx 1 + 7x f( x) = f( x*) = [15, 5] gradient 5x1 + 14x1x + 4x 6 x1+ 10x 10x1+ 14x 18 f( x) = f( x*) = Hessian 10x1+ 14x 14x1+ 4-4

29 Exercise First order T S A : g ( x ) = f ( x*) + ( x x*) f ( x*) second order TSA: h ( x ) = f ( x*) + ( x x*) f ( x*) + ( x x *) f ( x*)( x x*) 1 f ( y ) g ( y ) = ( 64.9) =.089 f ( y ) h ( y ) = ( 64.5) =.0 3 9

30 General Optimization algorithm Specify some initial guess x 0 For k = 0, 1, If x k is optimal then stop Determine descent direction p k Determine improved estimate of the solution: x k+1 =x k +λ k p k Last step is one-dimensional search problem called line search

31 Descent Directions If the directional derivative is negative then linesearch will lead to decrease in the function f ( x) d < 0 [8,] [0,-1] f () x d

32 Descent directions create decrease Let d' f( x) < 0, then λ > 0 such that f( x+ λd) < f( x) Proof for λ λ f ( x+ λd) = f( x) + λd f( x) + λd α( x, λd) f( x+ λd) f( x) λ = d f( x) + d α( x, λd) f( x+ λd) f( x) < 0 for λsufficiently small since d f( x) < 0and α( x, λd) 0.

33 Negative Gradient An important fact to know is that the negative gradient always points downhill Let d = f( x), then λ > 0 such that f( x+ λd) < f( x) Proof f ( x+ λd) = f( x) + λd f( x) + λd α( x, λd) f( x+ λd) f( x) λ = d f( x) + d α( x, λd) f( x+ λd) f( x) < 0 for λsufficiently small since d f( x) < 0and α( x, λd) 0. for λ λ

34

35 Notes on negative gradient If gradient nonzero, then negative gradient defines a descent direction d ' f ( x) = f ( x)' f ( x) by substitution of d = f( x) < 0 if f( x) 0

36 Directional Derivative f ( x, d) = lim λ 0 = f( x) d f ( x + λd) f ( x) λ Always exists when function is convex

37 Assignment Read chapter 3 in NW

Computational Optimization. Convexity and Unconstrained Optimization 1/29/08 and 2/1(revised)

Computational Optimization. Convexity and Unconstrained Optimization 1/29/08 and 2/1(revised) Computational Optimization Convexity and Unconstrained Optimization 1/9/08 and /1(revised) Convex Sets A set S is convex if the line segment joining any two points in the set is also in the set, i.e.,

More information

1 Overview. 2 A Characterization of Convex Functions. 2.1 First-order Taylor approximation. AM 221: Advanced Optimization Spring 2016

1 Overview. 2 A Characterization of Convex Functions. 2.1 First-order Taylor approximation. AM 221: Advanced Optimization Spring 2016 AM 221: Advanced Optimization Spring 2016 Prof. Yaron Singer Lecture 8 February 22nd 1 Overview In the previous lecture we saw characterizations of optimality in linear optimization, and we reviewed the

More information

Lecture Unconstrained optimization. In this lecture we will study the unconstrained problem. minimize f(x), (2.1)

Lecture Unconstrained optimization. In this lecture we will study the unconstrained problem. minimize f(x), (2.1) Lecture 2 In this lecture we will study the unconstrained problem minimize f(x), (2.1) where x R n. Optimality conditions aim to identify properties that potential minimizers need to satisfy in relation

More information

Math 273a: Optimization Basic concepts

Math 273a: Optimization Basic concepts Math 273a: Optimization Basic concepts Instructor: Wotao Yin Department of Mathematics, UCLA Spring 2015 slides based on Chong-Zak, 4th Ed. Goals of this lecture The general form of optimization: minimize

More information

2019 Spring MATH2060A Mathematical Analysis II 1

2019 Spring MATH2060A Mathematical Analysis II 1 2019 Spring MATH2060A Mathematical Analysis II 1 Notes 1. CONVEX FUNCTIONS First we define what a convex function is. Let f be a function on an interval I. For x < y in I, the straight line connecting

More information

Problem Set 0 Solutions

Problem Set 0 Solutions CS446: Machine Learning Spring 2017 Problem Set 0 Solutions Handed Out: January 25 th, 2017 Handed In: NONE 1. [Probability] Assume that the probability of obtaining heads when tossing a coin is λ. a.

More information

DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO

DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO QUESTION BOOKLET EECS 227A Fall 2009 Midterm Tuesday, Ocotober 20, 11:10-12:30pm DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO You have 80 minutes to complete the midterm. The midterm consists

More information

Examination paper for TMA4180 Optimization I

Examination paper for TMA4180 Optimization I Department of Mathematical Sciences Examination paper for TMA4180 Optimization I Academic contact during examination: Phone: Examination date: 26th May 2016 Examination time (from to): 09:00 13:00 Permitted

More information

Unconstrained Optimization

Unconstrained Optimization 1 / 36 Unconstrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University February 2, 2015 2 / 36 3 / 36 4 / 36 5 / 36 1. preliminaries 1.1 local approximation

More information

MATHEMATICAL ECONOMICS: OPTIMIZATION. Contents

MATHEMATICAL ECONOMICS: OPTIMIZATION. Contents MATHEMATICAL ECONOMICS: OPTIMIZATION JOÃO LOPES DIAS Contents 1. Introduction 2 1.1. Preliminaries 2 1.2. Optimal points and values 2 1.3. The optimization problems 3 1.4. Existence of optimal points 4

More information

Chapter 11. Taylor Series. Josef Leydold Mathematical Methods WS 2018/19 11 Taylor Series 1 / 27

Chapter 11. Taylor Series. Josef Leydold Mathematical Methods WS 2018/19 11 Taylor Series 1 / 27 Chapter 11 Taylor Series Josef Leydold Mathematical Methods WS 2018/19 11 Taylor Series 1 / 27 First-Order Approximation We want to approximate function f by some simple function. Best possible approximation

More information

Chapter 2 Convex Analysis

Chapter 2 Convex Analysis Chapter 2 Convex Analysis The theory of nonsmooth analysis is based on convex analysis. Thus, we start this chapter by giving basic concepts and results of convexity (for further readings see also [202,

More information

MATH3283W LECTURE NOTES: WEEK 6 = 5 13, = 2 5, 1 13

MATH3283W LECTURE NOTES: WEEK 6 = 5 13, = 2 5, 1 13 MATH383W LECTURE NOTES: WEEK 6 //00 Recursive sequences (cont.) Examples: () a =, a n+ = 3 a n. The first few terms are,,, 5 = 5, 3 5 = 5 3, Since 5

More information

Symmetric Matrices and Eigendecomposition

Symmetric Matrices and Eigendecomposition Symmetric Matrices and Eigendecomposition Robert M. Freund January, 2014 c 2014 Massachusetts Institute of Technology. All rights reserved. 1 2 1 Symmetric Matrices and Convexity of Quadratic Functions

More information

Functions of Several Variables

Functions of Several Variables Functions of Several Variables The Unconstrained Minimization Problem where In n dimensions the unconstrained problem is stated as f() x variables. minimize f()x x, is a scalar objective function of vector

More information

Chapter 1. Optimality Conditions: Unconstrained Optimization. 1.1 Differentiable Problems

Chapter 1. Optimality Conditions: Unconstrained Optimization. 1.1 Differentiable Problems Chapter 1 Optimality Conditions: Unconstrained Optimization 1.1 Differentiable Problems Consider the problem of minimizing the function f : R n R where f is twice continuously differentiable on R n : P

More information

UNDERGROUND LECTURE NOTES 1: Optimality Conditions for Constrained Optimization Problems

UNDERGROUND LECTURE NOTES 1: Optimality Conditions for Constrained Optimization Problems UNDERGROUND LECTURE NOTES 1: Optimality Conditions for Constrained Optimization Problems Robert M. Freund February 2016 c 2016 Massachusetts Institute of Technology. All rights reserved. 1 1 Introduction

More information

SWFR ENG 4TE3 (6TE3) COMP SCI 4TE3 (6TE3) Continuous Optimization Algorithm. Convex Optimization. Computing and Software McMaster University

SWFR ENG 4TE3 (6TE3) COMP SCI 4TE3 (6TE3) Continuous Optimization Algorithm. Convex Optimization. Computing and Software McMaster University SWFR ENG 4TE3 (6TE3) COMP SCI 4TE3 (6TE3) Continuous Optimization Algorithm Convex Optimization Computing and Software McMaster University General NLO problem (NLO : Non Linear Optimization) (N LO) min

More information

Chapter 2: Preliminaries and elements of convex analysis

Chapter 2: Preliminaries and elements of convex analysis Chapter 2: Preliminaries and elements of convex analysis Edoardo Amaldi DEIB Politecnico di Milano edoardo.amaldi@polimi.it Website: http://home.deib.polimi.it/amaldi/opt-14-15.shtml Academic year 2014-15

More information

1 Newton s Method. Suppose we want to solve: x R. At x = x, f (x) can be approximated by:

1 Newton s Method. Suppose we want to solve: x R. At x = x, f (x) can be approximated by: Newton s Method Suppose we want to solve: (P:) min f (x) At x = x, f (x) can be approximated by: n x R. f (x) h(x) := f ( x)+ f ( x) T (x x)+ (x x) t H ( x)(x x), 2 which is the quadratic Taylor expansion

More information

Differentiable Functions

Differentiable Functions Differentiable Functions Let S R n be open and let f : R n R. We recall that, for x o = (x o 1, x o,, x o n S the partial derivative of f at the point x o with respect to the component x j is defined as

More information

Introduction to Nonlinear Stochastic Programming

Introduction to Nonlinear Stochastic Programming School of Mathematics T H E U N I V E R S I T Y O H F R G E D I N B U Introduction to Nonlinear Stochastic Programming Jacek Gondzio Email: J.Gondzio@ed.ac.uk URL: http://www.maths.ed.ac.uk/~gondzio SPS

More information

Optimization. Escuela de Ingeniería Informática de Oviedo. (Dpto. de Matemáticas-UniOvi) Numerical Computation Optimization 1 / 30

Optimization. Escuela de Ingeniería Informática de Oviedo. (Dpto. de Matemáticas-UniOvi) Numerical Computation Optimization 1 / 30 Optimization Escuela de Ingeniería Informática de Oviedo (Dpto. de Matemáticas-UniOvi) Numerical Computation Optimization 1 / 30 Unconstrained optimization Outline 1 Unconstrained optimization 2 Constrained

More information

Problem Set 3

Problem Set 3 4.02 Problem Set 3 Due Thursday, October 2, 2004, in class Starred (*) problems will not count for the grade on this problem set; they are based on material from lectures on 0/2 and 0/26, and provide practice

More information

Nonlinear Programming Models

Nonlinear Programming Models Nonlinear Programming Models Fabio Schoen 2008 http://gol.dsi.unifi.it/users/schoen Nonlinear Programming Models p. Introduction Nonlinear Programming Models p. NLP problems minf(x) x S R n Standard form:

More information

Chapter 13. Convex and Concave. Josef Leydold Mathematical Methods WS 2018/19 13 Convex and Concave 1 / 44

Chapter 13. Convex and Concave. Josef Leydold Mathematical Methods WS 2018/19 13 Convex and Concave 1 / 44 Chapter 13 Convex and Concave Josef Leydold Mathematical Methods WS 2018/19 13 Convex and Concave 1 / 44 Monotone Function Function f is called monotonically increasing, if x 1 x 2 f (x 1 ) f (x 2 ) It

More information

ARE202A, Fall 2005 CONTENTS. 1. Graphical Overview of Optimization Theory (cont) Separating Hyperplanes 1

ARE202A, Fall 2005 CONTENTS. 1. Graphical Overview of Optimization Theory (cont) Separating Hyperplanes 1 AREA, Fall 5 LECTURE #: WED, OCT 5, 5 PRINT DATE: OCTOBER 5, 5 (GRAPHICAL) CONTENTS 1. Graphical Overview of Optimization Theory (cont) 1 1.4. Separating Hyperplanes 1 1.5. Constrained Maximization: One

More information

MATH529 Fundamentals of Optimization Unconstrained Optimization II

MATH529 Fundamentals of Optimization Unconstrained Optimization II MATH529 Fundamentals of Optimization Unconstrained Optimization II Marco A. Montes de Oca Mathematical Sciences, University of Delaware, USA 1 / 31 Recap 2 / 31 Example Find the local and global minimizers

More information

MAT 419 Lecture Notes Transcribed by Eowyn Cenek 6/1/2012

MAT 419 Lecture Notes Transcribed by Eowyn Cenek 6/1/2012 (Homework 1: Chapter 1: Exercises 1-7, 9, 11, 19, due Monday June 11th See also the course website for lectures, assignments, etc) Note: today s lecture is primarily about definitions Lots of definitions

More information

Preliminary draft only: please check for final version

Preliminary draft only: please check for final version ARE211, Fall2012 CALCULUS4: THU, OCT 11, 2012 PRINTED: AUGUST 22, 2012 (LEC# 15) Contents 3. Univariate and Multivariate Differentiation (cont) 1 3.6. Taylor s Theorem (cont) 2 3.7. Applying Taylor theory:

More information

min f(x). (2.1) Objectives consisting of a smooth convex term plus a nonconvex regularization term;

min f(x). (2.1) Objectives consisting of a smooth convex term plus a nonconvex regularization term; Chapter 2 Gradient Methods The gradient method forms the foundation of all of the schemes studied in this book. We will provide several complementary perspectives on this algorithm that highlight the many

More information

CHAPTER 4: HIGHER ORDER DERIVATIVES. Likewise, we may define the higher order derivatives. f(x, y, z) = xy 2 + e zx. y = 2xy.

CHAPTER 4: HIGHER ORDER DERIVATIVES. Likewise, we may define the higher order derivatives. f(x, y, z) = xy 2 + e zx. y = 2xy. April 15, 2009 CHAPTER 4: HIGHER ORDER DERIVATIVES In this chapter D denotes an open subset of R n. 1. Introduction Definition 1.1. Given a function f : D R we define the second partial derivatives as

More information

Unconstrained minimization of smooth functions

Unconstrained minimization of smooth functions Unconstrained minimization of smooth functions We want to solve min x R N f(x), where f is convex. In this section, we will assume that f is differentiable (so its gradient exists at every point), and

More information

Math 341: Convex Geometry. Xi Chen

Math 341: Convex Geometry. Xi Chen Math 341: Convex Geometry Xi Chen 479 Central Academic Building, University of Alberta, Edmonton, Alberta T6G 2G1, CANADA E-mail address: xichen@math.ualberta.ca CHAPTER 1 Basics 1. Euclidean Geometry

More information

Iowa State University. Instructor: Alex Roitershtein Summer Homework #1. Solutions

Iowa State University. Instructor: Alex Roitershtein Summer Homework #1. Solutions Math 501 Iowa State University Introduction to Real Analysis Department of Mathematics Instructor: Alex Roitershtein Summer 015 EXERCISES FROM CHAPTER 1 Homework #1 Solutions The following version of the

More information

Lecture 2: Convex Sets and Functions

Lecture 2: Convex Sets and Functions Lecture 2: Convex Sets and Functions Hyang-Won Lee Dept. of Internet & Multimedia Eng. Konkuk University Lecture 2 Network Optimization, Fall 2015 1 / 22 Optimization Problems Optimization problems are

More information

EC9A0: Pre-sessional Advanced Mathematics Course. Lecture Notes: Unconstrained Optimisation By Pablo F. Beker 1

EC9A0: Pre-sessional Advanced Mathematics Course. Lecture Notes: Unconstrained Optimisation By Pablo F. Beker 1 EC9A0: Pre-sessional Advanced Mathematics Course Lecture Notes: Unconstrained Optimisation By Pablo F. Beker 1 1 Infimum and Supremum Definition 1. Fix a set Y R. A number α R is an upper bound of Y if

More information

Monotone Function. Function f is called monotonically increasing, if. x 1 x 2 f (x 1 ) f (x 2 ) x 1 < x 2 f (x 1 ) < f (x 2 ) x 1 x 2

Monotone Function. Function f is called monotonically increasing, if. x 1 x 2 f (x 1 ) f (x 2 ) x 1 < x 2 f (x 1 ) < f (x 2 ) x 1 x 2 Monotone Function Function f is called monotonically increasing, if Chapter 3 x x 2 f (x ) f (x 2 ) It is called strictly monotonically increasing, if f (x 2) f (x ) Convex and Concave x < x 2 f (x )

More information

3 Applications of partial differentiation

3 Applications of partial differentiation Advanced Calculus Chapter 3 Applications of partial differentiation 37 3 Applications of partial differentiation 3.1 Stationary points Higher derivatives Let U R 2 and f : U R. The partial derivatives

More information

Constrained Optimization and Lagrangian Duality

Constrained Optimization and Lagrangian Duality CIS 520: Machine Learning Oct 02, 2017 Constrained Optimization and Lagrangian Duality Lecturer: Shivani Agarwal Disclaimer: These notes are designed to be a supplement to the lecture. They may or may

More information

BASICS OF CONVEX ANALYSIS

BASICS OF CONVEX ANALYSIS BASICS OF CONVEX ANALYSIS MARKUS GRASMAIR 1. Main Definitions We start with providing the central definitions of convex functions and convex sets. Definition 1. A function f : R n R + } is called convex,

More information

Convex Functions and Optimization

Convex Functions and Optimization Chapter 5 Convex Functions and Optimization 5.1 Convex Functions Our next topic is that of convex functions. Again, we will concentrate on the context of a map f : R n R although the situation can be generalized

More information

MA102: Multivariable Calculus

MA102: Multivariable Calculus MA102: Multivariable Calculus Rupam Barman and Shreemayee Bora Department of Mathematics IIT Guwahati Differentiability of f : U R n R m Definition: Let U R n be open. Then f : U R n R m is differentiable

More information

Lecture 4: Convex Functions, Part I February 1

Lecture 4: Convex Functions, Part I February 1 IE 521: Convex Optimization Instructor: Niao He Lecture 4: Convex Functions, Part I February 1 Spring 2017, UIUC Scribe: Shuanglong Wang Courtesy warning: These notes do not necessarily cover everything

More information

1 Introduction to Optimization

1 Introduction to Optimization Unconstrained Convex Optimization 2 1 Introduction to Optimization Given a general optimization problem of te form min x f(x) (1.1) were f : R n R. Sometimes te problem as constraints (we are only interested

More information

Math 10C - Fall Final Exam

Math 10C - Fall Final Exam Math 1C - Fall 217 - Final Exam Problem 1. Consider the function f(x, y) = 1 x 2 (y 1) 2. (i) Draw the level curve through the point P (1, 2). Find the gradient of f at the point P and draw the gradient

More information

Functions of Several Variables

Functions of Several Variables Jim Lambers MAT 419/519 Summer Session 2011-12 Lecture 2 Notes These notes correspond to Section 1.2 in the text. Functions of Several Variables We now generalize the results from the previous section,

More information

Constrained optimization: direct methods (cont.)

Constrained optimization: direct methods (cont.) Constrained optimization: direct methods (cont.) Jussi Hakanen Post-doctoral researcher jussi.hakanen@jyu.fi Direct methods Also known as methods of feasible directions Idea in a point x h, generate a

More information

Optimization Tutorial 1. Basic Gradient Descent

Optimization Tutorial 1. Basic Gradient Descent E0 270 Machine Learning Jan 16, 2015 Optimization Tutorial 1 Basic Gradient Descent Lecture by Harikrishna Narasimhan Note: This tutorial shall assume background in elementary calculus and linear algebra.

More information

Summary Notes on Maximization

Summary Notes on Maximization Division of the Humanities and Social Sciences Summary Notes on Maximization KC Border Fall 2005 1 Classical Lagrange Multiplier Theorem 1 Definition A point x is a constrained local maximizer of f subject

More information

Duality. Geoff Gordon & Ryan Tibshirani Optimization /

Duality. Geoff Gordon & Ryan Tibshirani Optimization / Duality Geoff Gordon & Ryan Tibshirani Optimization 10-725 / 36-725 1 Duality in linear programs Suppose we want to find lower bound on the optimal value in our convex problem, B min x C f(x) E.g., consider

More information

3.5 Quadratic Approximation and Convexity/Concavity

3.5 Quadratic Approximation and Convexity/Concavity 3.5 Quadratic Approximation and Convexity/Concavity 55 3.5 Quadratic Approximation and Convexity/Concavity Overview: Second derivatives are useful for understanding how the linear approximation varies

More information

Computational Optimization. Augmented Lagrangian NW 17.3

Computational Optimization. Augmented Lagrangian NW 17.3 Computational Optimization Augmented Lagrangian NW 17.3 Upcoming Schedule No class April 18 Friday, April 25, in class presentations. Projects due unless you present April 25 (free extension until Monday

More information

AM 205: lecture 18. Last time: optimization methods Today: conditions for optimality

AM 205: lecture 18. Last time: optimization methods Today: conditions for optimality AM 205: lecture 18 Last time: optimization methods Today: conditions for optimality Existence of Global Minimum For example: f (x, y) = x 2 + y 2 is coercive on R 2 (global min. at (0, 0)) f (x) = x 3

More information

1 Directional Derivatives and Differentiability

1 Directional Derivatives and Differentiability Wednesday, January 18, 2012 1 Directional Derivatives and Differentiability Let E R N, let f : E R and let x 0 E. Given a direction v R N, let L be the line through x 0 in the direction v, that is, L :=

More information

Convex envelopes, cardinality constrained optimization and LASSO. An application in supervised learning: support vector machines (SVMs)

Convex envelopes, cardinality constrained optimization and LASSO. An application in supervised learning: support vector machines (SVMs) ORF 523 Lecture 8 Princeton University Instructor: A.A. Ahmadi Scribe: G. Hall Any typos should be emailed to a a a@princeton.edu. 1 Outline Convexity-preserving operations Convex envelopes, cardinality

More information

The Steepest Descent Algorithm for Unconstrained Optimization

The Steepest Descent Algorithm for Unconstrained Optimization The Steepest Descent Algorithm for Unconstrained Optimization Robert M. Freund February, 2014 c 2014 Massachusetts Institute of Technology. All rights reserved. 1 1 Steepest Descent Algorithm The problem

More information

Taylor and Maclaurin Series. Copyright Cengage Learning. All rights reserved.

Taylor and Maclaurin Series. Copyright Cengage Learning. All rights reserved. 11.10 Taylor and Maclaurin Series Copyright Cengage Learning. All rights reserved. We start by supposing that f is any function that can be represented by a power series f(x)= c 0 +c 1 (x a)+c 2 (x a)

More information

Convexity and Smoothness

Convexity and Smoothness Capter 4 Convexity and Smootness 4.1 Strict Convexity, Smootness, and Gateaux Differentiablity Definition 4.1.1. Let X be a Banac space wit a norm denoted by. A map f : X \{0} X \{0}, f f x is called a

More information

Introduction to Real Analysis

Introduction to Real Analysis Introduction to Real Analysis Joshua Wilde, revised by Isabel Tecu, Takeshi Suzuki and María José Boccardi August 13, 2013 1 Sets Sets are the basic objects of mathematics. In fact, they are so basic that

More information

Introduction to unconstrained optimization - direct search methods

Introduction to unconstrained optimization - direct search methods Introduction to unconstrained optimization - direct search methods Jussi Hakanen Post-doctoral researcher jussi.hakanen@jyu.fi Structure of optimization methods Typically Constraint handling converts the

More information

Mathematical Economics. Lecture Notes (in extracts)

Mathematical Economics. Lecture Notes (in extracts) Prof. Dr. Frank Werner Faculty of Mathematics Institute of Mathematical Optimization (IMO) http://math.uni-magdeburg.de/ werner/math-ec-new.html Mathematical Economics Lecture Notes (in extracts) Winter

More information

Optimality conditions for unconstrained optimization. Outline

Optimality conditions for unconstrained optimization. Outline Optimality conditions for unconstrained optimization Daniel P. Robinson Department of Applied Mathematics and Statistics Johns Hopkins University September 13, 2018 Outline 1 The problem and definitions

More information

8.7 Taylor s Inequality Math 2300 Section 005 Calculus II. f(x) = ln(1 + x) f(0) = 0

8.7 Taylor s Inequality Math 2300 Section 005 Calculus II. f(x) = ln(1 + x) f(0) = 0 8.7 Taylor s Inequality Math 00 Section 005 Calculus II Name: ANSWER KEY Taylor s Inequality: If f (n+) is continuous and f (n+) < M between the center a and some point x, then f(x) T n (x) M x a n+ (n

More information

Lecture 1: Entropy, convexity, and matrix scaling CSE 599S: Entropy optimality, Winter 2016 Instructor: James R. Lee Last updated: January 24, 2016

Lecture 1: Entropy, convexity, and matrix scaling CSE 599S: Entropy optimality, Winter 2016 Instructor: James R. Lee Last updated: January 24, 2016 Lecture 1: Entropy, convexity, and matrix scaling CSE 599S: Entropy optimality, Winter 2016 Instructor: James R. Lee Last updated: January 24, 2016 1 Entropy Since this course is about entropy maximization,

More information

Recitation 1. Gradients and Directional Derivatives. Brett Bernstein. CDS at NYU. January 21, 2018

Recitation 1. Gradients and Directional Derivatives. Brett Bernstein. CDS at NYU. January 21, 2018 Gradients and Directional Derivatives Brett Bernstein CDS at NYU January 21, 2018 Brett Bernstein (CDS at NYU) Recitation 1 January 21, 2018 1 / 23 Initial Question Intro Question Question We are given

More information

Optimization Methods. Lecture 19: Line Searches and Newton s Method

Optimization Methods. Lecture 19: Line Searches and Newton s Method 15.93 Optimization Methods Lecture 19: Line Searches and Newton s Method 1 Last Lecture Necessary Conditions for Optimality (identifies candidates) x local min f(x ) =, f(x ) PSD Slide 1 Sufficient Conditions

More information

E5295/5B5749 Convex optimization with engineering applications. Lecture 8. Smooth convex unconstrained and equality-constrained minimization

E5295/5B5749 Convex optimization with engineering applications. Lecture 8. Smooth convex unconstrained and equality-constrained minimization E5295/5B5749 Convex optimization with engineering applications Lecture 8 Smooth convex unconstrained and equality-constrained minimization A. Forsgren, KTH 1 Lecture 8 Convex optimization 2006/2007 Unconstrained

More information

Conjugate Gradient (CG) Method

Conjugate Gradient (CG) Method Conjugate Gradient (CG) Method by K. Ozawa 1 Introduction In the series of this lecture, I will introduce the conjugate gradient method, which solves efficiently large scale sparse linear simultaneous

More information

Constrained optimization. Unconstrained optimization. One-dimensional. Multi-dimensional. Newton with equality constraints. Active-set method.

Constrained optimization. Unconstrained optimization. One-dimensional. Multi-dimensional. Newton with equality constraints. Active-set method. Optimization Unconstrained optimization One-dimensional Multi-dimensional Newton s method Basic Newton Gauss- Newton Quasi- Newton Descent methods Gradient descent Conjugate gradient Constrained optimization

More information

MAC College Algebra

MAC College Algebra MAC 05 - College Algebra Name Review for Test 2 - Chapter 2 Date MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question. Find the exact distance between the

More information

Unit #24 - Lagrange Multipliers Section 15.3

Unit #24 - Lagrange Multipliers Section 15.3 Unit #24 - Lagrange Multipliers Section 1.3 Some material from Calculus, Single and MultiVariable by Hughes-Hallett, Gleason, McCallum et. al. Copyright 200 by John Wiley & Sons, Inc. This material is

More information

Inequality Constraints

Inequality Constraints Chapter 2 Inequality Constraints 2.1 Optimality Conditions Early in multivariate calculus we learn the significance of differentiability in finding minimizers. In this section we begin our study of the

More information

Math 113 (Calculus 2) Exam 4

Math 113 (Calculus 2) Exam 4 Math 3 (Calculus ) Exam 4 November 0 November, 009 Sections 0, 3 7 Name Student ID Section Instructor In some cases a series may be seen to converge or diverge for more than one reason. For such problems

More information

Constrained Optimization Theory

Constrained Optimization Theory Constrained Optimization Theory Stephen J. Wright 1 2 Computer Sciences Department, University of Wisconsin-Madison. IMA, August 2016 Stephen Wright (UW-Madison) Constrained Optimization Theory IMA, August

More information

Optimization. Yuh-Jye Lee. March 21, Data Science and Machine Intelligence Lab National Chiao Tung University 1 / 29

Optimization. Yuh-Jye Lee. March 21, Data Science and Machine Intelligence Lab National Chiao Tung University 1 / 29 Optimization Yuh-Jye Lee Data Science and Machine Intelligence Lab National Chiao Tung University March 21, 2017 1 / 29 You Have Learned (Unconstrained) Optimization in Your High School Let f (x) = ax

More information

Section Taylor and Maclaurin Series

Section Taylor and Maclaurin Series Section.0 Taylor and Maclaurin Series Ruipeng Shen Feb 5 Taylor and Maclaurin Series Main Goal: How to find a power series representation for a smooth function us assume that a smooth function has a power

More information

Introduction to gradient descent

Introduction to gradient descent 6-1: Introduction to gradient descent Prof. J.C. Kao, UCLA Introduction to gradient descent Derivation and intuitions Hessian 6-2: Introduction to gradient descent Prof. J.C. Kao, UCLA Introduction Our

More information

AP Calculus Testbank (Chapter 9) (Mr. Surowski)

AP Calculus Testbank (Chapter 9) (Mr. Surowski) AP Calculus Testbank (Chapter 9) (Mr. Surowski) Part I. Multiple-Choice Questions n 1 1. The series will converge, provided that n 1+p + n + 1 (A) p > 1 (B) p > 2 (C) p >.5 (D) p 0 2. The series

More information

MATH 4211/6211 Optimization Basics of Optimization Problems

MATH 4211/6211 Optimization Basics of Optimization Problems MATH 4211/6211 Optimization Basics of Optimization Problems Xiaojing Ye Department of Mathematics & Statistics Georgia State University Xiaojing Ye, Math & Stat, Georgia State University 0 A standard minimization

More information

Numerical Optimization Techniques

Numerical Optimization Techniques Numerical Optimization Techniques Léon Bottou NEC Labs America COS 424 3/2/2010 Today s Agenda Goals Representation Capacity Control Operational Considerations Computational Considerations Classification,

More information

IFT Lecture 2 Basics of convex analysis and gradient descent

IFT Lecture 2 Basics of convex analysis and gradient descent IF 6085 - Lecture Basics of convex analysis and gradient descent his version of the notes has not yet been thoroughly checked. Please report any bugs to the scribes or instructor. Scribes: Assya rofimov,

More information

Solutions and Proofs: Optimizing Portfolios

Solutions and Proofs: Optimizing Portfolios Solutions and Proofs: Optimizing Portfolios An Undergraduate Introduction to Financial Mathematics J. Robert Buchanan Covariance Proof: Cov(X, Y) = E [XY Y E [X] XE [Y] + E [X] E [Y]] = E [XY] E [Y] E

More information

15-859E: Advanced Algorithms CMU, Spring 2015 Lecture #16: Gradient Descent February 18, 2015

15-859E: Advanced Algorithms CMU, Spring 2015 Lecture #16: Gradient Descent February 18, 2015 5-859E: Advanced Algorithms CMU, Spring 205 Lecture #6: Gradient Descent February 8, 205 Lecturer: Anupam Gupta Scribe: Guru Guruganesh In this lecture, we will study the gradient descent algorithm and

More information

Numerical Optimization

Numerical Optimization Numerical Optimization Unit 2: Multivariable optimization problems Che-Rung Lee Scribe: February 28, 2011 (UNIT 2) Numerical Optimization February 28, 2011 1 / 17 Partial derivative of a two variable function

More information

FALL 2018 MATH 4211/6211 Optimization Homework 1

FALL 2018 MATH 4211/6211 Optimization Homework 1 FALL 2018 MATH 4211/6211 Optimization Homework 1 This homework assignment is open to textbook, reference books, slides, and online resources, excluding any direct solution to the problem (such as solution

More information

Module 2: Reflecting on One s Problems

Module 2: Reflecting on One s Problems MATH55 Module : Reflecting on One s Problems Main Math concepts: Translations, Reflections, Graphs of Equations, Symmetry Auxiliary ideas: Working with quadratics, Mobius maps, Calculus, Inverses I. Transformations

More information

8 Numerical methods for unconstrained problems

8 Numerical methods for unconstrained problems 8 Numerical methods for unconstrained problems Optimization is one of the important fields in numerical computation, beside solving differential equations and linear systems. We can see that these fields

More information

g(t) = f(x 1 (t),..., x n (t)).

g(t) = f(x 1 (t),..., x n (t)). Reading: [Simon] p. 313-333, 833-836. 0.1 The Chain Rule Partial derivatives describe how a function changes in directions parallel to the coordinate axes. Now we shall demonstrate how the partial derivatives

More information

Math (P)refresher Lecture 8: Unconstrained Optimization

Math (P)refresher Lecture 8: Unconstrained Optimization Math (P)refresher Lecture 8: Unconstrained Optimization September 2006 Today s Topics : Quadratic Forms Definiteness of Quadratic Forms Maxima and Minima in R n First Order Conditions Second Order Conditions

More information

Variational Methods & Optimal Control

Variational Methods & Optimal Control Variational Methods & Optimal Control lecture 02 Matthew Roughan Discipline of Applied Mathematics School of Mathematical Sciences University of Adelaide April 14, 2016

More information

Part 3: Trust-region methods for unconstrained optimization. Nick Gould (RAL)

Part 3: Trust-region methods for unconstrained optimization. Nick Gould (RAL) Part 3: Trust-region methods for unconstrained optimization Nick Gould (RAL) minimize x IR n f(x) MSc course on nonlinear optimization UNCONSTRAINED MINIMIZATION minimize x IR n f(x) where the objective

More information

Nonlinear Optimization

Nonlinear Optimization Nonlinear Optimization (Com S 477/577 Notes) Yan-Bin Jia Nov 7, 2017 1 Introduction Given a single function f that depends on one or more independent variable, we want to find the values of those variables

More information

The general programming problem is the nonlinear programming problem where a given function is maximized subject to a set of inequality constraints.

The general programming problem is the nonlinear programming problem where a given function is maximized subject to a set of inequality constraints. 1 Optimization Mathematical programming refers to the basic mathematical problem of finding a maximum to a function, f, subject to some constraints. 1 In other words, the objective is to find a point,

More information

X. Numerical Methods

X. Numerical Methods X. Numerical Methods. Taylor Approximation Suppose that f is a function defined in a neighborhood of a point c, and suppose that f has derivatives of all orders near c. In section 5 of chapter 9 we introduced

More information

Nonlinear equations and optimization

Nonlinear equations and optimization Notes for 2017-03-29 Nonlinear equations and optimization For the next month or so, we will be discussing methods for solving nonlinear systems of equations and multivariate optimization problems. We will

More information

Euler s Theorem for Homogeneous Functions

Euler s Theorem for Homogeneous Functions Division of the Humanities and Social Sciences Euler s Theorem for Homogeneous Functions KC Border October 2000 1 Definition Let X be a subset of R n. A function f : X R is homogeneous of degree k if for

More information

1. (4 % each, total 20 %) Answer each of the following. (No need to show your work for this problem). 3 n. n!? n=1

1. (4 % each, total 20 %) Answer each of the following. (No need to show your work for this problem). 3 n. n!? n=1 NAME: EXAM 4 - Math 56 SOlutions Instruction: Circle your answers and show all your work CLEARLY Partial credit will be given only when you present what belongs to part of a correct solution (4 % each,

More information

Structural and Multidisciplinary Optimization. P. Duysinx and P. Tossings

Structural and Multidisciplinary Optimization. P. Duysinx and P. Tossings Structural and Multidisciplinary Optimization P. Duysinx and P. Tossings 2018-2019 CONTACTS Pierre Duysinx Institut de Mécanique et du Génie Civil (B52/3) Phone number: 04/366.91.94 Email: P.Duysinx@uliege.be

More information

Division of the Humanities and Social Sciences. Supergradients. KC Border Fall 2001 v ::15.45

Division of the Humanities and Social Sciences. Supergradients. KC Border Fall 2001 v ::15.45 Division of the Humanities and Social Sciences Supergradients KC Border Fall 2001 1 The supergradient of a concave function There is a useful way to characterize the concavity of differentiable functions.

More information