Outline. Basic Concepts in Optimization Part I. Illustration of a (Strict) Local Minimum, x. Local Optima. Neighborhood.

Size: px
Start display at page:

Download "Outline. Basic Concepts in Optimization Part I. Illustration of a (Strict) Local Minimum, x. Local Optima. Neighborhood."

Transcription

1 Outline Basic Concepts in Optimization Part I Local and Global Optima Benoît Chachuat <benoit@mcmaster.ca> McMaster University Department of Chemical Engineering ChE G: Optimization in Chemical Engineering Numerical Methods: Improving earch Notions of Conveity Benoît Chachuat (McMaster University) Basic Concepts in Optimization Part I G / Benoît Chachuat (McMaster University) Basic Concepts in Optimization Part I G / Local Optima Neighborhood The neighborhood N δ ( ) of a point consists of all nearby points; that is, all points within a small distance δ > of : Illustration of a (trict) Local Minimum, f ( ) < f (), N δ ( ) \ { } N δ ( ) = { : < δ} Local Optimum f () A point is a [strict] local minimum for the function f : IR n IR on the set if it is feasible ( ) and if sufficiently small neighborhoods surrounding it contain no points that are both feasible and [strictly] lower in objective value: δ > : f ( ) f (), N δ ( ) [ δ > : f ( ) < f (), N δ ( ) \ { } ] f ( ) δ N δ ( ) Benoît Chachuat (McMaster University) Basic Concepts in Optimization Part I G / Benoît Chachuat (McMaster University) Basic Concepts in Optimization Part I G /

2 Global Optima Illustration of a (trict) Global Minimum, Global Optimum A point is a [strict] global minimum for the function f : IR n IR on the set if it is feasible ( ) and if no other feasible solution has [strictly] lower objective value: f ( ) < f (), \ { } f ( ) f (), f () [ f ( ) < f (), \ { } ] Remarks: Global minima are always local minima Local minima may not be global minima f ( ) Analog definitions hold for local/global optima to maimize problems Benoît Chachuat (McMaster University) Basic Concepts in Optimization Part I G / Benoît Chachuat (McMaster University) Basic Concepts in Optimization Part I G 6 / Global vs. Local Optima Class Eercise: Identify the various types of minima and maima for f on = [ min, ma ] f () How to Find Optima? Review: Three Methods for Optimization Graphical olutions Great display + see multiple optima But impractical for nearly all practical problems Analytical olutions (e.g., Newton, Euler, etc.) Eact solution + easy analysis for changes in (uncertain) parameters But not possible for most practical problems Numerical olutions The only practical method for comple models! But only guarantees local optima + challenges in finding effects of (uncertain) parameters min ma Benoît Chachuat (McMaster University) Basic Concepts in Optimization Part I G 7 / Benoît Chachuat (McMaster University) Basic Concepts in Optimization Part I G 8 /

3 Numerical Optimization: The Dilemma! Consider the optimization problem: min f (, ) =, f (,) ( ) + ( ) +. + ( ) + ( ) Numerical Optimization: The Dilemma! Typically, only some local information is know about the objective function typically at a current point = (, )! Question: Which move do I make net? f (,) current point Benoît Chachuat (McMaster University) Basic Concepts in Optimization Part I G 9 / Benoît Chachuat (McMaster University) Basic Concepts in Optimization Part I G 9 / Numerical Optimization: The Basic Approach Improving earch Improving search methods are numerical algorithms that begin at a feasible solution to a given optimization model, and advance along a search path of feasible points with ever-improving function value f (,) Direction-tep Paradigm At the current point (k), how do I decide: the direction of change the magnitude of change whether further improvement is possible? The Basic Equation Improving search advances from current point (k) to new point (k+) as: (k+) (k) (k+) (k+) =. = (k) (k) + α =. + α. n (k+) n (k) n where: defines a move direction of solution change at (k) ( = ) α > determines a move magnitude, how far to pursue this direction Benoît Chachuat (McMaster University) Basic Concepts in Optimization Part I G / Benoît Chachuat (McMaster University) Basic Concepts in Optimization Part I G /

4 Direction of Change, Improving Directions Vector IR n is an improving direction at current point (k) if the objective function value at (k) + α is superior to that of (k), for all α > sufficiently small (maimize problem) ᾱ > : f ( (k) + α ) > f ( (k) ), α (,ᾱ] Direction of Change, Improving Directions Vector IR n is an improving direction at current point (k) if the objective function value at (k) + α is superior to that of (k), for all α > sufficiently small (maimize problem) ᾱ > : f ( (k) + α ) > f ( (k) ), α (,ᾱ]. current point f (,).8.6.., improving direction (k+) (k) set of improving directions at (k+) set of improving directions at (k) Benoît Chachuat (McMaster University) Basic Concepts in Optimization Part I G / Benoît Chachuat (McMaster University) Basic Concepts in Optimization Part I G / Direction of Change, (cont d) Feasible Directions Vector IR n is an feasible direction at current point (k) if point (k) + α violates no model constraint for all α > sufficiently small ᾱ > : (k) + α, α (,ᾱ] Optimality Criterion Necessary Condition of Optimality (NCO) No optimization model solution at which an improving feasible direction is available can be a local optimum set of feasible directions at set of feasible directions at (k) (k) set of improving directions at Benoît Chachuat (McMaster University) Basic Concepts in Optimization Part I G / Benoît Chachuat (McMaster University) Basic Concepts in Optimization Part I G /

5 Continuous Improving earch Algorithm tep : Initialization. Choose any starting feasible point () and let inde k. tep : Move Direction. If no improving feasible direction eists at current point (k), stop. Otherwise, construct an improving feasible direction at (k) as (k+). tep : tep ize. If there is no limit on step sizes for which direction (k+) continues to both improve the objective function and retain feasibility, stop The model is unbounded. Otherwise, choose the largest step size α (k+). tep : Update. (k+) (k) + α (k+) (k+) Increment inde k k + and return to step. A Word of Caution! Caution: A point at which no improving feasible direction is available may not be a local optimum! set of feasible directions at Remarks: This basic algorithm may terminate at a suboptimal point Moreover, it does not distinguish between local and global optima set of improving directions at Benoît Chachuat (McMaster University) Basic Concepts in Optimization Part I G / Benoît Chachuat (McMaster University) Basic Concepts in Optimization Part I G 6 / Finding out Optima! Class Eercise: Determine whether each of the following points is apparently a local/global minimum? a local/global maimum? neither? Conve ets A set IR n is said to be conve if every point on the line connecting any two points,y in is itself in, γ + ( γ)y, γ (,) y 6 Nonconve et: ome points on the line connecting,y do not lie in Nonconnected sets are nonconve; e.g., the discrete set {,,,...} 6 y 6 7 Benoît Chachuat (McMaster University) Basic Concepts in Optimization Part I G 7 / Benoît Chachuat (McMaster University) Basic Concepts in Optimization Part I G 8 /

6 Conve and Concave Functions Conve Functions A function f : IR, defined on a conve set IR n, is said to be conve on if the line segment connecting f () and f (y) at any two points,y lies above the function between and y, f (γ + ( γ)y) γf () + ( γ)f (y), γ (,) Conve and Concave Functions (cont d) Case of a strictly conve function on the conve set f () γf ( ) + ( γ)f ( ) Case of a nonconve function on, yet conve on the conve set f () trict conveity: f (γ + ( γ)y) < γf () + ( γ)f (y),,y, γ (,) Concave Functions f is said to be [strictly] concave on if ( f ) is [strictly] conve on, f (γ + ( γ)y) [>]γf () + ( γ)f (y),,y, γ (,) f (γ + ( γ) ) γ + ( γ) Benoît Chachuat (McMaster University) Basic Concepts in Optimization Part I G 9 / Benoît Chachuat (McMaster University) Basic Concepts in Optimization Part I G / ets Defined by Constraints Define the set = { IR n : g() }, with g a conve function on IR n. Then, is a conve set Why? Consider any two points,y. By the conveity of g, g(γ + ( γ)y) γg() + ( γ)g(y), γ (,) ince g() and g(y), g() + ( γ)g(y), γ (,) g() = Therefore, γ + ( γ)y for every γ (, ); i.e., is conve Class Eercise: Give a condition on g for the following set to be conve: = { IR n : g() } ets Defined by Constraints (cont d) What is the condition on h for the following set to be conve: = { IR n : h() = } The set is conve if and only if h is affine Conve ets Defined by Constraints Consider the set points not in y h() = = { IR n : g (),...,g m (),h () =,...,h p () = } Then, is conve if: g,...,g m are conve on IR n h,...,h p are affine Benoît Chachuat (McMaster University) Basic Concepts in Optimization Part I G / Benoît Chachuat (McMaster University) Basic Concepts in Optimization Part I G /

7 Conveity and Global Optimality Consider the constrained program: ma f () s.t. g j (), j =,...,m h j () =, j =,...,p If f and g,...,g m are conve on IR n, and h,...,h p are affine, then this program is said to be a conve program ufficient Condition for Global Optimality A [strict] local minimum to a conve program is also a [strict] global minimum On the other hand, a nonconve program may or may not have local optima that are not global optima Benoît Chachuat (McMaster University) Basic Concepts in Optimization Part I G /

1 Kernel methods & optimization

1 Kernel methods & optimization Machine Learning Class Notes 9-26-13 Prof. David Sontag 1 Kernel methods & optimization One eample of a kernel that is frequently used in practice and which allows for highly non-linear discriminant functions

More information

Review of Optimization Basics

Review of Optimization Basics Review of Optimization Basics. Introduction Electricity markets throughout the US are said to have a two-settlement structure. The reason for this is that the structure includes two different markets:

More information

Economics 205 Exercises

Economics 205 Exercises Economics 05 Eercises Prof. Watson, Fall 006 (Includes eaminations through Fall 003) Part 1: Basic Analysis 1. Using ε and δ, write in formal terms the meaning of lim a f() = c, where f : R R.. Write the

More information

1.4 FOUNDATIONS OF CONSTRAINED OPTIMIZATION

1.4 FOUNDATIONS OF CONSTRAINED OPTIMIZATION Essential Microeconomics -- 4 FOUNDATIONS OF CONSTRAINED OPTIMIZATION Fundamental Theorem of linear Programming 3 Non-linear optimization problems 6 Kuhn-Tucker necessary conditions Sufficient conditions

More information

1 Convexity, Convex Relaxations, and Global Optimization

1 Convexity, Convex Relaxations, and Global Optimization 1 Conveity, Conve Relaations, and Global Optimization Algorithms 1 1.1 Minima Consider the nonlinear optimization problem in a Euclidean space: where the feasible region. min f(), R n Definition (global

More information

4.3 - How Derivatives Affect the Shape of a Graph

4.3 - How Derivatives Affect the Shape of a Graph 4.3 - How Derivatives Affect the Shape of a Graph 1. Increasing and Decreasing Functions Definition: A function f is (strictly) increasing on an interval I if for every 1, in I with 1, f 1 f. A function

More information

Example 1. What are the critical points of f x 1 x x, 0 x? The maximal domain of f is 0 x and we find that

Example 1. What are the critical points of f x 1 x x, 0 x? The maximal domain of f is 0 x and we find that 6. Local Etrema of Functions We continue on our quest to etract as much information as possible about a function. The more information we gather, the better we can sketch the graph of the function. This

More information

Objective. 1 Specification/modeling of the controlled system. 2 Specification of a performance criterion

Objective. 1 Specification/modeling of the controlled system. 2 Specification of a performance criterion Optimal Control Problem Formulation Optimal Control Lectures 17-18: Problem Formulation Benoît Chachuat Department of Chemical Engineering Spring 2009 Objective Determine the control

More information

Lagrangian Duality for Dummies

Lagrangian Duality for Dummies Lagrangian Duality for Dummies David Knowles November 13, 2010 We want to solve the following optimisation problem: f 0 () (1) such that f i () 0 i 1,..., m (2) For now we do not need to assume conveity.

More information

A Primer on Multidimensional Optimization

A Primer on Multidimensional Optimization A Primer on Multidimensional Optimization Prof. Dr. Florian Rupp German University of Technology in Oman (GUtech) Introduction to Numerical Methods for ENG & CS (Mathematics IV) Spring Term 2016 Eercise

More information

STATIC LECTURE 4: CONSTRAINED OPTIMIZATION II - KUHN TUCKER THEORY

STATIC LECTURE 4: CONSTRAINED OPTIMIZATION II - KUHN TUCKER THEORY STATIC LECTURE 4: CONSTRAINED OPTIMIZATION II - KUHN TUCKER THEORY UNIVERSITY OF MARYLAND: ECON 600 1. Some Eamples 1 A general problem that arises countless times in economics takes the form: (Verbally):

More information

Chiang/Wainwright: Fundamental Methods of Mathematical Economics

Chiang/Wainwright: Fundamental Methods of Mathematical Economics Chiang/Wainwright: Fundamental Methods of Mathematical Economics CHAPTER 9 EXERCISE 9.. Find the stationary values of the following (check whether they are relative maima or minima or inflection points),

More information

Duality Uses and Correspondences. Ryan Tibshirani Convex Optimization

Duality Uses and Correspondences. Ryan Tibshirani Convex Optimization Duality Uses and Correspondences Ryan Tibshirani Conve Optimization 10-725 Recall that for the problem Last time: KKT conditions subject to f() h i () 0, i = 1,... m l j () = 0, j = 1,... r the KKT conditions

More information

Convexity II: Optimization Basics

Convexity II: Optimization Basics Conveity II: Optimization Basics Lecturer: Ryan Tibshirani Conve Optimization 10-725/36-725 See supplements for reviews of basic multivariate calculus basic linear algebra Last time: conve sets and functions

More information

FINITE-SAMPLE PERFORMANCE GUARANTEES FOR ONE-DIMENSIONAL STOCHASTIC ROOT FINDING. Samuel M. T. Ehrlichman Shane G. Henderson

FINITE-SAMPLE PERFORMANCE GUARANTEES FOR ONE-DIMENSIONAL STOCHASTIC ROOT FINDING. Samuel M. T. Ehrlichman Shane G. Henderson Proceedings of the 27 Winter Simulation Conference S. G. Henderson, B. Biller, M.-H. Hsieh, J. Shortle, J. D. Tew, and R. R. Barton, eds. FINITE-SAMPLE PERFORMANCE GUARANTEES FOR ONE-DIMENSIONAL STOCHASTIC

More information

Chapter 3 Solutions 1 2 (b) Δw (1) = (4, 2, 10) (4, 0, 7) = (0, 2, 3), Δw Δw (3) = ( 2, 4, 5) (4, 2, 10) = ( 6, 2, 5),

Chapter 3 Solutions 1 2 (b) Δw (1) = (4, 2, 10) (4, 0, 7) = (0, 2, 3), Δw Δw (3) = ( 2, 4, 5) (4, 2, 10) = ( 6, 2, 5), Optimization in Operations Research, nd Edition Ronald L. Rardin Solution Manual Completed download: https://solutionsmanualbank.com/download/solution-manual-foroptimization-in-operations-research-nd-edition-ronald-l-rardin/

More information

f'(x) = x 4 (2)(x - 6)(1) + (x - 6) 2 (4x 3 ) f'(x) = (x - 2) -1/3 = x 2 ; domain of f: (-, ) f'(x) = (x2 + 1)4x! 2x 2 (2x) 4x f'(x) =

f'(x) = x 4 (2)(x - 6)(1) + (x - 6) 2 (4x 3 ) f'(x) = (x - 2) -1/3 = x 2 ; domain of f: (-, ) f'(x) = (x2 + 1)4x! 2x 2 (2x) 4x f'(x) = 85. f() = 4 ( - 6) 2 f'() = 4 (2)( - 6)(1) + ( - 6) 2 (4 3 ) = 2 3 ( - 6)[ + 2( - 6)] = 2 3 ( - 6)(3-12) = 6 3 ( - 4)( - 6) Thus, the critical values are = 0, = 4, and = 6. Now we construct the sign chart

More information

3. Neoclassical Demand Theory. 3.1 Preferences

3. Neoclassical Demand Theory. 3.1 Preferences EC 701, Fall 2005, Microeconomic Theory September 28, 2005 page 81 3. Neoclassical Demand Theory 3.1 Preferences Not all preferences can be described by utility functions. This is inconvenient. We make

More information

Universidad Carlos III de Madrid

Universidad Carlos III de Madrid Universidad Carlos III de Madrid Eercise 1 2 3 4 5 6 Total Points Department of Economics Mathematics I Final Eam January 22nd 2018 LAST NAME: Eam time: 2 hours. FIRST NAME: ID: DEGREE: GROUP: 1 (1) Consider

More information

UNCONSTRAINED OPTIMIZATION PAUL SCHRIMPF OCTOBER 24, 2013

UNCONSTRAINED OPTIMIZATION PAUL SCHRIMPF OCTOBER 24, 2013 PAUL SCHRIMPF OCTOBER 24, 213 UNIVERSITY OF BRITISH COLUMBIA ECONOMICS 26 Today s lecture is about unconstrained optimization. If you re following along in the syllabus, you ll notice that we ve skipped

More information

4.3 How derivatives affect the shape of a graph. The first derivative test and the second derivative test.

4.3 How derivatives affect the shape of a graph. The first derivative test and the second derivative test. Chapter 4: Applications of Differentiation In this chapter we will cover: 41 Maimum and minimum values The critical points method for finding etrema 43 How derivatives affect the shape of a graph The first

More information

Bracketing an Optima in Univariate Optimization

Bracketing an Optima in Univariate Optimization Bracketing an Optima in Univariate Optimization Pritibhushan Sinha Quantitative Methods & Operations Management Area Indian Institute of Management Kozhikode Kozhikode 673570 Kerala India Email: pritibhushan.sinha@iimk.ac.in

More information

Convex Optimization Problems. Prof. Daniel P. Palomar

Convex Optimization Problems. Prof. Daniel P. Palomar Conve Optimization Problems Prof. Daniel P. Palomar The Hong Kong University of Science and Technology (HKUST) MAFS6010R- Portfolio Optimization with R MSc in Financial Mathematics Fall 2018-19, HKUST,

More information

A Basic Course in Real Analysis Prof. P. D. Srivastava Department of Mathematics Indian Institute of Technology, Kharagpur

A Basic Course in Real Analysis Prof. P. D. Srivastava Department of Mathematics Indian Institute of Technology, Kharagpur A Basic Course in Real Analysis Prof. P. D. Srivastava Department of Mathematics Indian Institute of Technology, Kharagpur Lecture - 36 Application of MVT, Darbou Theorem, L Hospital Rule (Refer Slide

More information

CHAPTER 1-2: SHADOW PRICES

CHAPTER 1-2: SHADOW PRICES Essential Microeconomics -- CHAPTER -: SHADOW PRICES An intuitive approach: profit maimizing firm with a fied supply of an input Shadow prices 5 Concave maimization problem 7 Constraint qualifications

More information

Calculus One variable

Calculus One variable Calculus One variable (f ± g) ( 0 ) = f ( 0 ) ± g ( 0 ) (λf) ( 0 ) = λ f ( 0 ) ( (fg) ) ( 0 ) = f ( 0 )g( 0 ) + f( 0 )g ( 0 ) f g (0 ) = f ( 0 )g( 0 ) f( 0 )g ( 0 ) f( 0 ) 2 (f g) ( 0 ) = f (g( 0 )) g

More information

PACKET Unit 4 Honors ICM Functions and Limits 1

PACKET Unit 4 Honors ICM Functions and Limits 1 PACKET Unit 4 Honors ICM Functions and Limits 1 Day 1 Homework For each of the rational functions find: a. domain b. -intercept(s) c. y-intercept Graph #8 and #10 with at least 5 EXACT points. 1. f 6.

More information

too, of course, but is perhaps overkill here.

too, of course, but is perhaps overkill here. LUNDS TEKNISKA HÖGSKOLA MATEMATIK LÖSNINGAR OPTIMERING 018-01-11 kl 08-13 1. a) CQ points are shown as A and B below. Graphically: they are the only points that share the same tangent line for both active

More information

Lecture 7: Weak Duality

Lecture 7: Weak Duality EE 227A: Conve Optimization and Applications February 7, 2012 Lecture 7: Weak Duality Lecturer: Laurent El Ghaoui 7.1 Lagrange Dual problem 7.1.1 Primal problem In this section, we consider a possibly

More information

MVE165/MMG631 Linear and integer optimization with applications Lecture 13 Overview of nonlinear programming. Ann-Brith Strömberg

MVE165/MMG631 Linear and integer optimization with applications Lecture 13 Overview of nonlinear programming. Ann-Brith Strömberg MVE165/MMG631 Overview of nonlinear programming Ann-Brith Strömberg 2015 05 21 Areas of applications, examples (Ch. 9.1) Structural optimization Design of aircraft, ships, bridges, etc Decide on the material

More information

Optimization. 1 Some Concepts and Terms

Optimization. 1 Some Concepts and Terms ECO 305 FALL 2003 Optimization 1 Some Concepts and Terms The general mathematical problem studied here is how to choose some variables, collected into a vector =( 1, 2,... n ), to maimize, or in some situations

More information

Intro to Nonlinear Optimization

Intro to Nonlinear Optimization Intro to Nonlinear Optimization We now rela the proportionality and additivity assumptions of LP What are the challenges of nonlinear programs NLP s? Objectives and constraints can use any function: ma

More information

CHAPTER 3: OPTIMIZATION

CHAPTER 3: OPTIMIZATION John Riley 8 February 7 CHAPTER 3: OPTIMIZATION 3. TWO VARIABLES 8 Second Order Conditions Implicit Function Theorem 3. UNCONSTRAINED OPTIMIZATION 4 Necessary and Sufficient Conditions 3.3 CONSTRAINED

More information

ECE 307 Techniques for Engineering Decisions

ECE 307 Techniques for Engineering Decisions ECE 7 Techniques for Engineering Decisions Introduction to the Simple Algorithm George Gross Department of Electrical and Computer Engineering University of Illinois at Urbana-Champaign ECE 7 5 9 George

More information

CSC 576: Gradient Descent Algorithms

CSC 576: Gradient Descent Algorithms CSC 576: Gradient Descent Algorithms Ji Liu Department of Computer Sciences, University of Rochester December 22, 205 Introduction The gradient descent algorithm is one of the most popular optimization

More information

Week #6 - Taylor Series, Derivatives and Graphs Section 4.1

Week #6 - Taylor Series, Derivatives and Graphs Section 4.1 Week #6 - Talor Series, Derivatives and Graphs Section 4.1 From Calculus, Single Variable b Hughes-Hallett, Gleason, McCallum et. al. Copright 2005 b John Wile & Sons, Inc. This material is used b permission

More information

Section 4.1. Math 150 HW 4.1 Solutions C. Panza

Section 4.1. Math 150 HW 4.1 Solutions C. Panza Math 50 HW 4. Solutions C. Panza Section 4. Eercise 0. Use Eq. ( to estimate f. Use a calculator to compute both the error and the percentage error. 0. f( =, a = 5, = 0.4 Estimate f: f ( = 4 f (5 = 9 f

More information

Functions. Introduction CHAPTER OUTLINE

Functions. Introduction CHAPTER OUTLINE Functions,00 P,000 00 0 970 97 980 98 990 99 000 00 00 Figure Standard and Poor s Inde with dividends reinvested (credit "bull": modification of work b Praitno Hadinata; credit "graph": modification of

More information

ECE Optimization for wireless networks Final. minimize f o (x) s.t. Ax = b,

ECE Optimization for wireless networks Final. minimize f o (x) s.t. Ax = b, ECE 788 - Optimization for wireless networks Final Please provide clear and complete answers. PART I: Questions - Q.. Discuss an iterative algorithm that converges to the solution of the problem minimize

More information

Chapter 2: Linear Programming Basics. (Bertsimas & Tsitsiklis, Chapter 1)

Chapter 2: Linear Programming Basics. (Bertsimas & Tsitsiklis, Chapter 1) Chapter 2: Linear Programming Basics (Bertsimas & Tsitsiklis, Chapter 1) 33 Example of a Linear Program Remarks. minimize 2x 1 x 2 + 4x 3 subject to x 1 + x 2 + x 4 2 3x 2 x 3 = 5 x 3 + x 4 3 x 1 0 x 3

More information

The Detective s Hat Function

The Detective s Hat Function The Detective s Hat Function (,) (,) (,) (,) (, ) (4, ) The graph of the function f shown above is a piecewise continuous function defined on [, 4]. The graph of f consists of five line segments. Let g

More information

CONSUMPTION. (Lectures 4, 5, and 6) Remark: (*) signals those exercises that I consider to be the most important

CONSUMPTION. (Lectures 4, 5, and 6) Remark: (*) signals those exercises that I consider to be the most important CONSUMPTION (Lectures 4, 5, and 6) Remark: (*) signals those eercises that I consider to be the most imortant Eercise 0 (MWG, E. 1.B.1, 1.B.) Show that if is rational, then: 1. if y z, then z;. is both

More information

Lecture 4: Optimization. Maximizing a function of a single variable

Lecture 4: Optimization. Maximizing a function of a single variable Lecture 4: Optimization Maximizing or Minimizing a Function of a Single Variable Maximizing or Minimizing a Function of Many Variables Constrained Optimization Maximizing a function of a single variable

More information

Optimization. Charles J. Geyer School of Statistics University of Minnesota. Stat 8054 Lecture Notes

Optimization. Charles J. Geyer School of Statistics University of Minnesota. Stat 8054 Lecture Notes Optimization Charles J. Geyer School of Statistics University of Minnesota Stat 8054 Lecture Notes 1 One-Dimensional Optimization Look at a graph. Grid search. 2 One-Dimensional Zero Finding Zero finding

More information

Duality revisited. Javier Peña Convex Optimization /36-725

Duality revisited. Javier Peña Convex Optimization /36-725 Duality revisited Javier Peña Conve Optimization 10-725/36-725 1 Last time: barrier method Main idea: approimate the problem f() + I C () with the barrier problem f() + 1 t φ() tf() + φ() where t > 0 and

More information

SWFR ENG 4TE3 (6TE3) COMP SCI 4TE3 (6TE3) Continuous Optimization Algorithm. Convex Optimization. Computing and Software McMaster University

SWFR ENG 4TE3 (6TE3) COMP SCI 4TE3 (6TE3) Continuous Optimization Algorithm. Convex Optimization. Computing and Software McMaster University SWFR ENG 4TE3 (6TE3) COMP SCI 4TE3 (6TE3) Continuous Optimization Algorithm Convex Optimization Computing and Software McMaster University General NLO problem (NLO : Non Linear Optimization) (N LO) min

More information

Math Midterm Solutions

Math Midterm Solutions Math 50 - Midterm Solutions November 4, 009. a) If f ) > 0 for all in a, b), then the graph of f is concave upward on a, b). If f ) < 0 for all in a, b), then the graph of f is downward on a, b). This

More information

1. Sets A set is any collection of elements. Examples: - the set of even numbers between zero and the set of colors on the national flag.

1. Sets A set is any collection of elements. Examples: - the set of even numbers between zero and the set of colors on the national flag. San Francisco State University Math Review Notes Michael Bar Sets A set is any collection of elements Eamples: a A {,,4,6,8,} - the set of even numbers between zero and b B { red, white, bule} - the set

More information

Operations Research Letters

Operations Research Letters Operations Research Letters 37 (2009) 1 6 Contents lists available at ScienceDirect Operations Research Letters journal homepage: www.elsevier.com/locate/orl Duality in robust optimization: Primal worst

More information

Maxima and Minima for Functions with side conditions. Lagrange s Multiplier. Question Find the critical points of w= xyz subject to the condition

Maxima and Minima for Functions with side conditions. Lagrange s Multiplier. Question Find the critical points of w= xyz subject to the condition Maima and Minima for Functions with side conditions. Lagrange s Multiplier. Find the critical points of w= z subject to the condition + + z =. We form the function ϕ = f + λg = z+ λ( + + z ) and obtain

More information

LINEAR APPROXIMATION, LIMITS, AND L'HOPITAL'S RULE v.05

LINEAR APPROXIMATION, LIMITS, AND L'HOPITAL'S RULE v.05 LINEAR APPROXIMATION, LIMITS, AND L'HOPITAL'S RULE v.05 Linear Approimation Nearby a point at which a function is differentiable, the function and its tangent line are approimately the same. The tangent

More information

Computational Optimization. Constrained Optimization Part 2

Computational Optimization. Constrained Optimization Part 2 Computational Optimization Constrained Optimization Part Optimality Conditions Unconstrained Case X* is global min Conve f X* is local min SOSC f ( *) = SONC Easiest Problem Linear equality constraints

More information

Calculus in Business. By Frederic A. Palmliden December 7, 1999

Calculus in Business. By Frederic A. Palmliden December 7, 1999 Calculus in Business By Frederic A. Palmliden December 7, 999 Optimization Linear Programming Game Theory Optimization The quest for the best Definition of goal equilibrium: The equilibrium state is defined

More information

DIFFERENTIATION. 3.1 Approximate Value and Error (page 151)

DIFFERENTIATION. 3.1 Approximate Value and Error (page 151) CHAPTER APPLICATIONS OF DIFFERENTIATION.1 Approimate Value and Error (page 151) f '( lim 0 f ( f ( f ( f ( f '( or f ( f ( f '( f ( f ( f '( (.) f ( f '( (.) where f ( f ( f ( Eample.1 (page 15): Find

More information

Fixed Point Theorem and Sequences in One or Two Dimensions

Fixed Point Theorem and Sequences in One or Two Dimensions Fied Point Theorem and Sequences in One or Two Dimensions Dr. Wei-Chi Yang Let us consider a recursive sequence of n+ = n + sin n and the initial value can be an real number. Then we would like to ask

More information

It is convenient to introduce some notation for this type of problems. I will write this as. max u (x 1 ; x 2 ) subj. to. p 1 x 1 + p 2 x 2 m ;

It is convenient to introduce some notation for this type of problems. I will write this as. max u (x 1 ; x 2 ) subj. to. p 1 x 1 + p 2 x 2 m ; 4 Calculus Review 4.1 The Utility Maimization Problem As a motivating eample, consider the problem facing a consumer that needs to allocate a given budget over two commodities sold at (linear) prices p

More information

1.5. Analyzing Graphs of Functions. The Graph of a Function. What you should learn. Why you should learn it. 54 Chapter 1 Functions and Their Graphs

1.5. Analyzing Graphs of Functions. The Graph of a Function. What you should learn. Why you should learn it. 54 Chapter 1 Functions and Their Graphs 0_005.qd /7/05 8: AM Page 5 5 Chapter Functions and Their Graphs.5 Analzing Graphs of Functions What ou should learn Use the Vertical Line Test for functions. Find the zeros of functions. Determine intervals

More information

MATH 4211/6211 Optimization Basics of Optimization Problems

MATH 4211/6211 Optimization Basics of Optimization Problems MATH 4211/6211 Optimization Basics of Optimization Problems Xiaojing Ye Department of Mathematics & Statistics Georgia State University Xiaojing Ye, Math & Stat, Georgia State University 0 A standard minimization

More information

14 March 2018 Module 1: Marginal analysis and single variable calculus John Riley. ( x, f ( x )) are the convex combinations of these two

14 March 2018 Module 1: Marginal analysis and single variable calculus John Riley. ( x, f ( x )) are the convex combinations of these two 4 March 28 Module : Marginal analysis single variable calculus John Riley 4. Concave conve functions A function f( ) is concave if, for any interval [, ], the graph of a function f( ) is above the line

More information

Circle your answer choice on the exam AND fill in the answer sheet below with the letter of the answer that you believe is the correct answer.

Circle your answer choice on the exam AND fill in the answer sheet below with the letter of the answer that you believe is the correct answer. ircle your answer choice on the eam AND fill in the answer sheet below with the letter of the answer that you believe is the correct answer. Problem Number Letter of Answer Problem Number Letter of Answer.

More information

Dr. Maddah ENMG 500 Engineering Management I 10/21/07

Dr. Maddah ENMG 500 Engineering Management I 10/21/07 Dr. Maddah ENMG 500 Engineering Management I 10/21/07 Computational Procedure of the Simplex Method The optimal solution of a general LP problem is obtained in the following steps: Step 1. Express the

More information

September Math Course: First Order Derivative

September Math Course: First Order Derivative September Math Course: First Order Derivative Arina Nikandrova Functions Function y = f (x), where x is either be a scalar or a vector of several variables (x,..., x n ), can be thought of as a rule which

More information

Convex Optimization Overview (cnt d)

Convex Optimization Overview (cnt d) Conve Optimization Overview (cnt d) Chuong B. Do November 29, 2009 During last week s section, we began our study of conve optimization, the study of mathematical optimization problems of the form, minimize

More information

Second Order ODEs. CSCC51H- Numerical Approx, Int and ODEs p.130/177

Second Order ODEs. CSCC51H- Numerical Approx, Int and ODEs p.130/177 Second Order ODEs Often physical or biological systems are best described by second or higher-order ODEs. That is, second or higher order derivatives appear in the mathematical model of the system. For

More information

Abe Mirza Graphing f ( x )

Abe Mirza Graphing f ( x ) Abe Mirza Graphing f ( ) Steps to graph f ( ) 1. Set f ( ) = 0 and solve for critical values.. Substitute the critical values into f ( ) to find critical points.. Set f ( ) = 0 and solve for critical values.

More information

9. Interpretations, Lifting, SOS and Moments

9. Interpretations, Lifting, SOS and Moments 9-1 Interpretations, Lifting, SOS and Moments P. Parrilo and S. Lall, CDC 2003 2003.12.07.04 9. Interpretations, Lifting, SOS and Moments Polynomial nonnegativity Sum of squares (SOS) decomposition Eample

More information

Principles of Optimization Math 364

Principles of Optimization Math 364 Principles of Optimization Math 364 Tom Asaki February 26, 2019 Topics List In this course we will cover the following list of topics and other topics as time allows. 1. Optimization Concepts and Notation

More information

MACHINE LEARNING ADVANCED MACHINE LEARNING

MACHINE LEARNING ADVANCED MACHINE LEARNING MACHINE LEARNING ADVANCED MACHINE LEARNING Recap of Important Notions on Estimation of Probability Density Functions 2 2 MACHINE LEARNING Overview Definition pdf Definition joint, condition, marginal,

More information

Unconstrained Multivariate Optimization

Unconstrained Multivariate Optimization Unconstrained Multivariate Optimization Multivariate optimization means optimization of a scalar function of a several variables: and has the general form: y = () min ( ) where () is a nonlinear scalar-valued

More information

ENGI 5708 Design of Civil Engineering Systems

ENGI 5708 Design of Civil Engineering Systems ENGI 5708 Design of Civil Engineering Systems Lecture 04: Graphical Solution Methods Part 1 Shawn Kenny, Ph.D., P.Eng. Assistant Professor Faculty of Engineering and Applied Science Memorial University

More information

( ) 2 3x=0 3x(x 3 1)=0 x=0 x=1

( ) 2 3x=0 3x(x 3 1)=0 x=0 x=1 Stewart Calculus ET 5e 05497;4. Partial Derivatives; 4.7 Maimum and Minimum Values. (a) First we compute D(,)= f (,) f (,) [ f (,)] =(4)() () =7. Since D(,)>0 and f (,)>0, f has a local minimum at (,)

More information

Lecture 16: October 22

Lecture 16: October 22 0-725/36-725: Conve Optimization Fall 208 Lecturer: Ryan Tibshirani Lecture 6: October 22 Scribes: Nic Dalmasso, Alan Mishler, Benja LeRoy Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer:

More information

Supplementary lecture notes on linear programming. We will present an algorithm to solve linear programs of the form. maximize.

Supplementary lecture notes on linear programming. We will present an algorithm to solve linear programs of the form. maximize. Cornell University, Fall 2016 Supplementary lecture notes on linear programming CS 6820: Algorithms 26 Sep 28 Sep 1 The Simplex Method We will present an algorithm to solve linear programs of the form

More information

Part 2: NLP Constrained Optimization

Part 2: NLP Constrained Optimization Part 2: NLP Constrained Optimization James G. Shanahan 2 Independent Consultant and Lecturer UC Santa Cruz EMAIL: James_DOT_Shanahan_AT_gmail_DOT_com WIFI: SSID Student USERname ucsc-guest Password EnrollNow!

More information

Process Model Formulation and Solution, 3E4

Process Model Formulation and Solution, 3E4 Process Model Formulation and Solution, 3E4 Section B: Linear Algebraic Equations Instructor: Kevin Dunn dunnkg@mcmasterca Department of Chemical Engineering Course notes: Dr Benoît Chachuat 06 October

More information

Approximate inference, Sampling & Variational inference Fall Cours 9 November 25

Approximate inference, Sampling & Variational inference Fall Cours 9 November 25 Approimate inference, Sampling & Variational inference Fall 2015 Cours 9 November 25 Enseignant: Guillaume Obozinski Scribe: Basile Clément, Nathan de Lara 9.1 Approimate inference with MCMC 9.1.1 Gibbs

More information

Global minimization with a new filled function approach

Global minimization with a new filled function approach 2nd International Conference on Electronics, Networ and Computer Engineering ICENCE 206 Global minimization with a new filled function approach Weiiang WANG,a, Youlin SHANG2,b and Mengiang LI3,c Shanghai

More information

Functions. Introduction

Functions. Introduction Functions,00 P,000 00 0 970 97 980 98 990 99 000 00 00 Figure Standard and Poor s Inde with dividends reinvested (credit "bull": modification of work b Praitno Hadinata; credit "graph": modification of

More information

Statistical Decision Theory and Bayesian Analysis Chapter1 Basic Concepts. Outline. Introduction. Introduction(cont.) Basic Elements (cont.

Statistical Decision Theory and Bayesian Analysis Chapter1 Basic Concepts. Outline. Introduction. Introduction(cont.) Basic Elements (cont. Statistical Decision Theory and Bayesian Analysis Chapter Basic Concepts 939 Outline Introduction Basic Elements Bayesian Epected Loss Frequentist Risk Randomized Decision Rules Decision Principles Misuse

More information

9.3 Path independence

9.3 Path independence 66 CHAPTER 9. ONE DIMENSIONAL INTEGRALS IN SEVERAL VARIABLES 9.3 Path independence Note:??? lectures 9.3.1 Path independent integrals Let U R n be a set and ω a one-form defined on U, The integral of ω

More information

BEE1004 / BEE1005 UNIVERSITY OF EXETER FACULTY OF UNDERGRADUATE STUDIES SCHOOL OF BUSINESS AND ECONOMICS. January 2002

BEE1004 / BEE1005 UNIVERSITY OF EXETER FACULTY OF UNDERGRADUATE STUDIES SCHOOL OF BUSINESS AND ECONOMICS. January 2002 BEE / BEE5 UNIVERSITY OF EXETER FACULTY OF UNDERGRADUATE STUDIES SCHOOL OF BUSINESS AND ECONOMICS January Basic Mathematics for Economists Introduction to Mathematical Economics Duration : TWO HOURS Please

More information

Part I Analysis in Economics

Part I Analysis in Economics Part I Analysis in Economics D 1 1 (Function) A function f from a set A into a set B, denoted by f : A B, is a correspondence that assigns to each element A eactly one element y B We call y the image of

More information

Convex Analysis and Economic Theory Winter 2018

Convex Analysis and Economic Theory Winter 2018 Division of the Humanities and Social Sciences Ec 181 KC Border Conve Analysis and Economic Theory Winter 2018 Toic 16: Fenchel conjugates 16.1 Conjugate functions Recall from Proosition 14.1.1 that is

More information

Gauss-Jordan Elimination for Solving Linear Equations Example: 1. Solve the following equations: (3)

Gauss-Jordan Elimination for Solving Linear Equations Example: 1. Solve the following equations: (3) The Simple Method Gauss-Jordan Elimination for Solving Linear Equations Eample: Gauss-Jordan Elimination Solve the following equations: + + + + = 4 = = () () () - In the first step of the procedure, we

More information

CHAPTER 2: Partial Derivatives. 2.2 Increments and Differential

CHAPTER 2: Partial Derivatives. 2.2 Increments and Differential CHAPTER : Partial Derivatives.1 Definition of a Partial Derivative. Increments and Differential.3 Chain Rules.4 Local Etrema.5 Absolute Etrema 1 Chapter : Partial Derivatives.1 Definition of a Partial

More information

A = Chapter 6. Linear Programming: The Simplex Method. + 21x 3 x x 2. C = 16x 1. + x x x 1. + x 3. 16,x 2.

A = Chapter 6. Linear Programming: The Simplex Method. + 21x 3 x x 2. C = 16x 1. + x x x 1. + x 3. 16,x 2. Chapter 6 Linear rogramming: The Simple Method Section The Dual roblem: Minimization with roblem Constraints of the Form Learning Objectives for Section 6. Dual roblem: Minimization with roblem Constraints

More information

It s Your Turn Problems I. Functions, Graphs, and Limits 1. Here s the graph of the function f on the interval [ 4,4]

It s Your Turn Problems I. Functions, Graphs, and Limits 1. Here s the graph of the function f on the interval [ 4,4] It s Your Turn Problems I. Functions, Graphs, and Limits. Here s the graph of the function f on the interval [ 4,4] f ( ) =.. It has a vertical asymptote at =, a) What are the critical numbers of f? b)

More information

Linear Classification: Perceptron

Linear Classification: Perceptron Linear Classification: Perceptron Yufei Tao Department of Computer Science and Engineering Chinese University of Hong Kong 1 / 18 Y Tao Linear Classification: Perceptron In this lecture, we will consider

More information

Introduction to Machine Learning Spring 2018 Note Duality. 1.1 Primal and Dual Problem

Introduction to Machine Learning Spring 2018 Note Duality. 1.1 Primal and Dual Problem CS 189 Introduction to Machine Learning Spring 2018 Note 22 1 Duality As we have seen in our discussion of kernels, ridge regression can be viewed in two ways: (1) an optimization problem over the weights

More information

Nonlinear Optimization for Optimal Control

Nonlinear Optimization for Optimal Control Nonlinear Optimization for Optimal Control Pieter Abbeel UC Berkeley EECS Many slides and figures adapted from Stephen Boyd [optional] Boyd and Vandenberghe, Convex Optimization, Chapters 9 11 [optional]

More information

EC5555 Economics Masters Refresher Course in Mathematics September 2013

EC5555 Economics Masters Refresher Course in Mathematics September 2013 EC5555 Economics Masters Reresher Course in Mathematics September 3 Lecture 5 Unconstraine Optimization an Quaratic Forms Francesco Feri We consier the unconstraine optimization or the case o unctions

More information

Graphing and Optimization

Graphing and Optimization BARNMC_33886.QXD //7 :7 Page 74 Graphing and Optimization CHAPTER - First Derivative and Graphs - Second Derivative and Graphs -3 L Hôpital s Rule -4 Curve-Sketching Techniques - Absolute Maima and Minima

More information

AP CALCULUS BC 2015 SCORING GUIDELINES

AP CALCULUS BC 2015 SCORING GUIDELINES 05 SCORING GUIDELINES Question 5 Consider the function f =, where k is a nonzero constant. The derivative of f is given by k f = k ( k). (a) Let k =, so that f =. Write an equation for the line tangent

More information

7. Lecture notes on the ellipsoid algorithm

7. Lecture notes on the ellipsoid algorithm Massachusetts Institute of Technology Michel X. Goemans 18.433: Combinatorial Optimization 7. Lecture notes on the ellipsoid algorithm The simplex algorithm was the first algorithm proposed for linear

More information

MACHINE LEARNING ADVANCED MACHINE LEARNING

MACHINE LEARNING ADVANCED MACHINE LEARNING MACHINE LEARNING ADVANCED MACHINE LEARNING Recap of Important Notions on Estimation of Probability Density Functions 22 MACHINE LEARNING Discrete Probabilities Consider two variables and y taking discrete

More information

You are responsible for upholding the University of Maryland Honor Code while taking this exam.

You are responsible for upholding the University of Maryland Honor Code while taking this exam. Econ300 Spring 2014 Second Midterm Eam version T Answers This eam consists of 25 multiple choice questions. The maimum duration of the eam is 50 minutes. 1. In the spaces provided on the scantron, write

More information

You are responsible for upholding the University of Maryland Honor Code while taking this exam.

You are responsible for upholding the University of Maryland Honor Code while taking this exam. Econ300 Spring 2014 Second Midterm Eam version W Answers This eam consists of 25 multiple choice questions. The maimum duration of the eam is 50 minutes. 1. In the spaces provided on the scantron, write

More information

Elementary properties of the gamma function

Elementary properties of the gamma function Appendi G Elementary properties of the gamma function G.1 Introduction The elementary definition of the gamma function is Euler s integral: 1 Γ(z) = 0 t z 1 e t. (G.1) For the sake of convergence of the

More information

Gi en Demand for Several Goods

Gi en Demand for Several Goods Gi en Demand for Several Goods Peter Norman Sørensen January 28, 2011 Abstract The utility maimizing consumer s demand function may simultaneously possess the Gi en property for any number of goods strictly

More information

Economics 101A (Lecture 3) Stefano DellaVigna

Economics 101A (Lecture 3) Stefano DellaVigna Economics 101A (Lecture 3) Stefano DellaVigna January 24, 2017 Outline 1. Implicit Function Theorem 2. Envelope Theorem 3. Convexity and concavity 4. Constrained Maximization 1 Implicit function theorem

More information