Methodological Foundations of Biomedical Informatics (BMSC-GA 4449) Optimization

Size: px
Start display at page:

Download "Methodological Foundations of Biomedical Informatics (BMSC-GA 4449) Optimization"

Transcription

1 Methodological Foundations of Biomedical Informatics (BMSCGA 4449) Optimization

2 Op#miza#on A set of techniques for finding the values of variables at which the objec#ve func#on amains its cri#cal (minimal or maximal) values, within given constraints on the domain. Classes of techniques: Closed form Numerical Local vs. global

3 A simple example f(x) = 1/3 x 3 x, on [ 2.5, 2.5] d/dx f(x) = x 2 1 Has two roots x 1 = 1, x 2 = 1 d 2 /dx 2 f(x) = 2x At x 1 f(x) is posi#ve => x 1 is a (local) minimum At x 2 f(x) is nega#ve => x 2 is a (local) maximum

4 A simple example f(x) x We have only discovered local minima We failed to check the boundaries for possibility of special points In fact, x 3 = 2.5 is the global minimum and x 4 = 2.5 is the global maximum.

5 Constrained Op#miza#on The previous example was a case of free optima optimization, because it did not place any restrictions on the domain of the objective function. If there are restrictions on the domain, the optimization is called constrained, and such are the optima we are looking in these cases.

6 Constrained Op#miza#on Graphically, the difference between the free optima and the constrained optima can be shown as: Constrained maximum Free maximum constraint

7 Constrained maximum Free maximum constraint The free optima occurs at the peak of the surface. If we specify a specibic relationship between variables x 1 and x 2 (a constraint) then the search for an optimum is restricted to a slice of the surface. The constrained maximum occurs at the peak of the slice.

8 Approached to constrained op#miza#on Subs#tu#on method Lagrange mul#pliers method

9 Subs#tu#on Method Suppose we would like to op#mize the func#on f(x 1, x 2 ) = 5 x 1 x 2. If there no constraints on x1 and x2, the func#on can take infinite values, so it does not have a maximum. Now suppose we have an addi#ve restric#on on the independent variables: 100 = 2 x 1 + x 2. We can use the constraint to express one variable in terms of the other: x 2 = x 1

10 Subs#tu#on Method Now subs#tute the solu#on into the objec#ve func#on f(x 1, x 2 ) = g(x 1 ) = 5 x 1 (100 2 x 1 )= 10 x x 1 We are lec with a free maxima problem, which we can solve in the usual way. Compute the deriva#ve d/dx 1 f = 20 x Solve for zero of the deriva#ve 20 x = 0! 500 = 20 x 1! x 1 = 25 Check whether this cri#cal point is a maximum or a minimum d 2 /dx 1 2 f = 20 < 0! maximum Subs#tute back to find the solu#on for x 2 : x 2 = x 1 = 100 2! 25 = 50

11 Subs#tu#on method Although subs#tu#on method is straighdorward it is hard to generalize to arbitrarily complex constraints. It may be hard or impossible to use the constraints to express one variable in terms of the other.

12 Lagrange mul#pliers Equality constraints: op#mize f(x) subject to g i (x)=0, 1 i k g can be arbitrary func?ons of the independent variables Method of Lagrange mul#pliers: convert to a higher dimensional problem Minimize the augmented func#on Over q(x, λ) = f (x)+ λ i g i (x) (x 1 x n ;λ 1 λ k )

13 Lagrange Mul#pliers Given the augmented function, the Birst order condition for optimization is as follows: q(x, λ) = f (x, λ)+ x 1 x 1! q(x, λ) = f (x, λ)+ x n x 1 λ 1 q(x, λ) = g 1 (x) = 0! λ k q(x, λ) = g k (x) = 0 λ i λ i # g i (x) = 0 x 1 g i (x) = 0 x n & Simultaneous system of equations

14 Lagrange Mul#pliers Second order condion are determined from the Hessian matrix evaluated at cri#cal points: " H(x, λ) = # 2 x 2 q 2 λ x q 2 x λ q 2 λ 2 q " = & # 2 x 1 2 q! 2 x 1 x n q 2 x 1 λ 1 q! 2 x 1 λ k q! "!!! 2 x n x 1 q! 2 x n 2 q 2 x n λ 1 q! 2 x n λ k q 2 λ 1 x 1 q! 2 λ 1 x n q 2 λ 1 2 q! 2 λ 1 λ k q!!! "! 2 λ k x 1 q! 2 λ k x n q 2 λ k x 1 q! 2 λ k 2 q &

15 Lagrange Mul#pliers If at the cri#cal point is H > 0, the point is a local maximum If at the cri#cal point is H < 0, the point is a local maximum

16 Lagrange Mul#pliers: Example Using previous example q(x 1, x 2, λ) = 5x 1 x 2 + λ(100 2x 1 x 2 ) Where the constraint 100 = 2 x 1 + x 2 is expressed as g(x 1, x 2 )=100 2 x 1 x 2 = 0 This gives rise to the following system of equa#ons in 3 unknowns q x 1 = 5x 2 2λ = 0 q x 2 = 5x 1 λ = 0 # q λ =100 2x 1 x 2 = 0 &

17 Lagrange Mul#pliers: Example Solve the system, e.g. by Gaussian elimina#on q x 1 = 5x 2 2λ = 0 q x 2 = 5x 1 λ = 0 q λ =100 2x 1 x 2 = 0 # & Ay = ( ) +, 1 x 1 x 2 λ ( ) +, = ( ) +, ( ) +, x 1 x 2 λ ( ) +, = ( ) +, x 1 x 2 λ ( ) +, = ( ) +,

18 Lagrange Mul#pliers: Example The hessian matrix for this is example is " H(x 1, x 2, λ) = # & Note that it does not depend on any of the independent variables (why?). The determinant is posi#ve H = ( 2) = ( 1) 3 5 (5 0 ( 1) ( 2))+ ( 1) 2 ( 2) (5 ( 1) 2 0) = = 20

19 Convexity A func#on is convex in a region [a, b] if for any two points x 1, x 2 in [a, b] and t in [0, 1] f(t x 1 + (1 t) x 2 ) t f(x 1 ) + (1 t) f(x 2 ) f(x) f(t x 1 + (1 t) x 2 ) t f(x 1 ) + (1 t) f(x 2 x 1 x 2 x

20 Fixed point itera#ons Suppose we have a func#on f(x) Given x 0, define a sequence x 0 x 1 = f(x 0 ) x 2 = f(x 1 )=f(f(x 0 )) If the sequence converge, then the f has a fixed point x = f(x ) Idea: design func#ons with fixed points corresponding to the minimum f(x) x

21 Convergence Rate at which a fixed point algorithm approaches its fixed point lim f (x ) f (x ) k = µ k x k f (x ) q At q=1, 0<μ<1, linear convergence At q=1, μ=1, sublinear convergence E.g. logarithmically At q=1, μ=1, superlinear convergence q=2, μ>0, quadra?c convergence q=3, μ>0, quadra?c convergence

22 Classification of Optimization Methods Use function values only (simplex method) Use function + gradient (most descent methods) Use function + gradient + Hessian (Newton)

23 Newton s Method

24 Newton s Method

25 Newton s Method

26 Newton s Method

27 Newton s Method Suppose you want to minimize f(x), We use the Newton s method to find the roots of its deriva#ve, i.e. g(x)=f (x) = 0. At each step: x k+1 = x k g(x ) k g "(x k ) = x f "(x ) k k f "" (x k ) Requires 1 st and 2 nd deriva#ves Quadra#c convergence

28 Multidimensional Optimization

29 Mul# Dimensional Op#miza#on Important in many areas Fiqng a model to measured data Finding best design in some parameter space Hard in general Weird shapes: mul#ple extrema, saddles, curved or elongated valleys, etc. Can t bracket

30 Newton s Method in Replace 1 st deriva#ve with gradient, 2 nd deriva#ve with Hessian Mul#ple Dimensions # H = f (x 1,, x n ) # f = f x 1! f x n & ( ( ( ( 2 f 2 " 2 f x 1 x n x 1! #! 2 f x 1 x n " 2 f 2 x n & ( ( ( (

31 Newton s Method in Mul#ple Dimensions Replace 1 st deriva#ve with gradient, 2 nd deriva#ve with Hessian So,! x =! x H! ( x ) f! ( x 1 k + 1 k k k Tends to be extremely fragile unless func#on very smooth and star#ng close to minimum )

32 Shortest distance between two ellipses

33 Shortest distance between two ellipses

34 Shortest distance between two ellipses

Bellman s Curse of Dimensionality

Bellman s Curse of Dimensionality Bellman s Curse of Dimensionality n- dimensional state space Number of states grows exponen

More information

Algorithms for NLP. Classifica(on III. Taylor Berg- Kirkpatrick CMU Slides: Dan Klein UC Berkeley

Algorithms for NLP. Classifica(on III. Taylor Berg- Kirkpatrick CMU Slides: Dan Klein UC Berkeley Algorithms for NLP Classifica(on III Taylor Berg- Kirkpatrick CMU Slides: Dan Klein UC Berkeley The Perceptron, Again Start with zero weights Visit training instances one by one Try to classify If correct,

More information

Gradient Descent for High Dimensional Systems

Gradient Descent for High Dimensional Systems Gradient Descent for High Dimensional Systems Lab versus Lab 2 D Geometry Op>miza>on Poten>al Energy Methods: Implemented Equa3ons for op3mizer 3 2 4 Bond length High Dimensional Op>miza>on Applica3ons:

More information

3.3 Increasing & Decreasing Functions and The First Derivative Test

3.3 Increasing & Decreasing Functions and The First Derivative Test 3.3 Increasing & Decreasing Functions and The First Derivative Test Definitions of Increasing and Decreasing Functions: A funcon f is increasing on an interval if for any two numbers x 1 and x 2 in the

More information

CE 191: Civil & Environmental Engineering Systems Analysis. LEC 17 : Final Review

CE 191: Civil & Environmental Engineering Systems Analysis. LEC 17 : Final Review CE 191: Civil & Environmental Engineering Systems Analysis LEC 17 : Final Review Professor Scott Moura Civil & Environmental Engineering University of California, Berkeley Fall 2014 Prof. Moura UC Berkeley

More information

Lecture 4: Optimization. Maximizing a function of a single variable

Lecture 4: Optimization. Maximizing a function of a single variable Lecture 4: Optimization Maximizing or Minimizing a Function of a Single Variable Maximizing or Minimizing a Function of Many Variables Constrained Optimization Maximizing a function of a single variable

More information

Logis&c Regression. Robot Image Credit: Viktoriya Sukhanova 123RF.com

Logis&c Regression. Robot Image Credit: Viktoriya Sukhanova 123RF.com Logis&c Regression These slides were assembled by Eric Eaton, with grateful acknowledgement of the many others who made their course materials freely available online. Feel free to reuse or adapt these

More information

1 Newton s Method. Suppose we want to solve: x R. At x = x, f (x) can be approximated by:

1 Newton s Method. Suppose we want to solve: x R. At x = x, f (x) can be approximated by: Newton s Method Suppose we want to solve: (P:) min f (x) At x = x, f (x) can be approximated by: n x R. f (x) h(x) := f ( x)+ f ( x) T (x x)+ (x x) t H ( x)(x x), 2 which is the quadratic Taylor expansion

More information

Chain Rule. Con,nued: Related Rates. UBC Math 102

Chain Rule. Con,nued: Related Rates. UBC Math 102 Chain Rule Con,nued: Related Rates Midterm Test All sec,ons 60 50 Average = 69% 40 30 20 10 0 0 6 12 18 24 30 36 42 48 I think the test was A Too hard B On the hard side C Fair D On the easy side E too

More information

Applications of Linear Programming

Applications of Linear Programming Applications of Linear Programming lecturer: András London University of Szeged Institute of Informatics Department of Computational Optimization Lecture 9 Non-linear programming In case of LP, the goal

More information

MATH 4211/6211 Optimization Basics of Optimization Problems

MATH 4211/6211 Optimization Basics of Optimization Problems MATH 4211/6211 Optimization Basics of Optimization Problems Xiaojing Ye Department of Mathematics & Statistics Georgia State University Xiaojing Ye, Math & Stat, Georgia State University 0 A standard minimization

More information

8 Numerical methods for unconstrained problems

8 Numerical methods for unconstrained problems 8 Numerical methods for unconstrained problems Optimization is one of the important fields in numerical computation, beside solving differential equations and linear systems. We can see that these fields

More information

Modelling of Equipment, Processes, and Systems

Modelling of Equipment, Processes, and Systems 1 Modelling of Equipment, Processes, and Systems 2 Modelling Tools Simple Programs Spreadsheet tools like Excel Mathema7cal Tools MatLab, Mathcad, Maple, and Mathema7ca Special Purpose Codes Macroflow,

More information

September Math Course: First Order Derivative

September Math Course: First Order Derivative September Math Course: First Order Derivative Arina Nikandrova Functions Function y = f (x), where x is either be a scalar or a vector of several variables (x,..., x n ), can be thought of as a rule which

More information

CE 191: Civil and Environmental Engineering Systems Analysis. LEC 05 : Optimality Conditions

CE 191: Civil and Environmental Engineering Systems Analysis. LEC 05 : Optimality Conditions CE 191: Civil and Environmental Engineering Systems Analysis LEC : Optimality Conditions Professor Scott Moura Civil & Environmental Engineering University of California, Berkeley Fall 214 Prof. Moura

More information

Scientific Computing: Optimization

Scientific Computing: Optimization Scientific Computing: Optimization Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 Course MATH-GA.2043 or CSCI-GA.2112, Spring 2012 March 8th, 2011 A. Donev (Courant Institute) Lecture

More information

Absolute Extrema. Joseph Lee. Metropolitan Community College

Absolute Extrema. Joseph Lee. Metropolitan Community College Metropolitan Community College Let f be a function defined over some interval I. An absolute minimum occurs at c if f (c) f (x) for all x in I. An absolute maximum occurs at c if f (c) f (x) for all x

More information

Machine Learning and Data Mining. Linear regression. Prof. Alexander Ihler

Machine Learning and Data Mining. Linear regression. Prof. Alexander Ihler + Machine Learning and Data Mining Linear regression Prof. Alexander Ihler Supervised learning Notation Features x Targets y Predictions ŷ Parameters θ Learning algorithm Program ( Learner ) Change µ Improve

More information

Introduction to Optimization

Introduction to Optimization Introduction to Optimization Konstantin Tretyakov (kt@ut.ee) MTAT.03.227 Machine Learning So far Machine learning is important and interesting The general concept: Fitting models to data So far Machine

More information

4TE3/6TE3. Algorithms for. Continuous Optimization

4TE3/6TE3. Algorithms for. Continuous Optimization 4TE3/6TE3 Algorithms for Continuous Optimization (Algorithms for Constrained Nonlinear Optimization Problems) Tamás TERLAKY Computing and Software McMaster University Hamilton, November 2005 terlaky@mcmaster.ca

More information

Algorithms for NLP. Classifica(on III. Taylor Berg- Kirkpatrick CMU Slides: Dan Klein UC Berkeley

Algorithms for NLP. Classifica(on III. Taylor Berg- Kirkpatrick CMU Slides: Dan Klein UC Berkeley Algorithms for NLP Classifica(on III Taylor Berg- Kirkpatrick CMU Slides: Dan Klein UC Berkeley Objec(ve Func(ons What do we want from our weights? Depends! So far: minimize (training) errors: This is

More information

Optimization. Next: Curve Fitting Up: Numerical Analysis for Chemical Previous: Linear Algebraic and Equations. Subsections

Optimization. Next: Curve Fitting Up: Numerical Analysis for Chemical Previous: Linear Algebraic and Equations. Subsections Next: Curve Fitting Up: Numerical Analysis for Chemical Previous: Linear Algebraic and Equations Subsections One-dimensional Unconstrained Optimization Golden-Section Search Quadratic Interpolation Newton's

More information

Pseudospectral Methods For Op2mal Control. Jus2n Ruths March 27, 2009

Pseudospectral Methods For Op2mal Control. Jus2n Ruths March 27, 2009 Pseudospectral Methods For Op2mal Control Jus2n Ruths March 27, 2009 Introduc2on Pseudospectral methods arose to find solu2ons to Par2al Differen2al Equa2ons Recently adapted for Op2mal Control Key Ideas

More information

Non-convex optimization. Issam Laradji

Non-convex optimization. Issam Laradji Non-convex optimization Issam Laradji Strongly Convex Objective function f(x) x Strongly Convex Objective function Assumptions Gradient Lipschitz continuous f(x) Strongly convex x Strongly Convex Objective

More information

2015 Math Camp Calculus Exam Solution

2015 Math Camp Calculus Exam Solution 015 Math Camp Calculus Exam Solution Problem 1: x = x x +5 4+5 = 9 = 3 1. lim We also accepted ±3, even though it is not according to the prevailing convention 1. x x 4 x+4 =. lim 4 4+4 = 4 0 = 4 0 = We

More information

2.098/6.255/ Optimization Methods Practice True/False Questions

2.098/6.255/ Optimization Methods Practice True/False Questions 2.098/6.255/15.093 Optimization Methods Practice True/False Questions December 11, 2009 Part I For each one of the statements below, state whether it is true or false. Include a 1-3 line supporting sentence

More information

Optimality Conditions

Optimality Conditions Chapter 2 Optimality Conditions 2.1 Global and Local Minima for Unconstrained Problems When a minimization problem does not have any constraints, the problem is to find the minimum of the objective function.

More information

Scientific Computing: An Introductory Survey

Scientific Computing: An Introductory Survey Scientific Computing: An Introductory Survey Chapter 6 Optimization Prof. Michael T. Heath Department of Computer Science University of Illinois at Urbana-Champaign Copyright c 2002. Reproduction permitted

More information

Scientific Computing: An Introductory Survey

Scientific Computing: An Introductory Survey Scientific Computing: An Introductory Survey Chapter 6 Optimization Prof. Michael T. Heath Department of Computer Science University of Illinois at Urbana-Champaign Copyright c 2002. Reproduction permitted

More information

CSE546: SVMs, Dual Formula5on, and Kernels Winter 2012

CSE546: SVMs, Dual Formula5on, and Kernels Winter 2012 CSE546: SVMs, Dual Formula5on, and Kernels Winter 2012 Luke ZeClemoyer Slides adapted from Carlos Guestrin Linear classifiers Which line is becer? w. = j w (j) x (j) Data Example i Pick the one with the

More information

Outline. Scientific Computing: An Introductory Survey. Optimization. Optimization Problems. Examples: Optimization Problems

Outline. Scientific Computing: An Introductory Survey. Optimization. Optimization Problems. Examples: Optimization Problems Outline Scientific Computing: An Introductory Survey Chapter 6 Optimization 1 Prof. Michael. Heath Department of Computer Science University of Illinois at Urbana-Champaign Copyright c 2002. Reproduction

More information

Reduced Models for Process Simula2on and Op2miza2on

Reduced Models for Process Simula2on and Op2miza2on Reduced Models for Process Simulaon and Opmizaon Yidong Lang, Lorenz T. Biegler and David Miller ESI annual meeng March, 0 Models are mapping Equaon set or Module simulators Input space Reduced model Surrogate

More information

g(x,y) = c. For instance (see Figure 1 on the right), consider the optimization problem maximize subject to

g(x,y) = c. For instance (see Figure 1 on the right), consider the optimization problem maximize subject to 1 of 11 11/29/2010 10:39 AM From Wikipedia, the free encyclopedia In mathematical optimization, the method of Lagrange multipliers (named after Joseph Louis Lagrange) provides a strategy for finding the

More information

Generalization to inequality constrained problem. Maximize

Generalization to inequality constrained problem. Maximize Lecture 11. 26 September 2006 Review of Lecture #10: Second order optimality conditions necessary condition, sufficient condition. If the necessary condition is violated the point cannot be a local minimum

More information

GENG2140, S2, 2012 Week 7: Curve fitting

GENG2140, S2, 2012 Week 7: Curve fitting GENG2140, S2, 2012 Week 7: Curve fitting Curve fitting is the process of constructing a curve, or mathematical function, f(x) that has the best fit to a series of data points Involves fitting lines and

More information

Unconstrained optimization I Gradient-type methods

Unconstrained optimization I Gradient-type methods Unconstrained optimization I Gradient-type methods Antonio Frangioni Department of Computer Science University of Pisa www.di.unipi.it/~frangio frangio@di.unipi.it Computational Mathematics for Learning

More information

x +3y 2t = 1 2x +y +z +t = 2 3x y +z t = 7 2x +6y +z +t = a

x +3y 2t = 1 2x +y +z +t = 2 3x y +z t = 7 2x +6y +z +t = a UCM Final Exam, 05/8/014 Solutions 1 Given the parameter a R, consider the following linear system x +y t = 1 x +y +z +t = x y +z t = 7 x +6y +z +t = a (a (6 points Discuss the system depending on the

More information

Recitation 1. Gradients and Directional Derivatives. Brett Bernstein. CDS at NYU. January 21, 2018

Recitation 1. Gradients and Directional Derivatives. Brett Bernstein. CDS at NYU. January 21, 2018 Gradients and Directional Derivatives Brett Bernstein CDS at NYU January 21, 2018 Brett Bernstein (CDS at NYU) Recitation 1 January 21, 2018 1 / 23 Initial Question Intro Question Question We are given

More information

Constrained optimization. Unconstrained optimization. One-dimensional. Multi-dimensional. Newton with equality constraints. Active-set method.

Constrained optimization. Unconstrained optimization. One-dimensional. Multi-dimensional. Newton with equality constraints. Active-set method. Optimization Unconstrained optimization One-dimensional Multi-dimensional Newton s method Basic Newton Gauss- Newton Quasi- Newton Descent methods Gradient descent Conjugate gradient Constrained optimization

More information

1 Lecture 25: Extreme values

1 Lecture 25: Extreme values 1 Lecture 25: Extreme values 1.1 Outline Absolute maximum and minimum. Existence on closed, bounded intervals. Local extrema, critical points, Fermat s theorem Extreme values on a closed interval Rolle

More information

Lagrange Multipliers

Lagrange Multipliers Lagrange Multipliers (Com S 477/577 Notes) Yan-Bin Jia Nov 9, 2017 1 Introduction We turn now to the study of minimization with constraints. More specifically, we will tackle the following problem: minimize

More information

Calculus 2502A - Advanced Calculus I Fall : Local minima and maxima

Calculus 2502A - Advanced Calculus I Fall : Local minima and maxima Calculus 50A - Advanced Calculus I Fall 014 14.7: Local minima and maxima Martin Frankland November 17, 014 In these notes, we discuss the problem of finding the local minima and maxima of a function.

More information

Gradient Descent. Dr. Xiaowei Huang

Gradient Descent. Dr. Xiaowei Huang Gradient Descent Dr. Xiaowei Huang https://cgi.csc.liv.ac.uk/~xiaowei/ Up to now, Three machine learning algorithms: decision tree learning k-nn linear regression only optimization objectives are discussed,

More information

MATH2070 Optimisation

MATH2070 Optimisation MATH2070 Optimisation Nonlinear optimisation with constraints Semester 2, 2012 Lecturer: I.W. Guo Lecture slides courtesy of J.R. Wishart Review The full nonlinear optimisation problem with equality constraints

More information

Last Lecture Recap UVA CS / Introduc8on to Machine Learning and Data Mining. Lecture 3: Linear Regression

Last Lecture Recap UVA CS / Introduc8on to Machine Learning and Data Mining. Lecture 3: Linear Regression UVA CS 4501-001 / 6501 007 Introduc8on to Machine Learning and Data Mining Lecture 3: Linear Regression Yanjun Qi / Jane University of Virginia Department of Computer Science 1 Last Lecture Recap q Data

More information

Variational Methods & Optimal Control

Variational Methods & Optimal Control Variational Methods & Optimal Control lecture 02 Matthew Roughan Discipline of Applied Mathematics School of Mathematical Sciences University of Adelaide April 14, 2016

More information

Least Mean Squares Regression. Machine Learning Fall 2017

Least Mean Squares Regression. Machine Learning Fall 2017 Least Mean Squares Regression Machine Learning Fall 2017 1 Lecture Overview Linear classifiers What func?ons do linear classifiers express? Least Squares Method for Regression 2 Where are we? Linear classifiers

More information

This exam will be over material covered in class from Monday 14 February through Tuesday 8 March, corresponding to sections in the text.

This exam will be over material covered in class from Monday 14 February through Tuesday 8 March, corresponding to sections in the text. Math 275, section 002 (Ultman) Spring 2011 MIDTERM 2 REVIEW The second midterm will be held in class (1:40 2:30pm) on Friday 11 March. You will be allowed one half of one side of an 8.5 11 sheet of paper

More information

Lecture 7 Rolling Constraints

Lecture 7 Rolling Constraints Lecture 7 Rolling Constraints The most common, and most important nonholonomic constraints They cannot be wri5en in terms of the variables alone you must include some deriva9ves The resul9ng differen9al

More information

Math Maximum and Minimum Values, I

Math Maximum and Minimum Values, I Math 213 - Maximum and Minimum Values, I Peter A. Perry University of Kentucky October 8, 218 Homework Re-read section 14.7, pp. 959 965; read carefully pp. 965 967 Begin homework on section 14.7, problems

More information

Math (P)refresher Lecture 8: Unconstrained Optimization

Math (P)refresher Lecture 8: Unconstrained Optimization Math (P)refresher Lecture 8: Unconstrained Optimization September 2006 Today s Topics : Quadratic Forms Definiteness of Quadratic Forms Maxima and Minima in R n First Order Conditions Second Order Conditions

More information

Lecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem

Lecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem Lecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem Michael Patriksson 0-0 The Relaxation Theorem 1 Problem: find f := infimum f(x), x subject to x S, (1a) (1b) where f : R n R

More information

Name Date Period. MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question.

Name Date Period. MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question. AB Fall Final Exam Review 200-20 Name Date Period MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question. Solve the problem. ) The position of a particle

More information

Neural Network Training

Neural Network Training Neural Network Training Sargur Srihari Topics in Network Training 0. Neural network parameters Probabilistic problem formulation Specifying the activation and error functions for Regression Binary classification

More information

Partial Derivatives. w = f(x, y, z).

Partial Derivatives. w = f(x, y, z). Partial Derivatives 1 Functions of Several Variables So far we have focused our attention of functions of one variable. These functions model situations in which a variable depends on another independent

More information

Notes on Constrained Optimization

Notes on Constrained Optimization Notes on Constrained Optimization Wes Cowan Department of Mathematics, Rutgers University 110 Frelinghuysen Rd., Piscataway, NJ 08854 December 16, 2016 1 Introduction In the previous set of notes, we considered

More information

LESSON 25: LAGRANGE MULTIPLIERS OCTOBER 30, 2017

LESSON 25: LAGRANGE MULTIPLIERS OCTOBER 30, 2017 LESSON 5: LAGRANGE MULTIPLIERS OCTOBER 30, 017 Lagrange multipliers is another method of finding minima and maxima of functions of more than one variable. In fact, many of the problems from the last homework

More information

Transpose & Dot Product

Transpose & Dot Product Transpose & Dot Product Def: The transpose of an m n matrix A is the n m matrix A T whose columns are the rows of A. So: The columns of A T are the rows of A. The rows of A T are the columns of A. Example:

More information

minimize x subject to (x 2)(x 4) u,

minimize x subject to (x 2)(x 4) u, Math 6366/6367: Optimization and Variational Methods Sample Preliminary Exam Questions 1. Suppose that f : [, L] R is a C 2 -function with f () on (, L) and that you have explicit formulae for

More information

Exponen'al growth and differen'al equa'ons

Exponen'al growth and differen'al equa'ons Exponen'al growth and differen'al equa'ons But first.. Thanks for the feedback! Feedback about M102 Which of the following do you find useful? 70 60 50 40 30 20 10 0 How many resources students typically

More information

LESSON 23: EXTREMA OF FUNCTIONS OF 2 VARIABLES OCTOBER 25, 2017

LESSON 23: EXTREMA OF FUNCTIONS OF 2 VARIABLES OCTOBER 25, 2017 LESSON : EXTREMA OF FUNCTIONS OF VARIABLES OCTOBER 5, 017 Just like with functions of a single variable, we want to find the minima (plural of minimum) and maxima (plural of maximum) of functions of several

More information

Calculus Example Exam Solutions

Calculus Example Exam Solutions Calculus Example Exam Solutions. Limits (8 points, 6 each) Evaluate the following limits: p x 2 (a) lim x 4 We compute as follows: lim p x 2 x 4 p p x 2 x +2 x 4 p x +2 x 4 (x 4)( p x + 2) p x +2 = p 4+2

More information

Numerical Optimization. Review: Unconstrained Optimization

Numerical Optimization. Review: Unconstrained Optimization Numerical Optimization Finding the best feasible solution Edward P. Gatzke Department of Chemical Engineering University of South Carolina Ed Gatzke (USC CHE ) Numerical Optimization ECHE 589, Spring 2011

More information

EC /11. Math for Microeconomics September Course, Part II Problem Set 1 with Solutions. a11 a 12. x 2

EC /11. Math for Microeconomics September Course, Part II Problem Set 1 with Solutions. a11 a 12. x 2 LONDON SCHOOL OF ECONOMICS Professor Leonardo Felli Department of Economics S.478; x7525 EC400 2010/11 Math for Microeconomics September Course, Part II Problem Set 1 with Solutions 1. Show that the general

More information

Introduc)on to Fuel Cells

Introduc)on to Fuel Cells Introduc)on to Fuel Cells Anode (oxida)on loss of electrons): 2H 2 à 4H + +4e - Cathode (reduc-on gain of electrons) O 2 +4H + +4e - à 2H 2 O Overall reac)on (redox): 2H 2 + O 2 à 2H 2 O We will par)cularly

More information

Math 5BI: Problem Set 6 Gradient dynamical systems

Math 5BI: Problem Set 6 Gradient dynamical systems Math 5BI: Problem Set 6 Gradient dynamical systems April 25, 2007 Recall that if f(x) = f(x 1, x 2,..., x n ) is a smooth function of n variables, the gradient of f is the vector field f(x) = ( f)(x 1,

More information

Deep Learning. Authors: I. Goodfellow, Y. Bengio, A. Courville. Chapter 4: Numerical Computation. Lecture slides edited by C. Yim. C.

Deep Learning. Authors: I. Goodfellow, Y. Bengio, A. Courville. Chapter 4: Numerical Computation. Lecture slides edited by C. Yim. C. Chapter 4: Numerical Computation Deep Learning Authors: I. Goodfellow, Y. Bengio, A. Courville Lecture slides edited by 1 Chapter 4: Numerical Computation 4.1 Overflow and Underflow 4.2 Poor Conditioning

More information

Transpose & Dot Product

Transpose & Dot Product Transpose & Dot Product Def: The transpose of an m n matrix A is the n m matrix A T whose columns are the rows of A. So: The columns of A T are the rows of A. The rows of A T are the columns of A. Example:

More information

Deriva'on of The Kalman Filter. Fred DePiero CalPoly State University EE 525 Stochas'c Processes

Deriva'on of The Kalman Filter. Fred DePiero CalPoly State University EE 525 Stochas'c Processes Deriva'on of The Kalman Filter Fred DePiero CalPoly State University EE 525 Stochas'c Processes KF Uses State Predic'ons KF es'mates the state of a system Example Measure: posi'on State: [ posi'on velocity

More information

Lecture 4 September 15

Lecture 4 September 15 IFT 6269: Probabilistic Graphical Models Fall 2017 Lecture 4 September 15 Lecturer: Simon Lacoste-Julien Scribe: Philippe Brouillard & Tristan Deleu 4.1 Maximum Likelihood principle Given a parametric

More information

x 2 i 10 cos(2πx i ). i=1

x 2 i 10 cos(2πx i ). i=1 CHAPTER 2 Optimization Written Exercises 2.1 Consider the problem min, where = 4 + 4 x 2 i 1 cos(2πx i ). i=1 Note that is the Rastrigin function see Section C.1.11. a) What are the independent variables

More information

LECTURE 22: SWARM INTELLIGENCE 3 / CLASSICAL OPTIMIZATION

LECTURE 22: SWARM INTELLIGENCE 3 / CLASSICAL OPTIMIZATION 15-382 COLLECTIVE INTELLIGENCE - S19 LECTURE 22: SWARM INTELLIGENCE 3 / CLASSICAL OPTIMIZATION TEACHER: GIANNI A. DI CARO WHAT IF WE HAVE ONE SINGLE AGENT PSO leverages the presence of a swarm: the outcome

More information

CHAPTER 2: QUADRATIC PROGRAMMING

CHAPTER 2: QUADRATIC PROGRAMMING CHAPTER 2: QUADRATIC PROGRAMMING Overview Quadratic programming (QP) problems are characterized by objective functions that are quadratic in the design variables, and linear constraints. In this sense,

More information

UVA CS 4501: Machine Learning. Lecture 6: Linear Regression Model with Dr. Yanjun Qi. University of Virginia

UVA CS 4501: Machine Learning. Lecture 6: Linear Regression Model with Dr. Yanjun Qi. University of Virginia UVA CS 4501: Machine Learning Lecture 6: Linear Regression Model with Regulariza@ons Dr. Yanjun Qi University of Virginia Department of Computer Science Where are we? è Five major sec@ons of this course

More information

Constrained optimization

Constrained optimization Constrained optimization In general, the formulation of constrained optimization is as follows minj(w), subject to H i (w) = 0, i = 1,..., k. where J is the cost function and H i are the constraints. Lagrange

More information

Examination paper for TMA4180 Optimization I

Examination paper for TMA4180 Optimization I Department of Mathematical Sciences Examination paper for TMA4180 Optimization I Academic contact during examination: Phone: Examination date: 26th May 2016 Examination time (from to): 09:00 13:00 Permitted

More information

Constrained Optimization

Constrained Optimization 1 / 22 Constrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University March 30, 2015 2 / 22 1. Equality constraints only 1.1 Reduced gradient 1.2 Lagrange

More information

Optimality conditions for Equality Constrained Optimization Problems

Optimality conditions for Equality Constrained Optimization Problems International Journal of Mathematics and Statistics Invention (IJMSI) E-ISSN: 2321 4767 P-ISSN: 2321-4759 Volume 4 Issue 4 April. 2016 PP-28-33 Optimality conditions for Equality Constrained Optimization

More information

Lagrange multipliers. Portfolio optimization. The Lagrange multipliers method for finding constrained extrema of multivariable functions.

Lagrange multipliers. Portfolio optimization. The Lagrange multipliers method for finding constrained extrema of multivariable functions. Chapter 9 Lagrange multipliers Portfolio optimization The Lagrange multipliers method for finding constrained extrema of multivariable functions 91 Lagrange multipliers Optimization problems often require

More information

Functions of Several Variables

Functions of Several Variables Functions of Several Variables The Unconstrained Minimization Problem where In n dimensions the unconstrained problem is stated as f() x variables. minimize f()x x, is a scalar objective function of vector

More information

Constrained optimization: direct methods (cont.)

Constrained optimization: direct methods (cont.) Constrained optimization: direct methods (cont.) Jussi Hakanen Post-doctoral researcher jussi.hakanen@jyu.fi Direct methods Also known as methods of feasible directions Idea in a point x h, generate a

More information

Par$al Fac$on Decomposi$on. Academic Resource Center

Par$al Fac$on Decomposi$on. Academic Resource Center Par$al Fac$on Decomposi$on Academic Resource Center Table of Contents. What is Par$al Frac$on Decomposi$on 2. Finding the Par$al Fac$on Decomposi$on 3. Examples 4. Exercises 5. Integra$on with Par$al Fac$ons

More information

CSCI 360 Introduc/on to Ar/ficial Intelligence Week 2: Problem Solving and Op/miza/on. Instructor: Wei-Min Shen

CSCI 360 Introduc/on to Ar/ficial Intelligence Week 2: Problem Solving and Op/miza/on. Instructor: Wei-Min Shen CSCI 360 Introduc/on to Ar/ficial Intelligence Week 2: Problem Solving and Op/miza/on Instructor: Wei-Min Shen Today s Lecture Search Techniques (review & con/nue) Op/miza/on Techniques Home Work 1: descrip/on

More information

AB.Q103.NOTES: Chapter 2.4, 3.1, 3.2 LESSON 1. Discovering the derivative at x = a: Slopes of secants and tangents to a curve

AB.Q103.NOTES: Chapter 2.4, 3.1, 3.2 LESSON 1. Discovering the derivative at x = a: Slopes of secants and tangents to a curve AB.Q103.NOTES: Chapter 2.4, 3.1, 3.2 LESSON 1 Discovering the derivative at x = a: Slopes of secants and tangents to a curve 1 1. Instantaneous rate of change versus average rate of change Equation of

More information

5 Handling Constraints

5 Handling Constraints 5 Handling Constraints Engineering design optimization problems are very rarely unconstrained. Moreover, the constraints that appear in these problems are typically nonlinear. This motivates our interest

More information

Math 1120 Calculus Final Exam

Math 1120 Calculus Final Exam May 4, 2001 Name The first five problems count 7 points each (total 35 points) and rest count as marked. There are 195 points available. Good luck. 1. Consider the function f defined by: { 2x 2 3 if x

More information

HOMEWORK 7 SOLUTIONS

HOMEWORK 7 SOLUTIONS HOMEWORK 7 SOLUTIONS MA11: ADVANCED CALCULUS, HILARY 17 (1) Using the method of Lagrange multipliers, find the largest and smallest values of the function f(x, y) xy on the ellipse x + y 1. Solution: The

More information

Optimality conditions for unconstrained optimization. Outline

Optimality conditions for unconstrained optimization. Outline Optimality conditions for unconstrained optimization Daniel P. Robinson Department of Applied Mathematics and Statistics Johns Hopkins University September 13, 2018 Outline 1 The problem and definitions

More information

The Rules. The Math Game. More Rules. Teams. 1. Slope of tangent line. Are you ready??? 10/24/17

The Rules. The Math Game. More Rules. Teams. 1. Slope of tangent line. Are you ready??? 10/24/17 The Rules The Math Game Four Teams 1-2 players per team at the buzzers each =me. First to buzz in gets to answer Q. Correct answer: 1 point for your team, and con=nue playing Incorrect answer/ no answer/

More information

REVIEW OF DIFFERENTIAL CALCULUS

REVIEW OF DIFFERENTIAL CALCULUS REVIEW OF DIFFERENTIAL CALCULUS DONU ARAPURA 1. Limits and continuity To simplify the statements, we will often stick to two variables, but everything holds with any number of variables. Let f(x, y) be

More information

Calculus and optimization

Calculus and optimization Calculus an optimization These notes essentially correspon to mathematical appenix 2 in the text. 1 Functions of a single variable Now that we have e ne functions we turn our attention to calculus. A function

More information

Parameter Es*ma*on: Cracking Incomplete Data

Parameter Es*ma*on: Cracking Incomplete Data Parameter Es*ma*on: Cracking Incomplete Data Khaled S. Refaat Collaborators: Arthur Choi and Adnan Darwiche Agenda Learning Graphical Models Complete vs. Incomplete Data Exploi*ng Data for Decomposi*on

More information

Minimization of Static! Cost Functions!

Minimization of Static! Cost Functions! Minimization of Static Cost Functions Robert Stengel Optimal Control and Estimation, MAE 546, Princeton University, 2017 J = Static cost function with constant control parameter vector, u Conditions for

More information

DEPARTMENT OF MATHEMATICS AND STATISTICS UNIVERSITY OF MASSACHUSETTS. MATH 233 SOME SOLUTIONS TO EXAM 2 Fall 2018

DEPARTMENT OF MATHEMATICS AND STATISTICS UNIVERSITY OF MASSACHUSETTS. MATH 233 SOME SOLUTIONS TO EXAM 2 Fall 2018 DEPARTMENT OF MATHEMATICS AND STATISTICS UNIVERSITY OF MASSACHUSETTS MATH 233 SOME SOLUTIONS TO EXAM 2 Fall 208 Version A refers to the regular exam and Version B to the make-up. Version A. A particle

More information

Two hours. To be provided by Examinations Office: Mathematical Formula Tables. THE UNIVERSITY OF MANCHESTER. xx xxxx 2017 xx:xx xx.

Two hours. To be provided by Examinations Office: Mathematical Formula Tables. THE UNIVERSITY OF MANCHESTER. xx xxxx 2017 xx:xx xx. Two hours To be provided by Examinations Office: Mathematical Formula Tables. THE UNIVERSITY OF MANCHESTER CONVEX OPTIMIZATION - SOLUTIONS xx xxxx 27 xx:xx xx.xx Answer THREE of the FOUR questions. If

More information

Outline. Roadmap for the NPP segment: 1 Preliminaries: role of convexity. 2 Existence of a solution

Outline. Roadmap for the NPP segment: 1 Preliminaries: role of convexity. 2 Existence of a solution Outline Roadmap for the NPP segment: 1 Preliminaries: role of convexity 2 Existence of a solution 3 Necessary conditions for a solution: inequality constraints 4 The constraint qualification 5 The Lagrangian

More information

Lecture 3. Optimization Problems and Iterative Algorithms

Lecture 3. Optimization Problems and Iterative Algorithms Lecture 3 Optimization Problems and Iterative Algorithms January 13, 2016 This material was jointly developed with Angelia Nedić at UIUC for IE 598ns Outline Special Functions: Linear, Quadratic, Convex

More information

AP Calculus Testbank (Chapter 9) (Mr. Surowski)

AP Calculus Testbank (Chapter 9) (Mr. Surowski) AP Calculus Testbank (Chapter 9) (Mr. Surowski) Part I. Multiple-Choice Questions n 1 1. The series will converge, provided that n 1+p + n + 1 (A) p > 1 (B) p > 2 (C) p >.5 (D) p 0 2. The series

More information

10 Numerical methods for constrained problems

10 Numerical methods for constrained problems 10 Numerical methods for constrained problems min s.t. f(x) h(x) = 0 (l), g(x) 0 (m), x X The algorithms can be roughly divided the following way: ˆ primal methods: find descent direction keeping inside

More information

CS 6140: Machine Learning Spring What We Learned Last Week 2/26/16

CS 6140: Machine Learning Spring What We Learned Last Week 2/26/16 Logis@cs CS 6140: Machine Learning Spring 2016 Instructor: Lu Wang College of Computer and Informa@on Science Northeastern University Webpage: www.ccs.neu.edu/home/luwang Email: luwang@ccs.neu.edu Sign

More information