CoE 3SK3 Computer Aided Engineering Tutorial: Unconstrained Optimization

Similar documents
MATH 4211/6211 Optimization Basics of Optimization Problems

ECE 680 Modern Automatic Control. Gradient and Newton s Methods A Review

Optimization II: Unconstrained Multivariable

1 Newton s Method. Suppose we want to solve: x R. At x = x, f (x) can be approximated by:

Optimization. Next: Curve Fitting Up: Numerical Analysis for Chemical Previous: Linear Algebraic and Equations. Subsections

MIT Manufacturing Systems Analysis Lecture 14-16

Motivation: We have already seen an example of a system of nonlinear equations when we studied Gaussian integration (p.8 of integration notes)

SOLUTIONS to Exercises from Optimization

Constrained optimization

Optimization. Escuela de Ingeniería Informática de Oviedo. (Dpto. de Matemáticas-UniOvi) Numerical Computation Optimization 1 / 30

Introduction to unconstrained optimization - direct search methods

a b = a+b a a 0.1 The Golden Ratio

Optimization II: Unconstrained Multivariable

Constrained optimization. Unconstrained optimization. One-dimensional. Multi-dimensional. Newton with equality constraints. Active-set method.

Optimization. Totally not complete this is...don't use it yet...

26. Directional Derivatives & The Gradient

Constrained Optimization

Optimization Tutorial 1. Basic Gradient Descent

EAD 115. Numerical Solution of Engineering and Scientific Problems. David M. Rocke Department of Applied Science

MVE165/MMG631 Linear and integer optimization with applications Lecture 13 Overview of nonlinear programming. Ann-Brith Strömberg

Numerical Optimization. Review: Unconstrained Optimization

Optimization: Nonlinear Optimization without Constraints. Nonlinear Optimization without Constraints 1 / 23

Quasi-Newton Methods

Part 2: Linesearch methods for unconstrained optimization. Nick Gould (RAL)

2.098/6.255/ Optimization Methods Practice True/False Questions

Nonlinear Optimization: What s important?

Scientific Computing: Optimization

Mathematical optimization

1. Method 1: bisection. The bisection methods starts from two points a 0 and b 0 such that

8 Numerical methods for unconstrained problems

CS 323: Numerical Analysis and Computing

4 Newton Method. Unconstrained Convex Optimization 21. H(x)p = f(x). Newton direction. Why? Recall second-order staylor series expansion:

UNCONSTRAINED OPTIMIZATION

Optimization with Scipy (2)

ECE580 Partial Solution to Problem Set 3

Numerical solutions of nonlinear systems of equations

Solution Methods. Richard Lusby. Department of Management Engineering Technical University of Denmark

Optimization Part 1 P. Agius L2.1, Spring 2008

Line Search Methods for Unconstrained Optimisation

Penalty and Barrier Methods. So we again build on our unconstrained algorithms, but in a different way.

Bindel, Fall 2011 Intro to Scientific Computing (CS 3220) Week 6: Monday, Mar 7. e k+1 = 1 f (ξ k ) 2 f (x k ) e2 k.

14. Nonlinear equations

2015 Math Camp Calculus Exam Solution

Lecture 3. Optimization Problems and Iterative Algorithms

Introduction to Optimization

Root Finding and Optimization

Line Search Methods. Shefali Kulkarni-Thaker

The Steepest Descent Algorithm for Unconstrained Optimization

LOCAL SEARCH. Today. Reading AIMA Chapter , Goals Local search algorithms. Introduce adversarial search 1/31/14

Optimization. CS Summer 2008 Jonathan Kaldor

Nonlinear equations and optimization

McMaster University CS-4-6TE3/CES Assignment-1 Solution Set

Applications of Linear Programming

ECE580 Exam 1 October 4, Please do not write on the back of the exam pages. Extra paper is available from the instructor.

, b = 0. (2) 1 2 The eigenvectors of A corresponding to the eigenvalues λ 1 = 1, λ 2 = 3 are

An Iterative Descent Method

Chapter 4. Unconstrained optimization

Optimality conditions for unconstrained optimization. Outline

AM 205: lecture 18. Last time: optimization methods Today: conditions for optimality

CS 323: Numerical Analysis and Computing

Practical Optimization: Basic Multidimensional Gradient Methods

Recitation 1. Gradients and Directional Derivatives. Brett Bernstein. CDS at NYU. January 21, 2018

minimize x subject to (x 2)(x 4) u,

Numerisches Rechnen. (für Informatiker) M. Grepl P. Esser & G. Welper & L. Zhang. Institut für Geometrie und Praktische Mathematik RWTH Aachen

Numerical Optimization

NonlinearOptimization

Constrained Optimization and Lagrangian Duality

Numerical Methods in Physics and Astrophysics

Optimization Methods. Lecture 19: Line Searches and Newton s Method

Math 409/509 (Spring 2011)

5 Quasi-Newton Methods

Unconstrained minimization

Chapter 3 Numerical Methods

Scientific Computing: An Introductory Survey

Scientific Computing: An Introductory Survey

E5295/5B5749 Convex optimization with engineering applications. Lecture 8. Smooth convex unconstrained and equality-constrained minimization

Numerical optimization

IE 5531: Engineering Optimization I

LECTURE 22: SWARM INTELLIGENCE 3 / CLASSICAL OPTIMIZATION

Lecture V. Numerical Optimization

6.252 NONLINEAR PROGRAMMING LECTURE 10 ALTERNATIVES TO GRADIENT PROJECTION LECTURE OUTLINE. Three Alternatives/Remedies for Gradient Projection

Problem Total Points Score

Multidisciplinary System Design Optimization (MSDO)

Algorithms for constrained local optimization

Conditional Gradient (Frank-Wolfe) Method

Roots of equations, minimization, numerical integration

Unconstrained minimization of smooth functions

Outline. Scientific Computing: An Introductory Survey. Optimization. Optimization Problems. Examples: Optimization Problems

Machine Learning Lecture 7

Part 5: Penalty and augmented Lagrangian methods for equality constrained optimization. Nick Gould (RAL)

EAD 115. Numerical Solution of Engineering and Scientific Problems. David M. Rocke Department of Applied Science

Numerical Optimization

Lecture 12 Unconstrained Optimization (contd.) Constrained Optimization. October 15, 2008

10-725/ Optimization Midterm Exam

Numerical Optimization of Partial Differential Equations

17 Solution of Nonlinear Systems

10. Unconstrained minimization

x 2 i 10 cos(2πx i ). i=1

Lecture 8 Optimization

Regression with Numerical Optimization. Logistic

Transcription:

CoE 3SK3 Computer Aided Engineering Tutorial: Unconstrained Optimization Jie Cao caoj23@grads.ece.mcmaster.ca Department of Electrical and Computer Engineering McMaster University Feb. 2, 2010

Outline 1 Introduction Optimization Methods 2 One-dimensional Unconstrained Optimization Different techniques 3 Multi-dimensional Unconstrained Optimization Basic Concepts 4 Summary

Optimization What is unconstrained and constrained optimization problems? Unconstrained optimization min x F(x) or max F(x) x Constrained Optimization max F(x) or max F(x) x x subject to g(x) = 0 and/or h(x) < 0 or h(x) > 0

Methods Ways to get there One-dimensional Unconstrained Optimization Analytical method Golden-section search method Newton s method Multi-dimensional Unconstrained Optimization Analytical method Gradient method - steepest ascent (descent) method Newton s method

Different techniques Analytical method Question 13.2 (5th Edition) Given f(x) = 1.5x 6 2x 4 + 12x 1 Plot the function. 2 Use analytical methods to prove that the function is concave for all values of x. 3 Differentiate the function and then use a root-location method to solve for the maximum f(x) and the corresponding value of x.

Different techniques Golden-section search method Question 13.3 (5th Edition) Solve for the value of x that maximize f(x) in Prob. 13.2 using the golden-section search. Employ initial guesses of x l = 0 and x u = 2 and perform three iterations.

Different techniques Golden Ratio in Art and Architecture Golden Ratio in Art An Old man by Leonardo Da Vinci The Vetruvian Man by Leonardo Da Vinci Mona-Risa by Leonardo Da Vinci Holy Family by Micahelangelo Crucifixion by Raphael self-portrait by Rembrandt... Golden Ratio in Architecture The Great Pyramid Parthenon Porch of Maidens, Acropolis, Athens Chartres Cathedral...

Different techniques Newton s method Question 13.5 (5th Edition) Repeat Prob.13.3 but use Newton s method. Employ an initial guess of x 0 = 2 and perform three iterations.

Different techniques Let s have a try Question 13.11 (5th Edition) Consider the following function: f(x) = 3 + 6x + 5x 2 + 3x 3 + 4 4 Locate the minimum by finding the root of the derivative of this function. Use bisection with initial guess of x l = 2 and x u = 1

Basic Concepts What you need? 1 Gradient vector 2 Directional derivative. 3 Hessian matrix

Basic Concepts Directional derivative. Question 14.2 (5th Edition) Find the directional derivative of f(x, y) = x 2 + 2y 2 at x = 2 and y = 2 in the direction of h = 2i + 3j

Basic Concepts Gradient vector and Hessian matrix. Question 14.3 (5th Edition) Find the gradient vector and Hessian matrix for each of the following functions: 1 f(x, y) = 3xy 2 + 2e xy 2 f(x, y, z) = 2x 2 + y 2 + z 2 3 f(x, y) = ln(x 2 + 3xy + 2y 2 )

One-dimensional Unconstrained Optimization Analytical method Golden-section search method Newton s method Multi-dimensional Unconstrained Optimization Some basic Concepts Analytical method Gradient method - steepest ascent (descent) method Newton s method

Thank you