Optimization II: Unconstrained Multivariable

Similar documents
Optimization II: Unconstrained Multivariable

2. Quasi-Newton methods

5 Quasi-Newton Methods

Chapter 4. Unconstrained optimization

Shiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 3. Gradient Method

Convex Optimization CMU-10725

Higher-Order Methods

Methods that avoid calculating the Hessian. Nonlinear Optimization; Steepest Descent, Quasi-Newton. Steepest Descent

Quasi-Newton Methods

Quasi-Newton methods: Symmetric rank 1 (SR1) Broyden Fletcher Goldfarb Shanno February 6, / 25 (BFG. Limited memory BFGS (L-BFGS)

Comparative study of Optimization methods for Unconstrained Multivariable Nonlinear Programming Problems

Conjugate Gradients I: Setup

Quasi-Newton Methods. Zico Kolter (notes by Ryan Tibshirani, Javier Peña, Zico Kolter) Convex Optimization

Algorithms for Constrained Optimization

Quasi-Newton Methods. Javier Peña Convex Optimization /36-725

Lecture 18: November Review on Primal-dual interior-poit methods

Newton s Method. Ryan Tibshirani Convex Optimization /36-725

ECS550NFB Introduction to Numerical Methods using Matlab Day 2

Programming, numerics and optimization

MATH 4211/6211 Optimization Quasi-Newton Method

Lecture V. Numerical Optimization

Statistics 580 Optimization Methods

Lecture 14: October 17

Improving L-BFGS Initialization for Trust-Region Methods in Deep Learning

Optimization: Nonlinear Optimization without Constraints. Nonlinear Optimization without Constraints 1 / 23

Nonlinear Optimization: What s important?

1 Numerical optimization

Numerical solutions of nonlinear systems of equations

Numerical Optimization Professor Horst Cerjak, Horst Bischof, Thomas Pock Mat Vis-Gra SS09

Nonlinear Programming

OPER 627: Nonlinear Optimization Lecture 14: Mid-term Review

Part 4: IIR Filters Optimization Approach. Tutorial ISCAS 2007

EAD 115. Numerical Solution of Engineering and Scientific Problems. David M. Rocke Department of Applied Science

1 Numerical optimization

(One Dimension) Problem: for a function f(x), find x 0 such that f(x 0 ) = 0. f(x)

Optimization Methods

Introduction. New Nonsmooth Trust Region Method for Unconstraint Locally Lipschitz Optimization Problems

Optimization Methods for Machine Learning

Introduction. A Modified Steepest Descent Method Based on BFGS Method for Locally Lipschitz Functions. R. Yousefpour 1

AM 205: lecture 19. Last time: Conditions for optimality Today: Newton s method for optimization, survey of optimization methods

Unconstrained optimization

Maria Cameron. f(x) = 1 n

ORIE 6326: Convex Optimization. Quasi-Newton Methods

Multivariate Newton Minimanization

Maximum Likelihood Estimation

Optimization 2. CS5240 Theoretical Foundations in Multimedia. Leow Wee Kheng

NonlinearOptimization

Convex Optimization. Problem set 2. Due Monday April 26th

Lecture 7 Unconstrained nonlinear programming

Geometry optimization

Quasi-Newton methods for minimization

Nonlinearity Root-finding Bisection Fixed Point Iteration Newton s Method Secant Method Conclusion. Nonlinear Systems

Methods for Unconstrained Optimization Numerical Optimization Lectures 1-2

Optimization and Root Finding. Kurt Hornik

ON THE CONNECTION BETWEEN THE CONJUGATE GRADIENT METHOD AND QUASI-NEWTON METHODS ON QUADRATIC PROBLEMS

Improved Damped Quasi-Newton Methods for Unconstrained Optimization

ENSIEEHT-IRIT, 2, rue Camichel, Toulouse (France) LMS SAMTECH, A Siemens Business,15-16, Lower Park Row, BS1 5BN Bristol (UK)

Gradient-Based Optimization

Global Convergence of Perry-Shanno Memoryless Quasi-Newton-type Method. 1 Introduction

Proximal Newton Method. Zico Kolter (notes by Ryan Tibshirani) Convex Optimization

Step lengths in BFGS method for monotone gradients

Unconstrained Multivariate Optimization

Math 408A: Non-Linear Optimization

STAT Advanced Bayesian Inference

1 Newton s Method. Suppose we want to solve: x R. At x = x, f (x) can be approximated by:

Nonlinearity Root-finding Bisection Fixed Point Iteration Newton s Method Secant Method Conclusion. Nonlinear Systems

17 Solution of Nonlinear Systems

NONSMOOTH VARIANTS OF POWELL S BFGS CONVERGENCE THEOREM

FALL 2018 MATH 4211/6211 Optimization Homework 4

MATH 4211/6211 Optimization Basics of Optimization Problems

EECS260 Optimization Lecture notes

Introduction to Nonlinear Optimization Paul J. Atzberger

POLI 8501 Introduction to Maximum Likelihood Estimation

Homework 3 Conjugate Gradient Descent, Accelerated Gradient Descent Newton, Quasi Newton and Projected Gradient Descent

Optimization Algorithms on Riemannian Manifolds with Applications

Data Mining (Mineria de Dades)

Numerical Analysis of Electromagnetic Fields

University of Maryland at College Park. limited amount of computer memory, thereby allowing problems with a very large number

Ordinary Differential Equations II

Chapter 2. Optimization. Gradients, convexity, and ALS

A Primer on Multidimensional Optimization

CoE 3SK3 Computer Aided Engineering Tutorial: Unconstrained Optimization

Eigenproblems II: Computation

MA/OR/ST 706: Nonlinear Programming Midterm Exam Instructor: Dr. Kartik Sivaramakrishnan INSTRUCTIONS

Stochastic Optimization Algorithms Beyond SG

Image restoration. An example in astronomy

Preconditioned conjugate gradient algorithms with column scaling

Quasi-Newton Methods

Static unconstrained optimization

Line Search Methods. Shefali Kulkarni-Thaker

ABSTRACT 1. INTRODUCTION

GRADIENT-TYPE METHODS FOR UNCONSTRAINED OPTIMIZATION

Machine Learning CS 4900/5900. Lecture 03. Razvan C. Bunescu School of Electrical Engineering and Computer Science

P1: JYD /... CB495-08Drv CB495/Train KEY BOARDED March 24, :7 Char Count= 0 Part II Estimation 183

STUDYING THE BASIN OF CONVERGENCE OF METHODS FOR COMPUTING PERIODIC ORBITS

Math 409/509 (Spring 2011)

Bindel, Fall 2011 Intro to Scientific Computing (CS 3220) Week 6: Monday, Mar 7. e k+1 = 1 f (ξ k ) 2 f (x k ) e2 k.

Numerical Optimization

Lecture 14 Ellipsoid method

Introduction to unconstrained optimization - direct search methods

Transcription:

Optimization II: Unconstrained Multivariable CS 205A: Mathematical Methods for Robotics, Vision, and Graphics Justin Solomon CS 205A: Mathematical Methods Optimization II: Unconstrained Multivariable 1 / 20

Unconstrained Multivariable Problems minimize f : R n R CS 205A: Mathematical Methods Optimization II: Unconstrained Multivariable 2 / 20

Recall f( x) Direction of steepest ascent CS 205A: Mathematical Methods Optimization II: Unconstrained Multivariable 3 / 20

Recall f( x) Direction of steepest descent CS 205A: Mathematical Methods Optimization II: Unconstrained Multivariable 4 / 20

Observation If f( x) 0, for sufficiently small α > 0, f( x α f( x)) f( x) CS 205A: Mathematical Methods Optimization II: Unconstrained Multivariable 5 / 20

Gradient Descent Algorithm Iterate until convergence: 1. g k (t) f( x k t f( x k )) 2. Find t 0 minimizing (or decreasing) g k 3. x k+1 x k t f( x k ) CS 205A: Mathematical Methods Optimization II: Unconstrained Multivariable 6 / 20

Stopping Condition f( x k ) 0 Don t forget: Check optimality! CS 205A: Mathematical Methods Optimization II: Unconstrained Multivariable 7 / 20

Line Search g k (t) f( x k t f( x k )) One-dimensional optimization Don t have to minimize completely: Wolfe conditions CS 205A: Mathematical Methods Optimization II: Unconstrained Multivariable 8 / 20

Newton s Method (again!) f( x) f( x k ) + f( x k ) ( x x k ) + 1 2 ( x x k) H f ( x k )( x x k ) CS 205A: Mathematical Methods Optimization II: Unconstrained Multivariable 9 / 20

Newton s Method (again!) f( x) f( x k ) + f( x k ) ( x x k ) + 1 2 ( x x k) H f ( x k )( x x k ) = x k+1 = x k [H f ( x k )] 1 f( x k ) CS 205A: Mathematical Methods Optimization II: Unconstrained Multivariable 9 / 20

Newton s Method (again!) f( x) f( x k ) + f( x k ) ( x x k ) + 1 2 ( x x k) H f ( x k )( x x k ) = x k+1 = x k [H f ( x k )] 1 f( x k ) Consideration: What if H f is not positive (semi-)definite? CS 205A: Mathematical Methods Optimization II: Unconstrained Multivariable 9 / 20

Motivation f might be hard to compute but H f is harder H f might be dense: n 2 CS 205A: Mathematical Methods Optimization II: Unconstrained Multivariable 10 / 20

Quasi-Newton Methods Approximate derivatives to avoid expensive calculations e.g. secant, Broyden,... CS 205A: Mathematical Methods Optimization II: Unconstrained Multivariable 11 / 20

Common Optimization Assumption f known H f unknown or hard to compute CS 205A: Mathematical Methods Optimization II: Unconstrained Multivariable 12 / 20

Quasi-Newton Optimization x k+1 = x k α k B 1 k f( x k) B k H f ( x k ) CS 205A: Mathematical Methods Optimization II: Unconstrained Multivariable 13 / 20

Warning <advanced material> See Nocedal & Wright CS 205A: Mathematical Methods Optimization II: Unconstrained Multivariable 14 / 20

Broyden-Style Update B k+1 ( x k+1 x k ) = f( x k+1 ) f( x k ) CS 205A: Mathematical Methods Optimization II: Unconstrained Multivariable 15 / 20

Additional Considerations B k should be symmetric B k should be positive (semi-)definite CS 205A: Mathematical Methods Optimization II: Unconstrained Multivariable 16 / 20

Davidon-Fletcher-Powell (DFP) min B k+1 B k B k+1 s.t. B k+1 = B k+1 B k+1 ( x k+1 x k ) = f( x k+1 ) f( x k ) CS 205A: Mathematical Methods Optimization II: Unconstrained Multivariable 17 / 20

Observation B k+1 B k small does not mean B 1 is small k+1 B 1 k CS 205A: Mathematical Methods Optimization II: Unconstrained Multivariable 18 / 20

Observation B k+1 B k small does not mean B 1 is small k+1 B 1 k Idea: Try to approximate directly B 1 k CS 205A: Mathematical Methods Optimization II: Unconstrained Multivariable 18 / 20

BFGS Update min H k+1 H k HB k+1 s.t. H k+1 = H k+1 x k+1 x k = H k+1 ( f( x k+1 ) f( x k )) State of the art! CS 205A: Mathematical Methods Optimization II: Unconstrained Multivariable 19 / 20

Lots of Missing Details Choice of Limited-memory alternative Next CS 205A: Mathematical Methods Optimization II: Unconstrained Multivariable 20 / 20