(One Dimension) Problem: for a function f(x), find x 0 such that f(x 0 ) = 0. f(x)

Size: px
Start display at page:

Download "(One Dimension) Problem: for a function f(x), find x 0 such that f(x 0 ) = 0. f(x)"

Transcription

1 Solving Nonlinear Equations & Optimization One Dimension Problem: or a unction, ind 0 such that 0 = 0. 0

2 One Root: The Bisection Method This one s guaranteed to converge at least to a singularity, i not an actual root. 1. Start with a and b such that a and b are opposite signs. 2. Choose midpoint c = a + b a/2. 3. I c has a sign opposite o a, then set b = c. Otherwise, set a=c c. 4. Repeat until desired tolerance is attained.

3 One Root: Brent s Method Brackets with a local quadratic interpolation o three points. At a given iteration, i the net computed point alls outside o the bracketing interval, a bisection step is used. Is the method underlying uniroot in R. More details in Press et al Brent s is the most is the method most highly recommended by NR or single nonlinear root-inding.

4 One Root: Newton s Method Local linear approimation using. Steps: With irst guess 0, compute 0 slope o approimating line. Net guess 1 is the root o the tangent line etending rom 0. Iterate until convergence. 1 0

5 A Comparison Method Requires? Guaranteed? Convergence Bisection No Yes Linear Brent s No Almost Superlinear Newton s Yes No Quadratic* * I close. These same relative trade-os eist or higher-dimensional procedures.

6 Optimization in One Dimension Problem: or a unction, ind m such that m > or m < or all m. We ll ocus on minima, since inding a ma or is equivalent to inding a min or. Global versus local: Multiple etrema. Boundaries.

7 One-dimensional: Golden Section Search An analogue to the bisection method or inding roots. Proceeds as ollows: Begin with 3 points 1 < 2 < 3, that are thought to contain a local minimum. Choose new point 0, such that 1 < 0 < 3. Form a new bracketing interval lbased on the relative values o 0 and 2. For eample, i 0 < 2, then the new interval is 0, 3 i 0 > 2, or it s 1, 2 i 0 < 2. Iterate until convergence.

8 What does Golden mean? The question is: ollowing the steps on the previous slide, how do we select 0? The answer is: we make a choice that guarantees a proportional reduction in the width o the interval at each step. For eample, i 0 < 2, then or this to happen regardless o the value o 0 we need to satisy 0 1 = 3 2 = α 3 1, where α represents the proportion o the interval eliminated at each step. To get the same reduction at the net iteration, the points also must satisy 2 0 = α 3 1 =α[α ], so 2 0 = 3 1 α/1 α. Since = 3 1, it ollows that 2α + α2/1 α = 1, a quadratic whose only solution satisying 0 < α < 1 is 3 5 / 2. Hence, the proportion o the interval remaining ater each iteration is given by which is known as the Golden Mean / ,

9 How do we use the value α?? Start with an interval [ 1, 3 ] thought to contain the min. Select the interior points 0 = 1 + α 3 1 and 2 = 3 α 3 1. Evaluate 0 and 2. I 0 < 2, new interval is [ 1, 2 ] and net point selected is 1 + α 2 1. I 0 > 2, new interval is [ 0, 3 ] and net point selected is 3 α 3 0. Iterate.

10 Brent s Method Works in a manner analogous to Brent s or root-inding: local quadratic interpolation, with a saety net in case new points all outside o the bracket. Too complicated to describe here a lot o housekeeping computations, although you can ind out more in NR. The method used by R s optimize unction.

11 Mathematics and Statistics Solving Several Nonlinear Equations The problem is to ind solutions or a system o the orm The problem is to ind solutions or a system o the orm 0,,, 0,,, p p 0,,, p 0,,, 2 1 p p

12 Options Multivariate Newton s, or Newton-Raphson NR. Modiied NR line searches and backtracking. Multivariate i t secant method Broyden s method. Similar trade-os apply as we discussed with one equation in terms o convergence and knowledge o the Jacobian.

13 Why is inding several roots such a problem? There are no good, general methods or solving systems o more than one nonlinear equation rom NR. Oten, unctions 1, 2,, p have nothing to do with each other. Finding solutions o s means identiying where e the p zero eo contours in the p 1 zero hypersuraces simultaneously intersect. These can be diicult to home in on without t some insight into how the p unctions relate to one another. See eample on ollowing slide, with p = 2.

14 Reproduced rom Numerical Recipes:

15 Mathematics and Statistics Developing a multivariate linear approimation: Let denote the entire vector o p unctions, and let = 1,, p denote an entire vector o values i, or i=1,,p. Taylor series epansion o i in a neighborhood o :. 2 p j i i i O Taylor series epansion o i in a neighborhood o :. 1 j j j i i O Note that the partial derivatives in this equation arise rom the Jacobian matri J o. So in matri notation we have: 2 J O. J O

16 Newton-Raphson From epansion on previous slide, neglecting terms o order δ 2 and higher, and setting equal to zero we obtain a set o linear equations or the corrections δ 2 that move each unction simultaneously l closer to zero: J, which can be solved using LU decomposition. This gives us an iterative approach correcting and updating a solution: new old, which we can iterate to convergence i.e., how close either g the 1-norm or -norm o δ is to zero.

17 Evaluating the Jacobian As we oten cannot easily evaluate the Jacobian analytically, a conventional option is numerical dierentiation. Numerical evaluation o the Jacobian relies on inite dierence equations. Approimate value o the i,jth element o J is given by: J ij [ i h je j i ]/ h j, where h j is some very small number and e j represents a vector with 1 at the jth position and zeroes everywhere else.

18 Modiied Newton-Raphson Note that a ull Newton step can be represented as J 1. When we are not close enough to the solution, this is not guaranteed to decrease to decrease the value o the unction. How do we know i we should take the ull step? One strategy is to require that the step decrease the inner product o, which is the same requirement as trying to minimize = /2. Another is to note that that the Newton step is a descent direction: J J 1 0.

19 Strategy: Modiied Newton-Raphson continued i. Deine p = δ, and a Newton iteration as new old p, where a ull Newton step speciies λ = 1. ii. I is reduced, then go to net iteration. iii. I is not reduced, then backtrack, selecting some λ < 1. Value o λ or a conventional backtrack is selected to ensure that the average rate o decrease is at least some raction o the initial rate o decrease, and that rate o decrease o at new value o is some raction o the rate or the old value o.

20 Multidimensional Optimization The problem: ind a minimum i or the unction 1,,, p. Note that in many statistical applications the unctions we wish to optimize e.g., loglikelihoods are conve, and hence airly well behaved. Also, in terms o the various approaches, options involve trade-os between rate o convergence and inormation about the gradient and Hessian. The latter two can oten be numerically evaluated.

21 Strategies 1. Newton-Raphson applied to the gradient. 2. Nelder-Mead Simple Method no gradient required. 3. Powell s Method. 4. Conjugate Gradient Methods. 5. Variable Metric Methods.

22 Nelder-Mead Simple Approach Simple is a igure with p+1 vertices in p dimensions a triangle in two dimensions, i or a tetrahedron t in three dimensions. i Start with a set o p+1 points that deine a inite simple i.e., one having inite volume. Simple method then takes a series o relective steps, moving the highest point where the is largest through the opposite ace o the simple to a lower point. Steps are designed to preserve the volume, but simple may epand lengthen where easible to acilitate convergence. When simple reaches a valley loor, it takes contractive steps. NR implementation descriptively reers to this routine as NR implementation descriptively reers to this routine as amoeba.

23 Possible simple moves:

24 Powell s Method aka, Direction Set Methods We know how to minimize a single nonlinear equation. Given a one-dimensional approach, a direction set method proceeds as ollows: Start at a point 0 = 1, p. Consider a set o vector directions n 1, n 2,,n p e.g., these might arise rom the gradient o. In the direction n 1, ind the scalar that minimizes 0 +λn 1 using a one-dimensional method. Replace 0 with 0 +λn 1. Iterate t through h n 2,,n p, and continue iterating ti until convergence. Note that you can use whatever nonlinear optimization routine you want say, Brent s or the Golden Section Search.

25 Conjugate Gradient Methods I you can compute the gradient, it turns out that you can enjoy substantial computational savings over a direction set method. Idea is to choose directions based on the gradient, but it turns out that the path o steepest descent i.e., given a current guess i or the minimum, the path o steepest descent is the negative gradient evaluated at i is not a good direction. See igure on slide ollowing. Instead, a set o conjugate directions are derived such that the we will not just proceed down the new gradient, but in a direction that is conjugate to the old gradient and conjugate to all previous directions traversed. Note: given the symmetric hessian H, two vectors i and n j are said to be conjugate i i Hn j = 0.

26 Problems with Steepest Descent a In a long, narrow valley, steepest descent takes many steps to reach the valley loor. b For a single magniied step, direction begins perpendicular to contours, but winds up parallel to local contours where minimum is reached.

27 Quasi-Newton Methods Similar to conjugate gradient methods, in the sense that we are accumulating inormation rom p successive line minimizations using gradient inormation to ind the minimum o a quadratic orm. Quasi-Newton methods can be thought o as a means o applying Newton-Raphson to the gradient, without the need or the Hessian. Using N-R with the gradient, given a current guess i the net guess is given by: 1 H. i1 i i Note that with quasi-newton, we start out with a positive-deinite matri used as an approimation to the Hessian. Successive iterations update this approimation, which converges to the actual Hessian. Most common implementations o this approach are so-called Davidon- Fletcher-Powell DFP and Broyden-Fletcher-Goldarb-Shanno BFGS algorithms.

28 Newton-Raphson in R: Mathematics and Statistics

29 Simple Mathematics and Quasi-Newton Statistics Methods Dr. Corcoran in R: STAT 6550

Today. Introduction to optimization Definition and motivation 1-dimensional methods. Multi-dimensional methods. General strategies, value-only methods

Today. Introduction to optimization Definition and motivation 1-dimensional methods. Multi-dimensional methods. General strategies, value-only methods Optimization Last time Root inding: deinition, motivation Algorithms: Bisection, alse position, secant, Newton-Raphson Convergence & tradeos Eample applications o Newton s method Root inding in > 1 dimension

More information

Lecture 8 Optimization

Lecture 8 Optimization 4/9/015 Lecture 8 Optimization EE 4386/5301 Computational Methods in EE Spring 015 Optimization 1 Outline Introduction 1D Optimization Parabolic interpolation Golden section search Newton s method Multidimensional

More information

Unconstrained Multivariate Optimization

Unconstrained Multivariate Optimization Unconstrained Multivariate Optimization Multivariate optimization means optimization of a scalar function of a several variables: and has the general form: y = () min ( ) where () is a nonlinear scalar-valued

More information

Lecture 7: Minimization or maximization of functions (Recipes Chapter 10)

Lecture 7: Minimization or maximization of functions (Recipes Chapter 10) Lecture 7: Minimization or maximization of functions (Recipes Chapter 10) Actively studied subject for several reasons: Commonly encountered problem: e.g. Hamilton s and Lagrange s principles, economics

More information

Optimization: Nonlinear Optimization without Constraints. Nonlinear Optimization without Constraints 1 / 23

Optimization: Nonlinear Optimization without Constraints. Nonlinear Optimization without Constraints 1 / 23 Optimization: Nonlinear Optimization without Constraints Nonlinear Optimization without Constraints 1 / 23 Nonlinear optimization without constraints Unconstrained minimization min x f(x) where f(x) is

More information

1 Numerical optimization

1 Numerical optimization Contents 1 Numerical optimization 5 1.1 Optimization of single-variable functions............ 5 1.1.1 Golden Section Search................... 6 1.1. Fibonacci Search...................... 8 1. Algorithms

More information

Root Finding and Optimization

Root Finding and Optimization Root Finding and Optimization Ramses van Zon SciNet, University o Toronto Scientiic Computing Lecture 11 February 11, 2014 Root Finding It is not uncommon in scientiic computing to want solve an equation

More information

Lecture V. Numerical Optimization

Lecture V. Numerical Optimization Lecture V Numerical Optimization Gianluca Violante New York University Quantitative Macroeconomics G. Violante, Numerical Optimization p. 1 /19 Isomorphism I We describe minimization problems: to maximize

More information

Chapter 6. Nonlinear Equations. 6.1 The Problem of Nonlinear Root-finding. 6.2 Rate of Convergence

Chapter 6. Nonlinear Equations. 6.1 The Problem of Nonlinear Root-finding. 6.2 Rate of Convergence Chapter 6 Nonlinear Equations 6. The Problem of Nonlinear Root-finding In this module we consider the problem of using numerical techniques to find the roots of nonlinear equations, f () =. Initially we

More information

1 Numerical optimization

1 Numerical optimization Contents Numerical optimization 5. Optimization of single-variable functions.............................. 5.. Golden Section Search..................................... 6.. Fibonacci Search........................................

More information

Chapter 4. Unconstrained optimization

Chapter 4. Unconstrained optimization Chapter 4. Unconstrained optimization Version: 28-10-2012 Material: (for details see) Chapter 11 in [FKS] (pp.251-276) A reference e.g. L.11.2 refers to the corresponding Lemma in the book [FKS] PDF-file

More information

5 Quasi-Newton Methods

5 Quasi-Newton Methods Unconstrained Convex Optimization 26 5 Quasi-Newton Methods If the Hessian is unavailable... Notation: H = Hessian matrix. B is the approximation of H. C is the approximation of H 1. Problem: Solve min

More information

Numerical Optimization

Numerical Optimization Numerical Optimization General Setup Let (. be a unction such that bn R R where b is a vector o unnown parameters. In many cases, b will not have a closed orm solution. We will estimate b by minimizing

More information

Convex Optimization CMU-10725

Convex Optimization CMU-10725 Convex Optimization CMU-10725 Quasi Newton Methods Barnabás Póczos & Ryan Tibshirani Quasi Newton Methods 2 Outline Modified Newton Method Rank one correction of the inverse Rank two correction of the

More information

Optimization II: Unconstrained Multivariable

Optimization II: Unconstrained Multivariable Optimization II: Unconstrained Multivariable CS 205A: Mathematical Methods for Robotics, Vision, and Graphics Justin Solomon CS 205A: Mathematical Methods Optimization II: Unconstrained Multivariable 1

More information

CLASS NOTES Computational Methods for Engineering Applications I Spring 2015

CLASS NOTES Computational Methods for Engineering Applications I Spring 2015 CLASS NOTES Computational Methods for Engineering Applications I Spring 2015 Petros Koumoutsakos Gerardo Tauriello (Last update: July 2, 2015) IMPORTANT DISCLAIMERS 1. REFERENCES: Much of the material

More information

Numerical solutions of nonlinear systems of equations

Numerical solutions of nonlinear systems of equations Numerical solutions of nonlinear systems of equations Tsung-Ming Huang Department of Mathematics National Taiwan Normal University, Taiwan E-mail: min@math.ntnu.edu.tw August 28, 2011 Outline 1 Fixed points

More information

Statistics 580 Optimization Methods

Statistics 580 Optimization Methods Statistics 580 Optimization Methods Introduction Let fx be a given real-valued function on R p. The general optimization problem is to find an x ɛ R p at which fx attain a maximum or a minimum. It is of

More information

Methods that avoid calculating the Hessian. Nonlinear Optimization; Steepest Descent, Quasi-Newton. Steepest Descent

Methods that avoid calculating the Hessian. Nonlinear Optimization; Steepest Descent, Quasi-Newton. Steepest Descent Nonlinear Optimization Steepest Descent and Niclas Börlin Department of Computing Science Umeå University niclas.borlin@cs.umu.se A disadvantage with the Newton method is that the Hessian has to be derived

More information

CISE-301: Numerical Methods Topic 1:

CISE-301: Numerical Methods Topic 1: CISE-3: Numerical Methods Topic : Introduction to Numerical Methods and Taylor Series Lectures -4: KFUPM Term 9 Section 8 CISE3_Topic KFUPM - T9 - Section 8 Lecture Introduction to Numerical Methods What

More information

Differentiation. The main problem of differential calculus deals with finding the slope of the tangent line at a point on a curve.

Differentiation. The main problem of differential calculus deals with finding the slope of the tangent line at a point on a curve. Dierentiation The main problem o dierential calculus deals with inding the slope o the tangent line at a point on a curve. deinition() : The slope o a curve at a point p is the slope, i it eists, o the

More information

The concept of limit

The concept of limit Roberto s Notes on Dierential Calculus Chapter 1: Limits and continuity Section 1 The concept o limit What you need to know already: All basic concepts about unctions. What you can learn here: What limits

More information

Optimization Methods

Optimization Methods Optimization Methods Decision making Examples: determining which ingredients and in what quantities to add to a mixture being made so that it will meet specifications on its composition allocating available

More information

Example: When describing where a function is increasing, decreasing or constant we use the x- axis values.

Example: When describing where a function is increasing, decreasing or constant we use the x- axis values. Business Calculus Lecture Notes (also Calculus With Applications or Business Math II) Chapter 3 Applications o Derivatives 31 Increasing and Decreasing Functions Inormal Deinition: A unction is increasing

More information

Topic 8c Multi Variable Optimization

Topic 8c Multi Variable Optimization Course Instructor Dr. Raymond C. Rumpf Office: A 337 Phone: (915) 747 6958 E Mail: rcrumpf@utep.edu Topic 8c Multi Variable Optimization EE 4386/5301 Computational Methods in EE Outline Mathematical Preliminaries

More information

Basic mathematics of economic models. 3. Maximization

Basic mathematics of economic models. 3. Maximization John Riley 1 January 16 Basic mathematics o economic models 3 Maimization 31 Single variable maimization 1 3 Multi variable maimization 6 33 Concave unctions 9 34 Maimization with non-negativity constraints

More information

Motivation: We have already seen an example of a system of nonlinear equations when we studied Gaussian integration (p.8 of integration notes)

Motivation: We have already seen an example of a system of nonlinear equations when we studied Gaussian integration (p.8 of integration notes) AMSC/CMSC 460 Computational Methods, Fall 2007 UNIT 5: Nonlinear Equations Dianne P. O Leary c 2001, 2002, 2007 Solving Nonlinear Equations and Optimization Problems Read Chapter 8. Skip Section 8.1.1.

More information

Quasi-Newton Methods

Quasi-Newton Methods Newton s Method Pros and Cons Quasi-Newton Methods MA 348 Kurt Bryan Newton s method has some very nice properties: It s extremely fast, at least once it gets near the minimum, and with the simple modifications

More information

Objectives. By the time the student is finished with this section of the workbook, he/she should be able

Objectives. By the time the student is finished with this section of the workbook, he/she should be able FUNCTIONS Quadratic Functions......8 Absolute Value Functions.....48 Translations o Functions..57 Radical Functions...61 Eponential Functions...7 Logarithmic Functions......8 Cubic Functions......91 Piece-Wise

More information

EAD 115. Numerical Solution of Engineering and Scientific Problems. David M. Rocke Department of Applied Science

EAD 115. Numerical Solution of Engineering and Scientific Problems. David M. Rocke Department of Applied Science EAD 115 Numerical Solution of Engineering and Scientific Problems David M. Rocke Department of Applied Science Multidimensional Unconstrained Optimization Suppose we have a function f() of more than one

More information

Nonlinear Programming

Nonlinear Programming Nonlinear Programming Kees Roos e-mail: C.Roos@ewi.tudelft.nl URL: http://www.isa.ewi.tudelft.nl/ roos LNMB Course De Uithof, Utrecht February 6 - May 8, A.D. 2006 Optimization Group 1 Outline for week

More information

Topic 4b. Open Methods for Root Finding

Topic 4b. Open Methods for Root Finding Course Instructor Dr. Ramond C. Rump Oice: A 337 Phone: (915) 747 6958 E Mail: rcrump@utep.edu Topic 4b Open Methods or Root Finding EE 4386/5301 Computational Methods in EE Outline Open Methods or Root

More information

A Simple Explanation of the Sobolev Gradient Method

A Simple Explanation of the Sobolev Gradient Method A Simple Explanation o the Sobolev Gradient Method R. J. Renka July 3, 2006 Abstract We have observed that the term Sobolev gradient is used more oten than it is understood. Also, the term is oten used

More information

Unconstrained optimization

Unconstrained optimization Chapter 4 Unconstrained optimization An unconstrained optimization problem takes the form min x Rnf(x) (4.1) for a target functional (also called objective function) f : R n R. In this chapter and throughout

More information

EAD 115. Numerical Solution of Engineering and Scientific Problems. David M. Rocke Department of Applied Science

EAD 115. Numerical Solution of Engineering and Scientific Problems. David M. Rocke Department of Applied Science EAD 115 Numerical Solution of Engineering and Scientific Problems David M. Rocke Department of Applied Science Taylor s Theorem Can often approximate a function by a polynomial The error in the approximation

More information

2. Quasi-Newton methods

2. Quasi-Newton methods L. Vandenberghe EE236C (Spring 2016) 2. Quasi-Newton methods variable metric methods quasi-newton methods BFGS update limited-memory quasi-newton methods 2-1 Newton method for unconstrained minimization

More information

18-660: Numerical Methods for Engineering Design and Optimization

18-660: Numerical Methods for Engineering Design and Optimization 8-66: Numerical Methods or Engineering Design and Optimization Xin Li Department o ECE Carnegie Mellon University Pittsburgh, PA 53 Slide Overview Linear Regression Ordinary least-squares regression Minima

More information

Optimization. Totally not complete this is...don't use it yet...

Optimization. Totally not complete this is...don't use it yet... Optimization Totally not complete this is...don't use it yet... Bisection? Doing a root method is akin to doing a optimization method, but bi-section would not be an effective method - can detect sign

More information

NonlinearOptimization

NonlinearOptimization 1/35 NonlinearOptimization Pavel Kordík Department of Computer Systems Faculty of Information Technology Czech Technical University in Prague Jiří Kašpar, Pavel Tvrdík, 2011 Unconstrained nonlinear optimization,

More information

Optimization II: Unconstrained Multivariable

Optimization II: Unconstrained Multivariable Optimization II: Unconstrained Multivariable CS 205A: Mathematical Methods for Robotics, Vision, and Graphics Doug James (and Justin Solomon) CS 205A: Mathematical Methods Optimization II: Unconstrained

More information

Extreme Values of Functions

Extreme Values of Functions Extreme Values o Functions When we are using mathematics to model the physical world in which we live, we oten express observed physical quantities in terms o variables. Then, unctions are used to describe

More information

AP Calculus Notes: Unit 1 Limits & Continuity. Syllabus Objective: 1.1 The student will calculate limits using the basic limit theorems.

AP Calculus Notes: Unit 1 Limits & Continuity. Syllabus Objective: 1.1 The student will calculate limits using the basic limit theorems. Syllabus Objective:. The student will calculate its using the basic it theorems. LIMITS how the outputs o a unction behave as the inputs approach some value Finding a Limit Notation: The it as approaches

More information

1 Newton s Method. Suppose we want to solve: x R. At x = x, f (x) can be approximated by:

1 Newton s Method. Suppose we want to solve: x R. At x = x, f (x) can be approximated by: Newton s Method Suppose we want to solve: (P:) min f (x) At x = x, f (x) can be approximated by: n x R. f (x) h(x) := f ( x)+ f ( x) T (x x)+ (x x) t H ( x)(x x), 2 which is the quadratic Taylor expansion

More information

Programming, numerics and optimization

Programming, numerics and optimization Programming, numerics and optimization Lecture C-3: Unconstrained optimization II Łukasz Jankowski ljank@ippt.pan.pl Institute of Fundamental Technological Research Room 4.32, Phone +22.8261281 ext. 428

More information

Nonlinear Equations. Chapter The Bisection Method

Nonlinear Equations. Chapter The Bisection Method Chapter 6 Nonlinear Equations Given a nonlinear function f(), a value r such that f(r) = 0, is called a root or a zero of f() For eample, for f() = e 016064, Fig?? gives the set of points satisfying y

More information

9.6 Newton-Raphson Method for Nonlinear Systems of Equations

9.6 Newton-Raphson Method for Nonlinear Systems of Equations 372 Chapter 9. Root Finding and Nonlinear Sets of Equations This equation, if used with i ranging over the roots already polished, will prevent a tentative root from spuriously hopping to another one s

More information

Static unconstrained optimization

Static unconstrained optimization Static unconstrained optimization 2 In unconstrained optimization an objective function is minimized without any additional restriction on the decision variables, i.e. min f(x) x X ad (2.) with X ad R

More information

Newton s Method. Ryan Tibshirani Convex Optimization /36-725

Newton s Method. Ryan Tibshirani Convex Optimization /36-725 Newton s Method Ryan Tibshirani Convex Optimization 10-725/36-725 1 Last time: dual correspondences Given a function f : R n R, we define its conjugate f : R n R, Properties and examples: f (y) = max x

More information

Shiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 3. Gradient Method

Shiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 3. Gradient Method Shiqian Ma, MAT-258A: Numerical Optimization 1 Chapter 3 Gradient Method Shiqian Ma, MAT-258A: Numerical Optimization 2 3.1. Gradient method Classical gradient method: to minimize a differentiable convex

More information

y2 = 0. Show that u = e2xsin(2y) satisfies Laplace's equation.

y2 = 0. Show that u = e2xsin(2y) satisfies Laplace's equation. Review 1 1) State the largest possible domain o deinition or the unction (, ) = 3 - ) Determine the largest set o points in the -plane on which (, ) = sin-1( - ) deines a continuous unction 3) Find the

More information

This is only a list of questions use a separate sheet to work out the problems. 1. (1.2 and 1.4) Use the given graph to answer each question.

This is only a list of questions use a separate sheet to work out the problems. 1. (1.2 and 1.4) Use the given graph to answer each question. Mth Calculus Practice Eam Questions NOTE: These questions should not be taken as a complete list o possible problems. The are merel intended to be eamples o the diicult level o the regular eam questions.

More information

Nonlinear Equations. Not guaranteed to have any real solutions, but generally do for astrophysical problems.

Nonlinear Equations. Not guaranteed to have any real solutions, but generally do for astrophysical problems. Nonlinear Equations Often (most of the time??) the relevant system of equations is not linear in the unknowns. Then, cannot decompose as Ax = b. Oh well. Instead write as: (1) f(x) = 0 function of one

More information

ECE 595, Section 10 Numerical Simulations Lecture 7: Optimization and Eigenvalues. Prof. Peter Bermel January 23, 2013

ECE 595, Section 10 Numerical Simulations Lecture 7: Optimization and Eigenvalues. Prof. Peter Bermel January 23, 2013 ECE 595, Section 10 Numerical Simulations Lecture 7: Optimization and Eigenvalues Prof. Peter Bermel January 23, 2013 Outline Recap from Friday Optimization Methods Brent s Method Golden Section Search

More information

Exploring the energy landscape

Exploring the energy landscape Exploring the energy landscape ChE210D Today's lecture: what are general features of the potential energy surface and how can we locate and characterize minima on it Derivatives of the potential energy

More information

Scientific Computing: An Introductory Survey

Scientific Computing: An Introductory Survey Scientific Computing: An Introductory Survey Chapter 5 Nonlinear Equations Prof. Michael T. Heath Department of Computer Science University of Illinois at Urbana-Champaign Copyright c 2002. Reproduction

More information

Finding roots. Lecture 4

Finding roots. Lecture 4 Finding roots Lecture 4 Finding roots: Find such that 0 or given. Bisection method: The intermediate value theorem states the obvious: i a continuous unction changes sign within a given interval, it has

More information

Queens College, CUNY, Department of Computer Science Numerical Methods CSCI 361 / 761 Spring 2018 Instructor: Dr. Sateesh Mane.

Queens College, CUNY, Department of Computer Science Numerical Methods CSCI 361 / 761 Spring 2018 Instructor: Dr. Sateesh Mane. Queens College, CUNY, Department of Computer Science Numerical Methods CSCI 361 / 761 Spring 2018 Instructor: Dr. Sateesh Mane c Sateesh R. Mane 2018 3 Lecture 3 3.1 General remarks March 4, 2018 This

More information

Review of Classical Optimization

Review of Classical Optimization Part II Review of Classical Optimization Multidisciplinary Design Optimization of Aircrafts 51 2 Deterministic Methods 2.1 One-Dimensional Unconstrained Minimization 2.1.1 Motivation Most practical optimization

More information

Scientific Computing: An Introductory Survey

Scientific Computing: An Introductory Survey Scientific Computing: An Introductory Survey Chapter 6 Optimization Prof. Michael T. Heath Department of Computer Science University of Illinois at Urbana-Champaign Copyright c 2002. Reproduction permitted

More information

Scientific Computing: An Introductory Survey

Scientific Computing: An Introductory Survey Scientific Computing: An Introductory Survey Chapter 6 Optimization Prof. Michael T. Heath Department of Computer Science University of Illinois at Urbana-Champaign Copyright c 2002. Reproduction permitted

More information

Quasi-Newton methods: Symmetric rank 1 (SR1) Broyden Fletcher Goldfarb Shanno February 6, / 25 (BFG. Limited memory BFGS (L-BFGS)

Quasi-Newton methods: Symmetric rank 1 (SR1) Broyden Fletcher Goldfarb Shanno February 6, / 25 (BFG. Limited memory BFGS (L-BFGS) Quasi-Newton methods: Symmetric rank 1 (SR1) Broyden Fletcher Goldfarb Shanno (BFGS) Limited memory BFGS (L-BFGS) February 6, 2014 Quasi-Newton methods: Symmetric rank 1 (SR1) Broyden Fletcher Goldfarb

More information

Roberto s Notes on Differential Calculus Chapter 8: Graphical analysis Section 1. Extreme points

Roberto s Notes on Differential Calculus Chapter 8: Graphical analysis Section 1. Extreme points Roberto s Notes on Dierential Calculus Chapter 8: Graphical analysis Section 1 Extreme points What you need to know already: How to solve basic algebraic and trigonometric equations. All basic techniques

More information

Section 3.4: Concavity and the second Derivative Test. Find any points of inflection of the graph of a function.

Section 3.4: Concavity and the second Derivative Test. Find any points of inflection of the graph of a function. Unit 3: Applications o Dierentiation Section 3.4: Concavity and the second Derivative Test Determine intervals on which a unction is concave upward or concave downward. Find any points o inlection o the

More information

Lecture 34 Minimization and maximization of functions

Lecture 34 Minimization and maximization of functions Lecture 34 Minimization and maximization of functions Introduction Golden section search Parabolic interpolation Search with first derivatives Downhill simplex method Introduction In a nutshell, you are

More information

1. Method 1: bisection. The bisection methods starts from two points a 0 and b 0 such that

1. Method 1: bisection. The bisection methods starts from two points a 0 and b 0 such that Chapter 4 Nonlinear equations 4.1 Root finding Consider the problem of solving any nonlinear relation g(x) = h(x) in the real variable x. We rephrase this problem as one of finding the zero (root) of a

More information

CS 450 Numerical Analysis. Chapter 5: Nonlinear Equations

CS 450 Numerical Analysis. Chapter 5: Nonlinear Equations Lecture slides based on the textbook Scientific Computing: An Introductory Survey by Michael T. Heath, copyright c 2018 by the Society for Industrial and Applied Mathematics. http://www.siam.org/books/cl80

More information

Solution of Nonlinear Equations

Solution of Nonlinear Equations Solution of Nonlinear Equations (Com S 477/577 Notes) Yan-Bin Jia Sep 14, 017 One of the most frequently occurring problems in scientific work is to find the roots of equations of the form f(x) = 0. (1)

More information

ECS550NFB Introduction to Numerical Methods using Matlab Day 2

ECS550NFB Introduction to Numerical Methods using Matlab Day 2 ECS550NFB Introduction to Numerical Methods using Matlab Day 2 Lukas Laffers lukas.laffers@umb.sk Department of Mathematics, University of Matej Bel June 9, 2015 Today Root-finding: find x that solves

More information

Line Search Techniques

Line Search Techniques Multidisciplinary Design Optimization 33 Chapter 2 Line Search Techniques 2.1 Introduction Most practical optimization problems involve many variables, so the study of single variable minimization may

More information

Numerical Optimization

Numerical Optimization Numerical Optimization Unit 2: Multivariable optimization problems Che-Rung Lee Scribe: February 28, 2011 (UNIT 2) Numerical Optimization February 28, 2011 1 / 17 Partial derivative of a two variable function

More information

Scientific Computing: Optimization

Scientific Computing: Optimization Scientific Computing: Optimization Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 Course MATH-GA.2043 or CSCI-GA.2112, Spring 2012 March 8th, 2011 A. Donev (Courant Institute) Lecture

More information

Numerical Methods. Root Finding

Numerical Methods. Root Finding Numerical Methods Solving Non Linear 1-Dimensional Equations Root Finding Given a real valued function f of one variable (say ), the idea is to find an such that: f() 0 1 Root Finding Eamples Find real

More information

MEAN VALUE THEOREM. Section 3.2 Calculus AP/Dual, Revised /30/2018 1:16 AM 3.2: Mean Value Theorem 1

MEAN VALUE THEOREM. Section 3.2 Calculus AP/Dual, Revised /30/2018 1:16 AM 3.2: Mean Value Theorem 1 MEAN VALUE THEOREM Section 3. Calculus AP/Dual, Revised 017 viet.dang@humbleisd.net 7/30/018 1:16 AM 3.: Mean Value Theorem 1 ACTIVITY A. Draw a curve (x) on a separate sheet o paper within a deined closed

More information

Lecture 14: Newton s Method

Lecture 14: Newton s Method 10-725/36-725: Conve Optimization Fall 2016 Lecturer: Javier Pena Lecture 14: Newton s ethod Scribes: Varun Joshi, Xuan Li Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer: These notes

More information

0,0 B 5,0 C 0, 4 3,5. y x. Recitation Worksheet 1A. 1. Plot these points in the xy plane: A

0,0 B 5,0 C 0, 4 3,5. y x. Recitation Worksheet 1A. 1. Plot these points in the xy plane: A Math 13 Recitation Worksheet 1A 1 Plot these points in the y plane: A 0,0 B 5,0 C 0, 4 D 3,5 Without using a calculator, sketch a graph o each o these in the y plane: A y B 3 Consider the unction a Evaluate

More information

Single Variable Minimization

Single Variable Minimization AA222: MDO 37 Sunday 1 st April, 2012 at 19:48 Chapter 2 Single Variable Minimization 2.1 Motivation Most practical optimization problems involve many variables, so the study of single variable minimization

More information

Constrained optimization. Unconstrained optimization. One-dimensional. Multi-dimensional. Newton with equality constraints. Active-set method.

Constrained optimization. Unconstrained optimization. One-dimensional. Multi-dimensional. Newton with equality constraints. Active-set method. Optimization Unconstrained optimization One-dimensional Multi-dimensional Newton s method Basic Newton Gauss- Newton Quasi- Newton Descent methods Gradient descent Conjugate gradient Constrained optimization

More information

( x) f = where P and Q are polynomials.

( x) f = where P and Q are polynomials. 9.8 Graphing Rational Functions Lets begin with a deinition. Deinition: Rational Function A rational unction is a unction o the orm ( ) ( ) ( ) P where P and Q are polynomials. Q An eample o a simple rational

More information

Line Search Methods for Unconstrained Optimisation

Line Search Methods for Unconstrained Optimisation Line Search Methods for Unconstrained Optimisation Lecture 8, Numerical Linear Algebra and Optimisation Oxford University Computing Laboratory, MT 2007 Dr Raphael Hauser (hauser@comlab.ox.ac.uk) The Generic

More information

. This is the Basic Chain Rule. x dt y dt z dt Chain Rule in this context.

. This is the Basic Chain Rule. x dt y dt z dt Chain Rule in this context. Math 18.0A Gradients, Chain Rule, Implicit Dierentiation, igher Order Derivatives These notes ocus on our things: (a) the application o gradients to ind normal vectors to curves suraces; (b) the generaliation

More information

Example - Newton-Raphson Method

Example - Newton-Raphson Method Eample - Newton-Raphson Method We now consider the following eample: minimize f( 3 3 + -- 4 4 Since f ( 3 2 + 3 3 and f ( 6 + 9 2 we form the following iteration: + n 3 ( n 3 3( n 2 ------------------------------------

More information

Nonlinear Optimization

Nonlinear Optimization Nonlinear Optimization (Com S 477/577 Notes) Yan-Bin Jia Nov 7, 2017 1 Introduction Given a single function f that depends on one or more independent variable, we want to find the values of those variables

More information

CLASS NOTES Models, Algorithms and Data: Introduction to computing 2018

CLASS NOTES Models, Algorithms and Data: Introduction to computing 2018 CLASS NOTES Models, Algorithms and Data: Introduction to computing 2018 Petros Koumoutsakos, Jens Honore Walther (Last update: April 16, 2018) IMPORTANT DISCLAIMERS 1. REFERENCES: Much of the material

More information

8.4 Inverse Functions

8.4 Inverse Functions Section 8. Inverse Functions 803 8. Inverse Functions As we saw in the last section, in order to solve application problems involving eponential unctions, we will need to be able to solve eponential equations

More information

Lab on Taylor Polynomials. This Lab is accompanied by an Answer Sheet that you are to complete and turn in to your instructor.

Lab on Taylor Polynomials. This Lab is accompanied by an Answer Sheet that you are to complete and turn in to your instructor. Lab on Taylor Polynomials This Lab is accompanied by an Answer Sheet that you are to complete and turn in to your instructor. In this Lab we will approimate complicated unctions by simple unctions. The

More information

Optimization and Root Finding. Kurt Hornik

Optimization and Root Finding. Kurt Hornik Optimization and Root Finding Kurt Hornik Basics Root finding and unconstrained smooth optimization are closely related: Solving ƒ () = 0 can be accomplished via minimizing ƒ () 2 Slide 2 Basics Root finding

More information

Exact and Approximate Numbers:

Exact and Approximate Numbers: Eact and Approimate Numbers: The numbers that arise in technical applications are better described as eact numbers because there is not the sort of uncertainty in their values that was described above.

More information

17 Solution of Nonlinear Systems

17 Solution of Nonlinear Systems 17 Solution of Nonlinear Systems We now discuss the solution of systems of nonlinear equations. An important ingredient will be the multivariate Taylor theorem. Theorem 17.1 Let D = {x 1, x 2,..., x m

More information

Outline. Scientific Computing: An Introductory Survey. Optimization. Optimization Problems. Examples: Optimization Problems

Outline. Scientific Computing: An Introductory Survey. Optimization. Optimization Problems. Examples: Optimization Problems Outline Scientific Computing: An Introductory Survey Chapter 6 Optimization 1 Prof. Michael. Heath Department of Computer Science University of Illinois at Urbana-Champaign Copyright c 2002. Reproduction

More information

Lecture 14: October 17

Lecture 14: October 17 1-725/36-725: Convex Optimization Fall 218 Lecture 14: October 17 Lecturer: Lecturer: Ryan Tibshirani Scribes: Pengsheng Guo, Xian Zhou Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer:

More information

Lecture 7 Unconstrained nonlinear programming

Lecture 7 Unconstrained nonlinear programming Lecture 7 Unconstrained nonlinear programming Weinan E 1,2 and Tiejun Li 2 1 Department of Mathematics, Princeton University, weinan@princeton.edu 2 School of Mathematical Sciences, Peking University,

More information

Multivariate Newton Minimanization

Multivariate Newton Minimanization Multivariate Newton Minimanization Optymalizacja syntezy biosurfaktantu Rhamnolipid Rhamnolipids are naturally occuring glycolipid produced commercially by the Pseudomonas aeruginosa species of bacteria.

More information

Lecture Notes to Accompany. Scientific Computing An Introductory Survey. by Michael T. Heath. Chapter 5. Nonlinear Equations

Lecture Notes to Accompany. Scientific Computing An Introductory Survey. by Michael T. Heath. Chapter 5. Nonlinear Equations Lecture Notes to Accompany Scientific Computing An Introductory Survey Second Edition by Michael T Heath Chapter 5 Nonlinear Equations Copyright c 2001 Reproduction permitted only for noncommercial, educational

More information

Nonlinear Optimization for Optimal Control

Nonlinear Optimization for Optimal Control Nonlinear Optimization for Optimal Control Pieter Abbeel UC Berkeley EECS Many slides and figures adapted from Stephen Boyd [optional] Boyd and Vandenberghe, Convex Optimization, Chapters 9 11 [optional]

More information

Wind-Driven Circulation: Stommel s gyre & Sverdrup s balance

Wind-Driven Circulation: Stommel s gyre & Sverdrup s balance Wind-Driven Circulation: Stommel s gyre & Sverdrup s balance We begin by returning to our system o equations or low o a layer o uniorm density on a rotating earth. du dv h + [ u( H + h)] + [ v( H t y d

More information

Optimization 2. CS5240 Theoretical Foundations in Multimedia. Leow Wee Kheng

Optimization 2. CS5240 Theoretical Foundations in Multimedia. Leow Wee Kheng Optimization 2 CS5240 Theoretical Foundations in Multimedia Leow Wee Kheng Department of Computer Science School of Computing National University of Singapore Leow Wee Kheng (NUS) Optimization 2 1 / 38

More information

Outline. Scientific Computing: An Introductory Survey. Nonlinear Equations. Nonlinear Equations. Examples: Nonlinear Equations

Outline. Scientific Computing: An Introductory Survey. Nonlinear Equations. Nonlinear Equations. Examples: Nonlinear Equations Methods for Systems of Methods for Systems of Outline Scientific Computing: An Introductory Survey Chapter 5 1 Prof. Michael T. Heath Department of Computer Science University of Illinois at Urbana-Champaign

More information

Math 411 Preliminaries

Math 411 Preliminaries Math 411 Preliminaries Provide a list of preliminary vocabulary and concepts Preliminary Basic Netwon s method, Taylor series expansion (for single and multiple variables), Eigenvalue, Eigenvector, Vector

More information

EC5555 Economics Masters Refresher Course in Mathematics September 2013

EC5555 Economics Masters Refresher Course in Mathematics September 2013 EC5555 Economics Masters Reresher Course in Mathematics September 3 Lecture 5 Unconstraine Optimization an Quaratic Forms Francesco Feri We consier the unconstraine optimization or the case o unctions

More information

and ( x, y) in a domain D R a unique real number denoted x y and b) = x y = {(, ) + 36} that is all points inside and on

and ( x, y) in a domain D R a unique real number denoted x y and b) = x y = {(, ) + 36} that is all points inside and on Mat 7 Calculus III Updated on 10/4/07 Dr. Firoz Chapter 14 Partial Derivatives Section 14.1 Functions o Several Variables Deinition: A unction o two variables is a rule that assigns to each ordered pair

More information