Introduction to Simulation - Lecture 10. Modified Newton Methods. Jacob White

Size: px
Start display at page:

Download "Introduction to Simulation - Lecture 10. Modified Newton Methods. Jacob White"

Transcription

1 Introduction to Simulation - Lecture 0 Modified Newton Methods Jacob White Thans to Deepa Ramaswamy, Jaime Peraire, Michal Rewiensi, and Karen Veroy

2 Outline Damped Newton Schemes SMA-HPC 003 MIT Globally Convergent if Jacobian is Nonsingular Difficulty with Singular Jacobians Introduce Continuation Schemes Problem with Source/Load stepping More General Continuation Scheme Improving Continuation Efficiency Better first guess for each continuation step Arc Length Continuation

3 Multidimensional Newton Method Newton Algorithm Newton Algorithm for Solving x ) = 0 0 x = = Initial Guess, 0 Repeat { ) ) Compute x, J x Solve = + ) = J x x x x for x + + } Until x + x x + ), small enough SMA-HPC 003 MIT

4 Multidimensional Newton Method If Main Theorem ) a J x β ) Inverse is bounded Multidimensional Convergence Theorem Theorem Statement b) J x J y x y Derivative is Lipschitz Cont Then Newton s method converges given a sufficiently close initial guess SMA-HPC 003 MIT

5 Multidimensional Newton Method Multidimensional Convergence Theorem Implications If a function s first derivative never goes to zero, and its second derivative is never too large Then Newton s method can be used to find the zero of the function provided you all ready now the answer. Need a way to develop Newton methods which converge regardless of initial guess! SMA-HPC 003 MIT

6 Non-converging Case fx) -D Picture x 0 x X Limiting the changes in X might improve convergence SMA-HPC 003 MIT

7 Newton Method with Limiting Newton Algorithm Newton Algorithm for Solving x ) = 0 0 x = = Initial Guess, 0 Repeat { ) ) Compute x, J x Solve J x x = x for x + + SMA-HPC 003 MIT x = x + limited x + + = + } Until x + x + ), small enough

8 Newton Method with Limiting Damped Newton Scheme General Damping Scheme Solve J x x = x for x + + x = x + α x + + Key Idea: Line Search Pic α to minimize SMA-HPC 003 MIT + + α x ) x + ) + ) + + α + α + α ) x x x x x x Method Performs a one-dimensional search in Newton Direction T

9 Newton Method with Limiting If Damped Newton Convergence Theorem ) a J x β ) Inverse is bounded b) J x J y x y Derivative is Lipschitz Cont Then ] There exists a set of α ' s 0, such that + ) + ) = + < ) x x α x γ x with γ< Every Step reduces -- Global Convergence! SMA-HPC 003 MIT

10 Newton Method with Limiting 0 x Initial Guess, = = 0 Repeat { Solve ) ) Compute x, J x J x x = x for x + + ] + x + α x ) ind α 0, such that is minimizd e x = x + α x + + = + } Until x + x + ) Damped Newton Nested Iteration, small enough SMA-HPC 003 MIT

11 Newton Method with Limiting v v 0 v V - d V d Vt I I e ) = 0 d Damped Newton Example I r s Vr = 0 0 Nodal Equations with Numerical Values v ) 0) v f v = + 0 e ) = 0 0

12 Newton Method with Limiting Damped Newton Example cont. v ) 0) v f v = + 0 e ) = 0 0

13 Newton Method with Limiting 0 x Initial Guess, = = 0 Repeat { Solve ) ) Compute x, J x J x x = x for x Damped Newton Nested Iteration + + ] + x + α x ) ind α 0, such that is minimizd e x = x + α x + + = + } Until x + x + ), small enough How can one find the damping coefficients? SMA-HPC 003 MIT

14 Newton Method with Limiting Damped Newton Theorem Proof By definition of the Newton Iteration + = - α x x J x x Newton Direction Multidimensional Mean Value Lemma x) y) J y) x y) x y Combining x x x + ) + J x α J x x α J x SMA-HPC 003 MIT

15 Newton Method with Limiting Damped Newton Theorem Proof-Cont rom the previous slide x x x + + J x α J x x α J x Combining terms and moving scalars out of norms x SMA-HPC 003 MIT + ) α x α J x x Using the Jacobian Bound and splitting the norm + β x α x α x ) + Yields a quadratic in the damping coefficient

16 Newton Method with Limiting Simplifying quadratic from previous slide Two Cases: β + Damped Newton Theorem Proof-Cont-II α + α ) x ) x ) x β ) x ) < Pic α = Standard Newton) β α + α ) x ) < β ) x ) > Pic α = β x SMA-HPC 003 MIT β α + α ) x ) < β x

17 Newton Method with Limiting Combining the results from the previous slide + γ x ) x The above result does imply 0 x ) x + or the case where β x ) Damped Newton Theorem Proof-Cont-III not good enough, need γ independent from not yet a convergence theorem > 0 β x ) β x ) γ 0 Note the proof technique irst Show that the iterates do not increase Second Use the non-increasing fact to prove convergence SMA-HPC 003 MIT

18 Newton Method with Limiting 0 x Initial Guess, = = 0 Repeat { Solve ) ) Compute x, J x J x x = x for x + + ] + x + α x ) ind α 0, such that is minimizd e x = x + α x + + = + } Until x + x + ) Damped Newton Nested Iteration, small enough Many approaches to finding α SMA-HPC 003 MIT

19 Newton Method with Limiting fx) Damped Newton Singular Jacobian Problem x x x D 0 x X SMA-HPC 003 MIT Damped Newton Methods push iterates to local minimums inds the points where Jacobian is Singular

20 Continuation Schemes Basic Concepts Source or Load-Stepping Newton converges given a close initial guess Generate a sequence of problems Mae sure previous problem generates guess for next problem Heat-conducting bar example. Start with heat off, T= 0 is a very close initial guess. Increase the heat slightly, T=0 is a good initial guess 3. Increase heat again SMA-HPC 003 MIT

21 Continuation Schemes ) Basic Concepts General Setting Solve x λ, λ = 0 where: a) x 0,0 = 0 is easy to solve ) ) = x) b) x, c) x λ ) is sufficiently smooth x λ ) Ends the continuation Starts the continuation Hard to insure! Dissallowed 0 λ SMA-HPC 003 MIT

22 Basic Concepts Continuation Schemes Template Algorithm Solve x 0,0, ) ) x λ ) prev = x 0) δλ=0.0, λ = δλ While λ < { } x 0 λ) = x λ ) prev ) Try to Solve x λ, λ = 0 with Newton If Newton Converged x λ ) prev = x λ), λ = λ+ δλ, δλ = δλ Else δλ = δλ, λ = λprev + δλ SMA-HPC 003 MIT

23 V s Continuation Schemes + - R v Diode Basic Concepts Source/Load Stepping Examples f v λ), λ) = idiode v) + v λvs ) = 0 R λ ) f v, idiode v = + Not λ dependent! v v R f f L x, λ = f x, y = 0 x f x, y + λ f = 0 y l Source/Load Stepping Does Not Alter Jacobian SMA-HPC 003 MIT

24 Continuation Schemes ) Jacobian Altering Scheme Description x λ, λ = λ x λ + λ x λ 0,0 ) ) x 0,0 ) λ = 0 x = x 0 = 0 x Observations = I Problem is easy to solve and Jacobian definitely nonsingular. ) ) x λ = x,= x 0,0 ) x ) x = x Bac to the original problem and original Jacobian SMA-HPC 003 MIT

25 Continuation Schemes Basic Algorithm Solve x 0,0, ) ) x λ ) prev = x 0) δλ=0.0, λ = δλ While λ < { } x 0 λ) = x λ ) prev +? ) Jacobian Altering Scheme Try to Solve x λ, λ = 0 with Newton If Newton Converged x λ ) prev = x λ), λ = λ+ δλ, δλ = δλ Else δλ = δλ, λ = λprev + δλ SMA-HPC 003 MIT

26 Continuation Schemes Jacobian Altering Scheme Initial Guess for each step. x λ) x λ + δλ λ λ + δλ λ + δλ) = x λ) 0 0 x Initial Guess Error λ SMA-HPC 003 MIT

27 Continuation Schemes x λ + δλ λ + δλ, ) x λ) λ) x Have rom last step s Newton SMA-HPC 003 MIT Jacobian Altering Scheme Update Improvement x λ λ + x 0), ) λ), λ) x λ δλ) x λ) ) x x λ, ) λ λ δλ + + λ), x x λ + δλ) x λ )) = λ, 0 Better Guess for next step s Newton λ δλ

28 Continuation Schemes Jacobian Altering Scheme Update Improvement Cont. If ) x, = x + x λ λ λ λ λ λ Then x, λ ) λ x λ ) = x Easily Computed SMA-HPC 003 MIT

29 Continuation Schemes Jacobian Altering Scheme Update Improvement Cont. II. λ) λ) x λ) x,, λ 0 x λ+ δλ) = x λ ) δλ x λ Graphically x λ) 0 x λ + δλ λ λ + δλ 0 λ SMA-HPC 003 MIT

30 Continuation Schemes x λ) Jacobian Altering Scheme Still can have problems Must switch bac to increasing lambda Arc-length steps Must switch from increasing to decreasing lambda 0 lambda steps λ SMA-HPC 003 MIT

31 Continuation Schemes x λ) Jacobian Altering Scheme Arc-length Steps? Arc-length steps arc-length x + δλ SMA-HPC 003 MIT 0 x, λ ) = 0 λ Must Solve or Lambda λ λ λ ) prev + x x prev arc = 0

32 Continuation Schemes Jacobian Altering Scheme Arc-length steps by Newton ) x, λ x, λ ) + x x x λ + = ) T λ λ x x λ ) prev λ λprev x, λ ) λ λ prev λprev + x x arc SMA-HPC 003 MIT

33 Continuation Schemes x λ) Jacobian Altering Scheme Arc-length Turning point What happens here? 0 Upper left-hand Bloc is singular λ x x, ) λ x, λ ) λ ) T x x λ ) prev λ λprev SMA-HPC 003 MIT

34 Summary Damped Newton Schemes SMA-HPC 003 MIT Globally Convergent if Jacobian is Nonsingular Difficulty with Singular Jacobians Introduce Continuation Schemes Problem with Source/Load stepping More General Continuation Scheme Improving Efficiency Better first guess for each continuation step Arc-length Continuation

Introduction to Simulation - Lecture 9. Multidimensional Newton Methods. Jacob White

Introduction to Simulation - Lecture 9. Multidimensional Newton Methods. Jacob White Introduction to Simulation - Lecture 9 Multidimensional Newton Methods Jacob White Thanks to Deepak Ramaswamy, Jaime Peraire, Michal Rewienski, and Karen Veroy Outline Quick Review of -D Newton Convergence

More information

Laplace s Equation FEM Methods. Jacob White. Thanks to Deepak Ramaswamy, Michal Rewienski, and Karen Veroy, Jaime Peraire and Tony Patera

Laplace s Equation FEM Methods. Jacob White. Thanks to Deepak Ramaswamy, Michal Rewienski, and Karen Veroy, Jaime Peraire and Tony Patera Introduction to Simulation - Lecture 19 Laplace s Equation FEM Methods Jacob White Thanks to Deepak Ramaswamy, Michal Rewienski, and Karen Veroy, Jaime Peraire and Tony Patera Outline for Poisson Equation

More information

Introduction to Simulation - Lecture 2. Equation Formulation Methods. Jacob White. Thanks to Deepak Ramaswamy, Michal Rewienski, and Karen Veroy

Introduction to Simulation - Lecture 2. Equation Formulation Methods. Jacob White. Thanks to Deepak Ramaswamy, Michal Rewienski, and Karen Veroy Introduction to Simulation - Lecture Equation Formulation Methods Jacob White Thanks to Deepak Ramaswamy, Michal Rewienski, and Karen Veroy Outline Formulating Equations rom Schematics Struts and Joints

More information

Introduction to Compact Dynamical Modeling. II.1 Steady State Simulation. Luca Daniel Massachusetts Institute of Technology. dx dt.

Introduction to Compact Dynamical Modeling. II.1 Steady State Simulation. Luca Daniel Massachusetts Institute of Technology. dx dt. Course Outline NS & NIH Introduction to Compact Dynamical Modeling II. Steady State Simulation uca Daniel Massachusetts Institute o Technology Quic Snea Preview I. ssembling Models rom Physical Problems

More information

Fast Methods for Integral Equations. Jacob White. Thanks to Deepak Ramaswamy, Michal Rewienski, and Karen Veroy

Fast Methods for Integral Equations. Jacob White. Thanks to Deepak Ramaswamy, Michal Rewienski, and Karen Veroy Introduction to Simulation - Lecture 23 Fast Methods for Integral Equations Jacob White Thanks to Deepak Ramaswamy, Michal Rewienski, and Karen Veroy Outline Solving Discretized Integral Equations Using

More information

Lecture 8 Optimization

Lecture 8 Optimization 4/9/015 Lecture 8 Optimization EE 4386/5301 Computational Methods in EE Spring 015 Optimization 1 Outline Introduction 1D Optimization Parabolic interpolation Golden section search Newton s method Multidimensional

More information

1 Newton s Method. Suppose we want to solve: x R. At x = x, f (x) can be approximated by:

1 Newton s Method. Suppose we want to solve: x R. At x = x, f (x) can be approximated by: Newton s Method Suppose we want to solve: (P:) min f (x) At x = x, f (x) can be approximated by: n x R. f (x) h(x) := f ( x)+ f ( x) T (x x)+ (x x) t H ( x)(x x), 2 which is the quadratic Taylor expansion

More information

Scientific Computing: An Introductory Survey

Scientific Computing: An Introductory Survey Scientific Computing: An Introductory Survey Chapter 5 Nonlinear Equations Prof. Michael T. Heath Department of Computer Science University of Illinois at Urbana-Champaign Copyright c 2002. Reproduction

More information

ECE 476. Exam #2. Tuesday, November 15, Minutes

ECE 476. Exam #2. Tuesday, November 15, Minutes Name: Answers ECE 476 Exam #2 Tuesday, November 15, 2016 75 Minutes Closed book, closed notes One new note sheet allowed, one old note sheet allowed 1. / 20 2. / 20 3. / 20 4. / 20 5. / 20 Total / 100

More information

(One Dimension) Problem: for a function f(x), find x 0 such that f(x 0 ) = 0. f(x)

(One Dimension) Problem: for a function f(x), find x 0 such that f(x 0 ) = 0. f(x) Solving Nonlinear Equations & Optimization One Dimension Problem: or a unction, ind 0 such that 0 = 0. 0 One Root: The Bisection Method This one s guaranteed to converge at least to a singularity, i not

More information

D(f/g)(P ) = D(f)(P )g(p ) f(p )D(g)(P ). g 2 (P )

D(f/g)(P ) = D(f)(P )g(p ) f(p )D(g)(P ). g 2 (P ) We first record a very useful: 11. Higher derivatives Theorem 11.1. Let A R n be an open subset. Let f : A R m and g : A R m be two functions and suppose that P A. Let λ A be a scalar. If f and g are differentiable

More information

17 Solution of Nonlinear Systems

17 Solution of Nonlinear Systems 17 Solution of Nonlinear Systems We now discuss the solution of systems of nonlinear equations. An important ingredient will be the multivariate Taylor theorem. Theorem 17.1 Let D = {x 1, x 2,..., x m

More information

Methods for Computing Periodic Steady-State Jacob White

Methods for Computing Periodic Steady-State Jacob White Introduction to Simulation - Lecture 15 Methods for Computing Periodic Steady-State Jacob White Thanks to Deepak Ramaswamy, Michal Rewienski, and Karen Veroy Outline Periodic Steady-state problems Application

More information

Maria Cameron. f(x) = 1 n

Maria Cameron. f(x) = 1 n Maria Cameron 1. Local algorithms for solving nonlinear equations Here we discuss local methods for nonlinear equations r(x) =. These methods are Newton, inexact Newton and quasi-newton. We will show that

More information

Advanced Techniques for Mobile Robotics Least Squares. Wolfram Burgard, Cyrill Stachniss, Kai Arras, Maren Bennewitz

Advanced Techniques for Mobile Robotics Least Squares. Wolfram Burgard, Cyrill Stachniss, Kai Arras, Maren Bennewitz Advanced Techniques for Mobile Robotics Least Squares Wolfram Burgard, Cyrill Stachniss, Kai Arras, Maren Bennewitz Problem Given a system described by a set of n observation functions {f i (x)} i=1:n

More information

Lecture 5. September 4, 2018 Math/CS 471: Introduction to Scientific Computing University of New Mexico

Lecture 5. September 4, 2018 Math/CS 471: Introduction to Scientific Computing University of New Mexico Lecture 5 September 4, 2018 Math/CS 471: Introduction to Scientific Computing University of New Mexico 1 Review: Office hours at regularly scheduled times this week Tuesday: 9:30am-11am Wed: 2:30pm-4:00pm

More information

Newton-Raphson. Relies on the Taylor expansion f(x + δ) = f(x) + δ f (x) + ½ δ 2 f (x) +..

Newton-Raphson. Relies on the Taylor expansion f(x + δ) = f(x) + δ f (x) + ½ δ 2 f (x) +.. 2008 Lecture 7 starts here Newton-Raphson When the derivative of f(x) is known, and when f(x) is well behaved, the celebrated (and ancient) Newton- Raphson method gives the fastest convergence of all (

More information

ECEN 615 Methods of Electric Power Systems Analysis Lecture 18: Least Squares, State Estimation

ECEN 615 Methods of Electric Power Systems Analysis Lecture 18: Least Squares, State Estimation ECEN 615 Methods of Electric Power Systems Analysis Lecture 18: Least Squares, State Estimation Prof. om Overbye Dept. of Electrical and Computer Engineering exas A&M University overbye@tamu.edu Announcements

More information

Newton s Method. Javier Peña Convex Optimization /36-725

Newton s Method. Javier Peña Convex Optimization /36-725 Newton s Method Javier Peña Convex Optimization 10-725/36-725 1 Last time: dual correspondences Given a function f : R n R, we define its conjugate f : R n R, f ( (y) = max y T x f(x) ) x Properties and

More information

Introduction to Simulation - Lecture 13. Convergence of Multistep Methods. Jacob White. Thanks to Deepak Ramaswamy, Michal Rewienski, and Karen Veroy

Introduction to Simulation - Lecture 13. Convergence of Multistep Methods. Jacob White. Thanks to Deepak Ramaswamy, Michal Rewienski, and Karen Veroy Introduction to Simuation - Lecture 13 Convergence of Mutistep Methods Jacob White Thans to Deepa Ramaswamy, Micha Rewiensi, and Karen Veroy Outine Sma Timestep issues for Mutistep Methods Loca truncation

More information

MATH 3795 Lecture 13. Numerical Solution of Nonlinear Equations in R N.

MATH 3795 Lecture 13. Numerical Solution of Nonlinear Equations in R N. MATH 3795 Lecture 13. Numerical Solution of Nonlinear Equations in R N. Dmitriy Leykekhman Fall 2008 Goals Learn about different methods for the solution of F (x) = 0, their advantages and disadvantages.

More information

COMP 558 lecture 18 Nov. 15, 2010

COMP 558 lecture 18 Nov. 15, 2010 Least squares We have seen several least squares problems thus far, and we will see more in the upcoming lectures. For this reason it is good to have a more general picture of these problems and how to

More information

Lecture 14: Newton s Method

Lecture 14: Newton s Method 10-725/36-725: Conve Optimization Fall 2016 Lecturer: Javier Pena Lecture 14: Newton s ethod Scribes: Varun Joshi, Xuan Li Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer: These notes

More information

CS 542G: Robustifying Newton, Constraints, Nonlinear Least Squares

CS 542G: Robustifying Newton, Constraints, Nonlinear Least Squares CS 542G: Robustifying Newton, Constraints, Nonlinear Least Squares Robert Bridson October 29, 2008 1 Hessian Problems in Newton Last time we fixed one of plain Newton s problems by introducing line search

More information

Quiescent Steady State (DC) Analysis The Newton-Raphson Method

Quiescent Steady State (DC) Analysis The Newton-Raphson Method Quiescent Steady State (DC) Analysis The Newton-Raphson Method J. Roychowdhury, University of California at Berkeley Slide 1 Solving the System's DAEs DAEs: many types of solutions useful DC steady state:

More information

1. Background: The SVD and the best basis (questions selected from Ch. 6- Can you fill in the exercises?)

1. Background: The SVD and the best basis (questions selected from Ch. 6- Can you fill in the exercises?) Math 35 Exam Review SOLUTIONS Overview In this third of the course we focused on linear learning algorithms to model data. summarize: To. Background: The SVD and the best basis (questions selected from

More information

Solving Polynomial and Rational Inequalities Algebraically. Approximating Solutions to Inequalities Graphically

Solving Polynomial and Rational Inequalities Algebraically. Approximating Solutions to Inequalities Graphically 10 Inequalities Concepts: Equivalent Inequalities Solving Polynomial and Rational Inequalities Algebraically Approximating Solutions to Inequalities Graphically (Section 4.6) 10.1 Equivalent Inequalities

More information

Optimization Methods. Lecture 19: Line Searches and Newton s Method

Optimization Methods. Lecture 19: Line Searches and Newton s Method 15.93 Optimization Methods Lecture 19: Line Searches and Newton s Method 1 Last Lecture Necessary Conditions for Optimality (identifies candidates) x local min f(x ) =, f(x ) PSD Slide 1 Sufficient Conditions

More information

Numerical solutions of nonlinear systems of equations

Numerical solutions of nonlinear systems of equations Numerical solutions of nonlinear systems of equations Tsung-Ming Huang Department of Mathematics National Taiwan Normal University, Taiwan E-mail: min@math.ntnu.edu.tw August 28, 2011 Outline 1 Fixed points

More information

CS 450 Numerical Analysis. Chapter 5: Nonlinear Equations

CS 450 Numerical Analysis. Chapter 5: Nonlinear Equations Lecture slides based on the textbook Scientific Computing: An Introductory Survey by Michael T. Heath, copyright c 2018 by the Society for Industrial and Applied Mathematics. http://www.siam.org/books/cl80

More information

Lecture Notes to Accompany. Scientific Computing An Introductory Survey. by Michael T. Heath. Chapter 5. Nonlinear Equations

Lecture Notes to Accompany. Scientific Computing An Introductory Survey. by Michael T. Heath. Chapter 5. Nonlinear Equations Lecture Notes to Accompany Scientific Computing An Introductory Survey Second Edition by Michael T Heath Chapter 5 Nonlinear Equations Copyright c 2001 Reproduction permitted only for noncommercial, educational

More information

Lecture 14: October 17

Lecture 14: October 17 1-725/36-725: Convex Optimization Fall 218 Lecture 14: October 17 Lecturer: Lecturer: Ryan Tibshirani Scribes: Pengsheng Guo, Xian Zhou Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer:

More information

Outline. Scientific Computing: An Introductory Survey. Nonlinear Equations. Nonlinear Equations. Examples: Nonlinear Equations

Outline. Scientific Computing: An Introductory Survey. Nonlinear Equations. Nonlinear Equations. Examples: Nonlinear Equations Methods for Systems of Methods for Systems of Outline Scientific Computing: An Introductory Survey Chapter 5 1 Prof. Michael T. Heath Department of Computer Science University of Illinois at Urbana-Champaign

More information

Bindel, Spring 2016 Numerical Analysis (CS 4220) Notes for

Bindel, Spring 2016 Numerical Analysis (CS 4220) Notes for Life beyond Newton Notes for 2016-04-08 Newton s method has many attractive properties, particularly when we combine it with a globalization strategy. Unfortunately, Newton steps are not cheap. At each

More information

CLASS NOTES Models, Algorithms and Data: Introduction to computing 2018

CLASS NOTES Models, Algorithms and Data: Introduction to computing 2018 CLASS NOTES Models, Algorithms and Data: Introduction to computing 2018 Petros Koumoutsakos, Jens Honore Walther (Last update: April 16, 2018) IMPORTANT DISCLAIMERS 1. REFERENCES: Much of the material

More information

Numerical Methods I Solving Nonlinear Equations

Numerical Methods I Solving Nonlinear Equations Numerical Methods I Solving Nonlinear Equations Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 MATH-GA 2011.003 / CSCI-GA 2945.003, Fall 2014 October 16th, 2014 A. Donev (Courant Institute)

More information

LECTURE NOTES ELEMENTARY NUMERICAL METHODS. Eusebius Doedel

LECTURE NOTES ELEMENTARY NUMERICAL METHODS. Eusebius Doedel LECTURE NOTES on ELEMENTARY NUMERICAL METHODS Eusebius Doedel TABLE OF CONTENTS Vector and Matrix Norms 1 Banach Lemma 20 The Numerical Solution of Linear Systems 25 Gauss Elimination 25 Operation Count

More information

Solving Linear and Rational Inequalities Algebraically. Definition 22.1 Two inequalities are equivalent if they have the same solution set.

Solving Linear and Rational Inequalities Algebraically. Definition 22.1 Two inequalities are equivalent if they have the same solution set. Inequalities Concepts: Equivalent Inequalities Solving Linear and Rational Inequalities Algebraically Approximating Solutions to Inequalities Graphically (Section 4.4).1 Equivalent Inequalities Definition.1

More information

1. Method 1: bisection. The bisection methods starts from two points a 0 and b 0 such that

1. Method 1: bisection. The bisection methods starts from two points a 0 and b 0 such that Chapter 4 Nonlinear equations 4.1 Root finding Consider the problem of solving any nonlinear relation g(x) = h(x) in the real variable x. We rephrase this problem as one of finding the zero (root) of a

More information

PHYS 410/555 Computational Physics Solution of Non Linear Equations (a.k.a. Root Finding) (Reference Numerical Recipes, 9.0, 9.1, 9.

PHYS 410/555 Computational Physics Solution of Non Linear Equations (a.k.a. Root Finding) (Reference Numerical Recipes, 9.0, 9.1, 9. PHYS 410/555 Computational Physics Solution of Non Linear Equations (a.k.a. Root Finding) (Reference Numerical Recipes, 9.0, 9.1, 9.4) We will consider two cases 1. f(x) = 0 1-dimensional 2. f(x) = 0 d-dimensional

More information

ME451 Kinematics and Dynamics of Machine Systems

ME451 Kinematics and Dynamics of Machine Systems ME451 Kinematics and Dynamics of Machine Systems Newton-Raphson Method 4.5 Closing remarks, Kinematics Analysis October 25, 2010 Dan Negrut, 2011 ME451, UW-Madison Success is going from failure to failure

More information

Unconstrained minimization of smooth functions

Unconstrained minimization of smooth functions Unconstrained minimization of smooth functions We want to solve min x R N f(x), where f is convex. In this section, we will assume that f is differentiable (so its gradient exists at every point), and

More information

MITOCW MITRES18_005S10_DiffEqnsMotion_300k_512kb-mp4

MITOCW MITRES18_005S10_DiffEqnsMotion_300k_512kb-mp4 MITOCW MITRES18_005S10_DiffEqnsMotion_300k_512kb-mp4 PROFESSOR: OK, this lecture, this day, is differential equations day. I just feel even though these are not on the BC exams, that we've got everything

More information

CLASS NOTES Computational Methods for Engineering Applications I Spring 2015

CLASS NOTES Computational Methods for Engineering Applications I Spring 2015 CLASS NOTES Computational Methods for Engineering Applications I Spring 2015 Petros Koumoutsakos Gerardo Tauriello (Last update: July 2, 2015) IMPORTANT DISCLAIMERS 1. REFERENCES: Much of the material

More information

Introduction to Simulation - Lecture 14. Multistep Methods II. Jacob White. Thanks to Deepak Ramaswamy, Michal Rewienski, and Karen Veroy

Introduction to Simulation - Lecture 14. Multistep Methods II. Jacob White. Thanks to Deepak Ramaswamy, Michal Rewienski, and Karen Veroy Introduction to Simuation - Lecture 14 Mutistep Methods II Jacob White Thans to Deepa Ramaswamy, Micha Rewiensi, and Karen Veroy Outine Sma Timestep issues for Mutistep Methods Reminder about LTE minimization

More information

Finite Elements for Nonlinear Problems

Finite Elements for Nonlinear Problems Finite Elements for Nonlinear Problems Computer Lab 2 In this computer lab we apply finite element method to nonlinear model problems and study two of the most common techniques for solving the resulting

More information

Newton s Method. Ryan Tibshirani Convex Optimization /36-725

Newton s Method. Ryan Tibshirani Convex Optimization /36-725 Newton s Method Ryan Tibshirani Convex Optimization 10-725/36-725 1 Last time: dual correspondences Given a function f : R n R, we define its conjugate f : R n R, Properties and examples: f (y) = max x

More information

STOP, a i+ 1 is the desired root. )f(a i) > 0. Else If f(a i+ 1. Set a i+1 = a i+ 1 and b i+1 = b Else Set a i+1 = a i and b i+1 = a i+ 1

STOP, a i+ 1 is the desired root. )f(a i) > 0. Else If f(a i+ 1. Set a i+1 = a i+ 1 and b i+1 = b Else Set a i+1 = a i and b i+1 = a i+ 1 53 17. Lecture 17 Nonlinear Equations Essentially, the only way that one can solve nonlinear equations is by iteration. The quadratic formula enables one to compute the roots of p(x) = 0 when p P. Formulas

More information

Convex Optimization CMU-10725

Convex Optimization CMU-10725 Convex Optimization CMU-10725 Newton Method Barnabás Póczos & Ryan Tibshirani Administrivia Scribing Projects HW1 solutions Feedback about lectures / solutions on blackboard 2 Books to read Boyd and Vandenberghe:

More information

Lecture 15 Newton Method and Self-Concordance. October 23, 2008

Lecture 15 Newton Method and Self-Concordance. October 23, 2008 Newton Method and Self-Concordance October 23, 2008 Outline Lecture 15 Self-concordance Notion Self-concordant Functions Operations Preserving Self-concordance Properties of Self-concordant Functions Implications

More information

Robot Mapping. Least Squares. Cyrill Stachniss

Robot Mapping. Least Squares. Cyrill Stachniss Robot Mapping Least Squares Cyrill Stachniss 1 Three Main SLAM Paradigms Kalman filter Particle filter Graphbased least squares approach to SLAM 2 Least Squares in General Approach for computing a solution

More information

Root finding. Eugeniy E. Mikhailov. Lecture 06. The College of William & Mary. Eugeniy Mikhailov (W&M) Practical Computing Lecture 06 1 / 10

Root finding. Eugeniy E. Mikhailov. Lecture 06. The College of William & Mary. Eugeniy Mikhailov (W&M) Practical Computing Lecture 06 1 / 10 Root finding Eugeniy E. Mikhailov The College of William & Mary Lecture 06 Eugeniy Mikhailov (W&M) Practical Computing Lecture 06 1 / 10 Root finding problem Generally we want to solve the following canonical

More information

2.007 Design and Manufacturing I Spring 2009

2.007 Design and Manufacturing I Spring 2009 MIT OpenCourseWare http://ocw.mit.edu 2.7 Design and Manufacturing I Spring 29 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 2.7 Design and Manufacturing

More information

Robotics 1 Inverse kinematics

Robotics 1 Inverse kinematics Robotics 1 Inverse kinematics Prof. Alessandro De Luca Robotics 1 1 Inverse kinematics what are we looking for? direct kinematics is always unique; how about inverse kinematics for this 6R robot? Robotics

More information

10. Unconstrained minimization

10. Unconstrained minimization Convex Optimization Boyd & Vandenberghe 10. Unconstrained minimization terminology and assumptions gradient descent method steepest descent method Newton s method self-concordant functions implementation

More information

Conditional Gradient (Frank-Wolfe) Method

Conditional Gradient (Frank-Wolfe) Method Conditional Gradient (Frank-Wolfe) Method Lecturer: Aarti Singh Co-instructor: Pradeep Ravikumar Convex Optimization 10-725/36-725 1 Outline Today: Conditional gradient method Convergence analysis Properties

More information

Lecture 9: September 28

Lecture 9: September 28 0-725/36-725: Convex Optimization Fall 206 Lecturer: Ryan Tibshirani Lecture 9: September 28 Scribes: Yiming Wu, Ye Yuan, Zhihao Li Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer: These

More information

An improved convergence theorem for the Newton method under relaxed continuity assumptions

An improved convergence theorem for the Newton method under relaxed continuity assumptions An improved convergence theorem for the Newton method under relaxed continuity assumptions Andrei Dubin ITEP, 117218, BCheremushinsaya 25, Moscow, Russia Abstract In the framewor of the majorization technique,

More information

Topic 8c Multi Variable Optimization

Topic 8c Multi Variable Optimization Course Instructor Dr. Raymond C. Rumpf Office: A 337 Phone: (915) 747 6958 E Mail: rcrumpf@utep.edu Topic 8c Multi Variable Optimization EE 4386/5301 Computational Methods in EE Outline Mathematical Preliminaries

More information

Handout on Newton s Method for Systems

Handout on Newton s Method for Systems Handout on Newton s Method for Systems The following summarizes the main points of our class discussion of Newton s method for approximately solving a system of nonlinear equations F (x) = 0, F : IR n

More information

3.1: 1, 3, 5, 9, 10, 12, 14, 18

3.1: 1, 3, 5, 9, 10, 12, 14, 18 3.:, 3, 5, 9,,, 4, 8 ) We want to solve d d c() d = f() with c() = c = constant and f() = for different boundary conditions to get w() and u(). dw d = dw d d = ( )d w() w() = w() = w() ( ) c d d = u()

More information

Penalty and Barrier Methods General classical constrained minimization problem minimize f(x) subject to g(x) 0 h(x) =0 Penalty methods are motivated by the desire to use unconstrained optimization techniques

More information

On the Local Quadratic Convergence of the Primal-Dual Augmented Lagrangian Method

On the Local Quadratic Convergence of the Primal-Dual Augmented Lagrangian Method Optimization Methods and Software Vol. 00, No. 00, Month 200x, 1 11 On the Local Quadratic Convergence of the Primal-Dual Augmented Lagrangian Method ROMAN A. POLYAK Department of SEOR and Mathematical

More information

Root finding. Eugeniy E. Mikhailov. Lecture 05. The College of William & Mary. Eugeniy Mikhailov (W&M) Practical Computing Lecture 05 1 / 10

Root finding. Eugeniy E. Mikhailov. Lecture 05. The College of William & Mary. Eugeniy Mikhailov (W&M) Practical Computing Lecture 05 1 / 10 Root finding Eugeniy E. Mikhailov The College of William & Mary Lecture 05 Eugeniy Mikhailov (W&M) Practical Computing Lecture 05 1 / 10 Root finding problem Generally we want to solve the following canonical

More information

Extended Introduction to Computer Science CS1001.py. Lecture 8 part A: Finding Zeroes of Real Functions: Newton Raphson Iteration

Extended Introduction to Computer Science CS1001.py. Lecture 8 part A: Finding Zeroes of Real Functions: Newton Raphson Iteration Extended Introduction to Computer Science CS1001.py Lecture 8 part A: Finding Zeroes of Real Functions: Newton Raphson Iteration Instructors: Benny Chor, Amir Rubinstein Teaching Assistants: Yael Baran,

More information

Convex Optimization. Newton s method. ENSAE: Optimisation 1/44

Convex Optimization. Newton s method. ENSAE: Optimisation 1/44 Convex Optimization Newton s method ENSAE: Optimisation 1/44 Unconstrained minimization minimize f(x) f convex, twice continuously differentiable (hence dom f open) we assume optimal value p = inf x f(x)

More information

MATH 4211/6211 Optimization Quasi-Newton Method

MATH 4211/6211 Optimization Quasi-Newton Method MATH 4211/6211 Optimization Quasi-Newton Method Xiaojing Ye Department of Mathematics & Statistics Georgia State University Xiaojing Ye, Math & Stat, Georgia State University 0 Quasi-Newton Method Motivation:

More information

Root finding. Root finding problem. Root finding problem. Root finding problem. Notes. Eugeniy E. Mikhailov. Lecture 05. Notes

Root finding. Root finding problem. Root finding problem. Root finding problem. Notes. Eugeniy E. Mikhailov. Lecture 05. Notes Root finding Eugeniy E. Mikhailov The College of William & Mary Lecture 05 Eugeniy Mikhailov (W&M) Practical Computing Lecture 05 1 / 10 2 sin(x) 1 = 0 2 sin(x) 1 = 0 Often we have a problem which looks

More information

you expect to encounter difficulties when trying to solve A x = b? 4. A composite quadrature rule has error associated with it in the following form

you expect to encounter difficulties when trying to solve A x = b? 4. A composite quadrature rule has error associated with it in the following form Qualifying exam for numerical analysis (Spring 2017) Show your work for full credit. If you are unable to solve some part, attempt the subsequent parts. 1. Consider the following finite difference: f (0)

More information

Unconstrained minimization

Unconstrained minimization CSCI5254: Convex Optimization & Its Applications Unconstrained minimization terminology and assumptions gradient descent method steepest descent method Newton s method self-concordant functions 1 Unconstrained

More information

NONLINEAR DC ANALYSIS

NONLINEAR DC ANALYSIS ECE 552 Numerical Circuit Analysis Chapter Six NONLINEAR DC ANALYSIS OR: Solution of Nonlinear Algebraic Equations I. Hajj 2017 Nonlinear Algebraic Equations A system of linear equations Ax = b has a

More information

Nonlinear examples. June 21, Solution by fixed-point iteration 3. 5 Exercises 9

Nonlinear examples. June 21, Solution by fixed-point iteration 3. 5 Exercises 9 Nonlinear examples June 21, 2012 Contents 1 Bratu s equation 1 1.1 Manufacturing a problem to match a solution.................. 2 1.1.1 A manufactured Bratu s problem in 1D.................. 2 1.2 Weak

More information

CS 323: Numerical Analysis and Computing

CS 323: Numerical Analysis and Computing CS 323: Numerical Analysis and Computing MIDTERM #2 Instructions: This is an open notes exam, i.e., you are allowed to consult any textbook, your class notes, homeworks, or any of the handouts from us.

More information

Solving Linear Systems of Equations

Solving Linear Systems of Equations Solving Linear Systems of Equations Gerald Recktenwald Portland State University Mechanical Engineering Department gerry@me.pdx.edu These slides are a supplement to the book Numerical Methods with Matlab:

More information

You should be able to...

You should be able to... Lecture Outline Gradient Projection Algorithm Constant Step Length, Varying Step Length, Diminishing Step Length Complexity Issues Gradient Projection With Exploration Projection Solving QPs: active set

More information

Root Finding and Optimization

Root Finding and Optimization Root Finding and Optimization Root finding, solving equations, and optimization are very closely related subjects, which occur often in practical applications. Root finding : f (x) = 0 Solve for x Equation

More information

Line Search Methods for Unconstrained Optimisation

Line Search Methods for Unconstrained Optimisation Line Search Methods for Unconstrained Optimisation Lecture 8, Numerical Linear Algebra and Optimisation Oxford University Computing Laboratory, MT 2007 Dr Raphael Hauser (hauser@comlab.ox.ac.uk) The Generic

More information

Numerical optimization

Numerical optimization THE UNIVERSITY OF WESTERN ONTARIO LONDON ONTARIO Paul Klein Office: SSC 408 Phone: 661-111 ext. 857 Email: paul.klein@uwo.ca URL: www.ssc.uwo.ca/economics/faculty/klein/ Numerical optimization In these

More information

9.5. Polynomial and Rational Inequalities. Objectives. Solve quadratic inequalities. Solve polynomial inequalities of degree 3 or greater.

9.5. Polynomial and Rational Inequalities. Objectives. Solve quadratic inequalities. Solve polynomial inequalities of degree 3 or greater. Chapter 9 Section 5 9.5 Polynomial and Rational Inequalities Objectives 1 3 Solve quadratic inequalities. Solve polynomial inequalities of degree 3 or greater. Solve rational inequalities. Objective 1

More information

Numerical Analysis Fall. Roots: Open Methods

Numerical Analysis Fall. Roots: Open Methods Numerical Analysis 2015 Fall Roots: Open Methods Open Methods Open methods differ from bracketing methods, in that they require only a single starting value or two starting values that do not necessarily

More information

13. Nonlinear least squares

13. Nonlinear least squares L. Vandenberghe ECE133A (Fall 2018) 13. Nonlinear least squares definition and examples derivatives and optimality condition Gauss Newton method Levenberg Marquardt method 13.1 Nonlinear least squares

More information

Robotics 1 Inverse kinematics

Robotics 1 Inverse kinematics Robotics 1 Inverse kinematics Prof. Alessandro De Luca Robotics 1 1 Inverse kinematics what are we looking for? direct kinematics is always unique; how about inverse kinematics for this 6R robot? Robotics

More information

Materials and Handouts - Warm-Up - Answers to homework #1 - Keynote and notes template - Tic Tac Toe grids - Homework #2

Materials and Handouts - Warm-Up - Answers to homework #1 - Keynote and notes template - Tic Tac Toe grids - Homework #2 Calculus Unit 1, Lesson 2: Composite Functions DATE: Objectives The students will be able to: - Evaluate composite functions using all representations Simplify composite functions Materials and Handouts

More information

Course in. Geometric nonlinearity. Nonlinear FEM. Computational Mechanics, AAU, Esbjerg

Course in. Geometric nonlinearity. Nonlinear FEM. Computational Mechanics, AAU, Esbjerg Course in Nonlinear FEM Geometric nonlinearity Nonlinear FEM Outline Lecture 1 Introduction Lecture 2 Geometric nonlinearity Lecture 3 Material nonlinearity Lecture 4 Material nonlinearity it continued

More information

Recent Advances in SPSA at the Extremes: Adaptive Methods for Smooth Problems and Discrete Methods for Non-Smooth Problems

Recent Advances in SPSA at the Extremes: Adaptive Methods for Smooth Problems and Discrete Methods for Non-Smooth Problems Recent Advances in SPSA at the Extremes: Adaptive Methods for Smooth Problems and Discrete Methods for Non-Smooth Problems SGM2014: Stochastic Gradient Methods IPAM, February 24 28, 2014 James C. Spall

More information

Queens College, CUNY, Department of Computer Science Numerical Methods CSCI 361 / 761 Spring 2018 Instructor: Dr. Sateesh Mane.

Queens College, CUNY, Department of Computer Science Numerical Methods CSCI 361 / 761 Spring 2018 Instructor: Dr. Sateesh Mane. Queens College, CUNY, Department of Computer Science Numerical Methods CSCI 361 / 761 Spring 2018 Instructor: Dr. Sateesh Mane c Sateesh R. Mane 2018 3 Lecture 3 3.1 General remarks March 4, 2018 This

More information

EAD 115. Numerical Solution of Engineering and Scientific Problems. David M. Rocke Department of Applied Science

EAD 115. Numerical Solution of Engineering and Scientific Problems. David M. Rocke Department of Applied Science EAD 115 Numerical Solution of Engineering and Scientific Problems David M. Rocke Department of Applied Science Multidimensional Unconstrained Optimization Suppose we have a function f() of more than one

More information

Jim Lambers MAT 419/519 Summer Session Lecture 11 Notes

Jim Lambers MAT 419/519 Summer Session Lecture 11 Notes Jim Lambers MAT 49/59 Summer Session 20-2 Lecture Notes These notes correspond to Section 34 in the text Broyden s Method One of the drawbacks of using Newton s Method to solve a system of nonlinear equations

More information

Robotics 2 Least Squares Estimation. Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Maren Bennewitz, Wolfram Burgard

Robotics 2 Least Squares Estimation. Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Maren Bennewitz, Wolfram Burgard Robotics 2 Least Squares Estimation Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Maren Bennewitz, Wolfram Burgard Problem Given a system described by a set of n observation functions {f i (x)} i=1:n

More information

MATHEMATICS FOR COMPUTER VISION WEEK 8 OPTIMISATION PART 2. Dr Fabio Cuzzolin MSc in Computer Vision Oxford Brookes University Year

MATHEMATICS FOR COMPUTER VISION WEEK 8 OPTIMISATION PART 2. Dr Fabio Cuzzolin MSc in Computer Vision Oxford Brookes University Year MATHEMATICS FOR COMPUTER VISION WEEK 8 OPTIMISATION PART 2 1 Dr Fabio Cuzzolin MSc in Computer Vision Oxford Brookes University Year 2013-14 OUTLINE OF WEEK 8 topics: quadratic optimisation, least squares,

More information

UNIT V: Multi-Dimensional Kinematics and Dynamics Page 1

UNIT V: Multi-Dimensional Kinematics and Dynamics Page 1 UNIT V: Multi-Dimensional Kinematics and Dynamics Page 1 UNIT V: Multi-Dimensional Kinematics and Dynamics As we have already discussed, the study of the rules of nature (a.k.a. Physics) involves both

More information

Kasetsart University Workshop. Multigrid methods: An introduction

Kasetsart University Workshop. Multigrid methods: An introduction Kasetsart University Workshop Multigrid methods: An introduction Dr. Anand Pardhanani Mathematics Department Earlham College Richmond, Indiana USA pardhan@earlham.edu A copy of these slides is available

More information

Lecture : Feedback Linearization

Lecture : Feedback Linearization ecture : Feedbac inearization Niola Misovic, dipl ing and Pro Zoran Vuic June 29 Summary: This document ollows the lectures on eedbac linearization tought at the University o Zagreb, Faculty o Electrical

More information

Root Finding (and Optimisation)

Root Finding (and Optimisation) Root Finding (and Optimisation) M.Sc. in Mathematical Modelling & Scientific Computing, Practical Numerical Analysis Michaelmas Term 2018, Lecture 4 Root Finding The idea of root finding is simple we want

More information

Physics 102: Lecture 05 Circuits and Ohm s Law

Physics 102: Lecture 05 Circuits and Ohm s Law Physics 102: Lecture 05 Circuits and Ohm s Law Physics 102: Lecture 5, Slide 1 Summary of Last Time Capacitors Physical C = ke 0 A/d C=Q/V Series 1/C eq = 1/C 1 + 1/C 2 Parallel C eq = C 1 + C 2 Energy

More information

Solving non-linear systems of equations

Solving non-linear systems of equations Solving non-linear systems of equations Felix Kubler 1 1 DBF, University of Zurich and Swiss Finance Institute October 7, 2017 Felix Kubler Comp.Econ. Gerzensee, Ch2 October 7, 2017 1 / 38 The problem

More information

Proximal Newton Method. Zico Kolter (notes by Ryan Tibshirani) Convex Optimization

Proximal Newton Method. Zico Kolter (notes by Ryan Tibshirani) Convex Optimization Proximal Newton Method Zico Kolter (notes by Ryan Tibshirani) Convex Optimization 10-725 Consider the problem Last time: quasi-newton methods min x f(x) with f convex, twice differentiable, dom(f) = R

More information

Suppose that the approximate solutions of Eq. (1) satisfy the condition (3). Then (1) if η = 0 in the algorithm Trust Region, then lim inf.

Suppose that the approximate solutions of Eq. (1) satisfy the condition (3). Then (1) if η = 0 in the algorithm Trust Region, then lim inf. Maria Cameron 1. Trust Region Methods At every iteration the trust region methods generate a model m k (p), choose a trust region, and solve the constraint optimization problem of finding the minimum of

More information

Homework 3 Conjugate Gradient Descent, Accelerated Gradient Descent Newton, Quasi Newton and Projected Gradient Descent

Homework 3 Conjugate Gradient Descent, Accelerated Gradient Descent Newton, Quasi Newton and Projected Gradient Descent Homework 3 Conjugate Gradient Descent, Accelerated Gradient Descent Newton, Quasi Newton and Projected Gradient Descent CMU 10-725/36-725: Convex Optimization (Fall 2017) OUT: Sep 29 DUE: Oct 13, 5:00

More information

Physic 231 Lecture 3. Main points of today s lecture. for constant acceleration: a = a; assuming also t0. v = lim

Physic 231 Lecture 3. Main points of today s lecture. for constant acceleration: a = a; assuming also t0. v = lim Physic 231 Lecture 3 Main points of today s lecture Δx v = ; Δ t = t t0 for constant acceleration: a = a; assuming also t0 = 0 Δ x = v v= v0 + at Δx 1 v = lim Δ x = Δ t 0 ( v+ vo ) t 2 Δv 1 2 a = ; Δ v=

More information