Multivariate Newton s Method
|
|
- Arleen Wells
- 5 years ago
- Views:
Transcription
1 Multivariate Newton s Method 1 Nonlinear Systems derivation of the method examples with Julia 2 Nonlinear Optimization computing the critical points with Newton s method MCS 471 Lecture 6(b) Numerical Analysis Jan Verschelde, 29 June 2018 Numerical Analysis (MCS 471) Multivariate Newton s Method L-6(b) 29 June / 14
2 Multivariate Newton s Method 1 Nonlinear Systems derivation of the method examples with Julia 2 Nonlinear Optimization computing the critical points with Newton s method Numerical Analysis (MCS 471) Multivariate Newton s Method L-6(b) 29 June / 14
3 Newton s method for nonlinear systems Consider a system of two equations in two variables: { f (x, y) = 0 g(x, y) = 0. Suppose we have an approximation for a solution (x 0, y 0 ) and we would like to compute x and y so x 1 = x 0 + x and y 1 = y 0 + y satisfy the system: { f (x1, y 1 ) = f (x 0 + x, y 0 + y) = 0 g(x 1, y 1 ) = g(x 0 + x, y 0 + y) = 0. How to compute x and y? Numerical Analysis (MCS 471) Multivariate Newton s Method L-6(b) 29 June / 14
4 Taylor series in two variables f (x 0 + x, y 0 + y) = f (x 0, y 0 ) + f x (x 0, y 0 ) x + f y (x 0, y 0 ) y + g(x 0 + x, y 0 + y) = g(x 0, y 0 ) + g x (x 0, y 0 ) x + g y (x 0, y 0 ) y + where f x (x 0, y 0 ) and f y (x 0, y 0 ) are the partial derivatives of f with respect to x and y evaluated at (x 0, y 0 ); g x (x 0, y 0 ) and g y (x 0, y 0 ) are the partial derivatives of g with respect to x and y evaluated at (x 0, y 0 ); and the represent the higher order terms in the series, in ( x) 2, ( x)( y), and ( y) 2. Because x and y are already small numbers, the higher order terms are even smaller. Numerical Analysis (MCS 471) Multivariate Newton s Method L-6(b) 29 June / 14
5 in matrix format Because f (x 0 + x, y 0 + y) = 0 and g(x 0 + x, y 0 + y) = 0: 0 = f (x 0, y 0 ) + f x (x 0, y 0 ) x + f y (x 0, y 0 ) y + 0 = g(x 0, y 0 ) + g x (x 0, y 0 ) x + g y (x 0, y 0 ) y + we solve for x and y: f x (x 0, y 0 ) g x (x 0, y 0 ) f y (x 0, y 0 ) [ x g y (x y 0, y 0 ) ] [ f (x0, y = 0 ) g(x 0, y 0 ) ]. Th solution ( x, y) of the linear system updates x 0 and y 0 : x 1 := x 0 + x and y 1 := y 0 + y. Numerical Analysis (MCS 471) Multivariate Newton s Method L-6(b) 29 June / 14
6 the Jacobian matrix Given a system of n equations in m unknowns f(x) = 0, with f = (f 1, f 2,..., f n ) and x = (x 1, x 2,..., x m ), f(x) = f 1 (x 1, x 2,..., x m ) = 0 f 2 (x 1, x 2,..., x m ) = 0. f n (x 1, x 2,..., x m ) = 0, the Jacobian matrix of f is the matrix of all first order partial derivatives: f 1 f 1 f 1 x 1 x 2 x m f 2 f 2 f 2 J f = x 1 x 2 x m f n f n f n x 1 x 2 x m Numerical Analysis (MCS 471) Multivariate Newton s Method L-6(b) 29 June / 14
7 a numerical example Consider the system f(x, y) = { e x y = 0 xy e x = 0 f(1, e) = 0. Let us do one step with Newton s method, starting at (0.9, 2.5). The Jacobian matrix is [ e J f = x ] [ ] E E +0 y e x A = J x f (0.9, 2.5) = 4.0 E E 1 f(0.9, 2.5) = [ 4.0 E E 1 ] x = 1.1 E 1, x = E +0 y = 2.3 E 1, y = E +0 Numerical Analysis (MCS 471) Multivariate Newton s Method L-6(b) 29 June / 14
8 Multivariate Newton s Method 1 Nonlinear Systems derivation of the method examples with Julia 2 Nonlinear Optimization computing the critical points with Newton s method Numerical Analysis (MCS 471) Multivariate Newton s Method L-6(b) 29 June / 14
9 computing the Jacobian matrix with SymPy using SymPy x, y = Sym("x, y") function SymPyDerivatives(f::String, g::string) # # Returns the string representations of all # partial derivatives of the expression in x and y, # given in the strings f and g. # formf = parse(f) evalformf = eval(formf) fx = diff(evalformf, x) fy = diff(evalformf, y) formg = parse(g) evalformg = eval(formg) gx = diff(evalformg, x) gy = diff(evalformg, y) return [string(fx) string(fy); string(gx) string(gy)] end Numerical Analysis (MCS 471) Multivariate Newton s Method L-6(b) 29 June / 14
10 code for one Newton step function NewtonStep(fun::Array{SymPy.Sym,1}, jac::array{sympy.sym,2}, x0::float64, y0::float64) # # Runs one step with Newton s method # valfun = -SymPyFun(fun, x0, y0) nfx = norm(valfun) valmat = SymPyMatrixEvaluate(jac, x0, y0) update = valmat\valfun ndx = norm(update) x1 = x0 + update[1] y1 = y0 + update[2] sfx nfx) sdx ndx) sx1 x1) sy1 y1) println(" $sfx $sdx $sx1 $sy1 ") return [x1, y1] end Numerical Analysis (MCS 471) Multivariate Newton s Method L-6(b) 29 June / 14
11 we observe quadratic convergence In the output below, there are four columns: 1 the norm of the residual, 2 the norm of the update, 3 the value for x, 4 the value for y. Observe the quadratic convergence: 2.13e e e e e e e e e e e e e e e e e e e e+00 Numerical Analysis (MCS 471) Multivariate Newton s Method L-6(b) 29 June / 14
12 Multivariate Newton s Method 1 Nonlinear Systems derivation of the method examples with Julia 2 Nonlinear Optimization computing the critical points with Newton s method Numerical Analysis (MCS 471) Multivariate Newton s Method L-6(b) 29 June / 14
13 nonlinear optimization Consider the minimization of maximization of a function in n variables, x = (x 1, x 2,..., x n ). f (x 1, x 2,..., x n ) The minima and maxima occur wher the gradient vanishes. ( f f =, f,..., f ),. x 1 x 2 x n All critical points satisfy f x 1 (x) = 0 f x 2 (x) = 0. f x n (x) = 0 Numerical Analysis (MCS 471) Multivariate Newton s Method L-6(b) 29 June / 14
14 the Hessian If we take the Jacobian matrix of f x 1 (x) = 0 f x 2 (x) = 0. f x n (x) = 0 then we arrive at the second partial derivatives of f : 2 f x1 2 (x) 2 f x 2 x 1 (x). 2 f x nx 1 (x) 2 f x 1 x 2 (x) 2 f x 1 x n (x) 2 f (x) 2 f x2 2 x 2 x n (x) f x nx 2 (x) 2 f (x) xn 2 If f is continuous, then the matrix is symmetric. If close to a minimum, then the matrix is positive definite. Numerical Analysis (MCS 471) Multivariate Newton s Method L-6(b) 29 June / 14
Root Finding with Newton s Method
Root Finding with Newton s Method 1 Newton s Method derivation of the method an implementation with SymPy and Julia 2 Convergence of Newton s Method linear and quadratic convergence geometric convergence
More informationChapter 11. Taylor Series. Josef Leydold Mathematical Methods WS 2018/19 11 Taylor Series 1 / 27
Chapter 11 Taylor Series Josef Leydold Mathematical Methods WS 2018/19 11 Taylor Series 1 / 27 First-Order Approximation We want to approximate function f by some simple function. Best possible approximation
More informationIntroduction to gradient descent
6-1: Introduction to gradient descent Prof. J.C. Kao, UCLA Introduction to gradient descent Derivation and intuitions Hessian 6-2: Introduction to gradient descent Prof. J.C. Kao, UCLA Introduction Our
More information17 Solution of Nonlinear Systems
17 Solution of Nonlinear Systems We now discuss the solution of systems of nonlinear equations. An important ingredient will be the multivariate Taylor theorem. Theorem 17.1 Let D = {x 1, x 2,..., x m
More informationLecture 11: October 2
10-725: Optimization Fall 2012 Lecture 11: October 2 Lecturer: Geoff Gordon/Ryan Tibshirani Scribes: Tongbo Huang, Shoou-I Yu Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer: These notes
More information1. Method 1: bisection. The bisection methods starts from two points a 0 and b 0 such that
Chapter 4 Nonlinear equations 4.1 Root finding Consider the problem of solving any nonlinear relation g(x) = h(x) in the real variable x. We rephrase this problem as one of finding the zero (root) of a
More informationA nonlinear equation is any equation of the form. f(x) = 0. A nonlinear equation can have any number of solutions (finite, countable, uncountable)
Nonlinear equations Definition A nonlinear equation is any equation of the form where f is a nonlinear function. Nonlinear equations x 2 + x + 1 = 0 (f : R R) f(x) = 0 (x cos y, 2y sin x) = (0, 0) (f :
More information1 Newton s Method. Suppose we want to solve: x R. At x = x, f (x) can be approximated by:
Newton s Method Suppose we want to solve: (P:) min f (x) At x = x, f (x) can be approximated by: n x R. f (x) h(x) := f ( x)+ f ( x) T (x x)+ (x x) t H ( x)(x x), 2 which is the quadratic Taylor expansion
More informationLogistic Regression. Seungjin Choi
Logistic Regression Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr http://mlg.postech.ac.kr/
More informationConstrained optimization. Unconstrained optimization. One-dimensional. Multi-dimensional. Newton with equality constraints. Active-set method.
Optimization Unconstrained optimization One-dimensional Multi-dimensional Newton s method Basic Newton Gauss- Newton Quasi- Newton Descent methods Gradient descent Conjugate gradient Constrained optimization
More informationPerformance Surfaces and Optimum Points
CSC 302 1.5 Neural Networks Performance Surfaces and Optimum Points 1 Entrance Performance learning is another important class of learning law. Network parameters are adjusted to optimize the performance
More informationStatic Problem Set 2 Solutions
Static Problem Set Solutions Jonathan Kreamer July, 0 Question (i) Let g, h be two concave functions. Is f = g + h a concave function? Prove it. Yes. Proof: Consider any two points x, x and α [0, ]. Let
More informationPractice Problems for Final Exam
Math 1280 Spring 2016 Practice Problems for Final Exam Part 2 (Sections 6.6, 6.7, 6.8, and chapter 7) S o l u t i o n s 1. Show that the given system has a nonlinear center at the origin. ẋ = 9y 5y 5,
More informationLecture Unconstrained optimization. In this lecture we will study the unconstrained problem. minimize f(x), (2.1)
Lecture 2 In this lecture we will study the unconstrained problem minimize f(x), (2.1) where x R n. Optimality conditions aim to identify properties that potential minimizers need to satisfy in relation
More informationNewton s Method. Javier Peña Convex Optimization /36-725
Newton s Method Javier Peña Convex Optimization 10-725/36-725 1 Last time: dual correspondences Given a function f : R n R, we define its conjugate f : R n R, f ( (y) = max y T x f(x) ) x Properties and
More informationRoot Finding (and Optimisation)
Root Finding (and Optimisation) M.Sc. in Mathematical Modelling & Scientific Computing, Practical Numerical Analysis Michaelmas Term 2018, Lecture 4 Root Finding The idea of root finding is simple we want
More informationMath 651 Introduction to Numerical Analysis I Fall SOLUTIONS: Homework Set 1
ath 651 Introduction to Numerical Analysis I Fall 2010 SOLUTIONS: Homework Set 1 1. Consider the polynomial f(x) = x 2 x 2. (a) Find P 1 (x), P 2 (x) and P 3 (x) for f(x) about x 0 = 0. What is the relation
More informationNumerical solutions of nonlinear systems of equations
Numerical solutions of nonlinear systems of equations Tsung-Ming Huang Department of Mathematics National Taiwan Normal University, Taiwan E-mail: min@math.ntnu.edu.tw August 28, 2011 Outline 1 Fixed points
More informationConstrained Optimization
1 / 22 Constrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University March 30, 2015 2 / 22 1. Equality constraints only 1.1 Reduced gradient 1.2 Lagrange
More informationCubic regularization of Newton s method for convex problems with constraints
CORE DISCUSSION PAPER 006/39 Cubic regularization of Newton s method for convex problems with constraints Yu. Nesterov March 31, 006 Abstract In this paper we derive efficiency estimates of the regularized
More informationMATH2070 Optimisation
MATH2070 Optimisation Nonlinear optimisation with constraints Semester 2, 2012 Lecturer: I.W. Guo Lecture slides courtesy of J.R. Wishart Review The full nonlinear optimisation problem with equality constraints
More information10.34: Numerical Methods Applied to Chemical Engineering. Lecture 7: Solutions of nonlinear equations Newton-Raphson method
10.34: Numerical Methods Applied to Chemical Engineering Lecture 7: Solutions of nonlinear equations Newton-Raphson method 1 Recap Singular value decomposition Iterative solutions to linear equations 2
More informationNonlinearOptimization
1/35 NonlinearOptimization Pavel Kordík Department of Computer Systems Faculty of Information Technology Czech Technical University in Prague Jiří Kašpar, Pavel Tvrdík, 2011 Unconstrained nonlinear optimization,
More informationMATH529 Fundamentals of Optimization Unconstrained Optimization II
MATH529 Fundamentals of Optimization Unconstrained Optimization II Marco A. Montes de Oca Mathematical Sciences, University of Delaware, USA 1 / 31 Recap 2 / 31 Example Find the local and global minimizers
More informationIntroduction to Optimization
Introduction to Optimization Konstantin Tretyakov (kt@ut.ee) MTAT.03.227 Machine Learning So far Machine learning is important and interesting The general concept: Fitting models to data So far Machine
More informationOptimization Tutorial 1. Basic Gradient Descent
E0 270 Machine Learning Jan 16, 2015 Optimization Tutorial 1 Basic Gradient Descent Lecture by Harikrishna Narasimhan Note: This tutorial shall assume background in elementary calculus and linear algebra.
More informationLecture 4: Optimization. Maximizing a function of a single variable
Lecture 4: Optimization Maximizing or Minimizing a Function of a Single Variable Maximizing or Minimizing a Function of Many Variables Constrained Optimization Maximizing a function of a single variable
More informationFunctions of Several Variables
Jim Lambers MAT 419/519 Summer Session 2011-12 Lecture 2 Notes These notes correspond to Section 1.2 in the text. Functions of Several Variables We now generalize the results from the previous section,
More informationComputational Methods. Least Squares Approximation/Optimization
Computational Methods Least Squares Approximation/Optimization Manfred Huber 2011 1 Least Squares Least squares methods are aimed at finding approximate solutions when no precise solution exists Find the
More information1. Nonlinear Equations. This lecture note excerpted parts from Michael Heath and Max Gunzburger. f(x) = 0
Numerical Analysis 1 1. Nonlinear Equations This lecture note excerpted parts from Michael Heath and Max Gunzburger. Given function f, we seek value x for which where f : D R n R n is nonlinear. f(x) =
More informationAM 205: lecture 18. Last time: optimization methods Today: conditions for optimality
AM 205: lecture 18 Last time: optimization methods Today: conditions for optimality Existence of Global Minimum For example: f (x, y) = x 2 + y 2 is coercive on R 2 (global min. at (0, 0)) f (x) = x 3
More information2 Nonlinear least squares algorithms
1 Introduction Notes for 2017-05-01 We briefly discussed nonlinear least squares problems in a previous lecture, when we described the historical path leading to trust region methods starting from the
More informationVasil Khalidov & Miles Hansard. C.M. Bishop s PRML: Chapter 5; Neural Networks
C.M. Bishop s PRML: Chapter 5; Neural Networks Introduction The aim is, as before, to find useful decompositions of the target variable; t(x) = y(x, w) + ɛ(x) (3.7) t(x n ) and x n are the observations,
More information1) The line has a slope of ) The line passes through (2, 11) and. 6) r(x) = x + 4. From memory match each equation with its graph.
Review Test 2 Math 1314 Name Write an equation of the line satisfying the given conditions. Write the answer in standard form. 1) The line has a slope of - 2 7 and contains the point (3, 1). Use the point-slope
More informationMA102: Multivariable Calculus
MA102: Multivariable Calculus Rupam Barman and Shreemayee Bora Department of Mathematics IIT Guwahati Differentiability of f : U R n R m Definition: Let U R n be open. Then f : U R n R m is differentiable
More informationUnconstrained minimization of smooth functions
Unconstrained minimization of smooth functions We want to solve min x R N f(x), where f is convex. In this section, we will assume that f is differentiable (so its gradient exists at every point), and
More informationChapter 3 Numerical Methods
Chapter 3 Numerical Methods Part 2 3.2 Systems of Equations 3.3 Nonlinear and Constrained Optimization 1 Outline 3.2 Systems of Equations 3.3 Nonlinear and Constrained Optimization Summary 2 Outline 3.2
More informationMath 273a: Optimization Basic concepts
Math 273a: Optimization Basic concepts Instructor: Wotao Yin Department of Mathematics, UCLA Spring 2015 slides based on Chong-Zak, 4th Ed. Goals of this lecture The general form of optimization: minimize
More informationMultivariable Calculus and Matrix Algebra-Summer 2017
Multivariable Calculus and Matrix Algebra-Summer 017 Homework 4 Solutions Note that the solutions below are for the latest version of the problems posted. For those of you who worked on an earlier version
More informationMathematical optimization
Optimization Mathematical optimization Determine the best solutions to certain mathematically defined problems that are under constrained determine optimality criteria determine the convergence of the
More informationMAT 419 Lecture Notes Transcribed by Eowyn Cenek 6/1/2012
(Homework 1: Chapter 1: Exercises 1-7, 9, 11, 19, due Monday June 11th See also the course website for lectures, assignments, etc) Note: today s lecture is primarily about definitions Lots of definitions
More informationMath 273a: Optimization Netwon s methods
Math 273a: Optimization Netwon s methods Instructor: Wotao Yin Department of Mathematics, UCLA Fall 2015 some material taken from Chong-Zak, 4th Ed. Main features of Newton s method Uses both first derivatives
More informationVector Derivatives and the Gradient
ECE 275AB Lecture 10 Fall 2008 V1.1 c K. Kreutz-Delgado, UC San Diego p. 1/1 Lecture 10 ECE 275A Vector Derivatives and the Gradient ECE 275AB Lecture 10 Fall 2008 V1.1 c K. Kreutz-Delgado, UC San Diego
More informationLagrange multipliers. Portfolio optimization. The Lagrange multipliers method for finding constrained extrema of multivariable functions.
Chapter 9 Lagrange multipliers Portfolio optimization The Lagrange multipliers method for finding constrained extrema of multivariable functions 91 Lagrange multipliers Optimization problems often require
More informationnonrobust estimation The n measurement vectors taken together give the vector X R N. The unknown parameter vector is P R M.
Introduction to nonlinear LS estimation R. I. Hartley and A. Zisserman: Multiple View Geometry in Computer Vision. Cambridge University Press, 2ed., 2004. After Chapter 5 and Appendix 6. We will use x
More informationLecture 14: October 17
1-725/36-725: Convex Optimization Fall 218 Lecture 14: October 17 Lecturer: Lecturer: Ryan Tibshirani Scribes: Pengsheng Guo, Xian Zhou Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer:
More informationMathematics for Economics ECON MA/MSSc in Economics-2017/2018. Dr. W. M. Semasinghe Senior Lecturer Department of Economics
Mathematics for Economics ECON 53035 MA/MSSc in Economics-2017/2018 Dr. W. M. Semasinghe Senior Lecturer Department of Economics MATHEMATICS AND STATISTICS LERNING OUTCOMES: By the end of this course unit
More information14. Nonlinear equations
L. Vandenberghe ECE133A (Winter 2018) 14. Nonlinear equations Newton method for nonlinear equations damped Newton method for unconstrained minimization Newton method for nonlinear least squares 14-1 Set
More informationChapter 2. Optimization. Gradients, convexity, and ALS
Chapter 2 Optimization Gradients, convexity, and ALS Contents Background Gradient descent Stochastic gradient descent Newton s method Alternating least squares KKT conditions 2 Motivation We can solve
More informationConditional Gradient (Frank-Wolfe) Method
Conditional Gradient (Frank-Wolfe) Method Lecturer: Aarti Singh Co-instructor: Pradeep Ravikumar Convex Optimization 10-725/36-725 1 Outline Today: Conditional gradient method Convergence analysis Properties
More informationWritten Examination
Division of Scientific Computing Department of Information Technology Uppsala University Optimization Written Examination 202-2-20 Time: 4:00-9:00 Allowed Tools: Pocket Calculator, one A4 paper with notes
More informationNonlinear equations and optimization
Notes for 2017-03-29 Nonlinear equations and optimization For the next month or so, we will be discussing methods for solving nonlinear systems of equations and multivariate optimization problems. We will
More information5 Handling Constraints
5 Handling Constraints Engineering design optimization problems are very rarely unconstrained. Moreover, the constraints that appear in these problems are typically nonlinear. This motivates our interest
More informationIntroduction to Unconstrained Optimization: Part 2
Introduction to Unconstrained Optimization: Part 2 James Allison ME 555 January 29, 2007 Overview Recap Recap selected concepts from last time (with examples) Use of quadratic functions Tests for positive
More informationNONLINEAR DC ANALYSIS
ECE 552 Numerical Circuit Analysis Chapter Six NONLINEAR DC ANALYSIS OR: Solution of Nonlinear Algebraic Equations I. Hajj 2017 Nonlinear Algebraic Equations A system of linear equations Ax = b has a
More informationNeural Network Training
Neural Network Training Sargur Srihari Topics in Network Training 0. Neural network parameters Probabilistic problem formulation Specifying the activation and error functions for Regression Binary classification
More informationDifferentiation. f(x + h) f(x) Lh = L.
Analysis in R n Math 204, Section 30 Winter Quarter 2008 Paul Sally, e-mail: sally@math.uchicago.edu John Boller, e-mail: boller@math.uchicago.edu website: http://www.math.uchicago.edu/ boller/m203 Differentiation
More informationComputational Optimization. Mathematical Programming Fundamentals 1/25 (revised)
Computational Optimization Mathematical Programming Fundamentals 1/5 (revised) If you don t know where you are going, you probably won t get there. -from some book I read in eight grade If you do get there,
More informationCubic Splines; Bézier Curves
Cubic Splines; Bézier Curves 1 Cubic Splines piecewise approximation with cubic polynomials conditions on the coefficients of the splines 2 Bézier Curves computer-aided design and manufacturing MCS 471
More information1 Overview. 2 A Characterization of Convex Functions. 2.1 First-order Taylor approximation. AM 221: Advanced Optimization Spring 2016
AM 221: Advanced Optimization Spring 2016 Prof. Yaron Singer Lecture 8 February 22nd 1 Overview In the previous lecture we saw characterizations of optimality in linear optimization, and we reviewed the
More informationTopic 8c Multi Variable Optimization
Course Instructor Dr. Raymond C. Rumpf Office: A 337 Phone: (915) 747 6958 E Mail: rcrumpf@utep.edu Topic 8c Multi Variable Optimization EE 4386/5301 Computational Methods in EE Outline Mathematical Preliminaries
More informationNon-linear least squares
Non-linear least squares Concept of non-linear least squares We have extensively studied linear least squares or linear regression. We see that there is a unique regression line that can be determined
More informationMath 5BI: Problem Set 6 Gradient dynamical systems
Math 5BI: Problem Set 6 Gradient dynamical systems April 25, 2007 Recall that if f(x) = f(x 1, x 2,..., x n ) is a smooth function of n variables, the gradient of f is the vector field f(x) = ( f)(x 1,
More informationLecture 7 Unconstrained nonlinear programming
Lecture 7 Unconstrained nonlinear programming Weinan E 1,2 and Tiejun Li 2 1 Department of Mathematics, Princeton University, weinan@princeton.edu 2 School of Mathematical Sciences, Peking University,
More informationHW3 - Due 02/06. Each answer must be mathematically justified. Don t forget your name. 1 2, A = 2 2
HW3 - Due 02/06 Each answer must be mathematically justified Don t forget your name Problem 1 Find a 2 2 matrix B such that B 3 = A, where A = 2 2 If A was diagonal, it would be easy: we would just take
More informationLecture V. Numerical Optimization
Lecture V Numerical Optimization Gianluca Violante New York University Quantitative Macroeconomics G. Violante, Numerical Optimization p. 1 /19 Isomorphism I We describe minimization problems: to maximize
More informationCalculus 2502A - Advanced Calculus I Fall : Local minima and maxima
Calculus 50A - Advanced Calculus I Fall 014 14.7: Local minima and maxima Martin Frankland November 17, 014 In these notes, we discuss the problem of finding the local minima and maxima of a function.
More informationPHYS 410/555 Computational Physics Solution of Non Linear Equations (a.k.a. Root Finding) (Reference Numerical Recipes, 9.0, 9.1, 9.
PHYS 410/555 Computational Physics Solution of Non Linear Equations (a.k.a. Root Finding) (Reference Numerical Recipes, 9.0, 9.1, 9.4) We will consider two cases 1. f(x) = 0 1-dimensional 2. f(x) = 0 d-dimensional
More informationSolving Nonlinear Equations
Solving Nonlinear Equations Jijian Fan Department of Economics University of California, Santa Cruz Oct 13 2014 Overview NUMERICALLY solving nonlinear equation Four methods Bisection Function iteration
More informationc 2007 Society for Industrial and Applied Mathematics
SIAM J. OPTIM. Vol. 18, No. 1, pp. 106 13 c 007 Society for Industrial and Applied Mathematics APPROXIMATE GAUSS NEWTON METHODS FOR NONLINEAR LEAST SQUARES PROBLEMS S. GRATTON, A. S. LAWLESS, AND N. K.
More informationNumerical Methods I Solving Nonlinear Equations
Numerical Methods I Solving Nonlinear Equations Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 MATH-GA 2011.003 / CSCI-GA 2945.003, Fall 2014 October 16th, 2014 A. Donev (Courant Institute)
More informationREVIEW OF DIFFERENTIAL CALCULUS
REVIEW OF DIFFERENTIAL CALCULUS DONU ARAPURA 1. Limits and continuity To simplify the statements, we will often stick to two variables, but everything holds with any number of variables. Let f(x, y) be
More informationMA2AA1 (ODE s): The inverse and implicit function theorem
MA2AA1 (ODE s): The inverse and implicit function theorem Sebastian van Strien (Imperial College) February 3, 2013 Differential Equations MA2AA1 Sebastian van Strien (Imperial College) 0 Some of you did
More informationAltered Jacobian Newton Iterative Method for Nonlinear Elliptic Problems
Altered Jacobian Newton Iterative Method for Nonlinear Elliptic Problems Sanjay K Khattri Abstract We present an Altered Jacobian Newton Iterative Method for solving nonlinear elliptic problems Effectiveness
More informationFunctions. A function is a rule that gives exactly one output number to each input number.
Functions A function is a rule that gives exactly one output number to each input number. Why it is important to us? The set of all input numbers to which the rule applies is called the domain of the function.
More informationLecture 18 Oct. 30, 2014
CS 224: Advanced Algorithms Fall 214 Lecture 18 Oct. 3, 214 Prof. Jelani Nelson Scribe: Xiaoyu He 1 Overview In this lecture we will describe a path-following implementation of the Interior Point Method
More informationSolution Methods. Richard Lusby. Department of Management Engineering Technical University of Denmark
Solution Methods Richard Lusby Department of Management Engineering Technical University of Denmark Lecture Overview (jg Unconstrained Several Variables Quadratic Programming Separable Programming SUMT
More informationComparative study of Optimization methods for Unconstrained Multivariable Nonlinear Programming Problems
International Journal of Scientific and Research Publications, Volume 3, Issue 10, October 013 1 ISSN 50-3153 Comparative study of Optimization methods for Unconstrained Multivariable Nonlinear Programming
More informationMath 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14
Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional
More informationDeep Learning. Authors: I. Goodfellow, Y. Bengio, A. Courville. Chapter 4: Numerical Computation. Lecture slides edited by C. Yim. C.
Chapter 4: Numerical Computation Deep Learning Authors: I. Goodfellow, Y. Bengio, A. Courville Lecture slides edited by 1 Chapter 4: Numerical Computation 4.1 Overflow and Underflow 4.2 Poor Conditioning
More informationLecture Notes: Geometric Considerations in Unconstrained Optimization
Lecture Notes: Geometric Considerations in Unconstrained Optimization James T. Allison February 15, 2006 The primary objectives of this lecture on unconstrained optimization are to: Establish connections
More informationCOMP 558 lecture 18 Nov. 15, 2010
Least squares We have seen several least squares problems thus far, and we will see more in the upcoming lectures. For this reason it is good to have a more general picture of these problems and how to
More informationOptimization Theory. Lectures 4-6
Optimization Theory Lectures 4-6 Unconstrained Maximization Problem: Maximize a function f:ú n 6 ú within a set A f ú n. Typically, A is ú n, or the non-negative orthant {x0ú n x$0} Existence of a maximum:
More informationNewton s Method. Ryan Tibshirani Convex Optimization /36-725
Newton s Method Ryan Tibshirani Convex Optimization 10-725/36-725 1 Last time: dual correspondences Given a function f : R n R, we define its conjugate f : R n R, Properties and examples: f (y) = max x
More information2.098/6.255/ Optimization Methods Practice True/False Questions
2.098/6.255/15.093 Optimization Methods Practice True/False Questions December 11, 2009 Part I For each one of the statements below, state whether it is true or false. Include a 1-3 line supporting sentence
More information11 a 12 a 21 a 11 a 22 a 12 a 21. (C.11) A = The determinant of a product of two matrices is given by AB = A B 1 1 = (C.13) and similarly.
C PROPERTIES OF MATRICES 697 to whether the permutation i 1 i 2 i N is even or odd, respectively Note that I =1 Thus, for a 2 2 matrix, the determinant takes the form A = a 11 a 12 = a a 21 a 11 a 22 a
More informationMathematics Lecture. 6 Chapter. 4 APPLICATIONS OF DERIVATIVES. By Dr. Mohammed Ramidh
Mathematics Lecture. 6 Chapter. 4 APPLICATIONS OF DERIVATIVES By Dr. Mohammed Ramidh OVERVIEW: This chapter studies some of the important applications of derivatives. We learn how derivatives are used
More informationMath and Numerical Methods Review
Math and Numerical Methods Review Michael Caracotsios, Ph.D. Clinical Associate Professor Chemical Engineering Department University of Illinois at Chicago Introduction In the study of chemical engineering
More informationMath (P)refresher Lecture 8: Unconstrained Optimization
Math (P)refresher Lecture 8: Unconstrained Optimization September 2006 Today s Topics : Quadratic Forms Definiteness of Quadratic Forms Maxima and Minima in R n First Order Conditions Second Order Conditions
More informationGeneralized Gradient Descent Algorithms
ECE 275AB Lecture 11 Fall 2008 V1.1 c K. Kreutz-Delgado, UC San Diego p. 1/1 Lecture 11 ECE 275A Generalized Gradient Descent Algorithms ECE 275AB Lecture 11 Fall 2008 V1.1 c K. Kreutz-Delgado, UC San
More informationGradient Descent. Dr. Xiaowei Huang
Gradient Descent Dr. Xiaowei Huang https://cgi.csc.liv.ac.uk/~xiaowei/ Up to now, Three machine learning algorithms: decision tree learning k-nn linear regression only optimization objectives are discussed,
More informationNumerical computation II. Reprojection error Bundle adjustment Family of Newtonʼs methods Statistical background Maximum likelihood estimation
Numerical computation II Reprojection error Bundle adjustment Family of Newtonʼs methods Statistical background Maximum likelihood estimation Reprojection error Reprojection error = Distance between the
More informationLecture 13: Simple Linear Regression in Matrix Format. 1 Expectations and Variances with Vectors and Matrices
Lecture 3: Simple Linear Regression in Matrix Format To move beyond simple regression we need to use matrix algebra We ll start by re-expressing simple linear regression in matrix form Linear algebra is
More informationReport due date. Please note: report has to be handed in by Monday, May 16 noon.
Lecture 23 18.86 Report due date Please note: report has to be handed in by Monday, May 16 noon. Course evaluation: Please submit your feedback (you should have gotten an email) From the weak form to finite
More informationAnnouncements. Topics: Homework: - sections , 6.1 (extreme values) * Read these sections and study solved examples in your textbook!
Announcements Topics: - sections 5.2 5.7, 6.1 (extreme values) * Read these sections and study solved examples in your textbook! Homework: - review lecture notes thoroughly - work on practice problems
More informationLecture 2: Review of Prerequisites. Table of contents
Math 348 Fall 217 Lecture 2: Review of Prerequisites Disclaimer. As we have a textbook, this lecture note is for guidance and supplement only. It should not be relied on when preparing for exams. In this
More informationLecture 10. Neural networks and optimization. Machine Learning and Data Mining November Nando de Freitas UBC. Nonlinear Supervised Learning
Lecture 0 Neural networks and optimization Machine Learning and Data Mining November 2009 UBC Gradient Searching for a good solution can be interpreted as looking for a minimum of some error (loss) function
More informationAdvanced Techniques for Mobile Robotics Least Squares. Wolfram Burgard, Cyrill Stachniss, Kai Arras, Maren Bennewitz
Advanced Techniques for Mobile Robotics Least Squares Wolfram Burgard, Cyrill Stachniss, Kai Arras, Maren Bennewitz Problem Given a system described by a set of n observation functions {f i (x)} i=1:n
More informationMATHEMATICS FOR COMPUTER VISION WEEK 8 OPTIMISATION PART 2. Dr Fabio Cuzzolin MSc in Computer Vision Oxford Brookes University Year
MATHEMATICS FOR COMPUTER VISION WEEK 8 OPTIMISATION PART 2 1 Dr Fabio Cuzzolin MSc in Computer Vision Oxford Brookes University Year 2013-14 OUTLINE OF WEEK 8 topics: quadratic optimisation, least squares,
More informationApplied Mathematics 205. Unit I: Data Fitting. Lecturer: Dr. David Knezevic
Applied Mathematics 205 Unit I: Data Fitting Lecturer: Dr. David Knezevic Unit I: Data Fitting Chapter I.4: Nonlinear Least Squares 2 / 25 Nonlinear Least Squares So far we have looked at finding a best
More information