Nonlinear Multigrid and Domain Decomposition Methods
|
|
- Kristin Lamb
- 5 years ago
- Views:
Transcription
1 Nonlinear Multigrid and Domain Decomposition Methods Rolf Krause Institute of Computational Science Universit`a della Svizzera italiana 1 Stepping Stone Symposium, Geneva, 2017
2 Non-linear problems Non-linear problem H, W Banach spaces, F : D H W, D open, F C 1 (D). Find u D such that F (u ) = 0 Construct sequence if iterates x k X, in D, k > 0, via u k+1 = u k + α k c k with u k u. u k correction or step, α k > 0 steplength special case F = J solution may not be unique Efficiency and global convergence? R. Krause (Università della Svizzera italiana) Nonlinear Domain Decomposition Methods 2
3 Non-linear problems Newton s method x k X, Newton s method replaces F by the linear model F (u k + c k ) F (u k ) + F (u k )c k = 0 leading to the the Newton correction c k = F (u k ) 1 F (u k ) and the Newton update c k+1 = c k + α k c k, (1) α k 0 damping or line-search parameter for the Newton correction. F (u k ) Fréchet derivative / Jacobian. Here: H = R n Invariance under affine transformations [Ortega 70, Deuflhard, Heindl 79, Deuflhard..., 11] First linearize, then solve R. Krause (Università della Svizzera italiana) Nonlinear Domain Decomposition Methods 3
4 Non-linear problems (Inexact) Newton Method 1: procedure (I)N(F, u 0 D, TOL) Non-linearity, start value, and tolerance 2: k = 0 3: while F (u k ) > TOL or u k+1 u > TOL do 4: Solve (approximately) F (u k )c k = F (u k ) 5: Determine α k Damping / Line search 6: u k+1 = u k + α k c k Update 7: k k + 1 8: end while 9: return u k solution found 10: end procedure the direction of the correction c k is given by the (approximate) solution of F (u k )c k = F (u k ) Close to u, α k =1 and exact solution ensures quadratic convergence stopping criterion: residual based or error based use multigrid/domain decompositionfor solving For strong non-linearities, α k might deteriorate R. Krause (Università della Svizzera italiana) Nonlinear Domain Decomposition Methods 4
5 Non-linear problems Properties of the Newton Direction [Deuflhard 11] General level set function T (u A) = 1 AF 2 (u) 2 2, A regular: T (u A) = (AF (u)) T (AF (u)) The natural choice A = F (u) 1 leads to T (u F (u) 1 ) = F (u) 1 F (u) For J(u) = 1 (Au, u) (f, u) we get 2 F (u) 1 F (u) = 2 J(u) J(u) = A 1 (f Au) = A 1 (Au Au) The Newton correction is direction of steepest descent for T (u F (u) 1 ) Damping strategy for the exact Newton method can be derived using T ( A) J convex and quadratic: Newton step leads to minimizer u = u + A 1 (Au Au) Isolines of J and T ( I ) will form our energy-landscape R. Krause (Università della Svizzera italiana) Nonlinear Domain Decomposition Methods 5
6 Non-linear problems Parallel solution of non-linear problems handling the nonlinearity Newton first linearize, then decompose (multigrid, DD as inner solver) nonlinear DD first decompose, then linearize - choice of sub-space/sub-domain model global communication - choice of coarse space convergence control intermediate approach: inexact solution after linearization constraints, non-smooth energies,... R. Krause (Università della Svizzera italiana) Nonlinear Domain Decomposition Methods 6
7 Additive and Multiplicative Trust-Region Methods Nonlinear Domain Decomposition Scheme R. Krause (Università della Svizzera italiana) Nonlinear Domain Decomposition Methods 14
8 Trust-Region Methods NDM - Additive non-linear decomposition 1: procedure NDM((V p, I p, P p, F p) p=1,...,p, x 0 2 X,TOL). Decomposition, start value, and tolerance 2: k =0 3: while kf (x k )k > TOL do 4: for p 1, P do 5: Solve (approximately) F p(p px k + pc p)=0 6: end for 7: x k+1 = x k + P P p=1 Ip pcp 8: k 7! k +1. Update 9: end while 10: return x k. solution found 11: end procedure R. Krause (Università della Svizzera italiana) Nonlinear Domain Decomposition Methods 13
9 Non-linear problems Newton Path F (ū(λ)) = (1 λ)f (u 0 ), T (ū(λ) A) = (1 λ) 2 T (u 0 A), dū dλ = F (ū) 1 F (u 0 ), x(0) = x 0, ū(1) = u, dū = F (u 0 ) 1 F (u 0 ) c 0, dλ λ=0 connects start value and nearest solution, or collapses (F singular), or ends at D[Davidenko 53, Deuflhard 72, 11] Newton correction (in the first step) ist tangent to the Newton path R. Krause (Università della Svizzera italiana) Nonlinear Domain Decomposition Methods 8
10 Non-linear problems Newton-Path: DD Examples F 1(x 1, x 2) = (x 1 x ) 3 x 3 2, F 2(x 1, x 2) = x 1 +2x 2 3. The exact solution is u =[1, 1] T. INB: Inexact Newton with backtracking ASPIN: standard ASPIN Exact-ASPIN: ASPIN with analytical Jacobian for the preconditioned system 2 Initial guess: x0=(2,2) 2 Initial guess: x0=(0,2) Contours of F(x) Newton soln 0.4 Newton path for INB ASPIN soln Newton path for ASPIN 0.2 Exact-ASPIN soln Newton path for exact-aspin Contours of F(x) Newton soln 0.4 Newton path for INB ASPIN soln Newton path for ASPIN 0.2 Exact-ASPIN soln Newton path for exact-aspin R. Krause (Università della Svizzera italiana) Nonlinear Domain Decomposition Methods 9
11 State of the Art Globalization Strategies Trust Region Methods Globalization Strategy: Trust Region Method Newton Step: Solve s 2 R n : (s) = 1 hs, Bsi + hrj(u), si =min! 2 subject to ksk 1 apple, u + s 2B B symmetric approximation to the Hessian (Quasi Newton Method) Quadratic approximation to a nonlinear function Acceptance Criterion = J(u+s) J(u) (s) then u = u + s Update of Trust Region bymeansof Theorem: If (s) = min! is solved su ciently well + Compactness of Levelsets ) convergence to first-/second-order critical points. C. Groß, R. Krause Nonlinearly Preconditioned Globalization Strategies 5
12 Nonlinear Additive Preconditioning Non linear Additive Preconditioning Algorithms Idea: Instead of computing search directions s with a parallel solver, solve local nonlinear minimization problems employing globalization strategies After the parallel phase, the corrections must be combined to a global search direction Similar Approaches: Parallel Variable Distribution [Ferris,Mangasarian 94] : asynchronous solution of local minimization problems. But: Unduly expansive computation of damping parameters ASPIN method [Cai,Keyes 02] : asynchronous solution of local minimization problems + global post smoothing. But: Formulation based on first-order conditions C. Groß, R. Krause Nonlinearly Preconditioned Globalization Strategies 8
13 Nonlinear Additive Preconditioning Concept: Non linear Additive Preconditioning Algorithms [G Krause 2009] 1 Decompose R n such that R n = S k D k where D k R n. 2 Compute several s k 2 D k by means of H k (P k u G + s k ) < H k (P k u G ) where H k is a subset objective function, u G is the current global iterate and P k : R n! D k. 3 Combine the subset corrections s k as follows u G = u G + X k I k s k where I k : D k! R n. 4 Post smoothing employing a globalization strategy C. Groß, R. Krause Nonlinearly Preconditioned Globalization Strategies 9
14 Nonlinear Additive Preconditioning Local Objective Function Non linear local functions [G Krause 2009] H k (u) =J k (u)+hr k rj(u G ) rj k (P k u G ), u P k u G i where u G is the current global iterate J k is a arbitrary local objective function, givenapriori R k =(I k ) T Properties of the coupling term hr k rj(u G ) rj k (P k u G ), u P k u G i: first local Newton correction solves in direction of the restricted current global gradient! (cf., [Nash 00, Gratton et al. 06] ) C. Groß, R. Krause Nonlinearly Preconditioned Globalization Strategies 10
15 A Non-linear Additively Preconditioned Trust-Region Method Non linear Additively Preconditioned Trust-Region Strategy (APTS) [G Krause 2009] 1 Parallel Computation: Solve s k 2 D k : H k (P k u G + s k ) < H k (P k u G ), w.r.t. ki k s k k 1 apple G employing a Trust Region Method. G : global Trust-Region radius 2 Combination: if P k I ks k is good enough: u G = u G + X k I k s k 3 Smoothing compute some global Trust-Region steps and goto (1) C. Groß, R. Krause Nonlinearly Preconditioned Globalization Strategies 11
16 A Non-linear Additively Preconditioned Trust-Region Method Ensuring Convergence [G Krause 2009] Measuring the quality of the additively computed correction where P k s k = P k I ks k Acceptance if Increase G if J(u G ) J(u G + P k = I ks k ) P k (Hk (P k u G ) H k (P k u G + s k )) (otherwise P k s k is disposed!), otherwise decrease G C. Groß, R. Krause Nonlinearly Preconditioned Globalization Strategies 12
17 A Non-linear Additively Preconditioned Trust-Region Method Convergence to First order Critical Points Theorem If the respective objective functions are su ciently smooth and a local minimizer exists, then the APTS method to computes a sequence converging to a first order critical point. Moreover, for any domain decomposition, the APTS algorithm computes a first order critical point without global smoothing. C. Groß, R. Krause Nonlinearly Preconditioned Globalization Strategies 13
18 Examples Unconstrained Example - APTS Energy optimal deformation of the geometry first order su ciency conditions (krj(u)k 2) after each iteration, comparison APTS vs normal Trust-Region (4 preconditioning iterations, 4 global smoothing iterations) 140,000 unknowns 7 processors C. Groß, R. Krause Nonlinearly Preconditioned Globalization Strategies 17
19 Examples Constrained Example APTS Energy optimal deformation of the geometry first order su ciency conditions (kd(u)rj(u)k 2) after each iteration, comparison APTS vs normal Trust-Region (4 preconditioning iterations, 4 global smoothing iterations) 900,000 unknowns 7 processors C. Groß, R. Krause Nonlinearly Preconditioned Globalization Strategies 18
20 Examples Constrained Example APLS Energy optimal deformation of the geometry first order su ciency conditions (kd(u)rj(u)k 2) after each iteration, comparison APLS vs normal Linesearch (4 preconditioning iterations, 4 global smoothing iterations) 900,000 unknowns 8 processors C. Groß, R. Krause Nonlinearly Preconditioned Globalization Strategies 20
21 Examples Unconstrained Example APLS Energy optimal deformation of the geometry first order su ciency conditions (krj(u)k 2) after each iteration, comparison APLS vs normal Linesearch (4 preconditioning iterations, 4 global smoothing iterations) 140,000 unknowns 8 processors C. Groß, R. Krause Nonlinearly Preconditioned Globalization Strategies 19
22 Additive and Multiplicative Trust-Region Methods Nonlinear Domain Decomposition Scheme R. Krause (Università della Svizzera italiana) Nonlinear Domain Decomposition Methods 18
23 Non-linear problems Non-linear Multigrid 1: procedure FAS (two level)(f h, F H, uh, 0 IH, h Ph H, V h, V H,TOL) Non-linearity, start value, and tolerance 2: k = 0 3: while F h (uh k ) > TOL or u k+1 h u h > TOL do 4: Solve (approximately) F h (uh k + c h ) = 0 in V h Fine Grid Solve 5: ū h = uh k + c h 6: uh 0 = Ph H (ū h ) 7: Solve (approximately) F H (uh 0 + c H ) = F (uh) 0 (IH) h T F h (ū h ) Coarse solve 8: u k+1 h = ūh k + IHc h H Update 9: k k : end while 11: return u k solution found 12: end procedure local convergence [Reusken 87] and investigation [Brabazona, Hubbard, Jimack 14] global convergence - gradient operators (damping) [Hackbusch, Reusken 89] global convergence - non-convex minimization (TR) [Gratton et al. 08; Groß, K 09] Linearize in function space [Deuflhard, Weiser 97, 98;], nested iteration [Bank, Rose 82] F = J: MG/OPT- multilevel optimization (MG/OPT)[Nash 99,... ] F = J+convex: monotone multigrid [Kornhuber 94; Kornhuber, K 01,... ] R. Krause (Università della Svizzera italiana) Nonlinear Domain Decomposition Methods 7
24 On the Projection Operator Projection vs. Restriction [G K 09] Note For our examples from nonlinear mechanics: restriction almost never yields initial coarse level iterates for which the objective function satisfies the standard Linesearch assumptions. R. Krause (ICS Lugano) On the Application of Nonlinear Preconditioning Strategies on Parallel Systems 15
25 On the Projection Operator Projection vs. Restriction [G K 09] The influence of the coarse level correction in nonlinear multigrids or domain decomposition methods R. Krause (ICS Lugano) On the Application of Nonlinear Preconditioning Strategies on Parallel Systems 14
26 Additive and Multiplicative Trust-Region Methods RMTR strategy [Gratton et al. 2008; Gratton et al. 2009; Groß,K 2009] The RMTR method 1 compute m 1 pre smoothing trust region steps to approximately solve H k (u k ) < H k (P k+1 u k+1 ) w.r.t u k 2B k, ku k kapple k 2 if (k is not coarsest level) Compute B k 1,andH k 1, u k 1,0 = P k u k,m1 call RMTR on level k 1andreceiveacorrections k 1 8 < u k,m1 + I k 1 s k 1 if M = H k (u k,m1 ) H k (u k,m1 +I k 1 s k 1 ) u k,m1 +1 = : u k,m1 otherwise Update trust-region k,m 1 +1 H k 1 (P k u k,m1 ) H k 1 (P k u k,m1 +s k 1 ) 3 compute m 2 post smoothing trust region steps to approximately solve H k (u k ) < H(u k,m1 +1) w.r.t u k 2B k, ku k kapple k 4 return final iterate R. Krause (Università della Svizzera italiana) Nonlinear Domain Decomposition Methods 19
27 Examples Ogden Materials Propagation of correction, Standard Approach Z J (u) = dtr(e)+ (tr(e)) 2 +(µ d)tr(e 2 )+d (det(i + ru))dx 2 E = 1 2 (ru + rut + ru T ru), (v) = ln(v), d > 0
28 Examples Ogden Materials Propagation of Correction, Multi Level Approach stresses and deformations after each V-Zyklus Multi level trust region Methode
29 Additive and Multiplicative Trust-Region Methods MPTS [Groß, K 2009] MPTS: a generalization of RMTR Almost arbitrary domain decomposition methods possible: Multigrid methods Alternating domain decomposition methods and nonlinear Jacobi methods Convergence to first-order critical points Theorem: If the search directions/corrections are chosen su ciently well, the norm of the gradients and of B are either bounded on a compact set, then MPTS is globally convergent. Even more: global convergence can be guaranteed without global smoothing, ifan (overlapping) domain decomposition is employed. R. Krause (Università della Svizzera italiana) Nonlinear Domain Decomposition Methods 20
30 Examples Convergence and Control Efficiency: Multilevel ansatz significantly increases convergence speed Highly non linear boundary value problem Norm of gradient vs. # iterations, different approaches Solution strategy: Projected cg + non linear Gauß-Seidel for quadratic problems ( fixed number of iterations for each ) Reassembling of the Hessian after every 4 steps adaptive strategies possible
31 Numerical Examples - APTS/MPTS Cylinder Contact Problem - Performance of Trust-Region Methods Energy optimal displacements First-order su ciency conditions krj(u)k 2 after each Trust-Region step; Comparison between seq. Trust-Region, APTS, MPTS, combinedapts/mpts = AMPTS (F b= 4localTrust-RegionstepsoneachD k,4globaltrust-regionstepsinorderto compute s) R. Krause (Università della Svizzera italiana) Nonlinear Domain Decomposition Methods 23
32 Numerical Examples Unconstrained Minimization Problem- Linesearch method energy optimal displacements First-order su ciency conditions krj(u)k 2 after each cycle; Comparison between seq. Linesearch, APLS, MPLS, combined APLS/MPLS = AMPLS (F b= 4 local Linesearch steps on each D k, 4 global Linesearch steps in order to compute s) 330,999 unknowns 8 processors C. Groß, R. Krause (University of Lugano) On the Application of Nonlinear Preconditioning Strategies 29
33 Numerical Results - GASPIN Comparisons Evolution of the objective function J(u i )andthenormofthegradientkg i k for globalized Aspin employing di erent numbers of processors 240 cores 480 cores 960 cores 1920 cores Overall Time Solver global TR problem Solver local QP Problem Assembling Nonlinear Iterations Computation time in seconds. R. Krause (Università della Svizzera italiana) Nonlinear Domain Decomposition Methods 34
A Recursive Trust-Region Method for Non-Convex Constrained Minimization
A Recursive Trust-Region Method for Non-Convex Constrained Minimization Christian Groß 1 and Rolf Krause 1 Institute for Numerical Simulation, University of Bonn. {gross,krause}@ins.uni-bonn.de 1 Introduction
More informationLine Search Methods for Unconstrained Optimisation
Line Search Methods for Unconstrained Optimisation Lecture 8, Numerical Linear Algebra and Optimisation Oxford University Computing Laboratory, MT 2007 Dr Raphael Hauser (hauser@comlab.ox.ac.uk) The Generic
More informationInterior-Point Methods as Inexact Newton Methods. Silvia Bonettini Università di Modena e Reggio Emilia Italy
InteriorPoint Methods as Inexact Newton Methods Silvia Bonettini Università di Modena e Reggio Emilia Italy Valeria Ruggiero Università di Ferrara Emanuele Galligani Università di Modena e Reggio Emilia
More informationLocal Inexact Newton Multilevel FEM for Nonlinear Elliptic Problems
Konrad-Zuse-Zentrum für Informationstechnik Berlin Heilbronner Str. 10, D-10711 Berlin-Wilmersdorf Peter Deuflhard Martin Weiser Local Inexact Newton Multilevel FEM for Nonlinear Elliptic Problems Preprint
More informationIterative Methods for Smooth Objective Functions
Optimization Iterative Methods for Smooth Objective Functions Quadratic Objective Functions Stationary Iterative Methods (first/second order) Steepest Descent Method Landweber/Projected Landweber Methods
More informationKasetsart University Workshop. Multigrid methods: An introduction
Kasetsart University Workshop Multigrid methods: An introduction Dr. Anand Pardhanani Mathematics Department Earlham College Richmond, Indiana USA pardhan@earlham.edu A copy of these slides is available
More informationA Line search Multigrid Method for Large-Scale Nonlinear Optimization
A Line search Multigrid Method for Large-Scale Nonlinear Optimization Zaiwen Wen Donald Goldfarb Department of Industrial Engineering and Operations Research Columbia University 2008 Siam Conference on
More informationNonlinear Optimization for Optimal Control
Nonlinear Optimization for Optimal Control Pieter Abbeel UC Berkeley EECS Many slides and figures adapted from Stephen Boyd [optional] Boyd and Vandenberghe, Convex Optimization, Chapters 9 11 [optional]
More informationNumerical Methods for Large-Scale Nonlinear Systems
Numerical Methods for Large-Scale Nonlinear Systems Handouts by Ronald H.W. Hoppe following the monograph P. Deuflhard Newton Methods for Nonlinear Problems Springer, Berlin-Heidelberg-New York, 2004 Num.
More information2 CAI, KEYES AND MARCINKOWSKI proportional to the relative nonlinearity of the function; i.e., as the relative nonlinearity increases the domain of co
INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN FLUIDS Int. J. Numer. Meth. Fluids 2002; 00:1 6 [Version: 2000/07/27 v1.0] Nonlinear Additive Schwarz Preconditioners and Application in Computational Fluid
More informationNumerical optimization
Numerical optimization Lecture 4 Alexander & Michael Bronstein tosca.cs.technion.ac.il/book Numerical geometry of non-rigid shapes Stanford University, Winter 2009 2 Longest Slowest Shortest Minimal Maximal
More informationConvex Optimization. Newton s method. ENSAE: Optimisation 1/44
Convex Optimization Newton s method ENSAE: Optimisation 1/44 Unconstrained minimization minimize f(x) f convex, twice continuously differentiable (hence dom f open) we assume optimal value p = inf x f(x)
More informationNumerical optimization. Numerical optimization. Longest Shortest where Maximal Minimal. Fastest. Largest. Optimization problems
1 Numerical optimization Alexander & Michael Bronstein, 2006-2009 Michael Bronstein, 2010 tosca.cs.technion.ac.il/book Numerical optimization 048921 Advanced topics in vision Processing and Analysis of
More informationQuasi-Newton methods: Symmetric rank 1 (SR1) Broyden Fletcher Goldfarb Shanno February 6, / 25 (BFG. Limited memory BFGS (L-BFGS)
Quasi-Newton methods: Symmetric rank 1 (SR1) Broyden Fletcher Goldfarb Shanno (BFGS) Limited memory BFGS (L-BFGS) February 6, 2014 Quasi-Newton methods: Symmetric rank 1 (SR1) Broyden Fletcher Goldfarb
More informationMULTIGRID METHODS FOR NONLINEAR PROBLEMS: AN OVERVIEW
MULTIGRID METHODS FOR NONLINEAR PROBLEMS: AN OVERVIEW VAN EMDEN HENSON CENTER FOR APPLIED SCIENTIFIC COMPUTING LAWRENCE LIVERMORE NATIONAL LABORATORY Abstract Since their early application to elliptic
More informationTrust-Region SQP Methods with Inexact Linear System Solves for Large-Scale Optimization
Trust-Region SQP Methods with Inexact Linear System Solves for Large-Scale Optimization Denis Ridzal Department of Computational and Applied Mathematics Rice University, Houston, Texas dridzal@caam.rice.edu
More informationOn Multigrid for Phase Field
On Multigrid for Phase Field Carsten Gräser (FU Berlin), Ralf Kornhuber (FU Berlin), Rolf Krause (Uni Bonn), and Vanessa Styles (University of Sussex) Interphase 04 Rome, September, 13-16, 2004 Synopsis
More information1. Fast Iterative Solvers of SLE
1. Fast Iterative Solvers of crucial drawback of solvers discussed so far: they become slower if we discretize more accurate! now: look for possible remedies relaxation: explicit application of the multigrid
More informationConditional Gradient (Frank-Wolfe) Method
Conditional Gradient (Frank-Wolfe) Method Lecturer: Aarti Singh Co-instructor: Pradeep Ravikumar Convex Optimization 10-725/36-725 1 Outline Today: Conditional gradient method Convergence analysis Properties
More informationComputational Linear Algebra
Computational Linear Algebra PD Dr. rer. nat. habil. Ralf-Peter Mundani Computation in Engineering / BGU Scientific Computing in Computer Science / INF Winter Term 2018/19 Part 4: Iterative Methods PD
More informationConstrained Minimization and Multigrid
Constrained Minimization and Multigrid C. Gräser (FU Berlin), R. Kornhuber (FU Berlin), and O. Sander (FU Berlin) Workshop on PDE Constrained Optimization Hamburg, March 27-29, 2008 Matheon Outline Successive
More informationUniversity of Houston, Department of Mathematics Numerical Analysis, Fall 2005
3 Numerical Solution of Nonlinear Equations and Systems 3.1 Fixed point iteration Reamrk 3.1 Problem Given a function F : lr n lr n, compute x lr n such that ( ) F(x ) = 0. In this chapter, we consider
More informationNumerical Optimization Professor Horst Cerjak, Horst Bischof, Thomas Pock Mat Vis-Gra SS09
Numerical Optimization 1 Working Horse in Computer Vision Variational Methods Shape Analysis Machine Learning Markov Random Fields Geometry Common denominator: optimization problems 2 Overview of Methods
More informationNumerical Solution I
Numerical Solution I Stationary Flow R. Kornhuber (FU Berlin) Summerschool Modelling of mass and energy transport in porous media with practical applications October 8-12, 2018 Schedule Classical Solutions
More informationOPER 627: Nonlinear Optimization Lecture 14: Mid-term Review
OPER 627: Nonlinear Optimization Lecture 14: Mid-term Review Department of Statistical Sciences and Operations Research Virginia Commonwealth University Oct 16, 2013 (Lecture 14) Nonlinear Optimization
More informationMATHEMATICS FOR COMPUTER VISION WEEK 8 OPTIMISATION PART 2. Dr Fabio Cuzzolin MSc in Computer Vision Oxford Brookes University Year
MATHEMATICS FOR COMPUTER VISION WEEK 8 OPTIMISATION PART 2 1 Dr Fabio Cuzzolin MSc in Computer Vision Oxford Brookes University Year 2013-14 OUTLINE OF WEEK 8 topics: quadratic optimisation, least squares,
More information5 Handling Constraints
5 Handling Constraints Engineering design optimization problems are very rarely unconstrained. Moreover, the constraints that appear in these problems are typically nonlinear. This motivates our interest
More informationIP-PCG An interior point algorithm for nonlinear constrained optimization
IP-PCG An interior point algorithm for nonlinear constrained optimization Silvia Bonettini (bntslv@unife.it), Valeria Ruggiero (rgv@unife.it) Dipartimento di Matematica, Università di Ferrara December
More information5 Overview of algorithms for unconstrained optimization
IOE 59: NLP, Winter 22 c Marina A. Epelman 9 5 Overview of algorithms for unconstrained optimization 5. General optimization algorithm Recall: we are attempting to solve the problem (P) min f(x) s.t. x
More informationWritten Examination
Division of Scientific Computing Department of Information Technology Uppsala University Optimization Written Examination 202-2-20 Time: 4:00-9:00 Allowed Tools: Pocket Calculator, one A4 paper with notes
More informationUnconstrained minimization
CSCI5254: Convex Optimization & Its Applications Unconstrained minimization terminology and assumptions gradient descent method steepest descent method Newton s method self-concordant functions 1 Unconstrained
More information10. Unconstrained minimization
Convex Optimization Boyd & Vandenberghe 10. Unconstrained minimization terminology and assumptions gradient descent method steepest descent method Newton s method self-concordant functions implementation
More informationAnalysis of Inexact Trust-Region Interior-Point SQP Algorithms. Matthias Heinkenschloss Luis N. Vicente. TR95-18 June 1995 (revised April 1996)
Analysis of Inexact rust-region Interior-Point SQP Algorithms Matthias Heinkenschloss Luis N. Vicente R95-18 June 1995 (revised April 1996) Department of Computational and Applied Mathematics MS 134 Rice
More informationITERATIVE METHODS FOR NONLINEAR ELLIPTIC EQUATIONS
ITERATIVE METHODS FOR NONLINEAR ELLIPTIC EQUATIONS LONG CHEN In this chapter we discuss iterative methods for solving the finite element discretization of semi-linear elliptic equations of the form: find
More informationE5295/5B5749 Convex optimization with engineering applications. Lecture 8. Smooth convex unconstrained and equality-constrained minimization
E5295/5B5749 Convex optimization with engineering applications Lecture 8 Smooth convex unconstrained and equality-constrained minimization A. Forsgren, KTH 1 Lecture 8 Convex optimization 2006/2007 Unconstrained
More information1 Newton s Method. Suppose we want to solve: x R. At x = x, f (x) can be approximated by:
Newton s Method Suppose we want to solve: (P:) min f (x) At x = x, f (x) can be approximated by: n x R. f (x) h(x) := f ( x)+ f ( x) T (x x)+ (x x) t H ( x)(x x), 2 which is the quadratic Taylor expansion
More informationMaria Cameron. f(x) = 1 n
Maria Cameron 1. Local algorithms for solving nonlinear equations Here we discuss local methods for nonlinear equations r(x) =. These methods are Newton, inexact Newton and quasi-newton. We will show that
More informationA Sobolev trust-region method for numerical solution of the Ginz
A Sobolev trust-region method for numerical solution of the Ginzburg-Landau equations Robert J. Renka Parimah Kazemi Department of Computer Science & Engineering University of North Texas June 6, 2012
More informationSimulation based optimization
SimBOpt p.1/52 Simulation based optimization Feb 2005 Eldad Haber haber@mathcs.emory.edu Emory University SimBOpt p.2/52 Outline Introduction A few words about discretization The unconstrained framework
More information10.34: Numerical Methods Applied to Chemical Engineering. Lecture 7: Solutions of nonlinear equations Newton-Raphson method
10.34: Numerical Methods Applied to Chemical Engineering Lecture 7: Solutions of nonlinear equations Newton-Raphson method 1 Recap Singular value decomposition Iterative solutions to linear equations 2
More informationLINEAR AND NONLINEAR PROGRAMMING
LINEAR AND NONLINEAR PROGRAMMING Stephen G. Nash and Ariela Sofer George Mason University The McGraw-Hill Companies, Inc. New York St. Louis San Francisco Auckland Bogota Caracas Lisbon London Madrid Mexico
More informationShiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 3. Gradient Method
Shiqian Ma, MAT-258A: Numerical Optimization 1 Chapter 3 Gradient Method Shiqian Ma, MAT-258A: Numerical Optimization 2 3.1. Gradient method Classical gradient method: to minimize a differentiable convex
More informationIPAM Summer School Optimization methods for machine learning. Jorge Nocedal
IPAM Summer School 2012 Tutorial on Optimization methods for machine learning Jorge Nocedal Northwestern University Overview 1. We discuss some characteristics of optimization problems arising in deep
More informationOptimization: Nonlinear Optimization without Constraints. Nonlinear Optimization without Constraints 1 / 23
Optimization: Nonlinear Optimization without Constraints Nonlinear Optimization without Constraints 1 / 23 Nonlinear optimization without constraints Unconstrained minimization min x f(x) where f(x) is
More informationNumerical Optimization
Numerical Optimization Emo Todorov Applied Mathematics and Computer Science & Engineering University of Washington Spring 2010 Emo Todorov (UW) AMATH/CSE 579, Spring 2010 Lecture 9 1 / 8 Gradient descent
More informationConvex Optimization. Problem set 2. Due Monday April 26th
Convex Optimization Problem set 2 Due Monday April 26th 1 Gradient Decent without Line-search In this problem we will consider gradient descent with predetermined step sizes. That is, instead of determining
More informationAn Alternative Three-Term Conjugate Gradient Algorithm for Systems of Nonlinear Equations
International Journal of Mathematical Modelling & Computations Vol. 07, No. 02, Spring 2017, 145-157 An Alternative Three-Term Conjugate Gradient Algorithm for Systems of Nonlinear Equations L. Muhammad
More informationThe Conjugate Gradient Method
The Conjugate Gradient Method Classical Iterations We have a problem, We assume that the matrix comes from a discretization of a PDE. The best and most popular model problem is, The matrix will be as large
More informationORIE 6326: Convex Optimization. Quasi-Newton Methods
ORIE 6326: Convex Optimization Quasi-Newton Methods Professor Udell Operations Research and Information Engineering Cornell April 10, 2017 Slides on steepest descent and analysis of Newton s method adapted
More informationLinear algebra issues in Interior Point methods for bound-constrained least-squares problems
Linear algebra issues in Interior Point methods for bound-constrained least-squares problems Stefania Bellavia Dipartimento di Energetica S. Stecco Università degli Studi di Firenze Joint work with Jacek
More information5 Quasi-Newton Methods
Unconstrained Convex Optimization 26 5 Quasi-Newton Methods If the Hessian is unavailable... Notation: H = Hessian matrix. B is the approximation of H. C is the approximation of H 1. Problem: Solve min
More informationUnconstrained minimization of smooth functions
Unconstrained minimization of smooth functions We want to solve min x R N f(x), where f is convex. In this section, we will assume that f is differentiable (so its gradient exists at every point), and
More informationNonlinear Programming
Nonlinear Programming Kees Roos e-mail: C.Roos@ewi.tudelft.nl URL: http://www.isa.ewi.tudelft.nl/ roos LNMB Course De Uithof, Utrecht February 6 - May 8, A.D. 2006 Optimization Group 1 Outline for week
More informationIntroduction to unconstrained optimization - direct search methods
Introduction to unconstrained optimization - direct search methods Jussi Hakanen Post-doctoral researcher jussi.hakanen@jyu.fi Structure of optimization methods Typically Constraint handling converts the
More informationSYSTEMS OF NONLINEAR EQUATIONS
SYSTEMS OF NONLINEAR EQUATIONS Widely used in the mathematical modeling of real world phenomena. We introduce some numerical methods for their solution. For better intuition, we examine systems of two
More information1. Nonlinear Equations. This lecture note excerpted parts from Michael Heath and Max Gunzburger. f(x) = 0
Numerical Analysis 1 1. Nonlinear Equations This lecture note excerpted parts from Michael Heath and Max Gunzburger. Given function f, we seek value x for which where f : D R n R n is nonlinear. f(x) =
More informationNonlinear Optimization: What s important?
Nonlinear Optimization: What s important? Julian Hall 10th May 2012 Convexity: convex problems A local minimizer is a global minimizer A solution of f (x) = 0 (stationary point) is a minimizer A global
More informationConjugate Gradients: Idea
Overview Steepest Descent often takes steps in the same direction as earlier steps Wouldn t it be better every time we take a step to get it exactly right the first time? Again, in general we choose a
More informationXIAO-CHUAN CAI AND MAKSYMILIAN DRYJA. strongly elliptic equations discretized by the nite element methods.
Contemporary Mathematics Volume 00, 0000 Domain Decomposition Methods for Monotone Nonlinear Elliptic Problems XIAO-CHUAN CAI AND MAKSYMILIAN DRYJA Abstract. In this paper, we study several overlapping
More informationUnconstrained optimization
Chapter 4 Unconstrained optimization An unconstrained optimization problem takes the form min x Rnf(x) (4.1) for a target functional (also called objective function) f : R n R. In this chapter and throughout
More informationA MULTIGRID ALGORITHM FOR. Richard E. Ewing and Jian Shen. Institute for Scientic Computation. Texas A&M University. College Station, Texas SUMMARY
A MULTIGRID ALGORITHM FOR THE CELL-CENTERED FINITE DIFFERENCE SCHEME Richard E. Ewing and Jian Shen Institute for Scientic Computation Texas A&M University College Station, Texas SUMMARY In this article,
More informationComplexity analysis of second-order algorithms based on line search for smooth nonconvex optimization
Complexity analysis of second-order algorithms based on line search for smooth nonconvex optimization Clément Royer - University of Wisconsin-Madison Joint work with Stephen J. Wright MOPTA, Bethlehem,
More information1. Method 1: bisection. The bisection methods starts from two points a 0 and b 0 such that
Chapter 4 Nonlinear equations 4.1 Root finding Consider the problem of solving any nonlinear relation g(x) = h(x) in the real variable x. We rephrase this problem as one of finding the zero (root) of a
More informationConstrained optimization. Unconstrained optimization. One-dimensional. Multi-dimensional. Newton with equality constraints. Active-set method.
Optimization Unconstrained optimization One-dimensional Multi-dimensional Newton s method Basic Newton Gauss- Newton Quasi- Newton Descent methods Gradient descent Conjugate gradient Constrained optimization
More informationOn nonlinear adaptivity with heterogeneity
On nonlinear adaptivity with heterogeneity Jed Brown jed@jedbrown.org (CU Boulder) Collaborators: Mark Adams (LBL), Matt Knepley (UChicago), Dave May (ETH), Laetitia Le Pourhiet (UPMC), Ravi Samtaney (KAUST)
More informationSome new developements in nonlinear programming
Some new developements in nonlinear programming S. Bellavia C. Cartis S. Gratton N. Gould B. Morini M. Mouffe Ph. Toint 1 D. Tomanos M. Weber-Mendonça 1 Department of Mathematics,University of Namur, Belgium
More informationCONVERGENCE PROPERTIES OF COMBINED RELAXATION METHODS
CONVERGENCE PROPERTIES OF COMBINED RELAXATION METHODS Igor V. Konnov Department of Applied Mathematics, Kazan University Kazan 420008, Russia Preprint, March 2002 ISBN 951-42-6687-0 AMS classification:
More informationSF2822 Applied Nonlinear Optimization. Preparatory question. Lecture 9: Sequential quadratic programming. Anders Forsgren
SF2822 Applied Nonlinear Optimization Lecture 9: Sequential quadratic programming Anders Forsgren SF2822 Applied Nonlinear Optimization, KTH / 24 Lecture 9, 207/208 Preparatory question. Try to solve theory
More informationMathematical optimization
Optimization Mathematical optimization Determine the best solutions to certain mathematically defined problems that are under constrained determine optimality criteria determine the convergence of the
More informationTermination criteria for inexact fixed point methods
Termination criteria for inexact fixed point methods Philipp Birken 1 October 1, 2013 1 Institute of Mathematics, University of Kassel, Heinrich-Plett-Str. 40, D-34132 Kassel, Germany Department of Mathematics/Computer
More informationTrust Region Methods. Lecturer: Pradeep Ravikumar Co-instructor: Aarti Singh. Convex Optimization /36-725
Trust Region Methods Lecturer: Pradeep Ravikumar Co-instructor: Aarti Singh Convex Optimization 10-725/36-725 Trust Region Methods min p m k (p) f(x k + p) s.t. p 2 R k Iteratively solve approximations
More information4 damped (modified) Newton methods
4 damped (modified) Newton methods 4.1 damped Newton method Exercise 4.1 Determine with the damped Newton method the unique real zero x of the real valued function of one variable f(x) = x 3 +x 2 using
More informationarxiv: v1 [math.oc] 1 Jul 2016
Convergence Rate of Frank-Wolfe for Non-Convex Objectives Simon Lacoste-Julien INRIA - SIERRA team ENS, Paris June 8, 016 Abstract arxiv:1607.00345v1 [math.oc] 1 Jul 016 We give a simple proof that the
More informationAn Iterative Descent Method
Conjugate Gradient: An Iterative Descent Method The Plan Review Iterative Descent Conjugate Gradient Review : Iterative Descent Iterative Descent is an unconstrained optimization process x (k+1) = x (k)
More informationAcceleration of a Domain Decomposition Method for Advection-Diffusion Problems
Acceleration of a Domain Decomposition Method for Advection-Diffusion Problems Gert Lube 1, Tobias Knopp 2, and Gerd Rapin 2 1 University of Göttingen, Institute of Numerical and Applied Mathematics (http://www.num.math.uni-goettingen.de/lube/)
More informationNewton s Method and Efficient, Robust Variants
Newton s Method and Efficient, Robust Variants Philipp Birken University of Kassel (SFB/TRR 30) Soon: University of Lund October 7th 2013 Efficient solution of large systems of non-linear PDEs in science
More informationLecture 14: October 17
1-725/36-725: Convex Optimization Fall 218 Lecture 14: October 17 Lecturer: Lecturer: Ryan Tibshirani Scribes: Pengsheng Guo, Xian Zhou Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer:
More informationHigher-Order Methods
Higher-Order Methods Stephen J. Wright 1 2 Computer Sciences Department, University of Wisconsin-Madison. PCMI, July 2016 Stephen Wright (UW-Madison) Higher-Order Methods PCMI, July 2016 1 / 25 Smooth
More informationPart 2: Linesearch methods for unconstrained optimization. Nick Gould (RAL)
Part 2: Linesearch methods for unconstrained optimization Nick Gould (RAL) minimize x IR n f(x) MSc course on nonlinear optimization UNCONSTRAINED MINIMIZATION minimize x IR n f(x) where the objective
More informationVasil Khalidov & Miles Hansard. C.M. Bishop s PRML: Chapter 5; Neural Networks
C.M. Bishop s PRML: Chapter 5; Neural Networks Introduction The aim is, as before, to find useful decompositions of the target variable; t(x) = y(x, w) + ɛ(x) (3.7) t(x n ) and x n are the observations,
More informationOn Nonlinear Dirichlet Neumann Algorithms for Jumping Nonlinearities
On Nonlinear Dirichlet Neumann Algorithms for Jumping Nonlinearities Heiko Berninger, Ralf Kornhuber, and Oliver Sander FU Berlin, FB Mathematik und Informatik (http://www.math.fu-berlin.de/rd/we-02/numerik/)
More informationLinear and Non-Linear Preconditioning
and Non- martin.gander@unige.ch University of Geneva June 2015 Invents an Iterative Method in a Letter (1823), in a letter to Gerling: in order to compute a least squares solution based on angle measurements
More informationMethods that avoid calculating the Hessian. Nonlinear Optimization; Steepest Descent, Quasi-Newton. Steepest Descent
Nonlinear Optimization Steepest Descent and Niclas Börlin Department of Computing Science Umeå University niclas.borlin@cs.umu.se A disadvantage with the Newton method is that the Hessian has to be derived
More informationNumerical solutions of nonlinear systems of equations
Numerical solutions of nonlinear systems of equations Tsung-Ming Huang Department of Mathematics National Taiwan Normal University, Taiwan E-mail: min@math.ntnu.edu.tw August 28, 2011 Outline 1 Fixed points
More informationAM 205: lecture 19. Last time: Conditions for optimality Today: Newton s method for optimization, survey of optimization methods
AM 205: lecture 19 Last time: Conditions for optimality Today: Newton s method for optimization, survey of optimization methods Optimality Conditions: Equality Constrained Case As another example of equality
More informationA Multilevel Proximal Algorithm for Large Scale Composite Convex Optimization
A Multilevel Proximal Algorithm for Large Scale Composite Convex Optimization Panos Parpas Department of Computing Imperial College London www.doc.ic.ac.uk/ pp500 p.parpas@imperial.ac.uk jointly with D.V.
More informationNonlinear equations and optimization
Notes for 2017-03-29 Nonlinear equations and optimization For the next month or so, we will be discussing methods for solving nonlinear systems of equations and multivariate optimization problems. We will
More informationCS 542G: Robustifying Newton, Constraints, Nonlinear Least Squares
CS 542G: Robustifying Newton, Constraints, Nonlinear Least Squares Robert Bridson October 29, 2008 1 Hessian Problems in Newton Last time we fixed one of plain Newton s problems by introducing line search
More informationLinear and Non-Linear Preconditioning
and Non- martin.gander@unige.ch University of Geneva July 2015 Joint work with Victorita Dolean, Walid Kheriji, Felix Kwok and Roland Masson (1845): First Idea of After preconditioning, it takes only three
More informationNewton s Method. Javier Peña Convex Optimization /36-725
Newton s Method Javier Peña Convex Optimization 10-725/36-725 1 Last time: dual correspondences Given a function f : R n R, we define its conjugate f : R n R, f ( (y) = max y T x f(x) ) x Properties and
More informationMath 411 Preliminaries
Math 411 Preliminaries Provide a list of preliminary vocabulary and concepts Preliminary Basic Netwon s method, Taylor series expansion (for single and multiple variables), Eigenvalue, Eigenvector, Vector
More informationUnconstrained minimization: assumptions
Unconstrained minimization I terminology and assumptions I gradient descent method I steepest descent method I Newton s method I self-concordant functions I implementation IOE 611: Nonlinear Programming,
More informationAn Accelerated Block-Parallel Newton Method via Overlapped Partitioning
An Accelerated Block-Parallel Newton Method via Overlapped Partitioning Yurong Chen Lab. of Parallel Computing, Institute of Software, CAS (http://www.rdcps.ac.cn/~ychen/english.htm) Summary. This paper
More informationThe Steepest Descent Algorithm for Unconstrained Optimization
The Steepest Descent Algorithm for Unconstrained Optimization Robert M. Freund February, 2014 c 2014 Massachusetts Institute of Technology. All rights reserved. 1 1 Steepest Descent Algorithm The problem
More informationStabilization and Acceleration of Algebraic Multigrid Method
Stabilization and Acceleration of Algebraic Multigrid Method Recursive Projection Algorithm A. Jemcov J.P. Maruszewski Fluent Inc. October 24, 2006 Outline 1 Need for Algorithm Stabilization and Acceleration
More informationPart 3: Trust-region methods for unconstrained optimization. Nick Gould (RAL)
Part 3: Trust-region methods for unconstrained optimization Nick Gould (RAL) minimize x IR n f(x) MSc course on nonlinear optimization UNCONSTRAINED MINIMIZATION minimize x IR n f(x) where the objective
More informationSOLVING MESH EIGENPROBLEMS WITH MULTIGRID EFFICIENCY
SOLVING MESH EIGENPROBLEMS WITH MULTIGRID EFFICIENCY KLAUS NEYMEYR ABSTRACT. Multigrid techniques can successfully be applied to mesh eigenvalue problems for elliptic differential operators. They allow
More information2. Quasi-Newton methods
L. Vandenberghe EE236C (Spring 2016) 2. Quasi-Newton methods variable metric methods quasi-newton methods BFGS update limited-memory quasi-newton methods 2-1 Newton method for unconstrained minimization
More informationCoordinate Update Algorithm Short Course Proximal Operators and Algorithms
Coordinate Update Algorithm Short Course Proximal Operators and Algorithms Instructor: Wotao Yin (UCLA Math) Summer 2016 1 / 36 Why proximal? Newton s method: for C 2 -smooth, unconstrained problems allow
More informationProgramming, numerics and optimization
Programming, numerics and optimization Lecture C-3: Unconstrained optimization II Łukasz Jankowski ljank@ippt.pan.pl Institute of Fundamental Technological Research Room 4.32, Phone +22.8261281 ext. 428
More information