A Line search Multigrid Method for Large-Scale Nonlinear Optimization

Size: px
Start display at page:

Download "A Line search Multigrid Method for Large-Scale Nonlinear Optimization"

Transcription

1 A Line search Multigrid Method for Large-Scale Nonlinear Optimization Zaiwen Wen Donald Goldfarb Department of Industrial Engineering and Operations Research Columbia University 2008 Siam Conference on Optimization

2 What s New? Multilevel/Multigrid Methods in Siam Conference on Optimization 2008 One plenary talk: Multiscale Optimization One dedicated session Four sessions on PDE-Based problems which are mainly handled by multigrid methods Almost 20 invited and contributed talks

3 MG Solution level = MG Solution level = MG Solution level = Motivation Statement of Problem Statement of Problem Previous Work Consider solving problem min u V F(u) Infinite-dimensional problem: F is a functional F has a closely related representations {f h } on a hierarchical discretization levels h. Discretize-then-Optimize scheme: min f h Solutions of {min f h } might have similar structures Figure: Solution Structure of F(u) = Ω + u(x) 2 dx (a) Level 4 (b) Level 5 (c) Level 6

4 Sources of Problems Statement of Problem Previous Work Applications in nonlinear PDEs, image processing: min F(u) = L( u, u, x) dx u U Ω PDE-constrained optimization: optimal control problems and inverse Problems. Example: finding a local volatility σ(t, x) such that the prices C(T, S) from the Black-Scholes PDEs match the observed prices on the market. min J(C, σ) := C(S i, T i ) z(s i, T i ) 2 + αj r (σ) I s.t. τ C σ2 K KK C + (r q)k K C + qc = 0, C(0, K ) = (S K ) +, K > 0, τ (0, + ), where J r (σ) is the regularization term.

5 Mesh-refinement Method Statement of Problem Previous Work Finest Level Problem Finer Level Problem Prolongation Restriction Finest Level Problem Finer Level Problem Prolongation Prolongation Restriction Prolongation Coarser Level Problem Prolongation Restriction Coarser Level Problem Prolongation Coarsest Level Problem Coarsest Level Problem

6 Linear Multigrid Methods Statement of Problem Previous Work Multigrid method for solving A h x h = b h, A h 0 Inter-grid operations: Restriction R h, Prolongation P h. Smoothing: reducing high frequency errors efficiently Coarse grid Correction: Let A h = R h A h P h. Algorithm Multigrid-cycle: x h,k+ = MGCYCLE(h, A h, b h, x h,k ) -PRE-SMOOTHING: Compute x h,k = x h,k + B h (b h A h x h,k ). -COARSE GRID CORRECTION: Compute the residual r h,k = b h A h x h,k. Restrict the residual r h,k = R h r h,k. -Solve the Coarse Grid Residual Equation A h e h,k = r h,k IF h = N 0, solve e h,k = A h r h,k, ELSE call e h,k = MGCYCLE(h, A h, r h,k, 0). Interpolate the correction: e h,k = P h e h,k. Compute the new approximation solution: x h,k+ = x h,k + e h,k. Smoothing R h Smoothing R H fine coarse coarsest P H P h

7 Statement of Problem Previous Work Methods for Solving nonlinear PDEs Global linearization method (Newton s Method) Local linearization method, such as the full approximation scheme (Brandt, Hackbusch) Combination of global and local linearization method (Irad Yavneh, Gregory Dardyk) Projection based multilevel method (Stephen Mccormick, Thomas Manteuffel, Oliver Rohrle, John Ruge) Multigrid methods for obstacle problems (Ralf Kornhuber, Carsten Graser)...

8 Statement of Problem Previous Work Multigrid Methods for Optimization Traditional optimization methods where the system of linear equations is solved by multigrid methods (A.Borzi, K.Kunisch, Volker Schulz, Thomas Dreyer, Bernd Maar, U.M.Ascher, E.Haber) The multigrid optimization framework proposed by Nash and Lewis. (extensions given by A.Borzi) The recursive trust region method for unconstrained and box-constrained optimization (S. Gratton, A. Sartenaer, P. Toint, M. Weber, D. Tomanos, M.Mouffe, M. Ulbrich, S. Ulbrich, B. Loesch) Multigrid method for image processing (Raymond Chan, Ke Chen, Xue-cheng Tai, Tony Chan, Elad Haber, Jan Moderstitzki)...

9 General Idea Motivation A general line search multigrid framework A new line search scheme Consider min x h V h f h (x h ) on a class of nested spaces V N0 V N V N V. General idea of line search: x h,k+ = x h,k + α h,k d h,k, d h,k V h, where d h,k is the search direction and α h,k is the step size. General idea of using coarse grid information: d h,k = arg min d H V H f h (x h,k + d H ).

10 A general line search multigrid framework A new line search scheme An illustration of the multigrid optimization framework. A direct search direction (Taylor iteration), marked by, is generated on the current level. A recursive search direction, marked by, is generated from the coarser levels. Level recursive P h direct R h Iteration

11 A general line search multigrid framework A new line search scheme Construction of recursive search direction If condition (Gratton, Sartenaer, Toint) R h g h,k κ g h,k, R h g h,k ɛ h hold, we compute recursive search direction ( i d h,k = P h dh = P h(x H,i x H,0 ) = P h Two major difficulties: i=0 α H,i d H,i ) The calculation of f h (x h,k + d H ) might be expensive. 2 The recursive direction might not be a descent direction.,

12 A general line search multigrid framework A new line search scheme Construction of the Coarse Level Model Coarse Level Model (Nash, 2000) min x H {ψ H (x H ) f H (x H ) (v H ) x H }, where v H = f H,0 R h g h,k and g h,k = ψ h (x h,k ). Define v N = 0, then f N (x N ) = ψ N (x N ) The scheme is compatible to the FAS scheme Second order coherence about the Hessian can also be enforced

13 A general line search multigrid framework A new line search scheme Properties of Recursive Search Direction d h,k Assumption Assume σ h P h = R h. First-order coherence g H,0 = R h g h,k, (d h,k ) g h,k = (d H ) g H,0. If f (x) is convex, the direction d h,k is a descent direction (d h,k ) g h,k < 0 and the directional derivative (d h,k ) g h,k satisfies (d h,k ) g h,k ψ H,0 ψ H,i.

14 Motivation Failure of recursive search direction Consider Rosenbrock function A general line search multigrid framework A new line search scheme ϕ(x) = 00(x 2 x 2 )2 + ( x ) 2 Local minimizer x = (, ), initial point x 0 = ( 0.5, 0.5). Search along the direction x x 0? ϕ(x 0 ) (x x 0 ) = (47, 50)(.5, 0.5) = 95.5 > 0.5 x* x k g k

15 A general line search multigrid framework A new line search scheme Obtaining a descent recursive direction Convexify the function f (x) on the coarser level. For example, setting ψ H = ψ H + λ H x H x H, Trust region method: min m h (s h,k ), subject to s h,k B h Line search method, but checking the decent condition at each step. Non-monotone line search Watch-dog technique

16 A New Line Search Scheme A general line search multigrid framework A new line search scheme Given the direction d h,k = P h dh,i = P h (x H,i x H,0 ). One can check g h,k d h,k on the coarser levels since the first order coherence: g h,k d h,k = g H,0 d H,i Since ψ H (x H,i ) ψ H (x H,0 ) + g H,0 (x H,i x H,0 ), we check Then ψ H (x H,i ) > ψ H,0 + ρ 2 g H,0 d H,i = ψ H,0 + ρ 2 g h,k d h,k g h,k d h,k < ρ 2 (ψ H(x H,i ) ψ H,0 ) 0 Sufficient reduction of the function value (Armijo Condition): ψ h (x h,k + α h,k d h,k ) ψ h,k + ρ α h,k (g h,k ) d h,k

17 A New Line Search Scheme A general line search multigrid framework A new line search scheme Line search conditions on the coarser level (A) ψ h (x h,k + α h,k d h,k ) ψ h,k + ρ α h,k (g h,k ) d h,k. (B) ψ h (x h,k + α h,k d h,k ) > ψ h,0 + ρ 2 g h,0 (x h,k + α h,k d h,k x h,0 ) Algorithm Condition (B) is similar to the Goldstein rule if k = 0 and ρ 2 = ρ. Backtracking Line Search Step. Given α ρ > 0, 0 < ρ < 2 and ρ ρ 2. Let α (0) = α ρ. Set l = 0. Step 2. If (h = N and condition (A) is satisfied) or if (h < N and both conditions (A) and (B) are satisfied), RETURN α h,k = α (l). Step 3. Set α (l+) = τα (l), where τ (0, ). Set l = l + and go to Step 2.

18 Existence of the step size A general line search multigrid framework A new line search scheme The first step size α h,0 always exists! Suppose α h,i for i = 0,, k exist, then α h,k+ also exists. Define φ h,k := ψ h,0 + ρ 2 g h,0 (x h,k x h,0 ). ψ h,k > ψ h,0 + ρ 2 g h,0 (x h,k + α h,k d h,k x h,0 ) = φ h,k Condition (B): ψ h (x h,k + α h,k d h,k ) > φ h,k + ρ 2 α h,k g h,0 d h,k φ h,k + ρ 2α(g 2 h,0 ) d h,k ψ h(x h,k + αd h,k) ψ h,k ψ h,k + ρ αg h,k dh,k φ h,k φ h,k + ρ 2α(g h,0 ) d h,k τ τ 2 τ 3 τ 4 Exit MG if the step size is too small

19 Choice of direct search direction We require that a direct search satisfy Direct Search Condition A general line search multigrid framework A new line search scheme d h,k β T g h,k and (d h,k ) g h,k η T g h,k 2. The following directions satisfy the condition (convex) Steepest descent direction d h,k = g h,k. Exact Newton s direction d h,k = G h,k g h,k. Inexact Newton s direction generated by the conjugate gradient method. Other possible choices: Inexact Newton s direction generated by the linear multigrid method Limited-memory BFGS direction

20 The Algorithm Motivation A general line search multigrid framework A new line search scheme Algorithm x h = MGLS(h, x h,0, g h,0 ) Step. Given κ > 0, ɛ h > 0 and ξ > 0 and an integer K. Step 2. IF h < N, compute v h = f h,0 g h,0, set g h,0 = g h,0 ; ELSE set v h = 0 and compute g h,0 = f h,0. Step 3. FOR k = 0,, 2, 3.. IF g h,k ɛ h or IF h < N and k K, RETURN solution x h,k ; 3.2. IF h = N 0 or R h g h,k < κ g h,k or R h g h,k < ɛ h -Direct Search Direction Computation. Compute a descent search direction d h,k on the current level. ELSE -Recursive Search Direction Computation. Call x h,i = MGLS(h, R h x h,k, R h g h,k ) to return a solution (or approximate solution) x h,i of min xh ψ h (x h )". Compute d h,k = P h e dh,i = P h (x h,i R h x h,k ) Call the backtracking line search Algorithm to obtain a step size α h,k Set x h,k+ = x h,k + α h,k d h,k. IF α h,k ξ and h < N, RETURN solution x h,k+.

21 Convergence for Convex functions A general line search multigrid framework A new line search scheme Let ρ 2 =. Assume f h (x) is twice continuously differentiable and uniformly convex. Suppose the direct search Condition is satisfied by all direct search steps. The step size α h,k is bounded below by a constant. 2 d h,k g h,k η h g h,k 2 and cos(θ h,k ) δ h, where θ h,k is the angle between d h,k and g h,k. 3 {x N,k } converges to the unique minimizer {x N } of f N(x N ). 4 The rate of convergence is at least R-linear. 5 For any ɛ > 0, after at most τ = log((f N(x N,0 ) f N (xn ))/ɛ) log(/c) iterations, where 0 < c = χα η N 2 <, we have f N (x N,k ) f N (xn ) ɛ.

22 A general line search multigrid framework A new line search scheme Convergence for general nonconvex functions Let ρ ρ 2 <. Assume all direct search direction d h,k satisfy the direct search condition; the level set D h = {x h : ψ h (x h ) ψ h (x h,0 )} is bounded; the objective function ψ h is continuously differentiable and the gradient ψ h is Lipschitz continuous. The step size and the directional derivative of the first iteration of each minimization sequence on the coarser levels can be bounded from below by the norm of gradient raised to some finite power 2 At the uppermost level lim k f N (x N,k ) = 0.

23 Implementation Issues Implementation Issues Test Examples Discretization of the problems Choices of the direct search direction Prolongation and Restriction operator Full multigrid method Level recursive Ph direct Rh Iteration Sharing information on different minimization sequences P (l) : P (l+) : min ψ (l) h (x h) = f h (x h ) (v (l) h ) x h, min ψ (l+) h (x h ) = f h (x h ) (v (l+) h ) x h,

24 Test Examples Motivation Implementation Issues Test Examples We compared the following three algorithms: The standard L-BFGS method, denoted by L-BFGS. 2 The mesh refinement technique, denoted by MRLS. 3 The full multigrid Algorithm with one smoothing step, denoted by FMLS. Implementation detail: The problems are discretized by finite difference. The initial point in the FMLS Algorithm is taken to be the zero vector. For the MGLS Algorithm, we set κ = 0 4, ɛ h = 0 5 /5 N h, K = 00, ρ = 0 3, ρ 2 = ρ. The number of gradient and step difference pairs stored by the L-BFGS method was set to 5.

25 Minimal Surface Problem Implementation Issues Test Examples Consider the minimal surface problem min f (u) = + u(x) 2 dx Ω () s.t. u(x) K = {u H (Ω) : u(x) = uω (x) for x Ω}, where Ω = [0, ] [0, ] and { y( y), x = 0,, u Ω (x) = x( x), y = 0,.

26 Minimal Surface Problem Implementation Issues Test Examples Table: Summary of computational costs L-BFGS h nls nfe nge gn 2 CPU e FMLS MRLS h nls nfe nge gn e e-06 CPU h indicates the level; nls", nfe and nge" denote the total number of line searches, the total number of function evaluations and the total number of gradient evaluations at that level, respectively. We also report the total CPU time measured in seconds and the accuracy attained, which is measured by the Euclidean-norm g N 2 of the gradient at the final iteration.

27 Minimal Surface Problem Implementation Issues Test Examples Figure: Iteration history of the FMLS method 9 Iteration History level Iteration

28 Nonlinear PDE Motivation Implementation Issues Test Examples Consider the nonlinear PDE: u + λue u = f in Ω, u = 0 on Ω, (2) where λ = 0, Ω = [0, ] [0, ] and ( ) f = 9π 2 + λe ((x 2 x 3 ) sin(3πy) )(x 2 x 3 ) + 6x 2 sin(3πy), and the exact solution is u = (x 2 x 3 ) sin(3πy). The corresponding variational problem is min F(u) = 2 u 2 λ(ue u e u ) fu dx. Ω

29 Nonlinear PDE Motivation Implementation Issues Test Examples Table: Summary of computational costs L-BFGS h nls nfe nge gn 2 CPU e FMLS MRLS h nls nfe nge gn e e-06 CPU h indicates the level; nls", nfe and nge" denote the total number of line searches, the total number of function evaluations and the total number of gradient evaluations at that level, respectively. We also report the total CPU time measured in seconds and the accuracy attained, which is measured by the Euclidean-norm g N 2 of the gradient at the final iteration.

30 Nonlinear PDE Motivation Implementation Issues Test Examples Figure: Iteration history of the FMLS method 9 Iteration History level Iteration

31 Summary Motivation Implementation Issues Test Examples Interpretation of Multigrid in optimization A new globally convergent line search algorithm by imposing a new condition on a modified backtracking line search procedure Outlook More efficient direct search directions? Multigrid for constrained optimization? Thanks S.Nash and M.Lewis for the multigrid framework P.Toint and his group for the Recursive TR method Y. Yuan for the subspace minimization

A Multilevel Proximal Algorithm for Large Scale Composite Convex Optimization

A Multilevel Proximal Algorithm for Large Scale Composite Convex Optimization A Multilevel Proximal Algorithm for Large Scale Composite Convex Optimization Panos Parpas Department of Computing Imperial College London www.doc.ic.ac.uk/ pp500 p.parpas@imperial.ac.uk jointly with D.V.

More information

Some new developements in nonlinear programming

Some new developements in nonlinear programming Some new developements in nonlinear programming S. Bellavia C. Cartis S. Gratton N. Gould B. Morini M. Mouffe Ph. Toint 1 D. Tomanos M. Weber-Mendonça 1 Department of Mathematics,University of Namur, Belgium

More information

A Recursive Trust-Region Method for Non-Convex Constrained Minimization

A Recursive Trust-Region Method for Non-Convex Constrained Minimization A Recursive Trust-Region Method for Non-Convex Constrained Minimization Christian Groß 1 and Rolf Krause 1 Institute for Numerical Simulation, University of Bonn. {gross,krause}@ins.uni-bonn.de 1 Introduction

More information

Worst Case Complexity of Direct Search

Worst Case Complexity of Direct Search Worst Case Complexity of Direct Search L. N. Vicente May 3, 200 Abstract In this paper we prove that direct search of directional type shares the worst case complexity bound of steepest descent when sufficient

More information

An Inexact Newton Method for Optimization

An Inexact Newton Method for Optimization New York University Brown Applied Mathematics Seminar, February 10, 2009 Brief biography New York State College of William and Mary (B.S.) Northwestern University (M.S. & Ph.D.) Courant Institute (Postdoc)

More information

Inexact Newton Methods and Nonlinear Constrained Optimization

Inexact Newton Methods and Nonlinear Constrained Optimization Inexact Newton Methods and Nonlinear Constrained Optimization Frank E. Curtis EPSRC Symposium Capstone Conference Warwick Mathematics Institute July 2, 2009 Outline PDE-Constrained Optimization Newton

More information

Constrained Minimization and Multigrid

Constrained Minimization and Multigrid Constrained Minimization and Multigrid C. Gräser (FU Berlin), R. Kornhuber (FU Berlin), and O. Sander (FU Berlin) Workshop on PDE Constrained Optimization Hamburg, March 27-29, 2008 Matheon Outline Successive

More information

PDE-Constrained and Nonsmooth Optimization

PDE-Constrained and Nonsmooth Optimization Frank E. Curtis October 1, 2009 Outline PDE-Constrained Optimization Introduction Newton s method Inexactness Results Summary and future work Nonsmooth Optimization Sequential quadratic programming (SQP)

More information

Methods for Unconstrained Optimization Numerical Optimization Lectures 1-2

Methods for Unconstrained Optimization Numerical Optimization Lectures 1-2 Methods for Unconstrained Optimization Numerical Optimization Lectures 1-2 Coralia Cartis, University of Oxford INFOMM CDT: Modelling, Analysis and Computation of Continuous Real-World Problems Methods

More information

Worst Case Complexity of Direct Search

Worst Case Complexity of Direct Search Worst Case Complexity of Direct Search L. N. Vicente October 25, 2012 Abstract In this paper we prove that the broad class of direct-search methods of directional type based on imposing sufficient decrease

More information

Multilevel Optimization Methods: Convergence and Problem Structure

Multilevel Optimization Methods: Convergence and Problem Structure Multilevel Optimization Methods: Convergence and Problem Structure Chin Pang Ho, Panos Parpas Department of Computing, Imperial College London, United Kingdom October 8, 206 Abstract Building upon multigrid

More information

Part 2: Linesearch methods for unconstrained optimization. Nick Gould (RAL)

Part 2: Linesearch methods for unconstrained optimization. Nick Gould (RAL) Part 2: Linesearch methods for unconstrained optimization Nick Gould (RAL) minimize x IR n f(x) MSc course on nonlinear optimization UNCONSTRAINED MINIMIZATION minimize x IR n f(x) where the objective

More information

Optimization. Escuela de Ingeniería Informática de Oviedo. (Dpto. de Matemáticas-UniOvi) Numerical Computation Optimization 1 / 30

Optimization. Escuela de Ingeniería Informática de Oviedo. (Dpto. de Matemáticas-UniOvi) Numerical Computation Optimization 1 / 30 Optimization Escuela de Ingeniería Informática de Oviedo (Dpto. de Matemáticas-UniOvi) Numerical Computation Optimization 1 / 30 Unconstrained optimization Outline 1 Unconstrained optimization 2 Constrained

More information

Line Search Methods for Unconstrained Optimisation

Line Search Methods for Unconstrained Optimisation Line Search Methods for Unconstrained Optimisation Lecture 8, Numerical Linear Algebra and Optimisation Oxford University Computing Laboratory, MT 2007 Dr Raphael Hauser (hauser@comlab.ox.ac.uk) The Generic

More information

An Inexact Newton Method for Nonlinear Constrained Optimization

An Inexact Newton Method for Nonlinear Constrained Optimization An Inexact Newton Method for Nonlinear Constrained Optimization Frank E. Curtis Numerical Analysis Seminar, January 23, 2009 Outline Motivation and background Algorithm development and theoretical results

More information

Introduction. New Nonsmooth Trust Region Method for Unconstraint Locally Lipschitz Optimization Problems

Introduction. New Nonsmooth Trust Region Method for Unconstraint Locally Lipschitz Optimization Problems New Nonsmooth Trust Region Method for Unconstraint Locally Lipschitz Optimization Problems Z. Akbari 1, R. Yousefpour 2, M. R. Peyghami 3 1 Department of Mathematics, K.N. Toosi University of Technology,

More information

Worst case complexity of direct search

Worst case complexity of direct search EURO J Comput Optim (2013) 1:143 153 DOI 10.1007/s13675-012-0003-7 ORIGINAL PAPER Worst case complexity of direct search L. N. Vicente Received: 7 May 2012 / Accepted: 2 November 2012 / Published online:

More information

Multigrid Methods for Elliptic Obstacle Problems on 2D Bisection Grids

Multigrid Methods for Elliptic Obstacle Problems on 2D Bisection Grids Multigrid Methods for Elliptic Obstacle Problems on 2D Bisection Grids Long Chen 1, Ricardo H. Nochetto 2, and Chen-Song Zhang 3 1 Department of Mathematics, University of California at Irvine. chenlong@math.uci.edu

More information

8 Numerical methods for unconstrained problems

8 Numerical methods for unconstrained problems 8 Numerical methods for unconstrained problems Optimization is one of the important fields in numerical computation, beside solving differential equations and linear systems. We can see that these fields

More information

Computational Linear Algebra

Computational Linear Algebra Computational Linear Algebra PD Dr. rer. nat. habil. Ralf-Peter Mundani Computation in Engineering / BGU Scientific Computing in Computer Science / INF Winter Term 2018/19 Part 4: Iterative Methods PD

More information

A globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications

A globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications A globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications Weijun Zhou 28 October 20 Abstract A hybrid HS and PRP type conjugate gradient method for smooth

More information

Part 3: Trust-region methods for unconstrained optimization. Nick Gould (RAL)

Part 3: Trust-region methods for unconstrained optimization. Nick Gould (RAL) Part 3: Trust-region methods for unconstrained optimization Nick Gould (RAL) minimize x IR n f(x) MSc course on nonlinear optimization UNCONSTRAINED MINIMIZATION minimize x IR n f(x) where the objective

More information

An introduction to complexity analysis for nonconvex optimization

An introduction to complexity analysis for nonconvex optimization An introduction to complexity analysis for nonconvex optimization Philippe Toint (with Coralia Cartis and Nick Gould) FUNDP University of Namur, Belgium Séminaire Résidentiel Interdisciplinaire, Saint

More information

arxiv: v1 [math.oc] 17 Dec 2018

arxiv: v1 [math.oc] 17 Dec 2018 ACCELERATING MULTIGRID OPTIMIZATION VIA SESOP TAO HONG, IRAD YAVNEH AND MICHAEL ZIBULEVSKY arxiv:82.06896v [math.oc] 7 Dec 208 Abstract. A merger of two optimization frameworks is introduced: SEquential

More information

Higher-Order Methods

Higher-Order Methods Higher-Order Methods Stephen J. Wright 1 2 Computer Sciences Department, University of Wisconsin-Madison. PCMI, July 2016 Stephen Wright (UW-Madison) Higher-Order Methods PCMI, July 2016 1 / 25 Smooth

More information

A derivative-free nonmonotone line search and its application to the spectral residual method

A derivative-free nonmonotone line search and its application to the spectral residual method IMA Journal of Numerical Analysis (2009) 29, 814 825 doi:10.1093/imanum/drn019 Advance Access publication on November 14, 2008 A derivative-free nonmonotone line search and its application to the spectral

More information

ITERATIVE METHODS FOR NONLINEAR ELLIPTIC EQUATIONS

ITERATIVE METHODS FOR NONLINEAR ELLIPTIC EQUATIONS ITERATIVE METHODS FOR NONLINEAR ELLIPTIC EQUATIONS LONG CHEN In this chapter we discuss iterative methods for solving the finite element discretization of semi-linear elliptic equations of the form: find

More information

Complexity analysis of second-order algorithms based on line search for smooth nonconvex optimization

Complexity analysis of second-order algorithms based on line search for smooth nonconvex optimization Complexity analysis of second-order algorithms based on line search for smooth nonconvex optimization Clément Royer - University of Wisconsin-Madison Joint work with Stephen J. Wright MOPTA, Bethlehem,

More information

5 Quasi-Newton Methods

5 Quasi-Newton Methods Unconstrained Convex Optimization 26 5 Quasi-Newton Methods If the Hessian is unavailable... Notation: H = Hessian matrix. B is the approximation of H. C is the approximation of H 1. Problem: Solve min

More information

Step-size Estimation for Unconstrained Optimization Methods

Step-size Estimation for Unconstrained Optimization Methods Volume 24, N. 3, pp. 399 416, 2005 Copyright 2005 SBMAC ISSN 0101-8205 www.scielo.br/cam Step-size Estimation for Unconstrained Optimization Methods ZHEN-JUN SHI 1,2 and JIE SHEN 3 1 College of Operations

More information

Convex Optimization. Problem set 2. Due Monday April 26th

Convex Optimization. Problem set 2. Due Monday April 26th Convex Optimization Problem set 2 Due Monday April 26th 1 Gradient Decent without Line-search In this problem we will consider gradient descent with predetermined step sizes. That is, instead of determining

More information

Kasetsart University Workshop. Multigrid methods: An introduction

Kasetsart University Workshop. Multigrid methods: An introduction Kasetsart University Workshop Multigrid methods: An introduction Dr. Anand Pardhanani Mathematics Department Earlham College Richmond, Indiana USA pardhan@earlham.edu A copy of these slides is available

More information

Programming, numerics and optimization

Programming, numerics and optimization Programming, numerics and optimization Lecture C-3: Unconstrained optimization II Łukasz Jankowski ljank@ippt.pan.pl Institute of Fundamental Technological Research Room 4.32, Phone +22.8261281 ext. 428

More information

Introduction. A Modified Steepest Descent Method Based on BFGS Method for Locally Lipschitz Functions. R. Yousefpour 1

Introduction. A Modified Steepest Descent Method Based on BFGS Method for Locally Lipschitz Functions. R. Yousefpour 1 A Modified Steepest Descent Method Based on BFGS Method for Locally Lipschitz Functions R. Yousefpour 1 1 Department Mathematical Sciences, University of Mazandaran, Babolsar, Iran; yousefpour@umz.ac.ir

More information

A Sobolev trust-region method for numerical solution of the Ginz

A Sobolev trust-region method for numerical solution of the Ginz A Sobolev trust-region method for numerical solution of the Ginzburg-Landau equations Robert J. Renka Parimah Kazemi Department of Computer Science & Engineering University of North Texas June 6, 2012

More information

Optimisation non convexe avec garanties de complexité via Newton+gradient conjugué

Optimisation non convexe avec garanties de complexité via Newton+gradient conjugué Optimisation non convexe avec garanties de complexité via Newton+gradient conjugué Clément Royer (Université du Wisconsin-Madison, États-Unis) Toulouse, 8 janvier 2019 Nonconvex optimization via Newton-CG

More information

On Nonlinear Dirichlet Neumann Algorithms for Jumping Nonlinearities

On Nonlinear Dirichlet Neumann Algorithms for Jumping Nonlinearities On Nonlinear Dirichlet Neumann Algorithms for Jumping Nonlinearities Heiko Berninger, Ralf Kornhuber, and Oliver Sander FU Berlin, FB Mathematik und Informatik (http://www.math.fu-berlin.de/rd/we-02/numerik/)

More information

Unconstrained optimization

Unconstrained optimization Chapter 4 Unconstrained optimization An unconstrained optimization problem takes the form min x Rnf(x) (4.1) for a target functional (also called objective function) f : R n R. In this chapter and throughout

More information

Unconstrained minimization of smooth functions

Unconstrained minimization of smooth functions Unconstrained minimization of smooth functions We want to solve min x R N f(x), where f is convex. In this section, we will assume that f is differentiable (so its gradient exists at every point), and

More information

Nonlinear Multigrid and Domain Decomposition Methods

Nonlinear Multigrid and Domain Decomposition Methods Nonlinear Multigrid and Domain Decomposition Methods Rolf Krause Institute of Computational Science Universit`a della Svizzera italiana 1 Stepping Stone Symposium, Geneva, 2017 Non-linear problems Non-linear

More information

min f(x). (2.1) Objectives consisting of a smooth convex term plus a nonconvex regularization term;

min f(x). (2.1) Objectives consisting of a smooth convex term plus a nonconvex regularization term; Chapter 2 Gradient Methods The gradient method forms the foundation of all of the schemes studied in this book. We will provide several complementary perspectives on this algorithm that highlight the many

More information

Numerisches Rechnen. (für Informatiker) M. Grepl P. Esser & G. Welper & L. Zhang. Institut für Geometrie und Praktische Mathematik RWTH Aachen

Numerisches Rechnen. (für Informatiker) M. Grepl P. Esser & G. Welper & L. Zhang. Institut für Geometrie und Praktische Mathematik RWTH Aachen Numerisches Rechnen (für Informatiker) M. Grepl P. Esser & G. Welper & L. Zhang Institut für Geometrie und Praktische Mathematik RWTH Aachen Wintersemester 2011/12 IGPM, RWTH Aachen Numerisches Rechnen

More information

1. Fast Iterative Solvers of SLE

1. Fast Iterative Solvers of SLE 1. Fast Iterative Solvers of crucial drawback of solvers discussed so far: they become slower if we discretize more accurate! now: look for possible remedies relaxation: explicit application of the multigrid

More information

1. Introduction. In this work we consider the solution of finite-dimensional constrained optimization problems of the form

1. Introduction. In this work we consider the solution of finite-dimensional constrained optimization problems of the form MULTILEVEL ALGORITHMS FOR LARGE-SCALE INTERIOR POINT METHODS MICHELE BENZI, ELDAD HABER, AND LAUREN TARALLI Abstract. We develop and compare multilevel algorithms for solving constrained nonlinear variational

More information

On Multigrid for Phase Field

On Multigrid for Phase Field On Multigrid for Phase Field Carsten Gräser (FU Berlin), Ralf Kornhuber (FU Berlin), Rolf Krause (Uni Bonn), and Vanessa Styles (University of Sussex) Interphase 04 Rome, September, 13-16, 2004 Synopsis

More information

Large-Scale Nonlinear Optimization with Inexact Step Computations

Large-Scale Nonlinear Optimization with Inexact Step Computations Large-Scale Nonlinear Optimization with Inexact Step Computations Andreas Wächter IBM T.J. Watson Research Center Yorktown Heights, New York andreasw@us.ibm.com IPAM Workshop on Numerical Methods for Continuous

More information

A SHORT NOTE COMPARING MULTIGRID AND DOMAIN DECOMPOSITION FOR PROTEIN MODELING EQUATIONS

A SHORT NOTE COMPARING MULTIGRID AND DOMAIN DECOMPOSITION FOR PROTEIN MODELING EQUATIONS A SHORT NOTE COMPARING MULTIGRID AND DOMAIN DECOMPOSITION FOR PROTEIN MODELING EQUATIONS MICHAEL HOLST AND FAISAL SAIED Abstract. We consider multigrid and domain decomposition methods for the numerical

More information

Solving Symmetric Indefinite Systems with Symmetric Positive Definite Preconditioners

Solving Symmetric Indefinite Systems with Symmetric Positive Definite Preconditioners Solving Symmetric Indefinite Systems with Symmetric Positive Definite Preconditioners Eugene Vecharynski 1 Andrew Knyazev 2 1 Department of Computer Science and Engineering University of Minnesota 2 Department

More information

1. Introduction. We consider the numerical solution of the unconstrained (possibly nonconvex) optimization problem

1. Introduction. We consider the numerical solution of the unconstrained (possibly nonconvex) optimization problem SIAM J. OPTIM. Vol. 2, No. 6, pp. 2833 2852 c 2 Society for Industrial and Applied Mathematics ON THE COMPLEXITY OF STEEPEST DESCENT, NEWTON S AND REGULARIZED NEWTON S METHODS FOR NONCONVEX UNCONSTRAINED

More information

Step lengths in BFGS method for monotone gradients

Step lengths in BFGS method for monotone gradients Noname manuscript No. (will be inserted by the editor) Step lengths in BFGS method for monotone gradients Yunda Dong Received: date / Accepted: date Abstract In this paper, we consider how to directly

More information

On the complexity of an Inexact Restoration method for constrained optimization

On the complexity of an Inexact Restoration method for constrained optimization On the complexity of an Inexact Restoration method for constrained optimization L. F. Bueno J. M. Martínez September 18, 2018 Abstract Recent papers indicate that some algorithms for constrained optimization

More information

Optimal Newton-type methods for nonconvex smooth optimization problems

Optimal Newton-type methods for nonconvex smooth optimization problems Optimal Newton-type methods for nonconvex smooth optimization problems Coralia Cartis, Nicholas I. M. Gould and Philippe L. Toint June 9, 20 Abstract We consider a general class of second-order iterations

More information

OPER 627: Nonlinear Optimization Lecture 14: Mid-term Review

OPER 627: Nonlinear Optimization Lecture 14: Mid-term Review OPER 627: Nonlinear Optimization Lecture 14: Mid-term Review Department of Statistical Sciences and Operations Research Virginia Commonwealth University Oct 16, 2013 (Lecture 14) Nonlinear Optimization

More information

E5295/5B5749 Convex optimization with engineering applications. Lecture 8. Smooth convex unconstrained and equality-constrained minimization

E5295/5B5749 Convex optimization with engineering applications. Lecture 8. Smooth convex unconstrained and equality-constrained minimization E5295/5B5749 Convex optimization with engineering applications Lecture 8 Smooth convex unconstrained and equality-constrained minimization A. Forsgren, KTH 1 Lecture 8 Convex optimization 2006/2007 Unconstrained

More information

On efficiency of nonmonotone Armijo-type line searches

On efficiency of nonmonotone Armijo-type line searches Noname manuscript No. (will be inserted by the editor On efficiency of nonmonotone Armijo-type line searches Masoud Ahookhosh Susan Ghaderi Abstract Monotonicity and nonmonotonicity play a key role in

More information

Multigrid and Domain Decomposition Methods for Electrostatics Problems

Multigrid and Domain Decomposition Methods for Electrostatics Problems Multigrid and Domain Decomposition Methods for Electrostatics Problems Michael Holst and Faisal Saied Abstract. We consider multigrid and domain decomposition methods for the numerical solution of electrostatics

More information

Math 164: Optimization Barzilai-Borwein Method

Math 164: Optimization Barzilai-Borwein Method Math 164: Optimization Barzilai-Borwein Method Instructor: Wotao Yin Department of Mathematics, UCLA Spring 2015 online discussions on piazza.com Main features of the Barzilai-Borwein (BB) method The BB

More information

Algorithms for Nonsmooth Optimization

Algorithms for Nonsmooth Optimization Algorithms for Nonsmooth Optimization Frank E. Curtis, Lehigh University presented at Center for Optimization and Statistical Learning, Northwestern University 2 March 2018 Algorithms for Nonsmooth Optimization

More information

Notes on Numerical Optimization

Notes on Numerical Optimization Notes on Numerical Optimization University of Chicago, 2014 Viva Patel October 18, 2014 1 Contents Contents 2 List of Algorithms 4 I Fundamentals of Optimization 5 1 Overview of Numerical Optimization

More information

Multigrid absolute value preconditioning

Multigrid absolute value preconditioning Multigrid absolute value preconditioning Eugene Vecharynski 1 Andrew Knyazev 2 (speaker) 1 Department of Computer Science and Engineering University of Minnesota 2 Department of Mathematical and Statistical

More information

An Introduction to Algebraic Multigrid (AMG) Algorithms Derrick Cerwinsky and Craig C. Douglas 1/84

An Introduction to Algebraic Multigrid (AMG) Algorithms Derrick Cerwinsky and Craig C. Douglas 1/84 An Introduction to Algebraic Multigrid (AMG) Algorithms Derrick Cerwinsky and Craig C. Douglas 1/84 Introduction Almost all numerical methods for solving PDEs will at some point be reduced to solving A

More information

Static unconstrained optimization

Static unconstrained optimization Static unconstrained optimization 2 In unconstrained optimization an objective function is minimized without any additional restriction on the decision variables, i.e. min f(x) x X ad (2.) with X ad R

More information

A multilevel, level-set method for optimizing eigenvalues in shape design problems

A multilevel, level-set method for optimizing eigenvalues in shape design problems A multilevel, level-set method for optimizing eigenvalues in shape design problems E. Haber July 22, 2003 Abstract In this paper we consider optimal design problems that involve shape optimization. The

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra) AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 24: Preconditioning and Multigrid Solver Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 5 Preconditioning Motivation:

More information

Interpolation-Based Trust-Region Methods for DFO

Interpolation-Based Trust-Region Methods for DFO Interpolation-Based Trust-Region Methods for DFO Luis Nunes Vicente University of Coimbra (joint work with A. Bandeira, A. R. Conn, S. Gratton, and K. Scheinberg) July 27, 2010 ICCOPT, Santiago http//www.mat.uc.pt/~lnv

More information

Infeasibility Detection and an Inexact Active-Set Method for Large-Scale Nonlinear Optimization

Infeasibility Detection and an Inexact Active-Set Method for Large-Scale Nonlinear Optimization Infeasibility Detection and an Inexact Active-Set Method for Large-Scale Nonlinear Optimization Frank E. Curtis, Lehigh University involving joint work with James V. Burke, University of Washington Daniel

More information

NONSMOOTH VARIANTS OF POWELL S BFGS CONVERGENCE THEOREM

NONSMOOTH VARIANTS OF POWELL S BFGS CONVERGENCE THEOREM NONSMOOTH VARIANTS OF POWELL S BFGS CONVERGENCE THEOREM JIAYI GUO AND A.S. LEWIS Abstract. The popular BFGS quasi-newton minimization algorithm under reasonable conditions converges globally on smooth

More information

Overlapping Domain Decomposition and Multigrid Methods for Inverse Problems

Overlapping Domain Decomposition and Multigrid Methods for Inverse Problems Contemporary Mathematics Volume 8, 998 B -88-988--35- Overlapping Domain Decomposition and Multigrid Methods for Inverse Problems Xue-Cheng Tai, Johnny Frøyen, Magne S. Espedal, and Tony F. Chan. Introduction

More information

Unconstrained Optimization

Unconstrained Optimization 1 / 36 Unconstrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University February 2, 2015 2 / 36 3 / 36 4 / 36 5 / 36 1. preliminaries 1.1 local approximation

More information

An Alternative Three-Term Conjugate Gradient Algorithm for Systems of Nonlinear Equations

An Alternative Three-Term Conjugate Gradient Algorithm for Systems of Nonlinear Equations International Journal of Mathematical Modelling & Computations Vol. 07, No. 02, Spring 2017, 145-157 An Alternative Three-Term Conjugate Gradient Algorithm for Systems of Nonlinear Equations L. Muhammad

More information

MULTIGRID METHODS FOR NONLINEAR PROBLEMS: AN OVERVIEW

MULTIGRID METHODS FOR NONLINEAR PROBLEMS: AN OVERVIEW MULTIGRID METHODS FOR NONLINEAR PROBLEMS: AN OVERVIEW VAN EMDEN HENSON CENTER FOR APPLIED SCIENTIFIC COMPUTING LAWRENCE LIVERMORE NATIONAL LABORATORY Abstract Since their early application to elliptic

More information

New Multigrid Solver Advances in TOPS

New Multigrid Solver Advances in TOPS New Multigrid Solver Advances in TOPS R D Falgout 1, J Brannick 2, M Brezina 2, T Manteuffel 2 and S McCormick 2 1 Center for Applied Scientific Computing, Lawrence Livermore National Laboratory, P.O.

More information

Stochastic Quasi-Newton Methods

Stochastic Quasi-Newton Methods Stochastic Quasi-Newton Methods Donald Goldfarb Department of IEOR Columbia University UCLA Distinguished Lecture Series May 17-19, 2016 1 / 35 Outline Stochastic Approximation Stochastic Gradient Descent

More information

A multigrid method for large scale inverse problems

A multigrid method for large scale inverse problems A multigrid method for large scale inverse problems Eldad Haber Dept. of Computer Science, Dept. of Earth and Ocean Science University of British Columbia haber@cs.ubc.ca July 4, 2003 E.Haber: Multigrid

More information

Complexity of gradient descent for multiobjective optimization

Complexity of gradient descent for multiobjective optimization Complexity of gradient descent for multiobjective optimization J. Fliege A. I. F. Vaz L. N. Vicente July 18, 2018 Abstract A number of first-order methods have been proposed for smooth multiobjective optimization

More information

Trust Region Methods. Lecturer: Pradeep Ravikumar Co-instructor: Aarti Singh. Convex Optimization /36-725

Trust Region Methods. Lecturer: Pradeep Ravikumar Co-instructor: Aarti Singh. Convex Optimization /36-725 Trust Region Methods Lecturer: Pradeep Ravikumar Co-instructor: Aarti Singh Convex Optimization 10-725/36-725 Trust Region Methods min p m k (p) f(x k + p) s.t. p 2 R k Iteratively solve approximations

More information

arxiv: v1 [math.na] 11 Jul 2011

arxiv: v1 [math.na] 11 Jul 2011 Multigrid Preconditioner for Nonconforming Discretization of Elliptic Problems with Jump Coefficients arxiv:07.260v [math.na] Jul 20 Blanca Ayuso De Dios, Michael Holst 2, Yunrong Zhu 2, and Ludmil Zikatanov

More information

Uniform Convergence of a Multilevel Energy-based Quantization Scheme

Uniform Convergence of a Multilevel Energy-based Quantization Scheme Uniform Convergence of a Multilevel Energy-based Quantization Scheme Maria Emelianenko 1 and Qiang Du 1 Pennsylvania State University, University Park, PA 16803 emeliane@math.psu.edu and qdu@math.psu.edu

More information

Convergence and Descent Properties for a Class of Multilevel Optimization Algorithms

Convergence and Descent Properties for a Class of Multilevel Optimization Algorithms Convergence and Descent Properties for a Class of Multilevel Optimization Algoritms Stepen G. Nas April 28, 2010 Abstract I present a multilevel optimization approac (termed MG/Opt) for te solution of

More information

1. Introduction. We consider the optimization of systems governed by differential equations. The nonlinear optimization problems have the form

1. Introduction. We consider the optimization of systems governed by differential equations. The nonlinear optimization problems have the form SIAM J. SCI. COMPUT. Vol. 26, No. 6, pp. 1811 1837 c 2005 Society for Industrial and Applied Mathematics MODEL PROBLEMS FOR THE MULTIGRID OPTIMIZATION OF SYSTEMS GOVERNED BY DIFFERENTIAL EQUATIONS ROBERT

More information

Complexity Analysis of Interior Point Algorithms for Non-Lipschitz and Nonconvex Minimization

Complexity Analysis of Interior Point Algorithms for Non-Lipschitz and Nonconvex Minimization Mathematical Programming manuscript No. (will be inserted by the editor) Complexity Analysis of Interior Point Algorithms for Non-Lipschitz and Nonconvex Minimization Wei Bian Xiaojun Chen Yinyu Ye July

More information

Chapter 7 Iterative Techniques in Matrix Algebra

Chapter 7 Iterative Techniques in Matrix Algebra Chapter 7 Iterative Techniques in Matrix Algebra Per-Olof Persson persson@berkeley.edu Department of Mathematics University of California, Berkeley Math 128B Numerical Analysis Vector Norms Definition

More information

Zero-Order Methods for the Optimization of Noisy Functions. Jorge Nocedal

Zero-Order Methods for the Optimization of Noisy Functions. Jorge Nocedal Zero-Order Methods for the Optimization of Noisy Functions Jorge Nocedal Northwestern University Simons Institute, October 2017 1 Collaborators Albert Berahas Northwestern University Richard Byrd University

More information

Institutional Repository - Research Portal Dépôt Institutionnel - Portail de la Recherche

Institutional Repository - Research Portal Dépôt Institutionnel - Portail de la Recherche Institutional Repository - Research Portal Dépôt Institutionnel - Portail de la Recherche researchportal.unamur.be RESEARCH OUTPUTS / RÉSULTATS DE RECHERCHE Armijo-type condition for the determination

More information

Geometric Multigrid Methods

Geometric Multigrid Methods Geometric Multigrid Methods Susanne C. Brenner Department of Mathematics and Center for Computation & Technology Louisiana State University IMA Tutorial: Fast Solution Techniques November 28, 2010 Ideas

More information

1 Numerical optimization

1 Numerical optimization Contents 1 Numerical optimization 5 1.1 Optimization of single-variable functions............ 5 1.1.1 Golden Section Search................... 6 1.1. Fibonacci Search...................... 8 1. Algorithms

More information

1 Newton s Method. Suppose we want to solve: x R. At x = x, f (x) can be approximated by:

1 Newton s Method. Suppose we want to solve: x R. At x = x, f (x) can be approximated by: Newton s Method Suppose we want to solve: (P:) min f (x) At x = x, f (x) can be approximated by: n x R. f (x) h(x) := f ( x)+ f ( x) T (x x)+ (x x) t H ( x)(x x), 2 which is the quadratic Taylor expansion

More information

Keywords: Nonlinear least-squares problems, regularized models, error bound condition, local convergence.

Keywords: Nonlinear least-squares problems, regularized models, error bound condition, local convergence. STRONG LOCAL CONVERGENCE PROPERTIES OF ADAPTIVE REGULARIZED METHODS FOR NONLINEAR LEAST-SQUARES S. BELLAVIA AND B. MORINI Abstract. This paper studies adaptive regularized methods for nonlinear least-squares

More information

On Nesterov s Random Coordinate Descent Algorithms - Continued

On Nesterov s Random Coordinate Descent Algorithms - Continued On Nesterov s Random Coordinate Descent Algorithms - Continued Zheng Xu University of Texas At Arlington February 20, 2015 1 Revisit Random Coordinate Descent The Random Coordinate Descent Upper and Lower

More information

Numerical Solution of a Coefficient Identification Problem in the Poisson equation

Numerical Solution of a Coefficient Identification Problem in the Poisson equation Swiss Federal Institute of Technology Zurich Seminar for Applied Mathematics Department of Mathematics Bachelor Thesis Spring semester 2014 Thomas Haener Numerical Solution of a Coefficient Identification

More information

Spectral element agglomerate AMGe

Spectral element agglomerate AMGe Spectral element agglomerate AMGe T. Chartier 1, R. Falgout 2, V. E. Henson 2, J. E. Jones 4, T. A. Manteuffel 3, S. F. McCormick 3, J. W. Ruge 3, and P. S. Vassilevski 2 1 Department of Mathematics, Davidson

More information

An Inexact Sequential Quadratic Optimization Method for Nonlinear Optimization

An Inexact Sequential Quadratic Optimization Method for Nonlinear Optimization An Inexact Sequential Quadratic Optimization Method for Nonlinear Optimization Frank E. Curtis, Lehigh University involving joint work with Travis Johnson, Northwestern University Daniel P. Robinson, Johns

More information

On the convergence properties of the modified Polak Ribiére Polyak method with the standard Armijo line search

On the convergence properties of the modified Polak Ribiére Polyak method with the standard Armijo line search ANZIAM J. 55 (E) pp.e79 E89, 2014 E79 On the convergence properties of the modified Polak Ribiére Polyak method with the standard Armijo line search Lijun Li 1 Weijun Zhou 2 (Received 21 May 2013; revised

More information

Gradient-Based Optimization

Gradient-Based Optimization Multidisciplinary Design Optimization 48 Chapter 3 Gradient-Based Optimization 3. Introduction In Chapter we described methods to minimize (or at least decrease) a function of one variable. While problems

More information

MULTIGRID PRECONDITIONING IN H(div) ON NON-CONVEX POLYGONS* Dedicated to Professor Jim Douglas, Jr. on the occasion of his seventieth birthday.

MULTIGRID PRECONDITIONING IN H(div) ON NON-CONVEX POLYGONS* Dedicated to Professor Jim Douglas, Jr. on the occasion of his seventieth birthday. MULTIGRID PRECONDITIONING IN H(div) ON NON-CONVEX POLYGONS* DOUGLAS N ARNOLD, RICHARD S FALK, and RAGNAR WINTHER Dedicated to Professor Jim Douglas, Jr on the occasion of his seventieth birthday Abstract

More information

Solving PDEs with Multigrid Methods p.1

Solving PDEs with Multigrid Methods p.1 Solving PDEs with Multigrid Methods Scott MacLachlan maclachl@colorado.edu Department of Applied Mathematics, University of Colorado at Boulder Solving PDEs with Multigrid Methods p.1 Support and Collaboration

More information

Adaptive methods for control problems with finite-dimensional control space

Adaptive methods for control problems with finite-dimensional control space Adaptive methods for control problems with finite-dimensional control space Saheed Akindeinde and Daniel Wachsmuth Johann Radon Institute for Computational and Applied Mathematics (RICAM) Austrian Academy

More information

Global Convergence of Perry-Shanno Memoryless Quasi-Newton-type Method. 1 Introduction

Global Convergence of Perry-Shanno Memoryless Quasi-Newton-type Method. 1 Introduction ISSN 1749-3889 (print), 1749-3897 (online) International Journal of Nonlinear Science Vol.11(2011) No.2,pp.153-158 Global Convergence of Perry-Shanno Memoryless Quasi-Newton-type Method Yigui Ou, Jun Zhang

More information

Nonsmooth Schur Newton Methods and Applications

Nonsmooth Schur Newton Methods and Applications Nonsmooth Schur Newton Methods and Applications C. Gräser, R. Kornhuber, and U. Sack IMA Annual Program Year Workshop Numerical Solutions of Partial Differential Equations: Fast Solution Techniques November

More information

A line-search algorithm inspired by the adaptive cubic regularization framework, with a worst-case complexity O(ɛ 3/2 ).

A line-search algorithm inspired by the adaptive cubic regularization framework, with a worst-case complexity O(ɛ 3/2 ). A line-search algorithm inspired by the adaptive cubic regularization framewor, with a worst-case complexity Oɛ 3/. E. Bergou Y. Diouane S. Gratton June 16, 017 Abstract Adaptive regularized framewor using

More information