Optimization with nonnegativity constraints
|
|
- Samantha Powell
- 6 years ago
- Views:
Transcription
1 Optimization with nonnegativity constraints Arie Verhoeven CASA Seminar, May 30, 2007
2 Seminar: Inverse problems 1 Introduction Yves van Gennip February 21 2 Regularization strategies Miguel Patricio March 3 3 Regularization by Hans Groot March 21 Galerkin methods 4 Inverse eigenvalue problems Marco Veneroni April 4 5 Image deblurring Willem Dijkstra April 18 6 Parameter identification Nico van der Aa May 2 7 Total variation regularization Mark van Kraaij May 23 8 Optimization with Arie Verhoeven May 30 nonnegativity constraints 9?? Martijn Slob June 13 10?? Marc Noot June 20
3 Outline 1 Introduction 2 Theory of constrained optimization 3 Numerical variational methods 4 Iterative nonnegative regularization methods 5 Numerical test results 6 Conclusions
4 Outline 1 Introduction 2 Theory of constrained optimization 3 Numerical variational methods 4 Iterative nonnegative regularization methods 5 Numerical test results 6 Conclusions
5 Formulation of the problem Discretized mathematical model: Discrete Fourier Transform: f true i = f true (x i ). g true = K f true. Measurements, e.g. from astronomical imaging, satisfy d i Poisson(g true i ) + Normal(0, σ 2 ). Goal: find f for given d and K.
6 Least Squares method with Tikhonov regularization The least squares solution minimizes the functional: The minimizer is given by J ls (f) = 1 2 Kf d 2 + α 2 f 2. f ls α = [ K T K + αi] 1 K T d. The regularization parameter α is selected to minimize f ls α f true. But this approach does not work in general when f true is unknown.
7 The L-curve method Define X(α) = log Kf α d 2 and Y (α) = log f α 2. With Tikhonov regularization X, Y are smooth functions. Then α can be selected which maximizes the curvature function Ẍ(α)Ẏ (α) Ẋ(α)Ÿ (α) κ(α) = (Ẋ(α)2 + Ẏ. (α)2 ) 3 2 Note that this selected point corresponds to the "corner" of the L-curve, which plots X against Y. Although this method is nonconvergent, it can be used to improve the value of α without knowledge of f true.
8 Nonnegatively constrained minimization For astronomical imaging it is well-known that f i 0. Thus it is more accurate to solve instead min f J ls (f) subject to f 0. Even better is to include the stochastic information of the measurements. Then we solve the following constrained Poisson likelihood minimization problem where J lhd (f) = i=1 min f J lhd (f) subject to f 0, n n (g i +σ 2 )+ ((max{d i, 0}+σ 2 ) log(g i +σ 2 ))+ α 2 f 2. i=1
9 Example 750, 0.1 < x < 0.25, 250, 0.3 < x < 0.32, f true (x) = (x 0.75)(0.85 x), 0.75 < x < 0.85, 0 otherwise.
10 Constrained likelihood minimization
11 Outline 1 Introduction 2 Theory of constrained optimization 3 Numerical variational methods 4 Iterative nonnegative regularization methods 5 Numerical test results 6 Conclusions
12 Optimization Consider the inequality constrained minimization problem Active set of indices: min J(f) subject to c(f) 0. f R n A(f) = {i c i (f) = 0}.
13 KKT conditions Karush-Kuhn-Tucker (first order necessary) conditions for inequality constrained minimization: There exists a vector λ such that m gradj(f ) λ i gradc i(f ) = 0 and i=1 λ i 0, c i (f ) 0, λ i c i(f ) = 0. We define the projection of f onto C as P C (f) = arg min f v. v C The operator P C is well defined and continuous.
14 Nonnegativity constraints Now we consider the problem min J(f) subject to f 0. f R n If J is continuously differentiable and f a local minimizer, it follows that λ = gradj(f ). Then we obtain J f i (f ) 0, i 0, fi J (f ) = 0. f i f A point which satisfies these conditions is a critical point, but it needs not to be a minimizer if.e.g. J is not strictly convex,
15 Nonnegativity constraints II For nonnegativity constraints, we define the feasible set C = {f f 0}. The projected gradient C J : C R n satisfies { J f [ C J] i = i (f) if f i > 0, min{0, J f i (f)} if f i = 0. Thus f is a critical point if and only if C J(f ) = 0. For given C we define P(f) = arg min v f. v 0 If f is a local minimizer, then f = P(f τgradj(f )) for any τ > 0.
16 Outline 1 Introduction 2 Theory of constrained optimization 3 Numerical variational methods 4 Iterative nonnegative regularization methods 5 Numerical test results 6 Conclusions
17 Gradient projection method ν := 0; f 0 := nonnegative initial guess; begin p ν := grad J(f ν ); τ ν := arg min τ>0 J(P(f ν + τp ν )); f ν+1 := P(f ν + τ ν p ν ); ν := ν + 1; end This generalized Steepest Descent method converges linearly to the global minimizer if J is strictly convex, coercive, and Lipschitz continuous.
18 Projected Newton method ν := 0; f 0 := nonnegative initial guess; begin g ν := grad J(f ν ); Identify active set A ν ; H R := reduced Hessian at f ν ; s := H 1 R g ν ; τ ν := arg min τ>0 J(P(f ν + τs)); f ν+1 := P(f ν + τ ν s); ν := ν + 1; end The reduced Hessian equals { δij if i A(f) or j A [H R ] ij = 2 J f i f j otherwise. If the active set can be correctly identified, this algorithm will be locally quadratically convergent.
19 Gradient projection-reduced Newton method ν := 0; f 0 := nonnegative initial guess; begin Gradient Projection Stage p GP := grad J(f ν ); τ GP := arg min τ>0 J(P(f ν + τp GP )); f GP ν := P(f ν + τ GP p GP ); Reduced Newton Stage Identify active set A(f GP ν ); g R := reduced gradient at f GP ν ; ν ; H R := reduced Hessian at f GP s := H 1 R g R ; τ RN := arg min τ>0 J(P(f GP ν f ν+1 := P(f GP ν + τ RN s); ν := ν + 1; end + τs)); We need { 0 if i A(f), [g R (f)] i = J f i (f) otherwise. This algorithm combines the global convergence of Gradient Projection with the locally quadratic rate of Projected Newton. For large-scale problems the linear system H R s = g R could be solved by an iterative method like the Conjugate Gradient method.
20 Outline 1 Introduction 2 Theory of constrained optimization 3 Numerical variational methods 4 Iterative nonnegative regularization methods 5 Numerical test results 6 Conclusions
21 Richardson-Lucy iteration Iterative methods use iteration count as regularization parameter. Consider m J(f) = d i log[kf] i. i=1 Approximation of maximizer by Richardson-Lucy iteration: f ν+1 j = f j ν k j m ( k ij i=1 d i n l=1 k ilf ν l ), where k j = m k lj. We get a sequence of approximations to the maximizer of J(f) subject to m m [Kf] i = d i. i=1 i=1 l=1
22 Modified Reduced Newton Steepest Descent ν := 0; f 0 := nonnegative initial guess; g 0 := K T (Kf 0 d); γ := (g 0, f 0. g 0 ); begin p ν := f ν. g ν ; u := Kp ν ; τ bndry := min{ [f ν ] i /[p ν ] i [p ν ] i < 0}; f ν+1 := f ν + τ ν p ν ; g ν+1 := g ν + τ ν K T u; γ := (g ν+1, f ν+1. g ν+1 ); ν := ν + 1; end
23 Outline 1 Introduction 2 Theory of constrained optimization 3 Numerical variational methods 4 Iterative nonnegative regularization methods 5 Numerical test results 6 Conclusions
24 1D
25 2D
26 Outline 1 Introduction 2 Theory of constrained optimization 3 Numerical variational methods 4 Iterative nonnegative regularization methods 5 Numerical test results 6 Conclusions
27 Summary Nonnegitivity constraints Theory of constrained optimization Variational methods Gradient projection method Projected Newton method Gradient projection-reduced Newton method Gradient projection-cg method Iterative methods Richardson-Lucy iteration Modified Steepest Descent algorithm Numerical test results
28 Conclusions Optimization with nonnegativity constraints often leads to more accurate reconstructions with e.g. less unwanted oscillations. Optimizing Poisson likelihood is more accurate than Least Squares. Iterative methods are preferable if no good a priori value of the regularization parameter is available. Variational regularization methods are more flexible, because they allow the use of prior information about the solution and constraints.
29 Literature. C.R. Vogel: Computational methods for inverse problems, SIAM, Philadelphia, 2002, pp
30 Literature. C.R. Vogel: Computational methods for inverse problems, SIAM, Philadelphia, 2002, pp J. Nocedal and S.J. Wright: Numerical optimization, Springer-Verlag, New York, S.G. Nash and A. Sofer: Linear and nonlinear programming, McGraw-Hill, New York, H.W. Engl, M. Hanke and A. Neubauer: Regularization of inverse problems, Kluwer Academic Publishers, Dordrecht, W.H. Richardson: Bayesian-based iterative methods for image restoration, Journal of the Optical Society of America, 62 (1972), pp B. Lucy: An iterative method for the rectification of observed distributions, Astronomical Journal, 79 (1974), pp
31 Introduction Theory of constrained optimization Numerical variational methods Iterative nonnegative regularization methods Num Questions?
Inverse problems Total Variation Regularization Mark van Kraaij Casa seminar 23 May 2007 Technische Universiteit Eindh ove n University of Technology
Inverse problems Total Variation Regularization Mark van Kraaij Casa seminar 23 May 27 Introduction Fredholm first kind integral equation of convolution type in one space dimension: g(x) = 1 k(x x )f(x
More informationConstrained Optimization
1 / 22 Constrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University March 30, 2015 2 / 22 1. Equality constraints only 1.1 Reduced gradient 1.2 Lagrange
More informationA Limited Memory, Quasi-Newton Preconditioner. for Nonnegatively Constrained Image. Reconstruction
A Limited Memory, Quasi-Newton Preconditioner for Nonnegatively Constrained Image Reconstruction Johnathan M. Bardsley Department of Mathematical Sciences, The University of Montana, Missoula, MT 59812-864
More informationNumerical Optimization of Partial Differential Equations
Numerical Optimization of Partial Differential Equations Part I: basic optimization concepts in R n Bartosz Protas Department of Mathematics & Statistics McMaster University, Hamilton, Ontario, Canada
More informationTikhonov Regularized Poisson Likelihood Estimation: Theoretical Justification and a Computational Method
Inverse Problems in Science and Engineering Vol. 00, No. 00, December 2006, 1 19 Tikhonov Regularized Poisson Likelihood Estimation: Theoretical Justification and a Computational Method Johnathan M. Bardsley
More informationA Theoretical Framework for the Regularization of Poisson Likelihood Estimation Problems
c de Gruyter 2007 J. Inv. Ill-Posed Problems 15 (2007), 12 8 DOI 10.1515 / JIP.2007.002 A Theoretical Framework for the Regularization of Poisson Likelihood Estimation Problems Johnathan M. Bardsley Communicated
More informationNonlinear Optimization: What s important?
Nonlinear Optimization: What s important? Julian Hall 10th May 2012 Convexity: convex problems A local minimizer is a global minimizer A solution of f (x) = 0 (stationary point) is a minimizer A global
More information2.098/6.255/ Optimization Methods Practice True/False Questions
2.098/6.255/15.093 Optimization Methods Practice True/False Questions December 11, 2009 Part I For each one of the statements below, state whether it is true or false. Include a 1-3 line supporting sentence
More informationNumerical Optimization Professor Horst Cerjak, Horst Bischof, Thomas Pock Mat Vis-Gra SS09
Numerical Optimization 1 Working Horse in Computer Vision Variational Methods Shape Analysis Machine Learning Markov Random Fields Geometry Common denominator: optimization problems 2 Overview of Methods
More informationSuppose that the approximate solutions of Eq. (1) satisfy the condition (3). Then (1) if η = 0 in the algorithm Trust Region, then lim inf.
Maria Cameron 1. Trust Region Methods At every iteration the trust region methods generate a model m k (p), choose a trust region, and solve the constraint optimization problem of finding the minimum of
More informationScientific Computing: Optimization
Scientific Computing: Optimization Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 Course MATH-GA.2043 or CSCI-GA.2112, Spring 2012 March 8th, 2011 A. Donev (Courant Institute) Lecture
More informationConstrained optimization
Constrained optimization In general, the formulation of constrained optimization is as follows minj(w), subject to H i (w) = 0, i = 1,..., k. where J is the cost function and H i are the constraints. Lagrange
More information6.252 NONLINEAR PROGRAMMING LECTURE 10 ALTERNATIVES TO GRADIENT PROJECTION LECTURE OUTLINE. Three Alternatives/Remedies for Gradient Projection
6.252 NONLINEAR PROGRAMMING LECTURE 10 ALTERNATIVES TO GRADIENT PROJECTION LECTURE OUTLINE Three Alternatives/Remedies for Gradient Projection Two-Metric Projection Methods Manifold Suboptimization Methods
More information2. Quasi-Newton methods
L. Vandenberghe EE236C (Spring 2016) 2. Quasi-Newton methods variable metric methods quasi-newton methods BFGS update limited-memory quasi-newton methods 2-1 Newton method for unconstrained minimization
More informationNumerical optimization
Numerical optimization Lecture 4 Alexander & Michael Bronstein tosca.cs.technion.ac.il/book Numerical geometry of non-rigid shapes Stanford University, Winter 2009 2 Longest Slowest Shortest Minimal Maximal
More informationGradient Descent. Dr. Xiaowei Huang
Gradient Descent Dr. Xiaowei Huang https://cgi.csc.liv.ac.uk/~xiaowei/ Up to now, Three machine learning algorithms: decision tree learning k-nn linear regression only optimization objectives are discussed,
More informationNumerical optimization. Numerical optimization. Longest Shortest where Maximal Minimal. Fastest. Largest. Optimization problems
1 Numerical optimization Alexander & Michael Bronstein, 2006-2009 Michael Bronstein, 2010 tosca.cs.technion.ac.il/book Numerical optimization 048921 Advanced topics in vision Processing and Analysis of
More informationAM 205: lecture 19. Last time: Conditions for optimality Today: Newton s method for optimization, survey of optimization methods
AM 205: lecture 19 Last time: Conditions for optimality Today: Newton s method for optimization, survey of optimization methods Optimality Conditions: Equality Constrained Case As another example of equality
More informationLecture 18: Optimization Programming
Fall, 2016 Outline Unconstrained Optimization 1 Unconstrained Optimization 2 Equality-constrained Optimization Inequality-constrained Optimization Mixture-constrained Optimization 3 Quadratic Programming
More informationLecture 3. Optimization Problems and Iterative Algorithms
Lecture 3 Optimization Problems and Iterative Algorithms January 13, 2016 This material was jointly developed with Angelia Nedić at UIUC for IE 598ns Outline Special Functions: Linear, Quadratic, Convex
More informationMA/OR/ST 706: Nonlinear Programming Midterm Exam Instructor: Dr. Kartik Sivaramakrishnan INSTRUCTIONS
MA/OR/ST 706: Nonlinear Programming Midterm Exam Instructor: Dr. Kartik Sivaramakrishnan INSTRUCTIONS 1. Please write your name and student number clearly on the front page of the exam. 2. The exam is
More informationIterative Methods for Smooth Objective Functions
Optimization Iterative Methods for Smooth Objective Functions Quadratic Objective Functions Stationary Iterative Methods (first/second order) Steepest Descent Method Landweber/Projected Landweber Methods
More informationNumerical Optimization: Basic Concepts and Algorithms
May 27th 2015 Numerical Optimization: Basic Concepts and Algorithms R. Duvigneau R. Duvigneau - Numerical Optimization: Basic Concepts and Algorithms 1 Outline Some basic concepts in optimization Some
More informationBarrier Method. Javier Peña Convex Optimization /36-725
Barrier Method Javier Peña Convex Optimization 10-725/36-725 1 Last time: Newton s method For root-finding F (x) = 0 x + = x F (x) 1 F (x) For optimization x f(x) x + = x 2 f(x) 1 f(x) Assume f strongly
More informationAN EFFICIENT COMPUTATIONAL METHOD FOR TOTAL VARIATION-PENALIZED POISSON LIKELIHOOD ESTIMATION. Johnathan M. Bardsley
Volume X, No. 0X, 200X, X XX Web site: http://www.aimsciences.org AN EFFICIENT COMPUTATIONAL METHOD FOR TOTAL VARIATION-PENALIZED POISSON LIKELIHOOD ESTIMATION Johnathan M. Bardsley Department of Mathematical
More informationLectures 9 and 10: Constrained optimization problems and their optimality conditions
Lectures 9 and 10: Constrained optimization problems and their optimality conditions Coralia Cartis, Mathematical Institute, University of Oxford C6.2/B2: Continuous Optimization Lectures 9 and 10: Constrained
More informationComplexity of gradient descent for multiobjective optimization
Complexity of gradient descent for multiobjective optimization J. Fliege A. I. F. Vaz L. N. Vicente July 18, 2018 Abstract A number of first-order methods have been proposed for smooth multiobjective optimization
More informationScientific Data Computing: Lecture 3
Scientific Data Computing: Lecture 3 Benson Muite benson.muite@ut.ee 23 April 2018 Outline Monday 10-12, Liivi 2-207 Monday 12-14, Liivi 2-205 Topics Introduction, statistical methods and their applications
More informationKey words. nonlinear programming, pattern search algorithm, derivative-free optimization, convergence analysis, second order optimality conditions
SIAM J. OPTIM. Vol. x, No. x, pp. xxx-xxx c Paper submitted to Society for Industrial and Applied Mathematics SECOND ORDER BEHAVIOR OF PATTERN SEARCH MARK A. ABRAMSON Abstract. Previous analyses of pattern
More informationA quantative comparison of two restoration methods as applied to confocal microscopy
A quantative comparison of two restoration methods as applied to confocal microscopy Geert M.P. van Kempen 1, Hans T.M. van der Voort, Lucas J. van Vliet 1 1 Pattern Recognition Group, Delft University
More informationConvex Optimization. Newton s method. ENSAE: Optimisation 1/44
Convex Optimization Newton s method ENSAE: Optimisation 1/44 Unconstrained minimization minimize f(x) f convex, twice continuously differentiable (hence dom f open) we assume optimal value p = inf x f(x)
More informationOn the Implementation of an Advanced Interior Point Algorithm for Stochastic Structural Optimization
10 th World Congress on Structural and Multidisciplinary Optimization May 19-24, 2013, Orlando, Florida, USA On the Implementation of an Advanced Interior Point Algorithm for Stochastic Structural Optimization
More informationConvex Optimization. Dani Yogatama. School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, USA. February 12, 2014
Convex Optimization Dani Yogatama School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, USA February 12, 2014 Dani Yogatama (Carnegie Mellon University) Convex Optimization February 12,
More informationSteepest Descent. Juan C. Meza 1. Lawrence Berkeley National Laboratory Berkeley, California 94720
Steepest Descent Juan C. Meza Lawrence Berkeley National Laboratory Berkeley, California 94720 Abstract The steepest descent method has a rich history and is one of the simplest and best known methods
More informationPart 4: Active-set methods for linearly constrained optimization. Nick Gould (RAL)
Part 4: Active-set methods for linearly constrained optimization Nick Gould RAL fx subject to Ax b Part C course on continuoue optimization LINEARLY CONSTRAINED MINIMIZATION fx subject to Ax { } b where
More informationMaximum Likelihood Estimation
Maximum Likelihood Estimation Prof. C. F. Jeff Wu ISyE 8813 Section 1 Motivation What is parameter estimation? A modeler proposes a model M(θ) for explaining some observed phenomenon θ are the parameters
More informationMore on Lagrange multipliers
More on Lagrange multipliers CE 377K April 21, 2015 REVIEW The standard form for a nonlinear optimization problem is min x f (x) s.t. g 1 (x) 0. g l (x) 0 h 1 (x) = 0. h m (x) = 0 The objective function
More informationIntroduction to Scientific Computing
Introduction to Scientific Computing Benson Muite benson.muite@ut.ee http://kodu.ut.ee/ benson https://courses.cs.ut.ee/2018/isc/spring 26 March 2018 [Public Domain,https://commons.wikimedia.org/wiki/File1
More informationScaled gradient projection methods in image deblurring and denoising
Scaled gradient projection methods in image deblurring and denoising Mario Bertero 1 Patrizia Boccacci 1 Silvia Bonettini 2 Riccardo Zanella 3 Luca Zanni 3 1 Dipartmento di Matematica, Università di Genova
More informationMS&E 318 (CME 338) Large-Scale Numerical Optimization
Stanford University, Management Science & Engineering (and ICME) MS&E 318 (CME 338) Large-Scale Numerical Optimization 1 Origins Instructor: Michael Saunders Spring 2015 Notes 9: Augmented Lagrangian Methods
More informationOptimization. Escuela de Ingeniería Informática de Oviedo. (Dpto. de Matemáticas-UniOvi) Numerical Computation Optimization 1 / 30
Optimization Escuela de Ingeniería Informática de Oviedo (Dpto. de Matemáticas-UniOvi) Numerical Computation Optimization 1 / 30 Unconstrained optimization Outline 1 Unconstrained optimization 2 Constrained
More informationA globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications
A globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications Weijun Zhou 28 October 20 Abstract A hybrid HS and PRP type conjugate gradient method for smooth
More informationOptimization Methods
Optimization Methods Decision making Examples: determining which ingredients and in what quantities to add to a mixture being made so that it will meet specifications on its composition allocating available
More informationVasil Khalidov & Miles Hansard. C.M. Bishop s PRML: Chapter 5; Neural Networks
C.M. Bishop s PRML: Chapter 5; Neural Networks Introduction The aim is, as before, to find useful decompositions of the target variable; t(x) = y(x, w) + ɛ(x) (3.7) t(x n ) and x n are the observations,
More informationAccelerated Block-Coordinate Relaxation for Regularized Optimization
Accelerated Block-Coordinate Relaxation for Regularized Optimization Stephen J. Wright Computer Sciences University of Wisconsin, Madison October 09, 2012 Problem descriptions Consider where f is smooth
More informationWHY DUALITY? Gradient descent Newton s method Quasi-newton Conjugate gradients. No constraints. Non-differentiable ???? Constrained problems? ????
DUALITY WHY DUALITY? No constraints f(x) Non-differentiable f(x) Gradient descent Newton s method Quasi-newton Conjugate gradients etc???? Constrained problems? f(x) subject to g(x) apple 0???? h(x) =0
More informationAccelerated Gradient Methods for Constrained Image Deblurring
Accelerated Gradient Methods for Constrained Image Deblurring S Bonettini 1, R Zanella 2, L Zanni 2, M Bertero 3 1 Dipartimento di Matematica, Università di Ferrara, Via Saragat 1, Building B, I-44100
More informationCubic regularization of Newton s method for convex problems with constraints
CORE DISCUSSION PAPER 006/39 Cubic regularization of Newton s method for convex problems with constraints Yu. Nesterov March 31, 006 Abstract In this paper we derive efficiency estimates of the regularized
More informationAPPLICATIONS OF A NONNEGATIVELY CONSTRAINED ITERATIVE METHOD WITH STATISTICALLY BASED STOPPING RULES TO CT, PET, AND SPECT IMAGING
APPLICATIONS OF A NONNEGATIVELY CONSTRAINED ITERATIVE METHOD WITH STATISTICALLY BASED STOPPING RULES TO CT, PET, AND SPECT IMAGING JOHNATHAN M. BARDSLEY Abstract. In this paper, we extend a nonnegatively
More informationConstrained Optimization and Lagrangian Duality
CIS 520: Machine Learning Oct 02, 2017 Constrained Optimization and Lagrangian Duality Lecturer: Shivani Agarwal Disclaimer: These notes are designed to be a supplement to the lecture. They may or may
More informationAN NONNEGATIVELY CONSTRAINED ITERATIVE METHOD FOR POSITRON EMISSION TOMOGRAPHY. Johnathan M. Bardsley
Volume X, No. 0X, 0X, X XX Web site: http://www.aimsciences.org AN NONNEGATIVELY CONSTRAINED ITERATIVE METHOD FOR POSITRON EMISSION TOMOGRAPHY Johnathan M. Bardsley Department of Mathematical Sciences
More informationIntroduction. New Nonsmooth Trust Region Method for Unconstraint Locally Lipschitz Optimization Problems
New Nonsmooth Trust Region Method for Unconstraint Locally Lipschitz Optimization Problems Z. Akbari 1, R. Yousefpour 2, M. R. Peyghami 3 1 Department of Mathematics, K.N. Toosi University of Technology,
More informationGlobal convergence of trust-region algorithms for constrained minimization without derivatives
Global convergence of trust-region algorithms for constrained minimization without derivatives P.D. Conejo E.W. Karas A.A. Ribeiro L.G. Pedroso M. Sachine September 27, 2012 Abstract In this work we propose
More informationWritten Examination
Division of Scientific Computing Department of Information Technology Uppsala University Optimization Written Examination 202-2-20 Time: 4:00-9:00 Allowed Tools: Pocket Calculator, one A4 paper with notes
More informationA multilevel, level-set method for optimizing eigenvalues in shape design problems
A multilevel, level-set method for optimizing eigenvalues in shape design problems E. Haber July 22, 2003 Abstract In this paper we consider optimal design problems that involve shape optimization. The
More informationOptimization Tutorial 1. Basic Gradient Descent
E0 270 Machine Learning Jan 16, 2015 Optimization Tutorial 1 Basic Gradient Descent Lecture by Harikrishna Narasimhan Note: This tutorial shall assume background in elementary calculus and linear algebra.
More informationNumerical Optimization. Review: Unconstrained Optimization
Numerical Optimization Finding the best feasible solution Edward P. Gatzke Department of Chemical Engineering University of South Carolina Ed Gatzke (USC CHE ) Numerical Optimization ECHE 589, Spring 2011
More informationOptimization Problems with Constraints - introduction to theory, numerical Methods and applications
Optimization Problems with Constraints - introduction to theory, numerical Methods and applications Dr. Abebe Geletu Ilmenau University of Technology Department of Simulation and Optimal Processes (SOP)
More informationA Proximal Method for Identifying Active Manifolds
A Proximal Method for Identifying Active Manifolds W.L. Hare April 18, 2006 Abstract The minimization of an objective function over a constraint set can often be simplified if the active manifold of the
More informationPrimal-dual relationship between Levenberg-Marquardt and central trajectories for linearly constrained convex optimization
Primal-dual relationship between Levenberg-Marquardt and central trajectories for linearly constrained convex optimization Roger Behling a, Clovis Gonzaga b and Gabriel Haeser c March 21, 2013 a Department
More informationCS-E4830 Kernel Methods in Machine Learning
CS-E4830 Kernel Methods in Machine Learning Lecture 3: Convex optimization and duality Juho Rousu 27. September, 2017 Juho Rousu 27. September, 2017 1 / 45 Convex optimization Convex optimisation This
More informationAM 205: lecture 19. Last time: Conditions for optimality, Newton s method for optimization Today: survey of optimization methods
AM 205: lecture 19 Last time: Conditions for optimality, Newton s method for optimization Today: survey of optimization methods Quasi-Newton Methods General form of quasi-newton methods: x k+1 = x k α
More informationALGEBRAIC DEGREE OF POLYNOMIAL OPTIMIZATION. 1. Introduction. f 0 (x)
ALGEBRAIC DEGREE OF POLYNOMIAL OPTIMIZATION JIAWANG NIE AND KRISTIAN RANESTAD Abstract. Consider the polynomial optimization problem whose objective and constraints are all described by multivariate polynomials.
More informationAM 205: lecture 18. Last time: optimization methods Today: conditions for optimality
AM 205: lecture 18 Last time: optimization methods Today: conditions for optimality Existence of Global Minimum For example: f (x, y) = x 2 + y 2 is coercive on R 2 (global min. at (0, 0)) f (x) = x 3
More informationConvex Optimization Algorithms for Machine Learning in 10 Slides
Convex Optimization Algorithms for Machine Learning in 10 Slides Presenter: Jul. 15. 2015 Outline 1 Quadratic Problem Linear System 2 Smooth Problem Newton-CG 3 Composite Problem Proximal-Newton-CD 4 Non-smooth,
More informationUnconstrained minimization of smooth functions
Unconstrained minimization of smooth functions We want to solve min x R N f(x), where f is convex. In this section, we will assume that f is differentiable (so its gradient exists at every point), and
More informationOn Nesterov s Random Coordinate Descent Algorithms - Continued
On Nesterov s Random Coordinate Descent Algorithms - Continued Zheng Xu University of Texas At Arlington February 20, 2015 1 Revisit Random Coordinate Descent The Random Coordinate Descent Upper and Lower
More information8 Numerical methods for unconstrained problems
8 Numerical methods for unconstrained problems Optimization is one of the important fields in numerical computation, beside solving differential equations and linear systems. We can see that these fields
More informationx 2 x n r n J(x + t(x x ))(x x )dt. For warming-up we start with methods for solving a single equation of one variable.
Maria Cameron 1. Fixed point methods for solving nonlinear equations We address the problem of solving an equation of the form (1) r(x) = 0, where F (x) : R n R n is a vector-function. Eq. (1) can be written
More informationNumerical Optimization
Numerical Optimization Shan-Hung Wu shwu@cs.nthu.edu.tw Department of Computer Science, National Tsing Hua University, Taiwan Machine Learning Shan-Hung Wu (CS, NTHU) Numerical Optimization Machine Learning
More informationModern Optimization Techniques
Modern Optimization Techniques 0. Overview Lars Schmidt-Thieme Information Systems and Machine Learning Lab (ISMLL) Institute of Computer Science University of Hildesheim, Germany 1 / 44 Syllabus Mon.
More information10-725/ Optimization Midterm Exam
10-725/36-725 Optimization Midterm Exam November 6, 2012 NAME: ANDREW ID: Instructions: This exam is 1hr 20mins long Except for a single two-sided sheet of notes, no other material or discussion is permitted
More informationLINEAR AND NONLINEAR PROGRAMMING
LINEAR AND NONLINEAR PROGRAMMING Stephen G. Nash and Ariela Sofer George Mason University The McGraw-Hill Companies, Inc. New York St. Louis San Francisco Auckland Bogota Caracas Lisbon London Madrid Mexico
More informationOptimality Conditions for Constrained Optimization
72 CHAPTER 7 Optimality Conditions for Constrained Optimization 1. First Order Conditions In this section we consider first order optimality conditions for the constrained problem P : minimize f 0 (x)
More informationA Double Regularization Approach for Inverse Problems with Noisy Data and Inexact Operator
A Double Regularization Approach for Inverse Problems with Noisy Data and Inexact Operator Ismael Rodrigo Bleyer Prof. Dr. Ronny Ramlau Johannes Kepler Universität - Linz Florianópolis - September, 2011.
More informationMcMaster University. Advanced Optimization Laboratory. Title: A Proximal Method for Identifying Active Manifolds. Authors: Warren L.
McMaster University Advanced Optimization Laboratory Title: A Proximal Method for Identifying Active Manifolds Authors: Warren L. Hare AdvOl-Report No. 2006/07 April 2006, Hamilton, Ontario, Canada A Proximal
More informationSharpening the Karush-John optimality conditions
Sharpening the Karush-John optimality conditions Arnold Neumaier and Hermann Schichl Institut für Mathematik, Universität Wien Strudlhofgasse 4, A-1090 Wien, Austria email: Arnold.Neumaier@univie.ac.at,
More informationSOR- and Jacobi-type Iterative Methods for Solving l 1 -l 2 Problems by Way of Fenchel Duality 1
SOR- and Jacobi-type Iterative Methods for Solving l 1 -l 2 Problems by Way of Fenchel Duality 1 Masao Fukushima 2 July 17 2010; revised February 4 2011 Abstract We present an SOR-type algorithm and a
More informationNumerical Optimization Techniques
Numerical Optimization Techniques Léon Bottou NEC Labs America COS 424 3/2/2010 Today s Agenda Goals Representation Capacity Control Operational Considerations Computational Considerations Classification,
More information1 Introduction
2018-06-12 1 Introduction The title of this course is Numerical Methods for Data Science. What does that mean? Before we dive into the course technical material, let s put things into context. I will not
More informationInverse Problems and Optimal Design in Electricity and Magnetism
Inverse Problems and Optimal Design in Electricity and Magnetism P. Neittaanmäki Department of Mathematics, University of Jyväskylä M. Rudnicki Institute of Electrical Engineering, Warsaw and A. Savini
More informationECE580 Exam 1 October 4, Please do not write on the back of the exam pages. Extra paper is available from the instructor.
ECE580 Exam 1 October 4, 2012 1 Name: Solution Score: /100 You must show ALL of your work for full credit. This exam is closed-book. Calculators may NOT be used. Please leave fractions as fractions, etc.
More informationNONSMOOTH VARIANTS OF POWELL S BFGS CONVERGENCE THEOREM
NONSMOOTH VARIANTS OF POWELL S BFGS CONVERGENCE THEOREM JIAYI GUO AND A.S. LEWIS Abstract. The popular BFGS quasi-newton minimization algorithm under reasonable conditions converges globally on smooth
More informationIPAM Summer School Optimization methods for machine learning. Jorge Nocedal
IPAM Summer School 2012 Tutorial on Optimization methods for machine learning Jorge Nocedal Northwestern University Overview 1. We discuss some characteristics of optimization problems arising in deep
More informationScientific Computing: An Introductory Survey
Scientific Computing: An Introductory Survey Chapter 6 Optimization Prof. Michael T. Heath Department of Computer Science University of Illinois at Urbana-Champaign Copyright c 2002. Reproduction permitted
More informationActive sets, steepest descent, and smooth approximation of functions
Active sets, steepest descent, and smooth approximation of functions Dmitriy Drusvyatskiy School of ORIE, Cornell University Joint work with Alex D. Ioffe (Technion), Martin Larsson (EPFL), and Adrian
More informationNumerical Solution of a Coefficient Identification Problem in the Poisson equation
Swiss Federal Institute of Technology Zurich Seminar for Applied Mathematics Department of Mathematics Bachelor Thesis Spring semester 2014 Thomas Haener Numerical Solution of a Coefficient Identification
More informationUsing the Karush-Kuhn-Tucker Conditions to Analyze the Convergence Rate of Preconditioned Eigenvalue Solvers
Using the Karush-Kuhn-Tucker Conditions to Analyze the Convergence Rate of Preconditioned Eigenvalue Solvers Merico Argentati University of Colorado Denver Joint work with Andrew V. Knyazev, Klaus Neymeyr
More informationIE 5531: Engineering Optimization I
IE 5531: Engineering Optimization I Lecture 12: Nonlinear optimization, continued Prof. John Gunnar Carlsson October 20, 2010 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I October 20,
More informationExamination paper for TMA4180 Optimization I
Department of Mathematical Sciences Examination paper for TMA4180 Optimization I Academic contact during examination: Phone: Examination date: 26th May 2016 Examination time (from to): 09:00 13:00 Permitted
More informationHigher-Order Methods
Higher-Order Methods Stephen J. Wright 1 2 Computer Sciences Department, University of Wisconsin-Madison. PCMI, July 2016 Stephen Wright (UW-Madison) Higher-Order Methods PCMI, July 2016 1 / 25 Smooth
More information5 Handling Constraints
5 Handling Constraints Engineering design optimization problems are very rarely unconstrained. Moreover, the constraints that appear in these problems are typically nonlinear. This motivates our interest
More informationSome Properties of the Augmented Lagrangian in Cone Constrained Optimization
MATHEMATICS OF OPERATIONS RESEARCH Vol. 29, No. 3, August 2004, pp. 479 491 issn 0364-765X eissn 1526-5471 04 2903 0479 informs doi 10.1287/moor.1040.0103 2004 INFORMS Some Properties of the Augmented
More informationOptimality, Duality, Complementarity for Constrained Optimization
Optimality, Duality, Complementarity for Constrained Optimization Stephen Wright University of Wisconsin-Madison May 2014 Wright (UW-Madison) Optimality, Duality, Complementarity May 2014 1 / 41 Linear
More informationOn well definedness of the Central Path
On well definedness of the Central Path L.M.Graña Drummond B. F. Svaiter IMPA-Instituto de Matemática Pura e Aplicada Estrada Dona Castorina 110, Jardim Botânico, Rio de Janeiro-RJ CEP 22460-320 Brasil
More informationProbing the covariance matrix
Probing the covariance matrix Kenneth M. Hanson Los Alamos National Laboratory (ret.) BIE Users Group Meeting, September 24, 2013 This presentation available at http://kmh-lanl.hansonhub.com/ LA-UR-06-5241
More informationIE 5531: Engineering Optimization I
IE 5531: Engineering Optimization I Lecture 15: Nonlinear optimization Prof. John Gunnar Carlsson November 1, 2010 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 1, 2010 1 / 24
More informationA Bound-Constrained Levenburg-Marquardt Algorithm for a Parameter Identification Problem in Electromagnetics
A Bound-Constrained Levenburg-Marquardt Algorithm for a Parameter Identification Problem in Electromagnetics Johnathan M. Bardsley February 19, 2004 Abstract Our objective is to solve a parameter identification
More informationLecture V. Numerical Optimization
Lecture V Numerical Optimization Gianluca Violante New York University Quantitative Macroeconomics G. Violante, Numerical Optimization p. 1 /19 Isomorphism I We describe minimization problems: to maximize
More informationContents. Preface. 1 Introduction Optimization view on mathematical models NLP models, black-box versus explicit expression 3
Contents Preface ix 1 Introduction 1 1.1 Optimization view on mathematical models 1 1.2 NLP models, black-box versus explicit expression 3 2 Mathematical modeling, cases 7 2.1 Introduction 7 2.2 Enclosing
More information