NatSciLab - Numerical Software Introduction to MATLAB
|
|
- Kelly Gibson
- 5 years ago
- Views:
Transcription
1 Outline NatSciLab - Numerical Software Introduction to MATLAB Onur Oktay Jacobs University Bremen Spring 2010
2 Outline 1 Optimization with Matlab 2 Least Squares problem Linear Least Squares Method Nonlinear Least Squares Method 3 1 norm minimization
3 Outline 1 Optimization with Matlab 2 Least Squares problem Linear Least Squares Method Nonlinear Least Squares Method 3 1 norm minimization
4 Optimization Toolbox - Function list bintprog fgoalattain fminbnd fmincon fminimax fminsearch fminunc fseminf linprog quadprog lsqcurvefit lsqlin lsqnonlin lsqnonneg Minimization Binary integer programming Multiobjective goal attainment Minimize a single-variable function on fixed interval Constrained minimization Minimax constraint problem Derivative-free unconstrained minimization Unconstrained minimization Semi-infinitely constrained minimization Solve linear programming problems Solve quadratic programming problems Least Squares (Curve Fitting) Nonlinear least-squares data fitting Constrained linear least-squares data fitting Nonlinear least-squares data-fitting Least-squares with nonnegative constraint
5 Optimization - Constrained minimization minimize E(u) subject to A 1 u = b 1 (linear equality constraint) A 2 u b 2 (linear inequality constraint) G(u) = 0 (nonlinear equality constraint) H(u) 0 (nonlinear inequality constraint) r 1 u r 2 (domain constraint) E : R N R objective function to be minimized Linear constraints: - u is a solution to the linear equation A 1 u = b - Each entry of the vector A 2 u b 2 is < 0. Nonlinear constraints: - n equality constraints G(u) = (g 1 (u), g 2 (u),..., g n(u)), g 1 (u) = 0, g 2 (u) = 0,..., g n(u) = 0. - m inequality constraints H(u) = (h 1 (u), h 2 (u),..., h m(u)), h 1 (u) 0, h 2 (u) 0,..., h m(u) 0.
6 Optimization - fminunc fminunc finds a minimum of a E : R N R, starting at an initial point x 0 R N. This is generally referred to as unconstrained nonlinear optimization. Usage x = fminunc(f,x0,options) f is a function handle x0 is the starting point options [OPTIONAL] optimization options (see optimset) x is a local minimum of the input function. Simple Example: >> f + x(2)ˆ2); % anonymous function handle >> x0 = [1,1]; >> x = fminunc(f,x0);
7 Optimization - optimset We use optimset to set the optimization options. For example, If the gradient Df and the Hessian D 2 f are available, >> options = optimset( GradObj, on, Hessian, on ) both of them are set to off by default. When Gradient and Hessian are on, the input function f.m must return, - function value f as first output, - gradient value Df as the second output, - Hessian value D 2 f as the third output. This will speed up the calculations. Remember that, for a function F : R N R, - the gradient DF = ( F/ x 1, F/ x 2,..., F/ x N ) is the vector of partial derivatives of F, - the Hessian D 2 F = [ 2 F/ x i x j ] is the N N matrix of second partial derivatives of F.
8 Optimization - optimset Some (not all) of the other uses of optimset. Display FunValCheck MaxFunEvals MaxIter TolFun TolX off displays no output. iter displays output at each iteration. notify displays output only if the algorithm diverges. final (default) displays just the final output. on displays an error when the objective function returns a complex number, Inf, NaN. off (default) displays no error. Maximum number of function evaluations. Default = 200*numberOfVariables. Maximum number of iterations. Default = 200*numberOfVariables. Termination tolerance on the function value. Default = Termination tolerance on x. Default = 10 4.
9 Optimization - fminunc Example - sumsin.m function [ f, Df, D2f ] = sumsin(x) % for simplicity, for the moment we assume that the input is a vector N = length(x); % number of variables f = sum( sin(x).ˆ2 ); if nargout == 2; % Compute Df only if it is called Df = 2 sin(x). cos(x); elseif nargout == 3; % Compute D2f only if it is called D2F = diag( 2 cos(2 x) ); end >> options = optimset( GradObj, on, Hessian, on ); >> x0 = [4,1]; >> x = fminunc(@sumsin,x0,options);
10 Optimization - fminsearch Use fminsearch if E : R N R is not differentiable, e.g., E(x) = sum(abs(x)). fminsearch finds a minimum of a E : R N R, starting at a x 0 R N. Usage x = fminsearch( f, x0, options ) f is a function handle x0 is the starting point options [OPTIONAL] optimization options (see optimset) x is a local minimum of the input function. Example: >> f + x(2)ˆ2); % anonymous function handle >> x0 = [4,0]; >> x = fminsearch(f,x0); >> options = optimset( Tolfun, 1e-6 ); >> xx = fminsearch(@sumsin, x0, options);
11 Outline 1 Optimization with Matlab 2 Least Squares problem Linear Least Squares Method Nonlinear Least Squares Method 3 1 norm minimization
12 Description of the Least Squares problem We start with Data: {(x k, y k ), k = 1, 2,..., r}. Goal: Fit the data into a model Y = f(x, c). - f is the modeling function - c = (c 1, c 2,..., c N ) set of parameters to be determined by LS Least squares problem Find c = (c 1, c 2,..., c N ), which minimizes the sum of the squares E(c) = r ( yk f(x k, c) ) 2 k=1 among all possible choices (unconstrained) of parameters c R N. We say that the LS method is linear if f is of the form f(x, c) = N c n f n (X) n=1
13 Example1 - Data: week3 example1.mat - Model: f(x, c) = c 1 + c 2 X + c 3 X 2 LS method gave c 1 = , c 2 = , c 3 = data (x k,y k ) 2.6 model Y = X 2 2X+3 y x
14 Linear Least Squares Method LS method is linear f is of the form N f(x, c) = c n f n (X). n=1 Linear LS can be written in the matrix form. Designate - x = [x 1 ; x 2 ;... ; x r], y = [y 1 ; y 2 ;... ; y r], c = [c 1 ; c 2 ;... ; c N ] as column vectors. - A = [f 1 (x), f 2 (x),..., f N (x)] as an r N matrix. f n applies to x entrywise: f n(x) = [f n(x 1 ); f n(x 2 );... ; f n(x r)]. - Now, f(x, c) = A c, and E(c) = r k=1 ( yk f(x k, c) )2 = y A c 2 2 Then, the linear LS problem is Find c = (c 1, c 2,..., c N ), which minimizes E(c) = y Ac 2 = norm(y A c, 2) among all possible choices (unconstrained) of parameters c R r.
15 Linear Least Squares Method y f(x, c) = y 1 y 2 y 3. y r f 1 (x 1 ) f 2 (x 1 )... f N (x 1 ) f 1 (x 2 ) f 2 (x 2 )... f N (x 2 ) f 1 (x 3 ) f 2 (x 3 )... f N (x 3 ) f 1 (x r ) f 2 (x r )... f N (x r ) c 1 c 2. c N Best method to solve linear LS is mldivide. Back to Example 1: >> load week3-example1.mat >> x = data(:,1); y = data(:,2); r=length(x); >> A = [ones(r,1), x, x.ˆ2]; % f 1 (x) = 1, f 2 (x) = x, f 3 (x) = x 2 >> c = A \ y % LS solution >> yls = A c; LSerror = y - A c; >> figure(1); plot(x,y, o ); hold on; plot(x, yls, r ) >> figure(2); plot(1:r, LSerror);
16 Linear Least Squares Method Example 2 Cost minimization. A producer wants to model its total cost TC as a function of the number of the items produced X. TC is the sum of the electricity usage Y, and the maintenance costs Z. {(x k, y k, z k ), k = 1,..., r} is gathered over a period of r days, We want to fit this data to the model f(x, c) = Y + Z, Y = c 1 + c 2 X 2, Z = c 3 log(x) + c 4 sin(0.01πx) f(x, c) = [1, X 2, log(x), sin(0.01πx)] c LS is linear. >> load week3-example2.mat >> x = data2(:, 1); y = data2(:, 2); z = data2(:, 3); >> A = [ ones(r,1), x.ˆ2, log(x), sin(0.01*pi*x) ]; >> TC = y+z; >> c = A \ TC >> yls = A( :, [1,2] ) c( [1; 2] ); LSerrory = y - yls; >> zls = A( :, [3,4] ) c( [3; 4] ); LSerrorz = z - zls;
17 Nonlinear Least Squares Method When f is not linear, we directly minimize E : R N R r ( E(c) = yk f(x k, c) ) 2. k=1 A twice differentiable function E : R N R has a local minimum at c = (c 1, c 2,..., c N ) if - The gradient, the vector of the 1st partial derivatives at c is zero. - The Hessian, the matrix of the 2nd partial derivatives at c, has all positive eigenvalues. - Use, e.g., fminunc, lsqnonlin. Provide the gradient and the Hessian if possible, which will speed up the calculations. If E : R N R is not differentiable, use fminsearch. Now, lets see examples on how to use fminunc, lsqnonlin, fminsearch with Nonlinear LS problems.
18 Nonlinear Least Squares Method Example 3 {(x k, y k ), k = 1,..., r} in week3-example3.mat, We want to fit this data to the model Y = f(x, c), where f(x, c) = c 1 + cos(c 2 X + c 3 ) Form r ( E(c 1, c 2, c 3 ) = yk c 1 cos(c 2 x k + c 3 ) ) 2 k=1 Write a Matlab function E.m, where the inputs are c,x,y. Use a Matlab function to find the minimum of G. For example, >> c0 = [1,2,3]; c = fminsearch(@(c)e(c,x,y), c0) >> c0 = [1,0,0]; c = fminunc(@(c)e(c,x,y), c0)
19 Outline 1 Optimization with Matlab 2 Least Squares problem Linear Least Squares Method Nonlinear Least Squares Method 3 1 norm minimization
20 1 norm minimization We use 1 norm minimization if we know that The vector y - f(x,c) has only a few nonzero entries. 1 norm minimization problem Find c = (c 1, c 2,..., c N ), which minimizes E(c) = r yk f(x k, c) k=1 among all possible choices (unconstrained) of parameters c R N. In most cases, 1 norm minimization results in better outcome compared to LS.
21 1 norm minimization Example 4 {(x k, y k ), k = 1,..., r} in week3-example4.mat, We know that the data strictly obeys the model Y = f(x, c), f(x, c) = c 1 + cos(c 2 X + c 3 ) When collecting data, only a few y k s are corrupted. We want to find the unknown parameters c 1, c 2, c 3. We use fminsearch since E is not differentiable everywhere. >> E sum(abs( c(1)+cos(c(2)*x+c(3))-y )) ); >> c0 =[1,2,1] ; c = fminsearch(@(c)e(c,x,y), c0)
22 1 norm minimization Example y data Y = 4 + cos(2x 3) x
23 Recommended Reading Scientific Computing with MATLAB and Octave by Quarteroni & Saleri, Chapter 3, Sections 3.1 and 3.4.
Optimization Toolbox. User s Guide Version 2. For Use with MATLAB. Computation. Visualization. Programming
Optimization Toolbox For Use with MATLAB Computation Visualization Programming User s Guide Version 2 How to Contact The MathWorks: www.mathworks.com comp.soft-sys.matlab support@mathworks.com suggest@mathworks.com
More information4M020 Design tools. Algorithms for numerical optimization. L.F.P. Etman. Department of Mechanical Engineering Eindhoven University of Technology
4M020 Design tools Algorithms for numerical optimization L.F.P. Etman Department of Mechanical Engineering Eindhoven University of Technology Wednesday September 3, 2008 1 / 32 Outline 1 Problem formulation:
More informationMATH 3795 Lecture 13. Numerical Solution of Nonlinear Equations in R N.
MATH 3795 Lecture 13. Numerical Solution of Nonlinear Equations in R N. Dmitriy Leykekhman Fall 2008 Goals Learn about different methods for the solution of F (x) = 0, their advantages and disadvantages.
More informationECS550NFB Introduction to Numerical Methods using Matlab Day 2
ECS550NFB Introduction to Numerical Methods using Matlab Day 2 Lukas Laffers lukas.laffers@umb.sk Department of Mathematics, University of Matej Bel June 9, 2015 Today Root-finding: find x that solves
More informationMATLAB Examples 2 (covering Statistics Lectures 3 and 4)
MATLAB Examples 2 (covering Statistics Lectures 3 and 4) Contents Example 1: Fit a linearized regression model Example 2: Fit a parametric nonlinear model Example 3: Another optimization example Example
More informationEAD 115. Numerical Solution of Engineering and Scientific Problems. David M. Rocke Department of Applied Science
EAD 115 Numerical Solution of Engineering and Scientific Problems David M. Rocke Department of Applied Science Multidimensional Unconstrained Optimization Suppose we have a function f() of more than one
More informationMultiple Bits Distributed Moving Horizon State Estimation for Wireless Sensor Networks. Ji an Luo
Multiple Bits Distributed Moving Horizon State Estimation for Wireless Sensor Networks Ji an Luo 2008.6.6 Outline Background Problem Statement Main Results Simulation Study Conclusion Background Wireless
More information4 damped (modified) Newton methods
4 damped (modified) Newton methods 4.1 damped Newton method Exercise 4.1 Determine with the damped Newton method the unique real zero x of the real valued function of one variable f(x) = x 3 +x 2 using
More informationMATH 3795 Lecture 12. Numerical Solution of Nonlinear Equations.
MATH 3795 Lecture 12. Numerical Solution of Nonlinear Equations. Dmitriy Leykekhman Fall 2008 Goals Learn about different methods for the solution of f(x) = 0, their advantages and disadvantages. Convergence
More informationCHAPTER 2: QUADRATIC PROGRAMMING
CHAPTER 2: QUADRATIC PROGRAMMING Overview Quadratic programming (QP) problems are characterized by objective functions that are quadratic in the design variables, and linear constraints. In this sense,
More informationChapter 2 Solutions of Equations of One Variable
Chapter 2 Solutions of Equations of One Variable 2.1 Bisection Method In this chapter we consider one of the most basic problems of numerical approximation, the root-finding problem. This process involves
More information10.3 Steepest Descent Techniques
The advantage of the Newton and quasi-newton methods for solving systems of nonlinear equations is their speed of convergence once a sufficiently accurate approximation is known. A weakness of these methods
More informationNumerical Solution of f(x) = 0
Numerical Solution of f(x) = 0 Gerald W. Recktenwald Department of Mechanical Engineering Portland State University gerry@pdx.edu ME 350: Finding roots of f(x) = 0 Overview Topics covered in these slides
More informationOptimization. Yuh-Jye Lee. March 21, Data Science and Machine Intelligence Lab National Chiao Tung University 1 / 29
Optimization Yuh-Jye Lee Data Science and Machine Intelligence Lab National Chiao Tung University March 21, 2017 1 / 29 You Have Learned (Unconstrained) Optimization in Your High School Let f (x) = ax
More informationAnalysis/Calculus Review Day 3
Analysis/Calculus Review Day 3 Arvind Saibaba arvindks@stanford.edu Institute of Computational and Mathematical Engineering Stanford University September 15, 2010 Big- Oh and Little- Oh Notation We write
More informationCO 602/CM 740: Fundamentals of Optimization Problem Set 7
CO 602/CM 740: Fundamentals of Optimization Problem Set 7 Instructor: Henry Wolkowicz Fall 2016. Handed out: Sunday 2016-Nov-20. Due: Friday 2016-Nov-25 by 3PM. Contents 1 Derivatives 1 2 Matrix Rounding
More informationAM 205: lecture 19. Last time: Conditions for optimality, Newton s method for optimization Today: survey of optimization methods
AM 205: lecture 19 Last time: Conditions for optimality, Newton s method for optimization Today: survey of optimization methods Quasi-Newton Methods General form of quasi-newton methods: x k+1 = x k α
More informationNumerical algorithms for one and two target optimal controls
Numerical algorithms for one and two target optimal controls Sung-Sik Kwon Department of Mathematics and Computer Science, North Carolina Central University 80 Fayetteville St. Durham, NC 27707 Email:
More informationScientific Computing: Optimization
Scientific Computing: Optimization Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 Course MATH-GA.2043 or CSCI-GA.2112, Spring 2012 March 8th, 2011 A. Donev (Courant Institute) Lecture
More informationLagrange multipliers. Portfolio optimization. The Lagrange multipliers method for finding constrained extrema of multivariable functions.
Chapter 9 Lagrange multipliers Portfolio optimization The Lagrange multipliers method for finding constrained extrema of multivariable functions 91 Lagrange multipliers Optimization problems often require
More informationNonlinear Optimization for Optimal Control
Nonlinear Optimization for Optimal Control Pieter Abbeel UC Berkeley EECS Many slides and figures adapted from Stephen Boyd [optional] Boyd and Vandenberghe, Convex Optimization, Chapters 9 11 [optional]
More informationLecture 15 Newton Method and Self-Concordance. October 23, 2008
Newton Method and Self-Concordance October 23, 2008 Outline Lecture 15 Self-concordance Notion Self-concordant Functions Operations Preserving Self-concordance Properties of Self-concordant Functions Implications
More informationMATLAB crash course 1 / 27. MATLAB crash course. Cesar E. Tamayo Economics - Rutgers. September 27th, /27
1/27 MATLAB crash course 1 / 27 MATLAB crash course Cesar E. Tamayo Economics - Rutgers September 27th, 2013 2/27 MATLAB crash course 2 / 27 Program Program I Interface: layout, menus, help, etc.. I Vectors
More informationMathematical Economics: Lecture 2
Mathematical Economics: Lecture 2 Yu Ren WISE, Xiamen University September 25, 2012 Outline 1 Number Line The number line, origin (Figure 2.1 Page 11) Number Line Interval (a, b) = {x R 1 : a < x < b}
More informationNumerical Analysis Fall. Roots: Open Methods
Numerical Analysis 2015 Fall Roots: Open Methods Open Methods Open methods differ from bracketing methods, in that they require only a single starting value or two starting values that do not necessarily
More information4F3 - Predictive Control
4F3 Predictive Control - Lecture 3 p 1/21 4F3 - Predictive Control Lecture 3 - Predictive Control with Constraints Jan Maciejowski jmm@engcamacuk 4F3 Predictive Control - Lecture 3 p 2/21 Constraints on
More informationLinear and non-linear programming
Linear and non-linear programming Benjamin Recht March 11, 2005 The Gameplan Constrained Optimization Convexity Duality Applications/Taxonomy 1 Constrained Optimization minimize f(x) subject to g j (x)
More informationUnconstrained Multivariate Optimization
Unconstrained Multivariate Optimization Multivariate optimization means optimization of a scalar function of a several variables: and has the general form: y = () min ( ) where () is a nonlinear scalar-valued
More informationUnconstrained minimization of smooth functions
Unconstrained minimization of smooth functions We want to solve min x R N f(x), where f is convex. In this section, we will assume that f is differentiable (so its gradient exists at every point), and
More informationConvex Optimization. Newton s method. ENSAE: Optimisation 1/44
Convex Optimization Newton s method ENSAE: Optimisation 1/44 Unconstrained minimization minimize f(x) f convex, twice continuously differentiable (hence dom f open) we assume optimal value p = inf x f(x)
More informationHomework and Computer Problems for Math*2130 (W17).
Homework and Computer Problems for Math*2130 (W17). MARCUS R. GARVIE 1 December 21, 2016 1 Department of Mathematics & Statistics, University of Guelph NOTES: These questions are a bare minimum. You should
More informationComputational Optimization. Convexity and Unconstrained Optimization 1/29/08 and 2/1(revised)
Computational Optimization Convexity and Unconstrained Optimization 1/9/08 and /1(revised) Convex Sets A set S is convex if the line segment joining any two points in the set is also in the set, i.e.,
More informationMore First-Order Optimization Algorithms
More First-Order Optimization Algorithms Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A. http://www.stanford.edu/ yyye Chapters 3, 8, 3 The SDM
More informationMAT 419 Lecture Notes Transcribed by Eowyn Cenek 6/1/2012
(Homework 1: Chapter 1: Exercises 1-7, 9, 11, 19, due Monday June 11th See also the course website for lectures, assignments, etc) Note: today s lecture is primarily about definitions Lots of definitions
More informationZero-Order Methods for the Optimization of Noisy Functions. Jorge Nocedal
Zero-Order Methods for the Optimization of Noisy Functions Jorge Nocedal Northwestern University Simons Institute, October 2017 1 Collaborators Albert Berahas Northwestern University Richard Byrd University
More information4TE3/6TE3. Algorithms for. Continuous Optimization
4TE3/6TE3 Algorithms for Continuous Optimization (Algorithms for Constrained Nonlinear Optimization Problems) Tamás TERLAKY Computing and Software McMaster University Hamilton, November 2005 terlaky@mcmaster.ca
More informationBinary floating point
Binary floating point Notes for 2017-02-03 Why do we study conditioning of problems? One reason is that we may have input data contaminated by noise, resulting in a bad solution even if the intermediate
More informationSuppose that the approximate solutions of Eq. (1) satisfy the condition (3). Then (1) if η = 0 in the algorithm Trust Region, then lim inf.
Maria Cameron 1. Trust Region Methods At every iteration the trust region methods generate a model m k (p), choose a trust region, and solve the constraint optimization problem of finding the minimum of
More informationMath 263 Assignment #4 Solutions. 0 = f 1 (x,y,z) = 2x 1 0 = f 2 (x,y,z) = z 2 0 = f 3 (x,y,z) = y 1
Math 263 Assignment #4 Solutions 1. Find and classify the critical points of each of the following functions: (a) f(x,y,z) = x 2 + yz x 2y z + 7 (c) f(x,y) = e x2 y 2 (1 e x2 ) (b) f(x,y) = (x + y) 3 (x
More informationLECTURE 22: SWARM INTELLIGENCE 3 / CLASSICAL OPTIMIZATION
15-382 COLLECTIVE INTELLIGENCE - S19 LECTURE 22: SWARM INTELLIGENCE 3 / CLASSICAL OPTIMIZATION TEACHER: GIANNI A. DI CARO WHAT IF WE HAVE ONE SINGLE AGENT PSO leverages the presence of a swarm: the outcome
More informationHomework 2 solutions
Homework 2 solutions Section 2.1: Ex 1,2,3,4,6,11; AP 1 (18 points; Ex 3, AP 1 graded, 4 pts each; 2 pts to try the others) 1. Determine if each function has a unique fixed point on the specified interval.
More informationPackage alabama. March 6, 2015
Package alabama March 6, 2015 Type Package Title Constrained Nonlinear Optimization Description Augmented Lagrangian Adaptive Barrier Minimization Algorithm for optimizing smooth nonlinear objective functions
More informationUnconstrained minimization
CSCI5254: Convex Optimization & Its Applications Unconstrained minimization terminology and assumptions gradient descent method steepest descent method Newton s method self-concordant functions 1 Unconstrained
More informationConstrained optimization: direct methods (cont.)
Constrained optimization: direct methods (cont.) Jussi Hakanen Post-doctoral researcher jussi.hakanen@jyu.fi Direct methods Also known as methods of feasible directions Idea in a point x h, generate a
More informationExamination paper for TMA4180 Optimization I
Department of Mathematical Sciences Examination paper for TMA4180 Optimization I Academic contact during examination: Phone: Examination date: 26th May 2016 Examination time (from to): 09:00 13:00 Permitted
More information14. Nonlinear equations
L. Vandenberghe ECE133A (Winter 2018) 14. Nonlinear equations Newton method for nonlinear equations damped Newton method for unconstrained minimization Newton method for nonlinear least squares 14-1 Set
More informationData Reconciliation: Measurement Variance, Robust Objectives and Optimization Techniques
Data Reconciliation: Measurement Variance, Robust Objectives and Optimization Techniques Technical Report Olli Suominen November 2015, Tampere, Finland Version 1.0 CLEEN MMEA program, WP2. DL 2.1.2, FP4
More informationModule 04 Optimization Problems KKT Conditions & Solvers
Module 04 Optimization Problems KKT Conditions & Solvers Ahmad F. Taha EE 5243: Introduction to Cyber-Physical Systems Email: ahmad.taha@utsa.edu Webpage: http://engineering.utsa.edu/ taha/index.html September
More informationControl Systems Lab - SC4070 System Identification and Linearization
Control Systems Lab - SC4070 System Identification and Linearization Dr. Manuel Mazo Jr. Delft Center for Systems and Control (TU Delft) m.mazo@tudelft.nl Tel.:015-2788131 TU Delft, February 13, 2015 (slides
More informationECON 4117/5111 Mathematical Economics
Test 1 September 23, 2016 1. Suppose that p and q are logical statements. The exclusive or, denoted by p Y q, is true when only one of p and q is true. (a) Construct the truth table of p Y q. (b) Prove
More informationAnalysis Qualifying Exam
Analysis Qualifying Exam Spring 2017 Problem 1: Let f be differentiable on R. Suppose that there exists M > 0 such that f(k) M for each integer k, and f (x) M for all x R. Show that f is bounded, i.e.,
More informationOptimization and Root Finding. Kurt Hornik
Optimization and Root Finding Kurt Hornik Basics Root finding and unconstrained smooth optimization are closely related: Solving ƒ () = 0 can be accomplished via minimizing ƒ () 2 Slide 2 Basics Root finding
More information1. Method 1: bisection. The bisection methods starts from two points a 0 and b 0 such that
Chapter 4 Nonlinear equations 4.1 Root finding Consider the problem of solving any nonlinear relation g(x) = h(x) in the real variable x. We rephrase this problem as one of finding the zero (root) of a
More informationFinding the Roots of f(x) = 0. Gerald W. Recktenwald Department of Mechanical Engineering Portland State University
Finding the Roots of f(x) = 0 Gerald W. Recktenwald Department of Mechanical Engineering Portland State University gerry@me.pdx.edu These slides are a supplement to the book Numerical Methods with Matlab:
More informationFinding the Roots of f(x) = 0
Finding the Roots of f(x) = 0 Gerald W. Recktenwald Department of Mechanical Engineering Portland State University gerry@me.pdx.edu These slides are a supplement to the book Numerical Methods with Matlab:
More informationNonlinear Minimization Techniques Without Using Derivatives
Nonlinear Minimization Techniques Without Using Derivatives Florian Jarre (Univ. Du sseldorf) Markus Lazar (Univ. Appl. Sc. Rosenheim), Felix Lieder (Univ. Du sseldorf) Aug. 10, ICCOPT 2016, Tokyo with
More informationAM 205: lecture 19. Last time: Conditions for optimality Today: Newton s method for optimization, survey of optimization methods
AM 205: lecture 19 Last time: Conditions for optimality Today: Newton s method for optimization, survey of optimization methods Optimality Conditions: Equality Constrained Case As another example of equality
More informationConstrained Optimization
1 / 22 Constrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University March 30, 2015 2 / 22 1. Equality constraints only 1.1 Reduced gradient 1.2 Lagrange
More informationLecture 1 Numerical methods: principles, algorithms and applications: an introduction
Lecture 1 Numerical methods: principles, algorithms and applications: an introduction Weinan E 1,2 and Tiejun Li 2 1 Department of Mathematics, Princeton University, weinan@princeton.edu 2 School of Mathematical
More informationMATH 5720: Unconstrained Optimization Hung Phan, UMass Lowell September 13, 2018
MATH 57: Unconstrained Optimization Hung Phan, UMass Lowell September 13, 18 1 Global and Local Optima Let a function f : S R be defined on a set S R n Definition 1 (minimizers and maximizers) (i) x S
More informationN. L. P. NONLINEAR PROGRAMMING (NLP) deals with optimization models with at least one nonlinear function. NLP. Optimization. Models of following form:
0.1 N. L. P. Katta G. Murty, IOE 611 Lecture slides Introductory Lecture NONLINEAR PROGRAMMING (NLP) deals with optimization models with at least one nonlinear function. NLP does not include everything
More informationMotivation: We have already seen an example of a system of nonlinear equations when we studied Gaussian integration (p.8 of integration notes)
AMSC/CMSC 460 Computational Methods, Fall 2007 UNIT 5: Nonlinear Equations Dianne P. O Leary c 2001, 2002, 2007 Solving Nonlinear Equations and Optimization Problems Read Chapter 8. Skip Section 8.1.1.
More informationConvex Functions. Daniel P. Palomar. Hong Kong University of Science and Technology (HKUST)
Convex Functions Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization Fall 2017-18, HKUST, Hong Kong Outline of Lecture Definition convex function Examples
More informationNumerical Optimization of Partial Differential Equations
Numerical Optimization of Partial Differential Equations Part I: basic optimization concepts in R n Bartosz Protas Department of Mathematics & Statistics McMaster University, Hamilton, Ontario, Canada
More informationConvex Optimization. 9. Unconstrained minimization. Prof. Ying Cui. Department of Electrical Engineering Shanghai Jiao Tong University
Convex Optimization 9. Unconstrained minimization Prof. Ying Cui Department of Electrical Engineering Shanghai Jiao Tong University 2017 Autumn Semester SJTU Ying Cui 1 / 40 Outline Unconstrained minimization
More informationSupplementary Information for:
Supplementary Information for: " Blueprint XAS: A Matlab-based toolbox for the fitting and analysis of XAS spectra ". Mario Ulises Delgado Jaime, Craig Philip Mewis and Pierre Kennepohl * The University
More informationProblem 1. Produce the linear and quadratic Taylor polynomials for the following functions:
Problem. Produce the linear and quadratic Taylor polynomials for the following functions: (a) f(x) = e cos(x), a = (b) log( + e x ), a = The general formula for any Taylor Polynomial is as follows: (a)
More informationSolution of Matrix Eigenvalue Problem
Outlines October 12, 2004 Outlines Part I: Review of Previous Lecture Part II: Review of Previous Lecture Outlines Part I: Review of Previous Lecture Part II: Standard Matrix Eigenvalue Problem Other Forms
More informationPDEs in Image Processing, Tutorials
PDEs in Image Processing, Tutorials Markus Grasmair Vienna, Winter Term 2010 2011 Direct Methods Let X be a topological space and R: X R {+ } some functional. following definitions: The mapping R is lower
More informationMATH 4211/6211 Optimization Basics of Optimization Problems
MATH 4211/6211 Optimization Basics of Optimization Problems Xiaojing Ye Department of Mathematics & Statistics Georgia State University Xiaojing Ye, Math & Stat, Georgia State University 0 A standard minimization
More informationAssignment 3 with Solutions
Discrete Mathematics (Math 27), Spring 2004 Assignment 3 with Solutions. Recall the definition of functions, one-to-one functions, and onto functions. (a) Consider the function f : R R with f(x) x x. i.
More informationConstrained Optimization and Lagrangian Duality
CIS 520: Machine Learning Oct 02, 2017 Constrained Optimization and Lagrangian Duality Lecturer: Shivani Agarwal Disclaimer: These notes are designed to be a supplement to the lecture. They may or may
More informationAdvanced Calculus Math 127B, Winter 2005 Solutions: Final. nx2 1 + n 2 x, g n(x) = n2 x
. Define f n, g n : [, ] R by f n (x) = Advanced Calculus Math 27B, Winter 25 Solutions: Final nx2 + n 2 x, g n(x) = n2 x 2 + n 2 x. 2 Show that the sequences (f n ), (g n ) converge pointwise on [, ],
More informationLecture 3: Basics of set-constrained and unconstrained optimization
Lecture 3: Basics of set-constrained and unconstrained optimization (Chap 6 from textbook) Xiaoqun Zhang Shanghai Jiao Tong University Last updated: October 9, 2018 Optimization basics Outline Optimization
More information10. Unconstrained minimization
Convex Optimization Boyd & Vandenberghe 10. Unconstrained minimization terminology and assumptions gradient descent method steepest descent method Newton s method self-concordant functions implementation
More informationMATHEMATICS FOR COMPUTER VISION WEEK 8 OPTIMISATION PART 2. Dr Fabio Cuzzolin MSc in Computer Vision Oxford Brookes University Year
MATHEMATICS FOR COMPUTER VISION WEEK 8 OPTIMISATION PART 2 1 Dr Fabio Cuzzolin MSc in Computer Vision Oxford Brookes University Year 2013-14 OUTLINE OF WEEK 8 topics: quadratic optimisation, least squares,
More information1. Type your solutions. This homework is mainly a programming assignment.
THE UNIVERSITY OF TEXAS AT SAN ANTONIO EE 5243 INTRODUCTION TO CYBER-PHYSICAL SYSTEMS H O M E W O R K S # 6 + 7 Ahmad F. Taha October 22, 2015 READ Homework Instructions: 1. Type your solutions. This homework
More informationUnconstrained minimization: assumptions
Unconstrained minimization I terminology and assumptions I gradient descent method I steepest descent method I Newton s method I self-concordant functions I implementation IOE 611: Nonlinear Programming,
More informationUnconstrained Optimization
1 / 36 Unconstrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University February 2, 2015 2 / 36 3 / 36 4 / 36 5 / 36 1. preliminaries 1.1 local approximation
More informationKernel B Splines and Interpolation
Kernel B Splines and Interpolation M. Bozzini, L. Lenarduzzi and R. Schaback February 6, 5 Abstract This paper applies divided differences to conditionally positive definite kernels in order to generate
More informationMath 409/509 (Spring 2011)
Math 409/509 (Spring 2011) Instructor: Emre Mengi Study Guide for Homework 2 This homework concerns the root-finding problem and line-search algorithms for unconstrained optimization. Please don t hesitate
More informationECE580 Solution to Problem Set 3: Applications of the FONC, SONC, and SOSC
ECE580 Spring 2016 Solution to Problem Set 3 February 8, 2016 1 ECE580 Solution to Problem Set 3: Applications of the FONC, SONC, and SOSC These problems are from the textbook by Chong and Zak, 4th edition,
More informationSpectral gradient projection method for solving nonlinear monotone equations
Journal of Computational and Applied Mathematics 196 (2006) 478 484 www.elsevier.com/locate/cam Spectral gradient projection method for solving nonlinear monotone equations Li Zhang, Weijun Zhou Department
More informationStatic unconstrained optimization
Static unconstrained optimization 2 In unconstrained optimization an objective function is minimized without any additional restriction on the decision variables, i.e. min f(x) x X ad (2.) with X ad R
More informationConvex Optimization Problems. Prof. Daniel P. Palomar
Conve Optimization Problems Prof. Daniel P. Palomar The Hong Kong University of Science and Technology (HKUST) MAFS6010R- Portfolio Optimization with R MSc in Financial Mathematics Fall 2018-19, HKUST,
More informationNumerical instructions in Matlab (M), Octave (O) or both ones
Numerical instructions in Matlab (M), Octave (O) or both ones Index 1. Nonlinear Equations and polynomials... 2 2. Linear Algebra... 2 3. Linear Equations... 3 4. Linear Equations (Iterative Methods)...
More informationNear-Optimal Discretization of the Brachistochrone Problem
Near-Optimal iscretization of the Brachistochrone Problem Jon attorro.5.45.4.35.3.5..5..5.5..5..5.3.35.4.45.5 A project in partial fulfillment of the requirements for EESOR34 Optimization Algorithms ecember
More informationIP-PCG An interior point algorithm for nonlinear constrained optimization
IP-PCG An interior point algorithm for nonlinear constrained optimization Silvia Bonettini (bntslv@unife.it), Valeria Ruggiero (rgv@unife.it) Dipartimento di Matematica, Università di Ferrara December
More informationA Filled Function Method with One Parameter for R n Constrained Global Optimization
A Filled Function Method with One Parameter for R n Constrained Global Optimization Weixiang Wang Youlin Shang Liansheng Zhang Abstract. For R n constrained global optimization problem, a new auxiliary
More information3E4: Modelling Choice. Introduction to nonlinear programming. Announcements
3E4: Modelling Choice Lecture 7 Introduction to nonlinear programming 1 Announcements Solutions to Lecture 4-6 Homework will be available from http://www.eng.cam.ac.uk/~dr241/3e4 Looking ahead to Lecture
More informationMark your answers ON THE EXAM ITSELF. If you are not sure of your answer you may wish to provide a brief explanation.
CS 189 Spring 2015 Introduction to Machine Learning Midterm You have 80 minutes for the exam. The exam is closed book, closed notes except your one-page crib sheet. No calculators or electronic items.
More informationAn Alternative Three-Term Conjugate Gradient Algorithm for Systems of Nonlinear Equations
International Journal of Mathematical Modelling & Computations Vol. 07, No. 02, Spring 2017, 145-157 An Alternative Three-Term Conjugate Gradient Algorithm for Systems of Nonlinear Equations L. Muhammad
More informationStatic Problem Set 2 Solutions
Static Problem Set Solutions Jonathan Kreamer July, 0 Question (i) Let g, h be two concave functions. Is f = g + h a concave function? Prove it. Yes. Proof: Consider any two points x, x and α [0, ]. Let
More information1. Gradient method. gradient method, first-order methods. quadratic bounds on convex functions. analysis of gradient method
L. Vandenberghe EE236C (Spring 2016) 1. Gradient method gradient method, first-order methods quadratic bounds on convex functions analysis of gradient method 1-1 Approximate course outline First-order
More informationLecture 2: Convex Sets and Functions
Lecture 2: Convex Sets and Functions Hyang-Won Lee Dept. of Internet & Multimedia Eng. Konkuk University Lecture 2 Network Optimization, Fall 2015 1 / 22 Optimization Problems Optimization problems are
More informationOptimization. Yuh-Jye Lee. March 28, Data Science and Machine Intelligence Lab National Chiao Tung University 1 / 40
Optimization Yuh-Jye Lee Data Science and Machine Intelligence Lab National Chiao Tung University March 28, 2017 1 / 40 The Key Idea of Newton s Method Let f : R n R be a twice differentiable function
More informationNonlinear Coordinate Transformations for Unconstrained Optimization I. Basic Transformations
Nonlinear Coordinate Transformations for Unconstrained Optimization I. Basic Transformations TIBOR CSENDES Kalmár Laboratory, József Attila University, Szeged, Hungary and TAMÁS RAPCSÁK Computer and Automation
More informationECE580 Partial Solution to Problem Set 3
ECE580 Fall 2015 Solution to Problem Set 3 October 23, 2015 1 ECE580 Partial Solution to Problem Set 3 These problems are from the textbook by Chong and Zak, 4th edition, which is the textbook for the
More informationConvex functions. Definition. f : R n! R is convex if dom f is a convex set and. f ( x +(1 )y) < f (x)+(1 )f (y) f ( x +(1 )y) apple f (x)+(1 )f (y)
Convex functions I basic properties and I operations that preserve convexity I quasiconvex functions I log-concave and log-convex functions IOE 611: Nonlinear Programming, Fall 2017 3. Convex functions
More informationParameter Estimation of Mathematical Models Described by Differential Equations
Parameter Estimation p.1/12 Parameter Estimation of Mathematical Models Described by Differential Equations Hossein Zivaripiran Department of Computer Science University of Toronto Outline Parameter Estimation
More information