SF2822 Applied nonlinear optimization, final exam Saturday December

Similar documents
SF2822 Applied nonlinear optimization, final exam Wednesday June

SF2822 Applied nonlinear optimization, final exam Saturday December

SF2822 Applied Nonlinear Optimization. Preparatory question. Lecture 9: Sequential quadratic programming. Anders Forsgren

E5295/5B5749 Convex optimization with engineering applications. Lecture 8. Smooth convex unconstrained and equality-constrained minimization

Nonlinear Optimization: What s important?

SF2822 Applied Nonlinear Optimization. Preparatory question. Lecture 10: Interior methods. Anders Forsgren. 1. Try to solve theory question 7.

CS711008Z Algorithm Design and Analysis

Lecture 18: Optimization Programming

Optimization Problems with Constraints - introduction to theory, numerical Methods and applications

4TE3/6TE3. Algorithms for. Continuous Optimization

March 8, 2010 MATH 408 FINAL EXAM SAMPLE

ISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints

Part 5: Penalty and augmented Lagrangian methods for equality constrained optimization. Nick Gould (RAL)

Constrained Optimization

E5295/5B5749 Convex optimization with engineering applications. Lecture 5. Convex programming and semidefinite programming

minimize x subject to (x 2)(x 4) u,

Applications of Linear Programming

Nonlinear optimization

Written Examination

TMA947/MAN280 APPLIED OPTIMIZATION

Constrained optimization: direct methods (cont.)

2.098/6.255/ Optimization Methods Practice True/False Questions

5 Handling Constraints

Computational Optimization. Augmented Lagrangian NW 17.3

CS 323: Numerical Analysis and Computing

Algorithms for constrained local optimization

Part 4: Active-set methods for linearly constrained optimization. Nick Gould (RAL)

Determination of Feasible Directions by Successive Quadratic Programming and Zoutendijk Algorithms: A Comparative Study

Generalization to inequality constrained problem. Maximize

Part IB Optimisation

Numerical Optimization. Review: Unconstrained Optimization

Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization

Quiz Discussion. IE417: Nonlinear Programming: Lecture 12. Motivation. Why do we care? Jeff Linderoth. 16th March 2006

LP. Kap. 17: Interior-point methods

ME451 Kinematics and Dynamics of Machine Systems

3E4: Modelling Choice. Introduction to nonlinear programming. Announcements

Optimization. Yuh-Jye Lee. March 28, Data Science and Machine Intelligence Lab National Chiao Tung University 1 / 40

Numerisches Rechnen. (für Informatiker) M. Grepl P. Esser & G. Welper & L. Zhang. Institut für Geometrie und Praktische Mathematik RWTH Aachen

Computational Finance

Introduction to Nonlinear Stochastic Programming

Newton s Method. Ryan Tibshirani Convex Optimization /36-725

12. Interior-point methods

Primal-Dual Interior-Point Methods

Miscellaneous Nonlinear Programming Exercises

Additional Homework Problems

Algorithms for Constrained Optimization

An Inexact Newton Method for Nonlinear Constrained Optimization


Interior Point Algorithms for Constrained Convex Optimization

CS 323: Numerical Analysis and Computing

Function Junction: Homework Examples from ACE

1 Computing with constraints

Interior Point Methods for Convex Quadratic and Convex Nonlinear Programming

Conditional Gradient (Frank-Wolfe) Method

Lecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem

Linear programming II

Nonlinear Programming

min 4x 1 5x 2 + 3x 3 s.t. x 1 + 2x 2 + x 3 = 10 x 1 x 2 6 x 1 + 3x 2 + x 3 14

Proximal Newton Method. Ryan Tibshirani Convex Optimization /36-725

Optimization for Communications and Networks. Poompat Saengudomlert. Session 4 Duality and Lagrange Multipliers

Optimization Theory. Lectures 4-6

Algorithms for nonlinear programming problems II

Motivation. Lecture 2 Topics from Optimization and Duality. network utility maximization (NUM) problem:

Lecture 13: Constrained optimization

Introduction to Mathematical Programming IE406. Lecture 10. Dr. Ted Ralphs

Interior-Point Methods for Linear Optimization

Paul Schrimpf. October 17, UBC Economics 526. Constrained optimization. Paul Schrimpf. First order conditions. Second order conditions

What s New in Active-Set Methods for Nonlinear Optimization?

10 Numerical methods for constrained problems

Optimization. Escuela de Ingeniería Informática de Oviedo. (Dpto. de Matemáticas-UniOvi) Numerical Computation Optimization 1 / 30

Convex Optimization. Dani Yogatama. School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, USA. February 12, 2014

arxiv: v1 [math.na] 8 Jun 2018

AM 205: lecture 19. Last time: Conditions for optimality, Newton s method for optimization Today: survey of optimization methods

Algorithms for nonlinear programming problems II

Math 164-1: Optimization Instructor: Alpár R. Mészáros

10-725/ Optimization Midterm Exam

Primal-Dual Interior-Point Methods for Linear Programming based on Newton s Method

Multidisciplinary System Design Optimization (MSDO)

Constrained Nonlinear Optimization Algorithms

CSCI 1951-G Optimization Methods in Finance Part 09: Interior Point Methods

1. Gradient method. gradient method, first-order methods. quadratic bounds on convex functions. analysis of gradient method

Lectures 9 and 10: Constrained optimization problems and their optimality conditions

INTRODUCTION, FOUNDATIONS

Lecture 16: October 22

Constrained Optimization and Lagrangian Duality

Two hours. To be provided by Examinations Office: Mathematical Formula Tables. THE UNIVERSITY OF MANCHESTER. xx xxxx 2017 xx:xx xx.

MA/OR/ST 706: Nonlinear Programming Midterm Exam Instructor: Dr. Kartik Sivaramakrishnan INSTRUCTIONS

Infeasibility Detection and an Inexact Active-Set Method for Large-Scale Nonlinear Optimization

An Inexact Newton Method for Optimization

Solutions to Homework 7

Lecture 7: Convex Optimizations

Chapter 2. Optimization. Gradients, convexity, and ALS

Nonlinear Optimization for Optimal Control

Solution Methods. Richard Lusby. Department of Management Engineering Technical University of Denmark

CONSTRAINED NONLINEAR PROGRAMMING

2015 Math Camp Calculus Exam Solution

Linear programming. Saad Mneimneh. maximize x 1 + x 2 subject to 4x 1 x 2 8 2x 1 + x x 1 2x 2 2

IE 5531 Midterm #2 Solutions

A Regularized Interior-Point Method for Constrained Nonlinear Least Squares

The use of second-order information in structural topology optimization. Susana Rojas Labanda, PhD student Mathias Stolpe, Senior researcher

Transcription:

SF2822 Applied nonlinear optimization, final exam Saturday December 5 27 8. 3. Examiner: Anders Forsgren, tel. 79 7 27. Allowed tools: Pen/pencil, ruler and rubber; plus a calculator provided by the department. Solution methods: Unless otherwise stated in the text, the problems should be solved by systematic methods, which do not become unrealistic for large problems. If you use methods other than what have been taught in the course, you must explain carefully. Note! Personal number must be written on the title page. Write only one exercise per sheet. Number the pages and write your name on each page. 22 points are sufficient for a passing grade. For 2-2 points, a completion to a passing grade may be made within three weeks from the date when the results of the exam are announced.. Consider the problem (NLP ) 2e (x ) + (x 2 x ) 2 + x 2 3 subject to x x 2 x 3 2, x + x 3 c, x, where c is a constant. Let x = ( ) T. (a) Is there any value of c such that x satisfies the first-order necessary optimality conditions for (NLP )?..................................................(6p) (b) Is there any value of c such that x is a global r to (NLP )?.... (4p) 2. Consider the quadratic program (QP ) defined by (QP ) 2 xt Hx + c T x subject to Ax b, with 2 4 H =, c =, A =, b = 5. 2 5 4 6 35

Page 2 of 4 Final exam December 5 27 SF2822 The problem is illustrated geometrically in the figure below. (a) Solve (QP ) by an active-set method. Start at the point x = (5 ) T with exactly one constraint in the working set, namely x 5. You need not compute any numerical values, but you may utilize the fact that the problem is twodimensional and make a pure geometric solution. Illustrate your iterations in the figure corresponding to Exercise 2a, which is appended at the end. Motivate each step carefully...................................................... (5p) (b) Solve (QP ) with the same method as in Exercise 2a and with the same starting point, x = (5 ) T, but with x 2 as the only constraint in the working set instead. Illustrate your iterations in the figure corresponding to Exercise 2b, which is appended at the end. Motivate each step carefully.............. (5p) 3. Consider the nonlinear program (NLP ) f(x) subject to g i (x), i =, 2, 3, x IR 2, where f : IR 2 IR and g i : IR 2 IR, i =, 2, 3, are twice-continuously differentiable. Assume specifically that we start at the point x () = ( ) T with f(x () ) =, T f(x () ) =, 2 f(x () ) =, g (x () ) = 2, T 2 g (x () ) =, 2 g (x () ) =, 2 g 2 (x () ) =, T 2 g 2 (x () ) =, 2 g 2 (x () ) =, 2 g 3 (x () ) = 4, T 3 g 3 (x () ) =, 2 g 3 (x () ) =.

SF2822 Final exam December 5 27 Page 3 of 4 In addition, assume that the inital estimate of the Lagrange multipliers, λ (), are chosen as λ () = ( 2 3) T. (a) Assume that we want to solve (NLP ) by a sequential quadratic programming method, starting at x (), λ (). Formulate the quadratic program which is to be solved at the first iteration. Insert numerical values in your quadratic program. Show how you would use the optimal solution of the quadratic program to generate x () and λ (). (We assume that no linesearch is needed.)....... (5p) Note: You need not solve the quadratic program. (b) Assume that we want to solve (NLP ) by a primal-dual interior method, starting at x (), λ (). Formulate a linear system of equations which is to be solved at the first iteration. Insert numerical values in your linear equations. Show how you would use the solution of the linear equations to generate x () and λ ()...........................................................................(5p) Note: You need not solve the linear equations. If you choose to introduce slack variables s, assign suitable initial values to s () and show also how you would generate s (). Remark: In accordance to the notation of the textbook, the sign of λ is chosen such that L(x, λ) = f(x) λ T g(x). 4. Derive the expression for the symmetric rank- update, C k, in a quasi-newton update B k+ = B k + C k............................................................ (p) 5. Consider the equality-constrained quadratic program (EQP ) defined by (EQP ) 2 xt Hx + c T x subject to Ax = b, with 2 2 2 H =, c =. 2 3 2 3 We will consider two sets of A and b. (a) For ( A = ), b = 2, compute a point that satisfies the first-order necessary optimality conditions for (EQP ).............................................................. (3p) (b) Show that the point computed in Exercise 5a is not a local r to the corresponding equality-constrained quadratic program...................(2p)

Page 4 of 4 Final exam December 5 27 SF2822 (c) For 2 A =, b =, compute a point that satisfies the first-order necessary optimality conditions for (EQP ).............................................................. (2p) (d) Show that the point computed in Exercise 5c is a global r to the corresponding equality-constrained quadratic program...................(3p) Note: In Exercise 5, you need not solve the linear systems of equations that arise in a systematic way. Good luck!

Name:.................Personal number:.................page number:................. Figure for Exercise 2a: Figure for Exercise 2b: