IE 5531 Practice Midterm #2

Size: px
Start display at page:

Download "IE 5531 Practice Midterm #2"

Transcription

1 IE 5531 Practice Midterm #2 Prof. John Gunnar Carlsson November 23, 2010 Problem 1: Nonlinear programming You are a songwriter who writes Top 40 style songs for the radio. Each song you write can be described by a feature vector x, which encodes information about the song (for example, number of times the word 'love' is used, number of guitar ris, average duration of a song, and so forth). Each song you write will be a hit with probability p (x) = exp ( c T x + d ) 1 + exp (c T x + d) where c and d are known parameters (this function is commonly used in logistic regression, which you may have seen previously; note that p lies between 0 and 1 by construction). In writing songs, you must also obey restrictions imposed by the local radio station, which we can express as the system Ax b. 1. Consider the problem of choosing x so as to maximize the probability of making a hit song while obeying the radio station's restriction. Then, write an equivalent minimization problem with a convex objective function f ( ), using a transformation of the objective function (hint: the function log (e t + 1) is convex). 2. Suppose that c = (1, 2) and d = 1. Sketch some level sets of f ( ). 3. Suppose that a hit song will generate a prot w T x + q, where q is a positive constant so that there exists a feasible x satisfying w T x+q > 0 (a non-hit song will generate a prot of 0). Write the problem of maximizing the expected prot due to this song. Is there an equivalent convex optimization problem for this? Take the negative logarithm of p (x) and you get f (x) := log (p (x)) = log ( exp ( c T x + d )) + log ( 1 + exp ( c T x + d )) = c T x + d + log ( 1 + exp ( c T x + d )) where we set t = c T x + d. By the hint this is a sum of convex functions and is therefore convex. Since f (x) depends only on c T x + d, the level sets of f (x) are just the level sets of c T x + d, which are just straight lines of the form x 1 + 2x = κ for various constants κ. The problem of maximizing the expected prot is maximize ( w T x + q ) p (x) Again, if we take the negative logarithm of this objective function, the problem is minimize log (( w T x + q ) p (x) ) = log ( w T x + q ) log p (x) = log ( w T x + q ) which is a sum of convex functions and therefore convex. c T x + d + log ( 1 + exp ( c T x + d )) 1

2 Problem 2: Multi-rm alliance revisited Suppose you are the mayor of Minneapolis and there are three rms in your city: rm 1 (capital), rm 2 (labor), and rm 3 (technology). The three rms are considering possible cooperations. Let x 1, x 2, x 3 denote the input of rms 1, 2, and 3, respectively. The payo function is f(x 1, x 2, x 3 ) = x 1 + x 3 + (x 1 + x 3 )(3x 2 2x 2 2). 1. Assume each input variable x 1, x 2, x 3 takes values in [0, 1]. That is, 0 x i 1 for i = 1, 2, 3. Derive the KKT condition for maximizing the social prot function f(x 1, x 2, x 3 ). Also show that x 1 = 1, x 2 = 0.75 and x 3 = 1 is an optimal solution. (Hint: KKT conditions are NOT sucient for this non-convex program. you should look for additional arguments to identify global optimality) 2. Suppose the social prot will be assigned to each rm proportionally to their input. So each rm i (i = 1 corresponds to rm 1, so on) will get prot x i x 1 + x 2 + x 3 f(x 1, x 2, x 3 ), i = 1, 2, 3. Suppose we do not consider the cost of the input. Show that if x 1 = 1, x 3 = 1 are xed, then x 2 = 0.75 is NOT an optimal strategy for rm 2 if rm 2 just aims to maximizes its own prot, which is x 2 x 2+2 f(1, x 2, 1). 3. From part 2, you may have realized that under this mechanism, the rms will never cooperate towards the social optimum. So, you changed the input rules that each rm either inputs x i = 1 or inputs x i = 0. For example, if rm 1 and rm 2 form a sub-alliance, then their total payo is f(1, 1, 0) = 2. We list the payos of all possible sub-alliances S {1, 2, 3} in the table below. S f(s) 0 {1} 1 {2} 0 {3} 1 {1, 2} 2 {2, 3} 2 {1, 3} 2 {1, 2, 3} 4 Table 1: Obviously, the grand alliance maximizes the social payo (= 4). We know a core is the set of payo allocation vectors (z 1, z 2, z 3 ) under the grand alliance, such that no subgroups can do better by deserting the grand alliance. Write out the expression of the core using the given data. Also show that this core is nonempty. 4. Due to the recent economic recession, your city has run out of budget, so you have to ask the grand alliance to pay a tax of T. However, you still want to maintain the grand alliance. In other words, you want to make sure that the core is nonempty. Under this condition, nd the maximal T you can charge the grand alliance. (The grand alliance has payo f({1, 2, 3}) T after tax, but we suppose any sub-alliance S {1, 2, 3} is exempt of tax and still has payo f(s).) First, note that 3x 2 2x for x 2 [0, 1]. Therefore, the social objective function is monotonically increasing as a function of x 1 and x 3 and therefore x 1 = 1 and x 3 = 1. Finally, it is easy to see that 3x 2 2x 2 2 is maximized at the point x 2 = 0.75 as desired. Next, suppose that x 1 = x 3 = 1 and consider the prot of rm 2. The derivative of rm 2's prot at the point x 2 = 0.75 is and therefore 0.75 is not a local minimizer of rm 2's prot. Indeed, if we set x 2 = 1 then rm 2 receives a prot of 4/3 as compared with a prot of 1.16 at x 2 = The core must satisfy z 1 + z 3 2 2

3 It is non-empty because setting z 1 = 1.5, z 2 = 1, and z 3 = 1.5 is in the core (for example). Finally, to determine the maximum tax, we can formulate the linear problem maximize T z 1 + z 2 + z 3 + T = 4 z 1 + z 3 2 Omitting redundant constraints, substituting T = 4 (z 1 + z 2 + z 3 ), and removing the constant 4 from the objective function, this simplies to s.t. minimize z 1 + z 2 + z 3 s.t. Note that and implies z 1 + z 2 + z 3 3. Therefore a lower bound on the new objective function is 3, which implies a tax of T = 1 unit. This is indeed feasible, by setting z 1 = z 2 = z 3 = 1, and therefore the optimal tax is T = 1 unit. 3

4 Problem 3: Interior point methods In addition to solving linear programming problems, the barrier function method also works for solving quadratic programming problems. Consider the following quadratic programming problem: minimize (x 1 3) 2 x 2 s.t. 2 x 1 x 2 0 x 1 0 (1) 1. Solve this quadratic programming problem using the KKT conditions. Next, consider the following optimization problem: minimizeφ µ (x 1, x 2 ) = (x 1 3) 2 x 2 µ(log(2 x 1 x 2 ) + log x 1 ) 2. Find the unconstrained minimal solution of φ µ (x 1, x 2 ) for any given µ 0. Note: Since x 1, x 2 are functions of µ, the optimal solution can be written as (x 1 (µ), x 2 (µ)), where µ is a parameter. 3. For the above problem, would the minimum solution be a local or global optimum? Why? 4. Verify that (x 1 (µ), x 2 (µ)) converges to the solutions as µ 0. The KKT conditions are 2 (x 1 3) + λ 1 λ 2 = λ 1 = 0 λ 1 (2 x 1 x 2 ) = 0 λ 2 x 1 = 0 Since there are only two constraints, by trial and error we nd that the optimal solution has x 1 = 5/2 and x 2 = 0.5. The optimality conditions for the barrier problem are ( 1 2x µ 1 ) 2 x 1 x 2 x 1 = 0 µ x 1 x 2 = 0 which tells us that x 1 (µ) = µ 4 and x 2 (µ) = 2 x 1 (µ) µ. The minimizer is a global minimizer because the objective function is convex. Finally, we see that as µ 0, we nd that x 1 (µ) 5/2 and x 2 (µ) 0.5 as desired. 4

5 Problem 4: KKT system Consider the nonlinear program minimize xy (x 3) 2 + (y 2) 2 = 1 1. Write the KKT conditions for optimality of this problem. 2. We know Newton's method can be used to nd roots of an equation. How can it be applied to nd the KKT points here? To that end, apply one iteration of the Newton's method starting with the point (x 0, y 0, λ) = (1, 1, 1). s.t. The KKT conditions, plus the feasibility condition, are y + 2λ (x 3) = 0 x + 2λ (y 2) = 0 (x 3) 2 + (y 2) 2 1 = 0 The Jacobian matrix of this system is J = 2λ 1 x 1 2λ y 2 (x 3) 2 (y 2) 0 the iteration scheme is where x = (x, y, λ) and f (x, y, λ) = x k+1 = x k J 1 f (x k ) y + 2λ (x 3) x + 2λ (y 2) (x 3) 2 + (y 2) 2 1 Computing one iteration with x 0 = (1, 1, 1) we nd that x 1 = (2.33, 0.33, 1.5). 5

IE 5531 Midterm #2 Solutions

IE 5531 Midterm #2 Solutions IE 5531 Midterm #2 s Prof. John Gunnar Carlsson November 9, 2011 Before you begin: This exam has 9 pages and a total of 5 problems. Make sure that all pages are present. To obtain credit for a problem,

More information

IE 5531: Engineering Optimization I

IE 5531: Engineering Optimization I IE 5531: Engineering Optimization I Lecture 12: Nonlinear optimization, continued Prof. John Gunnar Carlsson October 20, 2010 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I October 20,

More information

IE 5531: Engineering Optimization I

IE 5531: Engineering Optimization I IE 5531: Engineering Optimization I Lecture 19: Midterm 2 Review Prof. John Gunnar Carlsson November 22, 2010 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 22, 2010 1 / 34 Administrivia

More information

IE 5531: Engineering Optimization I

IE 5531: Engineering Optimization I IE 5531: Engineering Optimization I Lecture 15: Nonlinear optimization Prof. John Gunnar Carlsson November 1, 2010 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I November 1, 2010 1 / 24

More information

2.098/6.255/ Optimization Methods Practice True/False Questions

2.098/6.255/ Optimization Methods Practice True/False Questions 2.098/6.255/15.093 Optimization Methods Practice True/False Questions December 11, 2009 Part I For each one of the statements below, state whether it is true or false. Include a 1-3 line supporting sentence

More information

IE 5531: Engineering Optimization I

IE 5531: Engineering Optimization I IE 5531: Engineering Optimization I Lecture 14: Unconstrained optimization Prof. John Gunnar Carlsson October 27, 2010 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I October 27, 2010 1

More information

Barrier Method. Javier Peña Convex Optimization /36-725

Barrier Method. Javier Peña Convex Optimization /36-725 Barrier Method Javier Peña Convex Optimization 10-725/36-725 1 Last time: Newton s method For root-finding F (x) = 0 x + = x F (x) 1 F (x) For optimization x f(x) x + = x 2 f(x) 1 f(x) Assume f strongly

More information

Penalty and Barrier Methods General classical constrained minimization problem minimize f(x) subject to g(x) 0 h(x) =0 Penalty methods are motivated by the desire to use unconstrained optimization techniques

More information

MVE165/MMG631 Linear and integer optimization with applications Lecture 13 Overview of nonlinear programming. Ann-Brith Strömberg

MVE165/MMG631 Linear and integer optimization with applications Lecture 13 Overview of nonlinear programming. Ann-Brith Strömberg MVE165/MMG631 Overview of nonlinear programming Ann-Brith Strömberg 2015 05 21 Areas of applications, examples (Ch. 9.1) Structural optimization Design of aircraft, ships, bridges, etc Decide on the material

More information

4TE3/6TE3. Algorithms for. Continuous Optimization

4TE3/6TE3. Algorithms for. Continuous Optimization 4TE3/6TE3 Algorithms for Continuous Optimization (Algorithms for Constrained Nonlinear Optimization Problems) Tamás TERLAKY Computing and Software McMaster University Hamilton, November 2005 terlaky@mcmaster.ca

More information

Lecture Note 1: Introduction to optimization. Xiaoqun Zhang Shanghai Jiao Tong University

Lecture Note 1: Introduction to optimization. Xiaoqun Zhang Shanghai Jiao Tong University Lecture Note 1: Introduction to optimization Xiaoqun Zhang Shanghai Jiao Tong University Last updated: September 23, 2017 1.1 Introduction 1. Optimization is an important tool in daily life, business and

More information

Practice Questions for Math 131 Exam # 1

Practice Questions for Math 131 Exam # 1 Practice Questions for Math 131 Exam # 1 1) A company produces a product for which the variable cost per unit is $3.50 and fixed cost 1) is $20,000 per year. Next year, the company wants the total cost

More information

Lecture 13: Constrained optimization

Lecture 13: Constrained optimization 2010-12-03 Basic ideas A nonlinearly constrained problem must somehow be converted relaxed into a problem which we can solve (a linear/quadratic or unconstrained problem) We solve a sequence of such problems

More information

BEEM103 UNIVERSITY OF EXETER. BUSINESS School. January 2009 Mock Exam, Part A. OPTIMIZATION TECHNIQUES FOR ECONOMISTS solutions

BEEM103 UNIVERSITY OF EXETER. BUSINESS School. January 2009 Mock Exam, Part A. OPTIMIZATION TECHNIQUES FOR ECONOMISTS solutions BEEM03 UNIVERSITY OF EXETER BUSINESS School January 009 Mock Exam, Part A OPTIMIZATION TECHNIQUES FOR ECONOMISTS solutions Duration : TWO HOURS The paper has 3 parts. Your marks on the rst part will be

More information

Solution Methods. Richard Lusby. Department of Management Engineering Technical University of Denmark

Solution Methods. Richard Lusby. Department of Management Engineering Technical University of Denmark Solution Methods Richard Lusby Department of Management Engineering Technical University of Denmark Lecture Overview (jg Unconstrained Several Variables Quadratic Programming Separable Programming SUMT

More information

5 Handling Constraints

5 Handling Constraints 5 Handling Constraints Engineering design optimization problems are very rarely unconstrained. Moreover, the constraints that appear in these problems are typically nonlinear. This motivates our interest

More information

minimize x subject to (x 2)(x 4) u,

minimize x subject to (x 2)(x 4) u, Math 6366/6367: Optimization and Variational Methods Sample Preliminary Exam Questions 1. Suppose that f : [, L] R is a C 2 -function with f () on (, L) and that you have explicit formulae for

More information

Simon Fraser University, Department of Economics, Econ 201, Prof. Karaivanov FINAL EXAM Answer key

Simon Fraser University, Department of Economics, Econ 201, Prof. Karaivanov FINAL EXAM Answer key Simon Fraser University, Department of Economics, Econ 01, Prof. Karaivanov 017 FINAL EXAM Answer key I. TRUE or FALSE (5 pts each). [The answers below are just examples of correct answers, other possible

More information

Applications of Linear Programming

Applications of Linear Programming Applications of Linear Programming lecturer: András London University of Szeged Institute of Informatics Department of Computational Optimization Lecture 9 Non-linear programming In case of LP, the goal

More information

4.1 Identifying Linear Functions

4.1 Identifying Linear Functions Warm-up Solve the equation for y. Then graph. 1. 2x - 3y = 12 4.1 Identifying Linear Functions I can... 1.) identify a linear function by a graph 2.) identify a linear function by ordered pairs (table)

More information

Math 164-1: Optimization Instructor: Alpár R. Mészáros

Math 164-1: Optimization Instructor: Alpár R. Mészáros Math 164-1: Optimization Instructor: Alpár R. Mészáros First Midterm, April 20, 2016 Name (use a pen): Student ID (use a pen): Signature (use a pen): Rules: Duration of the exam: 50 minutes. By writing

More information

Nonlinear Programming (Hillier, Lieberman Chapter 13) CHEM-E7155 Production Planning and Control

Nonlinear Programming (Hillier, Lieberman Chapter 13) CHEM-E7155 Production Planning and Control Nonlinear Programming (Hillier, Lieberman Chapter 13) CHEM-E7155 Production Planning and Control 19/4/2012 Lecture content Problem formulation and sample examples (ch 13.1) Theoretical background Graphical

More information

CS711008Z Algorithm Design and Analysis

CS711008Z Algorithm Design and Analysis CS711008Z Algorithm Design and Analysis Lecture 8 Linear programming: interior point method Dongbo Bu Institute of Computing Technology Chinese Academy of Sciences, Beijing, China 1 / 31 Outline Brief

More information

CSCI 1951-G Optimization Methods in Finance Part 09: Interior Point Methods

CSCI 1951-G Optimization Methods in Finance Part 09: Interior Point Methods CSCI 1951-G Optimization Methods in Finance Part 09: Interior Point Methods March 23, 2018 1 / 35 This material is covered in S. Boyd, L. Vandenberge s book Convex Optimization https://web.stanford.edu/~boyd/cvxbook/.

More information

Optimization Tutorial 1. Basic Gradient Descent

Optimization Tutorial 1. Basic Gradient Descent E0 270 Machine Learning Jan 16, 2015 Optimization Tutorial 1 Basic Gradient Descent Lecture by Harikrishna Narasimhan Note: This tutorial shall assume background in elementary calculus and linear algebra.

More information

ISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints

ISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints ISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints Instructor: Prof. Kevin Ross Scribe: Nitish John October 18, 2011 1 The Basic Goal The main idea is to transform a given constrained

More information

Convex Optimization. Prof. Nati Srebro. Lecture 12: Infeasible-Start Newton s Method Interior Point Methods

Convex Optimization. Prof. Nati Srebro. Lecture 12: Infeasible-Start Newton s Method Interior Point Methods Convex Optimization Prof. Nati Srebro Lecture 12: Infeasible-Start Newton s Method Interior Point Methods Equality Constrained Optimization f 0 (x) s. t. A R p n, b R p Using access to: 2 nd order oracle

More information

Lecture 4: Optimization. Maximizing a function of a single variable

Lecture 4: Optimization. Maximizing a function of a single variable Lecture 4: Optimization Maximizing or Minimizing a Function of a Single Variable Maximizing or Minimizing a Function of Many Variables Constrained Optimization Maximizing a function of a single variable

More information

g(t) = f(x 1 (t),..., x n (t)).

g(t) = f(x 1 (t),..., x n (t)). Reading: [Simon] p. 313-333, 833-836. 0.1 The Chain Rule Partial derivatives describe how a function changes in directions parallel to the coordinate axes. Now we shall demonstrate how the partial derivatives

More information

Key Concepts: Economic Computation, Part III

Key Concepts: Economic Computation, Part III Key Concepts: Economic Computation, Part III Brent Hickman Summer, 8 1 Using Newton s Method to Find Roots of Real- Valued Functions The intuition behind Newton s method is that finding zeros of non-linear

More information

September Math Course: First Order Derivative

September Math Course: First Order Derivative September Math Course: First Order Derivative Arina Nikandrova Functions Function y = f (x), where x is either be a scalar or a vector of several variables (x,..., x n ), can be thought of as a rule which

More information

Second Welfare Theorem

Second Welfare Theorem Second Welfare Theorem Econ 2100 Fall 2015 Lecture 18, November 2 Outline 1 Second Welfare Theorem From Last Class We want to state a prove a theorem that says that any Pareto optimal allocation is (part

More information

Mathematical Foundations -1- Constrained Optimization. Constrained Optimization. An intuitive approach 2. First Order Conditions (FOC) 7

Mathematical Foundations -1- Constrained Optimization. Constrained Optimization. An intuitive approach 2. First Order Conditions (FOC) 7 Mathematical Foundations -- Constrained Optimization Constrained Optimization An intuitive approach First Order Conditions (FOC) 7 Constraint qualifications 9 Formal statement of the FOC for a maximum

More information

Bi-Variate Functions - ACTIVITES

Bi-Variate Functions - ACTIVITES Bi-Variate Functions - ACTIVITES LO1. Students to consolidate basic meaning of bi-variate functions LO2. Students to learn how to confidently use bi-variate functions in economics Students are given the

More information

Generalization to inequality constrained problem. Maximize

Generalization to inequality constrained problem. Maximize Lecture 11. 26 September 2006 Review of Lecture #10: Second order optimality conditions necessary condition, sufficient condition. If the necessary condition is violated the point cannot be a local minimum

More information

Algorithms for Constrained Optimization

Algorithms for Constrained Optimization 1 / 42 Algorithms for Constrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University April 19, 2015 2 / 42 Outline 1. Convergence 2. Sequential quadratic

More information

Written Examination

Written Examination Division of Scientific Computing Department of Information Technology Uppsala University Optimization Written Examination 202-2-20 Time: 4:00-9:00 Allowed Tools: Pocket Calculator, one A4 paper with notes

More information

A Solution to the Problem of Externalities When Agents Are Well-Informed

A Solution to the Problem of Externalities When Agents Are Well-Informed A Solution to the Problem of Externalities When Agents Are Well-Informed Hal R. Varian. The American Economic Review, Vol. 84, No. 5 (Dec., 1994), pp. 1278-1293 Introduction There is a unilateral externality

More information

Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization

Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization Compiled by David Rosenberg Abstract Boyd and Vandenberghe s Convex Optimization book is very well-written and a pleasure to read. The

More information

Homework 4. Convex Optimization /36-725

Homework 4. Convex Optimization /36-725 Homework 4 Convex Optimization 10-725/36-725 Due Friday November 4 at 5:30pm submitted to Christoph Dann in Gates 8013 (Remember to a submit separate writeup for each problem, with your name at the top)

More information

10-725/ Optimization Midterm Exam

10-725/ Optimization Midterm Exam 10-725/36-725 Optimization Midterm Exam November 6, 2012 NAME: ANDREW ID: Instructions: This exam is 1hr 20mins long Except for a single two-sided sheet of notes, no other material or discussion is permitted

More information

Convex Optimization Lecture 6: KKT Conditions, and applications

Convex Optimization Lecture 6: KKT Conditions, and applications Convex Optimization Lecture 6: KKT Conditions, and applications Dr. Michel Baes, IFOR / ETH Zürich Quick recall of last week s lecture Various aspects of convexity: The set of minimizers is convex. Convex

More information

Mathematics For Economists

Mathematics For Economists Mathematics For Economists Mark Dean Final 2010 Tuesday 7th December Question 1 (15 Points) Let be a continuously differentiable function on an interval in R. Show that is concave if and only if () ()

More information

Rice University. Answer Key to Mid-Semester Examination Fall ECON 501: Advanced Microeconomic Theory. Part A

Rice University. Answer Key to Mid-Semester Examination Fall ECON 501: Advanced Microeconomic Theory. Part A Rice University Answer Key to Mid-Semester Examination Fall 006 ECON 50: Advanced Microeconomic Theory Part A. Consider the following expenditure function. e (p ; p ; p 3 ; u) = (p + p ) u + p 3 State

More information

Interior Point Methods for Convex Quadratic and Convex Nonlinear Programming

Interior Point Methods for Convex Quadratic and Convex Nonlinear Programming School of Mathematics T H E U N I V E R S I T Y O H F E D I N B U R G Interior Point Methods for Convex Quadratic and Convex Nonlinear Programming Jacek Gondzio Email: J.Gondzio@ed.ac.uk URL: http://www.maths.ed.ac.uk/~gondzio

More information

Lecture 18: Optimization Programming

Lecture 18: Optimization Programming Fall, 2016 Outline Unconstrained Optimization 1 Unconstrained Optimization 2 Equality-constrained Optimization Inequality-constrained Optimization Mixture-constrained Optimization 3 Quadratic Programming

More information

Gradient Descent. Dr. Xiaowei Huang

Gradient Descent. Dr. Xiaowei Huang Gradient Descent Dr. Xiaowei Huang https://cgi.csc.liv.ac.uk/~xiaowei/ Up to now, Three machine learning algorithms: decision tree learning k-nn linear regression only optimization objectives are discussed,

More information

Convex Optimization. Newton s method. ENSAE: Optimisation 1/44

Convex Optimization. Newton s method. ENSAE: Optimisation 1/44 Convex Optimization Newton s method ENSAE: Optimisation 1/44 Unconstrained minimization minimize f(x) f convex, twice continuously differentiable (hence dom f open) we assume optimal value p = inf x f(x)

More information

Reading group: Calculus of Variations and Optimal Control Theory by Daniel Liberzon

Reading group: Calculus of Variations and Optimal Control Theory by Daniel Liberzon : Calculus of Variations and Optimal Control Theory by Daniel Liberzon 16th March 2017 1 / 30 Content 1 2 Recall on finite-dimensional of a global minimum 3 Infinite-dimensional 4 2 / 30 Content 1 2 Recall

More information

Notes IV General Equilibrium and Welfare Properties

Notes IV General Equilibrium and Welfare Properties Notes IV General Equilibrium and Welfare Properties In this lecture we consider a general model of a private ownership economy, i.e., a market economy in which a consumer s wealth is derived from endowments

More information

NONLINEAR. (Hillier & Lieberman Introduction to Operations Research, 8 th edition)

NONLINEAR. (Hillier & Lieberman Introduction to Operations Research, 8 th edition) NONLINEAR PROGRAMMING (Hillier & Lieberman Introduction to Operations Research, 8 th edition) Nonlinear Programming g Linear programming has a fundamental role in OR. In linear programming all its functions

More information

Tvestlanka Karagyozova University of Connecticut

Tvestlanka Karagyozova University of Connecticut September, 005 CALCULUS REVIEW Tvestlanka Karagyozova University of Connecticut. FUNCTIONS.. Definition: A function f is a rule that associates each value of one variable with one and only one value of

More information

Problem Set 2: Proposed solutions Econ Fall Cesar E. Tamayo Department of Economics, Rutgers University

Problem Set 2: Proposed solutions Econ Fall Cesar E. Tamayo Department of Economics, Rutgers University Problem Set 2: Proposed solutions Econ 504 - Fall 202 Cesar E. Tamayo ctamayo@econ.rutgers.edu Department of Economics, Rutgers University Simple optimal growth (Problems &2) Suppose that we modify slightly

More information

Midterm 1. Every element of the set of functions is continuous

Midterm 1. Every element of the set of functions is continuous Econ 200 Mathematics for Economists Midterm Question.- Consider the set of functions F C(0, ) dened by { } F = f C(0, ) f(x) = ax b, a A R and b B R That is, F is a subset of the set of continuous functions

More information

Support Vector Machines

Support Vector Machines Support Vector Machines Le Song Machine Learning I CSE 6740, Fall 2013 Naïve Bayes classifier Still use Bayes decision rule for classification P y x = P x y P y P x But assume p x y = 1 is fully factorized

More information

Scientific Computing: Optimization

Scientific Computing: Optimization Scientific Computing: Optimization Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 Course MATH-GA.2043 or CSCI-GA.2112, Spring 2012 March 8th, 2011 A. Donev (Courant Institute) Lecture

More information

Lecture 3. Optimization Problems and Iterative Algorithms

Lecture 3. Optimization Problems and Iterative Algorithms Lecture 3 Optimization Problems and Iterative Algorithms January 13, 2016 This material was jointly developed with Angelia Nedić at UIUC for IE 598ns Outline Special Functions: Linear, Quadratic, Convex

More information

Two hours. To be provided by Examinations Office: Mathematical Formula Tables. THE UNIVERSITY OF MANCHESTER. xx xxxx 2017 xx:xx xx.

Two hours. To be provided by Examinations Office: Mathematical Formula Tables. THE UNIVERSITY OF MANCHESTER. xx xxxx 2017 xx:xx xx. Two hours To be provided by Examinations Office: Mathematical Formula Tables. THE UNIVERSITY OF MANCHESTER CONVEX OPTIMIZATION - SOLUTIONS xx xxxx 27 xx:xx xx.xx Answer THREE of the FOUR questions. If

More information

UC Berkeley Department of Electrical Engineering and Computer Science. EECS 227A Nonlinear and Convex Optimization. Solutions 5 Fall 2009

UC Berkeley Department of Electrical Engineering and Computer Science. EECS 227A Nonlinear and Convex Optimization. Solutions 5 Fall 2009 UC Berkeley Department of Electrical Engineering and Computer Science EECS 227A Nonlinear and Convex Optimization Solutions 5 Fall 2009 Reading: Boyd and Vandenberghe, Chapter 5 Solution 5.1 Note that

More information

Notes on Iterated Expectations Stephen Morris February 2002

Notes on Iterated Expectations Stephen Morris February 2002 Notes on Iterated Expectations Stephen Morris February 2002 1. Introduction Consider the following sequence of numbers. Individual 1's expectation of random variable X; individual 2's expectation of individual

More information

Cubic regularization of Newton s method for convex problems with constraints

Cubic regularization of Newton s method for convex problems with constraints CORE DISCUSSION PAPER 006/39 Cubic regularization of Newton s method for convex problems with constraints Yu. Nesterov March 31, 006 Abstract In this paper we derive efficiency estimates of the regularized

More information

Primal-Dual Interior-Point Methods for Linear Programming based on Newton s Method

Primal-Dual Interior-Point Methods for Linear Programming based on Newton s Method Primal-Dual Interior-Point Methods for Linear Programming based on Newton s Method Robert M. Freund March, 2004 2004 Massachusetts Institute of Technology. The Problem The logarithmic barrier approach

More information

Algorithms for constrained local optimization

Algorithms for constrained local optimization Algorithms for constrained local optimization Fabio Schoen 2008 http://gol.dsi.unifi.it/users/schoen Algorithms for constrained local optimization p. Feasible direction methods Algorithms for constrained

More information

Convex Optimization & Lagrange Duality

Convex Optimization & Lagrange Duality Convex Optimization & Lagrange Duality Chee Wei Tan CS 8292 : Advanced Topics in Convex Optimization and its Applications Fall 2010 Outline Convex optimization Optimality condition Lagrange duality KKT

More information

MATH 4211/6211 Optimization Basics of Optimization Problems

MATH 4211/6211 Optimization Basics of Optimization Problems MATH 4211/6211 Optimization Basics of Optimization Problems Xiaojing Ye Department of Mathematics & Statistics Georgia State University Xiaojing Ye, Math & Stat, Georgia State University 0 A standard minimization

More information

How to Characterize Solutions to Constrained Optimization Problems

How to Characterize Solutions to Constrained Optimization Problems How to Characterize Solutions to Constrained Optimization Problems Michael Peters September 25, 2005 1 Introduction A common technique for characterizing maximum and minimum points in math is to use first

More information

Comprehensive Exam. Macro Spring 2014 Retake. August 22, 2014

Comprehensive Exam. Macro Spring 2014 Retake. August 22, 2014 Comprehensive Exam Macro Spring 2014 Retake August 22, 2014 You have a total of 180 minutes to complete the exam. If a question seems ambiguous, state why, sharpen it up and answer the sharpened-up question.

More information

Review of Optimization Methods

Review of Optimization Methods Review of Optimization Methods Prof. Manuela Pedio 20550 Quantitative Methods for Finance August 2018 Outline of the Course Lectures 1 and 2 (3 hours, in class): Linear and non-linear functions on Limits,

More information

Constrained Optimization

Constrained Optimization 1 / 22 Constrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University March 30, 2015 2 / 22 1. Equality constraints only 1.1 Reduced gradient 1.2 Lagrange

More information

Motivation. Lecture 2 Topics from Optimization and Duality. network utility maximization (NUM) problem:

Motivation. Lecture 2 Topics from Optimization and Duality. network utility maximization (NUM) problem: CDS270 Maryam Fazel Lecture 2 Topics from Optimization and Duality Motivation network utility maximization (NUM) problem: consider a network with S sources (users), each sending one flow at rate x s, through

More information

Lecture 12. Functional form

Lecture 12. Functional form Lecture 12. Functional form Multiple linear regression model β1 + β2 2 + L+ β K K + u Interpretation of regression coefficient k Change in if k is changed by 1 unit and the other variables are held constant.

More information

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu

More information

IE 5531: Engineering Optimization I

IE 5531: Engineering Optimization I IE 5531: Engineering Optimization I Lecture 1: Introduction Prof. John Gunnar Carlsson September 8, 2010 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I September 8, 2010 1 / 35 Administrivia

More information

Algorithms for nonlinear programming problems II

Algorithms for nonlinear programming problems II Algorithms for nonlinear programming problems II Martin Branda Charles University Faculty of Mathematics and Physics Department of Probability and Mathematical Statistics Computational Aspects of Optimization

More information

CHAPTER 2: CONVEX SETS AND CONCAVE FUNCTIONS. W. Erwin Diewert January 31, 2008.

CHAPTER 2: CONVEX SETS AND CONCAVE FUNCTIONS. W. Erwin Diewert January 31, 2008. 1 ECONOMICS 594: LECTURE NOTES CHAPTER 2: CONVEX SETS AND CONCAVE FUNCTIONS W. Erwin Diewert January 31, 2008. 1. Introduction Many economic problems have the following structure: (i) a linear function

More information

z = f (x; y) f (x ; y ) f (x; y) f (x; y )

z = f (x; y) f (x ; y ) f (x; y) f (x; y ) BEEM0 Optimization Techiniques for Economists Lecture Week 4 Dieter Balkenborg Departments of Economics University of Exeter Since the fabric of the universe is most perfect, and is the work of a most

More information

Fundamental Theorems of Optimization

Fundamental Theorems of Optimization Fundamental Theorems of Optimization 1 Fundamental Theorems of Math Prog. Maximizing a concave function over a convex set. Maximizing a convex function over a closed bounded convex set. 2 Maximizing Concave

More information

Following The Central Trajectory Using The Monomial Method Rather Than Newton's Method

Following The Central Trajectory Using The Monomial Method Rather Than Newton's Method Following The Central Trajectory Using The Monomial Method Rather Than Newton's Method Yi-Chih Hsieh and Dennis L. Bricer Department of Industrial Engineering The University of Iowa Iowa City, IA 52242

More information

Optimization and Newton s method

Optimization and Newton s method Chapter 5 Optimization and Newton s method 5.1 Optimal Flying Speed According to R McNeil Alexander (1996, Optima for Animals, Princeton U Press), the power, P, required to propel a flying plane at constant

More information

Optimization for Machine Learning

Optimization for Machine Learning Optimization for Machine Learning (Problems; Algorithms - A) SUVRIT SRA Massachusetts Institute of Technology PKU Summer School on Data Science (July 2017) Course materials http://suvrit.de/teaching.html

More information

Convex Feasibility Problems

Convex Feasibility Problems Laureate Prof. Jonathan Borwein with Matthew Tam http://carma.newcastle.edu.au/drmethods/paseky.html Spring School on Variational Analysis VI Paseky nad Jizerou, April 19 25, 2015 Last Revised: May 6,

More information

Project Discussions: SNL/ADMM, MDP/Randomization, Quadratic Regularization, and Online Linear Programming

Project Discussions: SNL/ADMM, MDP/Randomization, Quadratic Regularization, and Online Linear Programming Project Discussions: SNL/ADMM, MDP/Randomization, Quadratic Regularization, and Online Linear Programming Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305,

More information

Algorithms for nonlinear programming problems II

Algorithms for nonlinear programming problems II Algorithms for nonlinear programming problems II Martin Branda Charles University in Prague Faculty of Mathematics and Physics Department of Probability and Mathematical Statistics Computational Aspects

More information

Chapter 1. Preliminaries

Chapter 1. Preliminaries Introduction This dissertation is a reading of chapter 4 in part I of the book : Integer and Combinatorial Optimization by George L. Nemhauser & Laurence A. Wolsey. The chapter elaborates links between

More information

Lagrange duality. The Lagrangian. We consider an optimization program of the form

Lagrange duality. The Lagrangian. We consider an optimization program of the form Lagrange duality Another way to arrive at the KKT conditions, and one which gives us some insight on solving constrained optimization problems, is through the Lagrange dual. The dual is a maximization

More information

ECE 476. Exam #2. Tuesday, November 15, Minutes

ECE 476. Exam #2. Tuesday, November 15, Minutes Name: Answers ECE 476 Exam #2 Tuesday, November 15, 2016 75 Minutes Closed book, closed notes One new note sheet allowed, one old note sheet allowed 1. / 20 2. / 20 3. / 20 4. / 20 5. / 20 Total / 100

More information

Linear programming II

Linear programming II Linear programming II Review: LP problem 1/33 The standard form of LP problem is (primal problem): max z = cx s.t. Ax b, x 0 The corresponding dual problem is: min b T y s.t. A T y c T, y 0 Strong Duality

More information

Convexification by Duality for a Leontief Technology Production Design Problem

Convexification by Duality for a Leontief Technology Production Design Problem Vietnam Journal of Mathematics 35:3(2007) 299 308 9LHWQDP-RXUQDO RI 0$7+(0$7,&6 9$67 Convexification by Duality for a Multiple Leontief Technology Production Design Problem P. T. Thach Institute of Mathematics,

More information

Outline. Roadmap for the NPP segment: 1 Preliminaries: role of convexity. 2 Existence of a solution

Outline. Roadmap for the NPP segment: 1 Preliminaries: role of convexity. 2 Existence of a solution Outline Roadmap for the NPP segment: 1 Preliminaries: role of convexity 2 Existence of a solution 3 Necessary conditions for a solution: inequality constraints 4 The constraint qualification 5 The Lagrangian

More information

Primal/Dual Decomposition Methods

Primal/Dual Decomposition Methods Primal/Dual Decomposition Methods Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization Fall 2018-19, HKUST, Hong Kong Outline of Lecture Subgradients

More information

Computational Finance

Computational Finance Department of Mathematics at University of California, San Diego Computational Finance Optimization Techniques [Lecture 2] Michael Holst January 9, 2017 Contents 1 Optimization Techniques 3 1.1 Examples

More information

1 Computing with constraints

1 Computing with constraints Notes for 2017-04-26 1 Computing with constraints Recall that our basic problem is minimize φ(x) s.t. x Ω where the feasible set Ω is defined by equality and inequality conditions Ω = {x R n : c i (x)

More information

Math Camp Notes: Everything Else

Math Camp Notes: Everything Else Math Camp Notes: Everything Else Systems of Dierential Equations Consider the general two-equation system of dierential equations: Steady States ẋ = f(x, y ẏ = g(x, y Just as before, we can nd the steady

More information

Exam 2 Study Guide: MATH 2080: Summer I 2016

Exam 2 Study Guide: MATH 2080: Summer I 2016 Exam Study Guide: MATH 080: Summer I 016 Dr. Peterson June 7 016 First Order Problems Solve the following IVP s by inspection (i.e. guessing). Sketch a careful graph of each solution. (a) u u; u(0) 0.

More information

2 optimal prices the link is either underloaded or critically loaded; it is never overloaded. For the social welfare maximization problem we show that

2 optimal prices the link is either underloaded or critically loaded; it is never overloaded. For the social welfare maximization problem we show that 1 Pricing in a Large Single Link Loss System Costas A. Courcoubetis a and Martin I. Reiman b a ICS-FORTH and University of Crete Heraklion, Crete, Greece courcou@csi.forth.gr b Bell Labs, Lucent Technologies

More information

Handout 1: Introduction to Dynamic Programming. 1 Dynamic Programming: Introduction and Examples

Handout 1: Introduction to Dynamic Programming. 1 Dynamic Programming: Introduction and Examples SEEM 3470: Dynamic Optimization and Applications 2013 14 Second Term Handout 1: Introduction to Dynamic Programming Instructor: Shiqian Ma January 6, 2014 Suggested Reading: Sections 1.1 1.5 of Chapter

More information

subject to (x 2)(x 4) u,

subject to (x 2)(x 4) u, Exercises Basic definitions 5.1 A simple example. Consider the optimization problem with variable x R. minimize x 2 + 1 subject to (x 2)(x 4) 0, (a) Analysis of primal problem. Give the feasible set, the

More information

Lecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem

Lecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem Lecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem Michael Patriksson 0-0 The Relaxation Theorem 1 Problem: find f := infimum f(x), x subject to x S, (1a) (1b) where f : R n R

More information

An algorithmic solution for an optimal decision making process within emission trading markets

An algorithmic solution for an optimal decision making process within emission trading markets An algorithmic solution for an optimal decision making process within emission trading markets Stefan Pickl Department of Mathematics Center for Applied Computer Science (ZAIK) University of Cologne, Germany

More information

Computing regularization paths for learning multiple kernels

Computing regularization paths for learning multiple kernels Computing regularization paths for learning multiple kernels Francis Bach Romain Thibaux Michael Jordan Computer Science, UC Berkeley December, 24 Code available at www.cs.berkeley.edu/~fbach Computing

More information