Concave programming. Concave programming is another special case of the general constrained optimization. subject to g(x) 0

Similar documents
Mathematical Foundations -1- Constrained Optimization. Constrained Optimization. An intuitive approach 2. First Order Conditions (FOC) 7

Constrained maxima and Lagrangean saddlepoints

Optimization. A first course on mathematics for economists

Seminars on Mathematics for Economics and Finance Topic 5: Optimization Kuhn-Tucker conditions for problems with inequality constraints 1

Mathematical Economics: Lecture 16

The general programming problem is the nonlinear programming problem where a given function is maximized subject to a set of inequality constraints.

Firms and returns to scale -1- John Riley

Mathematical Foundations -1- Supporting hyperplanes. SUPPORTING HYPERPLANES Key Ideas: Bounding hyperplane for a convex set, supporting hyperplane

Firms and returns to scale -1- Firms and returns to scale

Constrained Optimization. Unconstrained Optimization (1)

UNDERGROUND LECTURE NOTES 1: Optimality Conditions for Constrained Optimization Problems

Summary Notes on Maximization

Convex Optimization & Lagrange Duality

EC487 Advanced Microeconomics, Part I: Lecture 5

Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization

CHAPTER 1-2: SHADOW PRICES

Lagrangian Duality. Richard Lusby. Department of Management Engineering Technical University of Denmark

Lakehead University ECON 4117/5111 Mathematical Economics Fall 2002

Nonlinear Programming and the Kuhn-Tucker Conditions

4TE3/6TE3. Algorithms for. Continuous Optimization

Duality. for The New Palgrave Dictionary of Economics, 2nd ed. Lawrence E. Blume

E 600 Chapter 4: Optimization

Constrained Optimization

Introduction to General Equilibrium: Framework.

CHAPTER 2: CONVEX SETS AND CONCAVE FUNCTIONS. W. Erwin Diewert January 31, 2008.

Lagrangian Duality. Evelien van der Hurk. DTU Management Engineering

In view of (31), the second of these is equal to the identity I on E m, while this, in view of (30), implies that the first can be written

EC /11. Math for Microeconomics September Course, Part II Problem Set 1 with Solutions. a11 a 12. x 2

Econ 508-A FINITE DIMENSIONAL OPTIMIZATION - NECESSARY CONDITIONS. Carmen Astorne-Figari Washington University in St. Louis.

Chapter 1: Linear Programming

Second Welfare Theorem

Nonlinear Programming (NLP)

Optimization Theory. Lectures 4-6

GENERALIZED CONVEXITY AND OPTIMALITY CONDITIONS IN SCALAR AND VECTOR OPTIMIZATION

Stochastic Programming Math Review and MultiPeriod Models

2 Kuhn Tucker Conditions

Mathematical Preliminaries for Microeconomics: Exercises

The Envelope Theorem

Solving a Class of Generalized Nash Equilibrium Problems

Motivation. Lecture 2 Topics from Optimization and Duality. network utility maximization (NUM) problem:

University of California, Davis Department of Agricultural and Resource Economics ARE 252 Lecture Notes 2 Quirino Paris

Final Exam - Math Camp August 27, 2014

The Fundamental Welfare Theorems

LINEAR PROGRAMMING II

Homework 3 Suggested Answers

Outline. Roadmap for the NPP segment: 1 Preliminaries: role of convexity. 2 Existence of a solution

Linear Programming. Larry Blume Cornell University, IHS Vienna and SFI. Summer 2016

Microeconomics I. September, c Leopold Sögner

General Equilibrium and Welfare

Nonlinear Optimization

Appendix PRELIMINARIES 1. THEOREMS OF ALTERNATIVES FOR SYSTEMS OF LINEAR CONSTRAINTS

The Kuhn-Tucker Problem

Lecture 4: Optimization. Maximizing a function of a single variable

Advanced Microeconomic Analysis, Lecture 6

Chap 2. Optimality conditions

The Fundamental Welfare Theorems

Constrained Optimization

Modern Optimization Theory: Concave Programming

1 General Equilibrium

Sensitivity Analysis and Duality in LP

Econ Slides from Lecture 14

Lagrangian Methods for Constrained Optimization

Linear and Combinatorial Optimization

Week 7: The Consumer (Malinvaud, Chapter 2 and 4) / Consumer November Theory 1, 2015 (Jehle and 1 / Reny, 32

The Ohio State University Department of Economics. Homework Set Questions and Answers

Review of Optimization Methods

. This matrix is not symmetric. Example. Suppose A =

Midterm Exam - Solutions

EC /11. Math for Microeconomics September Course, Part II Lecture Notes. Course Outline

Finite Dimensional Optimization Part III: Convex Optimization 1

subject to (x 2)(x 4) u,

School of Business. Blank Page

Roles of Convexity in Optimization Theory. Efor, T. E and Nshi C. E

3 Development of the Simplex Method Constructing Basic Solution Optimality Conditions The Simplex Method...

EE/AA 578, Univ of Washington, Fall Duality

where u is the decision-maker s payoff function over her actions and S is the set of her feasible actions.

Convex Optimization. Dani Yogatama. School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, USA. February 12, 2014

ECONOMICS 207 SPRING 2008 PROBLEM SET 13

Differentiable Welfare Theorems Existence of a Competitive Equilibrium: Preliminaries

Convex analysis and profit/cost/support functions

1.4 FOUNDATIONS OF CONSTRAINED OPTIMIZATION

On the (Non-)Differentiability of the Optimal Value Function When the Optimal Solution Is Unique

Convex Optimization Boyd & Vandenberghe. 5. Duality

Convex Optimization Theory. Chapter 5 Exercises and Solutions: Extended Version

SECTION 5.1: Polynomials

1 Theory of the Firm: Topics and Exercises

Increasingly, economists are asked not just to study or explain or interpret markets, but to design them.

Notes IV General Equilibrium and Welfare Properties

Course notes for EE394V Restructured Electricity Markets: Locational Marginal Pricing

Convex Sets with Applications to Economics

Competitive Equilibrium

How to Characterize Solutions to Constrained Optimization Problems

Math Camp Notes: Everything Else

STATIC LECTURE 4: CONSTRAINED OPTIMIZATION II - KUHN TUCKER THEORY

Problem Set 3

Notes on Consumer Theory

Vector Spaces. Addition : R n R n R n Scalar multiplication : R R n R n.

Adding Production to the Theory

Mathematical Foundations II

Moral Hazard: Part 1. April 9, 2018

Transcription:

1 Introduction Concave programming Concave programming is another special case of the general constrained optimization problem max f(x) subject to g(x) 0 in which the objective function f is concave and the constraint functions g j are convex. For such problems, an alternative derivation of the Kuhn-Tucker conditions is possible, providing yet another perspective on the Lagrangean method. 2 Derivation Consider the family of optimization problems max f(x) subject to g(x) c where the parameter c measures the available resources. The value function v(c) =max{ f(x) :g(x) c } summarizes what can be achieved with different amounts of resource c. The set of attainable outcomes is given by its hypograph A = { (c,z) Y R: z v(c) } Since v is concave (Theorem 3.1), its hypograph A is convex (Exercise 3.125). Let z = v(0)and define B = { (c,z) Y R: c 0,z z } B is convex, and its interior is nonempty and disjoint from A. 1

B z v(c) z A 0 c There exists a hyperplane separating A and B (Corollary 3.2.1), the equation of which is z = z + λ T c or z λ T c = z (1) with z = z λ T 0 z λ T c for every (c,z) A (2) That is, (0,z )maximizes the linear functional z λ T c on A. By definition, for every (c,z) A, there exists some x X such that f(x) = z and g(x) c. Letx be such that g(x ) 0 and f(x )=z. Substituting in (2) f(x ) f(x) λ T g(x) forevery x X The right-hand side of this inequality is precisely the Lagrangean of the constrained optimization problem. At the optimal solution (0,z ),thelagrangean is maximized over X. In particular, x X and therefore f(x ) f(x ) λ T g(x ) which implies that λ T g(x ) 0. But λ 0 (Exercise 1)and g(x ) 0, and therefore we must have λ j g j (x )=0forevery j =1, 2,...m 2.1 Concave programming theorem 5.6 Suppose that f is concave and g j are convex and there exists an ˆx X for which g(ˆx) < 0. Thenx is a global solution of max f(x)subject to g(x) 0 if and only if there exist multipliers λ =(λ 1,λ 2,...,λ m ) 0 such that the Lagrangean L(x, λ) =f(x) λ j g j (x)is maximized on X and λ j g j (x )=0 for every j. 2

3 Net benefit approach An economic approach to constrained maximization would be to assign a price λ j to each constraint and maximize the net benefit L(x, λ) =f(x) λ j g j (x) where g j (x)measures the resources used and the f(x)the total benefit attained. If f is concave and g convex, the Lagrangean is concave, and the Kuhn-Tucker conditions identify those policies which maximize the net benefit. This is the fundamental economic insight of the Lagrangean method. Example An electricity company operates n generating plants. At each plant i, itcostsc i (x i )to produce x i units of electricity. If the company aims to meet electricity demand D at minimum cost, its optimization problem is min c i (x i )subject to x i = D x The Lagrangean is L(x, λ) = ( ) c i (x i ) λ x i D = ( c i (x i )+λ D ) x i Suppose that the company has the option of purchasing electricity from outside suppliers at a price λ. Then, the Lagrangean (3)represents the sum of the costs of producing electricity at its own plants and purchasing electricity from outside. For any arbitrary price λ, the company will choose an optimal mix of its own production and outside purchase to minimize the total costs. The first-order conditions for a minimum of (3)are D xi L = D xi c i λ =0 (4) Optimality requires that the company utilize each plant to the level at which its marginal cost is equal to the alternative price λ. As the price increases, the proportion of demand which it satisfies from its own resources will increase. At some price λ, the company will be induced to fill total demand D from its own production. This is the shadow price which arises from the solution of (4)together with the constraint x i = D and is the marginal cost of producing the total demand D from its own plants. 3 (3)

4 Summary of necessary and sufficient conditions Necessary x 0 Sufficient Unconstrained f = 0 f 0, f T x =0 f pseudoconcave Equality constraints L = 0 L 0, L T x =0 f pseudoconcave g(x) =0 g quasiconvex Inequality constraints L = 0 L 0, L T x =0 f pseudoconcave g(x) 0 λ 0, λ T g(x )=0 λ 0, λ T g(x )=0 g quasiconvex Concave programming maxl maxl f concave g convex 5 Homework 1. In general, the equation of the separating hyperplane in (1)is αz λ T c = c R Show that α 0andλ 0. [Hint: ] Consider the point (0,z +1) B. 2. The constraints g satisfies the Slater constraint qualification condition if there exist ˆx X with g(ˆx) < 0. Show that this implies that α>0. Therefore, without loss of generality, we can set α = 1 as in (1)(see Remark 5.8). 3. Show how the shadow price can be used to decentralize the running of the power company, leaving the production level at each plant to be determined locally. 4. (Pareto optimality)suppose that there are n agents each of which has preferences over a set X. The preferences of agent i can be presented by the concave function u i : X R. A choice x X is Pareto optimal if it impossible to make any agent better off without making another agent worse off. Show that x is Pareto optimal if and only 4

if it maximizes a weighted average of the individual utility functions, that is x arg max α i u i (x) for some weights α 1,α 2,...,α n. 5. (Peak load pricing)suppose a public utility supplies a service, the demand for which varies with the time of day. For simplicity, assume that demand in each period is independent of the price in other periods. The inverse demand function for each period is p i (y i ). Assume that marginal production costs c i are constant, independent of capacity and independent across periods. Further assume that the marginal cost of capacity c 0 is constant. With these assumptions, the total cost function is c(y, Y )= c i y i + c 0 Y The objective is to determine outputs y i (and hence prices p i )and production capacity Y to maximize social welfare, as measured by total consumer and producer surplus. In any period i, total surplus is measured by the area between the demand and cost curves, that is and so total surplus is S(y, Y )= S i (y, Y )= yi 0 yi 0 (p i (τ) c i )dτ (p i (τ) c i )dτ c 0 Y (5) The optimization problem is to choose nonnegative y i and Y so as to maximize (5)subject to the constraints y i Y i =1, 2,...,n Show that it is optimal to price at marginal cost during off-peak periods, and extract a premium during peak periods, where the total premium is equal to the marginal cost of capacity c 0. Furthermore, under this pricing rule, the enterprise will break even. Note that yi D yi S i (y, Y )=D yi (p i (τ) c i )dτ = p i (y i ) c i 0 5

Solutions 6 1 The point (0,z + 1)belongs to B, and therefore α(z +1) λ T 0 c = αz λ T 0 which implies that α 0. Similarly, let { e 1, e 2,...,e m } denote the standard basis for R m (Example 1.79). For any j =1, 2,...,m,thepoint(0 e j,z ) (which corresponds to decreasing resource j by one unit)belongs to B and therefore (from (75)) which implies that λ j 0. αz λ T (0 e j )=αz + λ j αz λ T 0 = z 2 Let ĉ = g(ˆx) < 0 and ˆz = f(ˆx) Suppose α = 0. Then, at least one component of λ must be nonzero. That is, λ 0 and therefore λ T ˆc <0 (1) But (ĉ, ˆz) A and (2)implies αˆz λ T ĉ αz λ T 0 and therefore α = 0 implies λ T ˆc 0 contradicting (1). Therefore, we conclude that α>0. 3 The Lagrangean L(x,λ)= ( c i (x i )+λ D ) x i (2) can be rewritten as L(x,λ)= ( λxi c i (x i ) ) + λd (3) The ith term in the sum is the net profit of plant i if its output is valued at λ. Therefore, if the company undertakes to buy electricity from its plants at the price λ and instructs each plant manager to produce so as to maximize the plant s net profit, each manager will be induced to choose an output 1

level which maximizes the profit of the company as a whole. This is the case whether the price λ is the market price at which the company can buy electricity from external suppliers or the shadow price determined by the need to satisfy the total demand D. In this way, the shadow price λ can be used to decentralize the production decision. 4 x is Pareto optimal if and only if it x solves the problem max u 1(x) subject to u i (x) u i (x ) which can be written in standard form as i =2, 3,...,n max u 1(x) subject to g i (x) =u i (x ) u i (x) 0 i =2, 3,...,n Since u i is concave, g i is convex for every i. Assuming the qualification condition is satisfied (see below), the problem satisfies the conditions of the concave programming theorem (Theorem 5.6). Therefore, x is Pareto optimal if and only if there exist multipliers α 2,α 3,...,α n such that the Lagrangean L(x, α) =u i (x) n α ig i (x)is maximized on X, thatis ( L(x,α)=u 1 (x ) α i ui (x ) u i (x ) ) ( u 1 (x) α i ui (x ) u i (x) ) = L(x, α) for every x X which implies that u 1 (x )+ α i u i (x ) u 1 (x)+ α i u i (x)for every x X Letting α 1 = 1 gives the desired result. The constraint qualification condition requires that there exists ˆx X such that g j (ˆx) < 0oru i (ˆx) >u i (x )forevery i =2, 3,...,n Suppose this is satisfied for only a subset of agents. Without loss of generality, suppose that there exists ˆx X such that g j (ˆx) < 0oru i (ˆx) >u i (x )forevery i =2, 3,...,m<n If x is Pareto optimal with respect to agents 1, 2,...,m, it solves the problem max u 1(x) subject to u i (x) u i (x ) i =2, 3,...,m 2

which can be written in standard form as max u 1(x) subject to g i (x) =u i (x ) u i (x) 0 i =2, 3,...,n This problem satisfies the conditions of the concave programming theorem (Theorem 5.6), including the constraint qualification condition. Therefore, there exist multipliers α 2,α 3,...,α m such that the Lagrangean L(x, α) = u i (x) n α ig i (x)is maximized on X, thatis m ( L(x,α)=u 1 (x ) α i ui (x ) u i (x ) ) m ( u 1 (x) α i ui (x ) u i (x) ) = L(x, α) for every x X which implies that u 1 (x )+ m α i u i (x ) u 1 (x)+ m α i u i (x)for every x X Letting α 1 =1andα i =0foreveryi>mgives the desired result, namely x arg max α i u i (x)(4) Conversely, suppose x satisfies (4)for some weights α 1,α 2,...,α n. Assume to the contrary that x is not Pareto optimal. That is, there exists x X such that with Adding, this implies that u i (x) u i (x )forevery i =1, 2,...,n and u i (x) >u i (x )foratleastonei α i u i (x) > α i u i (x ) contradicting (4). 3

5 The utility s optimization problem is max S(y, Y )= y,y 0 yi subject to g i (y,y)=y i Y 0 0 (p i (τ) c i )dτ c 0 Y i =1, 2,...,n The demand independence assumption ensures that the objective function S is concave, since its Hessian Dp 1 0... 0 0 H S = 0 Dp 2...0 0 0... Dp n 0 0... 0 0 is nonpositive definite (Exercise 3.96). The constraints are linear and hence convex. Moreover, there exists a point (0, 1)such that for every i =1, 2,...,n g i (0, 1)= 0 1 < 0 Therefore the problem satisfies the conditions of Theorem 5.6. The optimal solution (y,y )satisfies the Kuhn-Tucker conditions, that is there exist multipliers λ 1,λ 2,...,λ m such that for every period i =1, 2,...,n D yi L = p i (y i ) c i λ i 0 y i 0 y i (p i (y i ) c i λ i )=0 (5) y i Y λ i 0 λ(y y i )=0 and that capacity be chosen such that D Y L = c 0 λ i 0 Y 0 Y where L is the Lagrangean ( c 0 ) λ i =0 (6) L(y, Y, λ) = yi 0 (p i (τ) c i )dτ c 0 Y λ i (y i Y ) In off-peak periods (y i <Y), complementary slackness requires that λ i =0 and therefore from (5) p i (y i )=c i assuming y i > 0. In peak periods (y i = Y ) p i (y i )=c i + λ i 4

We conclude that it is optimal to price at marginal cost in off-peak periods and charge a premium during peak periods. Furthermore, (6)implies that the total premium is equal to the marginal capacity cost Furthermore, note that λ i = c 0 λ i y i = λ i y i + Peak = y i =Y = y i =Y Off-peak λ i y i + λ i =0 λ i y i λ i Y λ i y i = λ i Y = c 0 Y Therefore, the utility s total revenue is R(y, Y )= = = = p i (y i )y i (c i + λ i )y i c i y i + λ i y i c i y i + c 0 Y = c(y, Y ) Under the optimal pricing policy, revenue equals cost and the utility breaks even. 5