Microeconomics I. September, c Leopold Sögner

Similar documents
The general programming problem is the nonlinear programming problem where a given function is maximized subject to a set of inequality constraints.

EC /11. Math for Microeconomics September Course, Part II Problem Set 1 with Solutions. a11 a 12. x 2

In view of (31), the second of these is equal to the identity I on E m, while this, in view of (30), implies that the first can be written

Nonlinear Programming and the Kuhn-Tucker Conditions

Generalization to inequality constrained problem. Maximize

Week 4: Calculus and Optimization (Jehle and Reny, Chapter A2)

Summary Notes on Maximization

Introduction to Machine Learning Lecture 7. Mehryar Mohri Courant Institute and Google Research

EC /11. Math for Microeconomics September Course, Part II Lecture Notes. Course Outline

The Kuhn-Tucker Problem

Optimization. A first course on mathematics for economists

7) Important properties of functions: homogeneity, homotheticity, convexity and quasi-convexity

Nonlinear Programming (NLP)

. This matrix is not symmetric. Example. Suppose A =

Constrained Optimization and Lagrangian Duality

Week 3: Sets and Mappings/Calculus and Optimization (Jehle September and Reny, 20, 2015 Chapter A1/A2)

Mathematical Economics. Lecture Notes (in extracts)

Seminars on Mathematics for Economics and Finance Topic 5: Optimization Kuhn-Tucker conditions for problems with inequality constraints 1

MATH2070 Optimisation

Optimality Conditions for Constrained Optimization

Lecture 2: Convex Sets and Functions

Mathematical Preliminaries for Microeconomics: Exercises

Week 7: The Consumer (Malinvaud, Chapter 2 and 4) / Consumer November Theory 1, 2015 (Jehle and 1 / Reny, 32

TMA 4180 Optimeringsteori KARUSH-KUHN-TUCKER THEOREM

Constrained Optimization

Economics 101A (Lecture 3) Stefano DellaVigna

Mathematical Economics: Lecture 16

ISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints

Microeconomics I. September, c Leopold Sögner

Lectures 9 and 10: Constrained optimization problems and their optimality conditions

UNDERGROUND LECTURE NOTES 1: Optimality Conditions for Constrained Optimization Problems

Constrained Optimization Theory

Mathematical Foundations -1- Convexity and quasi-convexity. Convex set Convex function Concave function Quasi-concave function Supporting hyperplane

January 29, Introduction to optimization and complexity. Outline. Introduction. Problem formulation. Convexity reminder. Optimality Conditions

OPTIMIZATION THEORY IN A NUTSHELL Daniel McFadden, 1990, 2003

CONVEX FUNCTIONS AND OPTIMIZATION TECHINIQUES A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF

September Math Course: First Order Derivative

GENERALIZED CONVEXITY AND OPTIMALITY CONDITIONS IN SCALAR AND VECTOR OPTIMIZATION

Lecture 4: Optimization. Maximizing a function of a single variable

Optimization Theory. Lectures 4-6

Appendix A Taylor Approximations and Definite Matrices

E 600 Chapter 3: Multivariate Calculus

University of California, Davis Department of Agricultural and Resource Economics ARE 252 Lecture Notes 2 Quirino Paris

MATHEMATICAL ECONOMICS: OPTIMIZATION. Contents

ARE211, Fall2015. Contents. 2. Univariate and Multivariate Differentiation (cont) Taylor s Theorem (cont) 2

Modern Optimization Theory: Concave Programming

Lecture 3: Basics of set-constrained and unconstrained optimization

Review of Optimization Methods

Optimality Conditions

Convex Functions and Optimization

or R n +, R n ++ etc.;

5. Duality. Lagrangian

CS-E4830 Kernel Methods in Machine Learning

Preliminary draft only: please check for final version

Convex Optimization M2

Constrained Optimization

Convex Analysis and Economic Theory Winter 2018

8. Constrained Optimization

EC400 Math for Microeconomics Syllabus The course is based on 6 sixty minutes lectures and on 6 ninety minutes classes.

EC9A0: Pre-sessional Advanced Mathematics Course. Lecture Notes: Unconstrained Optimisation By Pablo F. Beker 1

Optimization Problems with Constraints - introduction to theory, numerical Methods and applications

Chapter 13. Convex and Concave. Josef Leydold Mathematical Methods WS 2018/19 13 Convex and Concave 1 / 44

Constrained optimization: direct methods (cont.)

Second Order Optimality Conditions for Constrained Nonlinear Programming

Lecture 3. Optimization Problems and Iterative Algorithms

1 Convexity, concavity and quasi-concavity. (SB )

OPTIMALITY AND STABILITY OF SYMMETRIC EVOLUTIONARY GAMES WITH APPLICATIONS IN GENETIC SELECTION. (Communicated by Yang Kuang)

CHAPTER 2: CONVEX SETS AND CONCAVE FUNCTIONS. W. Erwin Diewert January 31, 2008.

Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization

3. Neoclassical Demand Theory. 3.1 Preferences

Monotone Function. Function f is called monotonically increasing, if. x 1 x 2 f (x 1 ) f (x 2 ) x 1 < x 2 f (x 1 ) < f (x 2 ) x 1 x 2

x +3y 2t = 1 2x +y +z +t = 2 3x y +z t = 7 2x +6y +z +t = a

Notes I Classical Demand Theory: Review of Important Concepts

Math Camp Notes: Everything Else

Mathematical Preliminaries

Numerisches Rechnen. (für Informatiker) M. Grepl P. Esser & G. Welper & L. Zhang. Institut für Geometrie und Praktische Mathematik RWTH Aachen

Calculus and optimization

Ebonyi State University, Abakaliki 2 Department of Computer Science, Ebonyi State University, Abakaliki

E 600 Chapter 4: Optimization

Scientific Computing: Optimization

Mathematical Foundations -1- Constrained Optimization. Constrained Optimization. An intuitive approach 2. First Order Conditions (FOC) 7

Convex Optimization Boyd & Vandenberghe. 5. Duality

Week 9: Topics in Consumer Theory (Jehle and Reny, Chapter 2)

The Fundamental Welfare Theorems

The Fundamental Welfare Theorems

Static Problem Set 2 Solutions

Introduction to Optimization Techniques. Nonlinear Optimization in Function Spaces

BEEM103 UNIVERSITY OF EXETER. BUSINESS School. January 2009 Mock Exam, Part A. OPTIMIZATION TECHNIQUES FOR ECONOMISTS solutions

Numerical Optimization

Convex Optimization & Lagrange Duality

4TE3/6TE3. Algorithms for. Continuous Optimization

Paul Schrimpf. October 17, UBC Economics 526. Constrained optimization. Paul Schrimpf. First order conditions. Second order conditions

Division of the Humanities and Social Sciences. Supergradients. KC Border Fall 2001 v ::15.45

ARE202A, Fall 2005 CONTENTS. 1. Graphical Overview of Optimization Theory (cont) Separating Hyperplanes 1

ECON 5111 Mathematical Economics

14 Lecture 14 Local Extrema of Function

Numerical Optimization of Partial Differential Equations

Econ Slides from Lecture 14

MVE165/MMG631 Linear and integer optimization with applications Lecture 13 Overview of nonlinear programming. Ann-Brith Strömberg

Constrained optimization

Transcription:

Microeconomics I c Leopold Sögner Department of Economics and Finance Institute for Advanced Studies Stumpergasse 56 1060 Wien Tel: +43-1-59991 182 soegner@ihs.ac.at http://www.ihs.ac.at/ soegner September, 2012

Motivation (1) Motivation: Sufficient conditions for maximum for Kuhn Tucker problem: Suppose that there are no nonlinear equality constraints and each inequality constraint is given by a quasiconvex function. Suppose that the objective function satisfies f(x)(x x) > 0 for any x and x with f(x ) > f(x). If x satisfied the Kuhn-Tucker conditions, then x is a global maximizer. (see Mas-Colell, Theorem [M.K.3]). Jehle, Reny: Chapter A 1.4, 1.4. Mas-Colell, Chapter M.C 1

Concave Functions (1) Consider a convex subset A of R n. Definition - Concave Function: A function f : A R is concave if f(νx + (1 ν)x) νf(x ) + (1 ν)f(x), ν [0, 1]. If strict > holds then f is strictly concave; ν (0, 1) and x x. This last equation can be rewritten with z = x x and α = ν: f(x + αz) αf(x ) + (1 α)f(x). If f is (strictly) concave then f is (strictly) convex. 2

Concave Functions (2) Theorem - Tangents and Concave Functions: If f is continuously differentiable and concave, then f(x ) f(x) + f(x)) (x x) (and vice versa). < holds if f is strict concave for all x x. [Theorem M.C.1] For the univariate case this implies that the tangent line is above the function graph of f(x); strictly for x x with strict concave functions. 3

Concave Functions (3) : For α (0, 1] the definition of a concave function implies: f(x ) = f(x + z) f(x) + f(x + αz) f(x) α If f is differentiable the limit of the last term exists such that f(x + z) f(x) + f(x) z 4

Concave Functions (4) : Suppose that f(x + z) f(x) f(x) z for any non-concave function. Since f(.) is not concave f(x + z) f(x) > f(x + αz) f(x) α for some x, z and α (0, 1]. Taking the limit results in f(x + z) f(x) > f(x) z, i.e. we arrive at a contradiction. 5

Concave Functions (5) Theorem - Hessian and Concave Functions: If f is twice continuously differentiable and concave, then the Hessian matrix D 2 f(x) is negative semidefinite; negative definite for strict concave functions (and vice versa). [Theorem M.C.2] 6

Concave Functions (6) : A Taylor expansion of f(x ) around the point α = 0 results in f(x + αz) = f(x) + f(x) (αz) + α2 2 (z D 2 (f(x + β(α)z))z) By the former theorem we know that f(x + αz) f(x) f(x) (αz) 0 for concave functions z D 2 (f(x + β(α)z))z 0. For arbitrary small α we get z D 2 (f(x))z 0. 7

Concave Functions (7) : If the right hand side of f(x + αz) f(x) f(x) (αz) = 0.5α 2 (z D 2 (f(x + β(α)z))z) is 0 then the left hand side. By the former theorem f is concave. 8

Quasiconcave Functions (1) Definition - Quasiconcave Function: A function f : A R is quasiconcave if f(νx + (1 ν)x) min{f(x ), f(x)}, ν [0, 1]. If > holds it is said to be strict quasiconcave; ν (0, 1) and x x. Quasiconvex is defined by f(νx + (1 ν)x) max{f(x ), f(x)}. If f is quasiconcave than f is quasiconvex. If f is concave then f is quasiconcave but not vice versa. E.g. f(x) = x for x > 0 is concave and also quasiconcave. x 3 is quasiconcave but not concave. 9

Quasiconcave Functions (2) Transformation property: Positive monotone transformations of quasiconcave functions result in a quasiconcave function. Definition - Superior Set: S(x) := {x A f(x ) f(x)} is called superior set of x (upper contour set of x). Note that if f(x ν ) min{f(x ), f(x )}, then if f(x ) t and f(x ) t this implies that f(x ν ) t; where t = f(x). 10

Quasiconcave Functions (3) Theorem - Quasiconcave Function and Convex Sets: The function f is quasiconcave if and only if S(x) is convex for all x A. 11

Quasiconcave Functions (4) Sufficient condition : If f is quasiconcave then S(x) is convex. Consider x 1 and x 2 in S(x). We need to show that f(x ν ) in S(x); f(x) = t. Since f(x 1 ) t and f(x 2 ) t, the quasiconcave f implies f(x ν ) min{f(x 1 ), f(x 2 )} t. Therefore f(x ν ) S(x); i.e. the set S(x) is convex. 12

Quasiconcave Functions (5) Necessary condition : If S(x) is convex then f(x) has to be quasiconcave. W.l.g. assume that f(x 1 ) f(x 2 ), x 1 and x 2 in A. By assumption S(x) is convex, such that S(x 2 ) is convex. Since f(x 1 ) f(x 2 ), we get x 1 S(x 2 ) and x ν S(x 2 ). From the definition of S(x 2 ) we conclude that f(x ν ) f(x 2 ) = min{f(x 1 ), f(x 2 )}. Therefore f(x) has to be quasiconcave. 13

Quasiconcave Functions (6) Theorem - Gradients and Quasiconcave Functions: If f is continuously differentiable and quasiconcave, then f(x) (x x) 0 whenever f(x ) f(x) (and vice versa). [Theorem M.C.3] If f(x) (x x) > 0 whenever f(x ) f(x) and x x then f(x) is strictly quasiconcave. If f(x) is strictly quasiconcave and if f(x) 0 for all x A, then f(x) (x x) > 0 whenever f(x ) f(x) and x x. 14

Quasiconcave Functions (7) : For f(x ) f(x) and α (0, 1] the definition of a quasiconcave function implies: f(x + α(x x)) f(x) α 0 If f is differentiable, then the limit exists such that f(x) z 0 15

Quasiconcave Functions (8) : Suppose that f(x) z 0 holds but f is not quasiconcave. Then f(x + αz) f(x) < 0 for some x, z and α (0, 1]. Such that (f(x + αz) f(x))/α < 0. Taking the limit results in a contradiction. 16

Quasiconcave Functions (9) Theorem - Hessian Matrix and Quasiconcave Functions: Suppose f is twice continuously differentiable. f(x) is quasiconcave if and only if D 2 (f(x)) is negative semidefinite in the subspace {z f(x) z = 0}. I.e. z D 2 (f(x))z 0 whenever f(x) z = 0. [Theorem M.C.4] If the Hessian D 2 (f(x)) is negative definite in the subspace {z f(x) z = 0} for every x A then f(x) is strictly quasiconcave. 17

Quasiconcave Functions (10) : If f is quasiconcave then whenever f(x ν ) f(x), so f(x) (αz) 0 has to hold. Thus f(x 1 ) f(x) 0 and the above theorem imply: f(x) (z) 0, where z = x 1 x. A first order Taylor series expansion of f in α (at α = 0) results in f(x + αz) = f(x) + f(x)αz + α2 2 (z D 2 f(x + β(α)z)z ). 18

Quasiconcave Functions (11) Apply this to x 1, x with f(x 1 ) f(x): f(x + αz) f(x) f(x)αz = α2 2 z D 2 f(x + β(α)z)z. If z = x 1 x fulfills f(x)(x 1 x) = 0 the above inequality still has to hold. This implies α 2 /2z D 2 f(x + β(α)z)z 0. 19

Quasiconcave Functions (12) To fulfill this requirement on the subspace {z f(x) z = 0}, where f(x)αz = 0, this requires a negative definite Hessian of f(x). : In the above equation a negative semidefinite Hessian implies that.... 20

Envelope Theorem (1) Consider f(x; q), x are variables in R N and q are parameters in R S. We look at the constrained maximization problem max x f(x; q) s.t.g m (x; q) b m m = 1,..., M. Assume that the solution of this optimization problem x = x(q) is at least locally differentiable function (in a neighborhood of a q considered). v(q) = f(x(q); q) is the maximum value function associated with this problem. 21

Envelope Theorem (2) With no constraints (M = 0) and S, N = 1 the chain rule yields: d f(x( q); q) x( q) v( q) = dq x q + f(x( q); q). q With an unconstrained maximization problem the first order = 0 results in condition f(x( q); q) x d f(x( q); q) v( q) =. dq q 22

Envelope Theorem (3) [T. M.L.1] Consider the value function v(q) for the above constrained maximization problem. Assume that v(q) is differentiable at q R S and (λ 1,..., λ M ) are the Lagrange multipliers associated with the maximizer solution x(q) at q. In addition the inequality constraints are remain unaltered in a neighborhood of q. Then v( q) q s = f(x( q); q) q s M m=1 λ m g m (x( q); q) q s. For s = 1,..., S. 23

Envelope Theorem (4) With no constraints (M = 0) and S, N = 1 the chain rule yields: v( q) dq s = N n=1 f(x( q); q) x n ( q) x n q s + f(x( q); q) q s. The first order conditions tell us f(x( q); q) x n = M m=1 λ m g m (x( q); q) x n. 24

Envelope Theorem (5) In addition we observe N n=1 g m (x( q); q) x n ( q) x n q s + g m( q) q s = 0. if a constraint is binding; if not the multiplier λ m is zero. 25

Envelope Theorem (6) Plugging in and changing the order of summation results in : v( q) dq s = M m=1 λ m N n=1 g m (x( q); q) x n ( q) x n q s + f(x( q); q) q s. and v( q) dq s = M m=1 λ m g m (x( q); q) q s + f(x( q); q) q s. Remark: remember that the Lagrangian of the problem is L(x, λ; q) = f(x; q) m λ mg m (x; q). Hence we get v( q) dq s by means of the partial derivative of the Lagrangian with respect to q l, evaluated at q. 26