Calculus and optimization
|
|
- Duane Russell
- 5 years ago
- Views:
Transcription
1 Calculus an optimization These notes essentially correspon to mathematical appenix 2 in the text. 1 Functions of a single variable Now that we have e ne functions we turn our attention to calculus. A function f is i erentiable if it is continuous an "smooth" with no breaks or kinks. A erivative, f 0 (x), gives the slope or instantaneous rate of change in f (x). If f 0 (x) is i erentiable, then we can n the secon erivative, f 00 (x), too. The secon erivative is relate to the curvature of the original function as it tells how the rst erivative is changing. Table 1 provies some common rules of i erentiation. We now state some equivalency relations for concave functions. Theorem 1 Let D be a nonegenerate interval of real numbers on which f is twice continuously i erentiable. The following statements are equivalent: 1. f is concave 2. f 00 (x) 0 for all x 2 D 3. For all x 0 2 D : f (x) f x 0 + f 0 x 0 x x 0 4. If f 00 (x) < 0 for all x 2 D, then f is strictly concave. From our prior theorem relating convex an concave functions we also have a theorem for convex functions by reversing the inequalities an changing the wor "concave" to "convex". 2 Functions of several variables Many times we will work with functions of several variables. In consumer theory rms have utility functions which are functions of the quantities consume of various goos; in proucer theory rms have cost functions which are functions of the quantities of inputs use to prouce a goo. Since there are multiple variables, it is sometimes easier to think in terms of the slope of one particular variable, holing the other variables constant. De nition 2 Let y = f (x 1 ; :::; x n ). The partial erivative of f with respect to x i is e ne as: (x) lim h!0 f (x 1 ; :::; x i + h; :::; x n ) f (x 1 ; :::; x i ; :::; x n ) h (1) Constants, Sums Power rule Prouct rule Quotient rule Chain rule ln rule x () x (f (x) g (x)) = f 0 (x) g 0 (x) x (xn ) = nx n 1 x (f (x) g (x)) = f (x) g0 (x) + f 0 (x) g (x) x f(x) g(x) = g(x)f 0 (x) f(x)g 0 (x) (g(x)) 2 x (f (g (x))) = f 0 (g (x)) g 0 (x) ( ln x) x = x Table 1: Common rules of i erentiation 1
2 The rst thing to notice is that there are n partial erivatives. In essence, to n the n th partial erivative of a particular function simply treat the other n 1 variables as constants. If f (x 1 ; x 2 ) = x x 1 x 2 x 2 2, then we have: = 2x 1 + 3x 2 (2) = 3x 1 2x 2 If we collect the n partial erivatives into a row vector, then this will be the graient of the function. for our example the graient, e ne as rf (x) is: h i = 2x 1 + 3x 2 3x 1 2x 2 We can also consier the partial erivatives of the graient, as the secon partials of the original function shoul provie information about the curvature of the original function. Note that if one takes the secon partial erivatives of a row vector of length n then the result will be an nn matrix of secon partials. This matrix of secon partial erivatives is calle the Hessian matrix. Consiering our example of f (x 1 ; x 2 ) = x x 1 x 2 x 2 2, the Hessian matrix, H (x), is: " 2 2 H (x) = = (4) f A theorem which we will be able to invoke later is Young s Theorem, which basically states that the orer in which the partial erivatives are taken oes not a ect the resulting secon partial erivative: Theorem 3 (Young s Theorem) For any twice i erentiable function f 2 f f 2 = 2 j We can see in our example that = 3. While we will not get into the etails of curvature it shoul be helpful to state some results: Theorem 4 Let D be a convex subset of R n with a nonempty interior on which f is twice continuously i erentiable. The following statements are equivalent: 1. f is concave 2. H (x) is negative semie nite for all x in D. 3. For all x 0 2 D : f (x) f x 0 + rf x 0 x x Homogeneous functions A homogeneous function are real-value functions which behave in particular ways as all variables are increase simultaneously an in the same proportion. De nition 5 A real-value function is calle homogeneous of egree k if f (tx) t k f (x) (6) If a function is homogeneous of egree 0, then increasing the variables in the same proportion will leave the value of the function unchange. If the function is homogeneous of egree 1, then increasing the variables in the same proportion will increase the value of the function in the same proportion (if all the variables are ouble, the value of the function oubles, etc.). So (3) 2
3 3 Optimization Suppose we have a single variable function f (x). Assume that f (x) is i erentiable. The function f (x) achieves a local maximum at x if f (x ) f (x) for all x in some neighborhoo of x. The function f (x) achieves a global maximum at x if f (x ) f (x) for all x in the omain of the function. The local maximum is unique if f (x ) > f (x) for all x in some neighborhoo of x an the global maximum is unique if f (x ) > f (x) for all x in the omain of the function. Similarly, the function achieves a local minimum at ex if f (ex) f (x) for all x in some neighborhoo of ex an a global mininum at ex if f (ex) f (x) for all x in the omain of the function. If the inequalities are strict then the minimum is unique. What follows are the rst-orer necessary conitions an secon-orer necessary conitions for the optima of a general twice continuously i erentiable function of one variable: Theorem 6 Let f (x) be a twice continuously i erentiable function of one variable. local interior: 1. maximum at x =) f 0 (x ) =) f 00 (x ) 0 2. minimum at ex =) f 0 (x ) =) f 00 (ex) 0 Then f (x) reaches a 3.1 Optima for real-value functions of multiple variables Suppose that D R n an let f : D! R be a twice continuously i erentiable real-value function of n variables. Intuitively, a local maximum is achieve at x if no small movement away from x results in the function increasing. The rst-orer necessary conitions for a local interior optima at x are as follows: Theorem 7 If the i erentiable function f (x) reaches a local interior maximum or minimum at x then x solves the system of simultaneous equations: (x ) (x ) ::: (x n (7) The secon-orer necessary conitions are as follows: Theorem 8 Let f (x) be twice continuously i erentiable. 1. If f (x) reaches a local interior maximum at x, then H (x ) is negative semie nite. 2. If f (x) reaches a local interior minimum at ex, then H (ex) is positive semie nite. Again, we will not get into the etails of negative an positive semie niteness, but the function will be strictly concave if the principal minors of the Hessian matrix alternate in sign, beginning with a negative sign. It will be strictly convex if the principal minors of the Hessian matrix are all positive. 3.2 Constraine optimization Thus far we have focuse on unconstraine optimization. However, in many problems there are constraints which must be met. They may be equality constraints, such that we are optimizing a particular function f x 1 ; x 2 subject to the equality constraint g x 1 ; x 2. They may be constraints as simple as nonnegativity constraints, so that x 1 0 an x 2 0. They may be more complex inequality constraints, such that we are optimizing the function f x 1 ; x 2 subject to g x 1 ; x 2 0. The function g x 1 ; x 2 may be linear or nonlinear. 3
4 3.2.1 Equality constraints Formally, an optimization with an equality constraint is set up as: max f (x 1 ; x 2 ) s.t. g (x 1 ; x 2 ) (8) x 1;x 2 The function which we are optimizing, f in this instance, is calle the objective function. The variables that are being chosen, x 1 an x 2 in this problem, are the choice variables. The function g (x 1 ; x 2 ) is calle the constraint function. With equality constraine optimization problems the problem can be solve by substitution. If we solve for one of the variables in the constraint function, say x 2, we woul have a new function: Substitute this irectly into the objective function an the problem becomes: x 2 = eg (x 1 ) (9) max x 1 f (x 1 ; eg (x 1 )) (10) an then we are maximizing a function of a single variable. We then set the rst erivative equal to zero an n the critical value for x 1. This rst orer conition is: (x 1; eg (x 1)) + (x 1; eg (x 1)) eg (x 1) x 1 (11) Once x 1 is known we n x 2 by using x 2 = eg (x 1). For simple problems this substitution metho works well; when the constraint functions are complex or there are multiple choice variables (or constraints) this metho becomes teious Lagrange s metho Solving unconstraine optimization problems is (relatively) easy. The iea that Lagrange ha was to turn constraine optimization problems into unconstraine optimization problems. Our problem from before is: max f (x 1 ; x 2 ) s.t. g (x 1 ; x 2 ) (12) x 1;x 2 Now, multiply the constraint by the variable (why? that comes later). A that prouct to the objective function an we have create a new function calle the Lagrangian function (or Lagrangian). The Lagrangian is: L (x 1 ; x 2 ; ) = f (x 1 ; x 2 ) + g (x 1 ; x 2 ) (13) The rst-orer necessary conitions for optimizing the Lagrangian are the set of partial erivatives set equal to zero: = (x 1; x 2) (x 1; x 2) (14) = (x 1; x 2) (x 1; x 2) = g (x 1; x 2) (16) The iea of Lagrange s metho is that if we solve these three equations simultaneously for x 1, x 2, an we will have foun a critical point of f (x 1 ; x 2 ) along the constraint g (x 1 ; x 2 ). While we will not go through all of the etails to prove why this metho works, the book provies a nice iscussion. Also, as of now we o not know whether or not these critical values are maxima or minima, but some comments on this topic will be mae shortly. Note that Lagrange s metho can be use for any number of variables (n) an any number of constraints (m) so long as the number of constraints is less than the number of variables (m < n). There is still the question of whether or not a solution actually exists for a problem an whether or not the variable exists. While these are important questions a general iscussion of these concepts is more than is neee here. When we iscuss particular economic optimization problems we will iscuss these concepts in slightly more etail. The formal statement of Lagrange s theorem is here (note that is a vector of the variables): 4
5 Theorem 9 Let f (x) an g j (x), j = 1; :::; m, be continuously i erentiable real-value functions over some omain D R n. Let x be an interior point of D an suppose that x is an optimum (maximum or minimum) of f subject to the constraints, g j (x ), j = 1; :::; m. If the graient vectors rg j (x ), j = 1; :::; m are linearly inepenent, then there exist m unique numbers j, j = 1; :::; m such that (x ; ) = (x ) + mx j (x ) j for i = 1; :::; n (17) So this theorem guarantees us the existence of the variables. As for etermining whether or not the constraine function is attaining a maximum or minimum, one can check the eterminant of the borere Hessian of the Lagrangian function. The borere Hessian is simply the Hessian matrix of the Lagrangian function borere by the rst-orer partial erivatives of the constraint equation an a zero. The borere Hessian for a problem with two choice variables an one constraint is: 2 H = 4 L 3 11 L 12 g 1 L 21 L 22 g 2 5 g 1 g 2 0 where L ij j an g i As a practical matter in this course we will not be concerne with etermining the sign of the eterminant of the borere Hessian matrix, but you are encourage to rea the text for more etail Inequality constraints In some problems we will have inequality constraints to conten with. Many times we assume that the choice variables must be nonnegative, or x i 0. Look at the following pictures an attempt to characterize the solution to the problem: f (x * )<0 f (x * )=0 f (x * )=0 x * =0 x * =0 x * >0 Case 1 Case 2 Case 3 In case 1 we have x an f 0 (x ) < 0. In case 2 we have x an f 0 (x ). In case 3 we have x > 0 an f 0 (x ). In all three of these cases we have x [f 0 (x )]. But this is not the only conition because in case 3 when ex we have ex [f 0 (ex)] but ex is NOT a maximum. So the necessary conitions we nee to have hol are: f 0 (x ) 0 (18) x [f 0 (x )] (19) x 0 (20) 5
6 The necessary conitions for a minimum are fairly similar to those of a maximum. We have: f 0 (x ) 0 (21) x [f 0 (x )] (22) x 0 (23) Kuhn-Tucker conitions A general problem that we might encounter is: max f (x 1 ; x 2 ) s.t. g (x 1 ; x 2 ) 0 (24) x 1;x 2 This is a nonlinear programming problem, as a linear programming problem we have a linear function which is optimize subject to linear equality constraints. We make no such restrictions here. We can construct the Lagrangian to n: L (x 1 ; x 2 ; ) = f (x 1 ; x 2 ) + [g (x 1 ; x 2 )] : (25) The Kuhn-Tucker necessary conitions for this problem are: (26) (27) g (x 1 ; x 2 ) (28) 0; g (x 1 ; x 2 ) 0 (29) Now, there are many times in which we might restrict our choice variables x 1 an x 2 to be greater than or equal to 0. In those instances, our problem woul be: Technically, the Lagrangian is: max f (x 1; x 2 ) s.t. g (x 1 ; x 2 ) 0 (30) x 10;x 20 L (x 1 ; x 2 ; 1 ; 2 ; 3 ) = f (x 1 ; x 2 ) + 1 [g (x 1 ; x 2 )] + 2 x x 2 (31) The Kuhn-Tucker necessary conitions for this type of problem are: 0; x 1 0; x 1 0; x 2 0; x 1 0; 1 0; 2 0; 2 0; 3 0; 3 0; 3 : Technically one woul have to check all these conitions to see which constraints are bining an which are not. For most consumer an rm problems that we will solve we will be able to "know" that x 1 > 0 an x 2 > 0 by looking at the objective function. 6
Week 4: Calculus and Optimization (Jehle and Reny, Chapter A2)
Week 4: Calculus and Optimization (Jehle and Reny, Chapter A2) Tsun-Feng Chiang *School of Economics, Henan University, Kaifeng, China September 27, 2015 Microeconomic Theory Week 4: Calculus and Optimization
More informationFinal Exam Study Guide and Practice Problems Solutions
Final Exam Stuy Guie an Practice Problems Solutions Note: These problems are just some of the types of problems that might appear on the exam. However, to fully prepare for the exam, in aition to making
More informationLectures - Week 10 Introduction to Ordinary Differential Equations (ODES) First Order Linear ODEs
Lectures - Week 10 Introuction to Orinary Differential Equations (ODES) First Orer Linear ODEs When stuying ODEs we are consiering functions of one inepenent variable, e.g., f(x), where x is the inepenent
More informationSeptember Math Course: First Order Derivative
September Math Course: First Order Derivative Arina Nikandrova Functions Function y = f (x), where x is either be a scalar or a vector of several variables (x,..., x n ), can be thought of as a rule which
More informationNonlinear Programming (NLP)
Natalia Lazzati Mathematics for Economics (Part I) Note 6: Nonlinear Programming - Unconstrained Optimization Note 6 is based on de la Fuente (2000, Ch. 7), Madden (1986, Ch. 3 and 5) and Simon and Blume
More informationDifferentiation ( , 9.5)
Chapter 2 Differentiation (8.1 8.3, 9.5) 2.1 Rate of Change (8.2.1 5) Recall that the equation of a straight line can be written as y = mx + c, where m is the slope or graient of the line, an c is the
More informationThe Kuhn-Tucker Problem
Natalia Lazzati Mathematics for Economics (Part I) Note 8: Nonlinear Programming - The Kuhn-Tucker Problem Note 8 is based on de la Fuente (2000, Ch. 7) and Simon and Blume (1994, Ch. 18 and 19). The Kuhn-Tucker
More informationImplicit Differentiation
Implicit Differentiation Thus far, the functions we have been concerne with have been efine explicitly. A function is efine explicitly if the output is given irectly in terms of the input. For instance,
More informationd dx [xn ] = nx n 1. (1) dy dx = 4x4 1 = 4x 3. Theorem 1.3 (Derivative of a constant function). If f(x) = k and k is a constant, then f (x) = 0.
Calculus refresher Disclaimer: I claim no original content on this ocument, which is mostly a summary-rewrite of what any stanar college calculus book offers. (Here I ve use Calculus by Dennis Zill.) I
More informationQF101: Quantitative Finance September 5, Week 3: Derivatives. Facilitator: Christopher Ting AY 2017/2018. f ( x + ) f(x) f(x) = lim
QF101: Quantitative Finance September 5, 2017 Week 3: Derivatives Facilitator: Christopher Ting AY 2017/2018 I recoil with ismay an horror at this lamentable plague of functions which o not have erivatives.
More informationMath Notes on differentials, the Chain Rule, gradients, directional derivative, and normal vectors
Math 18.02 Notes on ifferentials, the Chain Rule, graients, irectional erivative, an normal vectors Tangent plane an linear approximation We efine the partial erivatives of f( xy, ) as follows: f f( x+
More informationLecture 4: Optimization. Maximizing a function of a single variable
Lecture 4: Optimization Maximizing or Minimizing a Function of a Single Variable Maximizing or Minimizing a Function of Many Variables Constrained Optimization Maximizing a function of a single variable
More informationMake graph of g by adding c to the y-values. on the graph of f by c. multiplying the y-values. even-degree polynomial. graph goes up on both sides
Reference 1: Transformations of Graphs an En Behavior of Polynomial Graphs Transformations of graphs aitive constant constant on the outsie g(x) = + c Make graph of g by aing c to the y-values on the graph
More informationThe derivative of a function f(x) is another function, defined in terms of a limiting expression: f(x + δx) f(x)
Y. D. Chong (2016) MH2801: Complex Methos for the Sciences 1. Derivatives The erivative of a function f(x) is another function, efine in terms of a limiting expression: f (x) f (x) lim x δx 0 f(x + δx)
More informationRules of Differentiation. Lecture 12. Product and Quotient Rules.
Rules of Differentiation. Lecture 12. Prouct an Quotient Rules. We warne earlier that we can not calculate the erivative of a prouct as the prouct of the erivatives. It is easy to see that this is so.
More informationDifferentiation Rules Derivatives of Polynomials and Exponential Functions
Derivatives of Polynomials an Exponential Functions Differentiation Rules Derivatives of Polynomials an Exponential Functions Let s start with the simplest of all functions, the constant function f(x)
More informationLecture 6: Calculus. In Song Kim. September 7, 2011
Lecture 6: Calculus In Song Kim September 7, 20 Introuction to Differential Calculus In our previous lecture we came up with several ways to analyze functions. We saw previously that the slope of a linear
More informationThe general programming problem is the nonlinear programming problem where a given function is maximized subject to a set of inequality constraints.
1 Optimization Mathematical programming refers to the basic mathematical problem of finding a maximum to a function, f, subject to some constraints. 1 In other words, the objective is to find a point,
More informationMath 115 Section 018 Course Note
Course Note 1 General Functions Definition 1.1. A function is a rule that takes certain numbers as inputs an assigns to each a efinite output number. The set of all input numbers is calle the omain of
More informationLinear First-Order Equations
5 Linear First-Orer Equations Linear first-orer ifferential equations make up another important class of ifferential equations that commonly arise in applications an are relatively easy to solve (in theory)
More informationJUST THE MATHS UNIT NUMBER DIFFERENTIATION 2 (Rates of change) A.J.Hobson
JUST THE MATHS UNIT NUMBER 10.2 DIFFERENTIATION 2 (Rates of change) by A.J.Hobson 10.2.1 Introuction 10.2.2 Average rates of change 10.2.3 Instantaneous rates of change 10.2.4 Derivatives 10.2.5 Exercises
More informationEC5555 Economics Masters Refresher Course in Mathematics September 2013
EC5555 Economics Masters Reresher Course in Mathematics September 3 Lecture 5 Unconstraine Optimization an Quaratic Forms Francesco Feri We consier the unconstraine optimization or the case o unctions
More informationIMPLICIT DIFFERENTIATION
IMPLICIT DIFFERENTIATION CALCULUS 3 INU0115/515 (MATHS 2) Dr Arian Jannetta MIMA CMath FRAS Implicit Differentiation 1/ 11 Arian Jannetta Explicit an implicit functions Explicit functions An explicit function
More informationExtreme Abridgment of Boyd and Vandenberghe s Convex Optimization
Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization Compiled by David Rosenberg Abstract Boyd and Vandenberghe s Convex Optimization book is very well-written and a pleasure to read. The
More informationReview of Optimization Methods
Review of Optimization Methods Prof. Manuela Pedio 20550 Quantitative Methods for Finance August 2018 Outline of the Course Lectures 1 and 2 (3 hours, in class): Linear and non-linear functions on Limits,
More informationz = f (x; y) f (x ; y ) f (x; y) f (x; y )
BEEM0 Optimization Techiniques for Economists Lecture Week 4 Dieter Balkenborg Departments of Economics University of Exeter Since the fabric of the universe is most perfect, and is the work of a most
More informationMath 210 Midterm #1 Review
Math 20 Miterm # Review This ocument is intene to be a rough outline of what you are expecte to have learne an retaine from this course to be prepare for the first miterm. : Functions Definition: A function
More information23 Implicit differentiation
23 Implicit ifferentiation 23.1 Statement The equation y = x 2 + 3x + 1 expresses a relationship between the quantities x an y. If a value of x is given, then a corresponing value of y is etermine. For
More informationWeek 3: Sets and Mappings/Calculus and Optimization (Jehle September and Reny, 20, 2015 Chapter A1/A2)
Week 3: Sets and Mappings/Calculus and Optimization (Jehle and Reny, Chapter A1/A2) Tsun-Feng Chiang *School of Economics, Henan University, Kaifeng, China September 20, 2015 Week 3: Sets and Mappings/Calculus
More informationDerivatives and the Product Rule
Derivatives an the Prouct Rule James K. Peterson Department of Biological Sciences an Department of Mathematical Sciences Clemson University January 28, 2014 Outline Differentiability Simple Derivatives
More informationCalculus in the AP Physics C Course The Derivative
Limits an Derivatives Calculus in the AP Physics C Course The Derivative In physics, the ieas of the rate change of a quantity (along with the slope of a tangent line) an the area uner a curve are essential.
More informationSurvey Sampling. 1 Design-based Inference. Kosuke Imai Department of Politics, Princeton University. February 19, 2013
Survey Sampling Kosuke Imai Department of Politics, Princeton University February 19, 2013 Survey sampling is one of the most commonly use ata collection methos for social scientists. We begin by escribing
More informationOptimization Notes. Note: Any material in red you will need to have memorized verbatim (more or less) for tests, quizzes, and the final exam.
MATH 2250 Calculus I Date: October 5, 2017 Eric Perkerson Optimization Notes 1 Chapter 4 Note: Any material in re you will nee to have memorize verbatim (more or less) for tests, quizzes, an the final
More informationEconomics 101A (Lecture 3) Stefano DellaVigna
Economics 101A (Lecture 3) Stefano DellaVigna January 24, 2017 Outline 1. Implicit Function Theorem 2. Envelope Theorem 3. Convexity and concavity 4. Constrained Maximization 1 Implicit function theorem
More informationTopic 7: Convergence of Random Variables
Topic 7: Convergence of Ranom Variables Course 003, 2016 Page 0 The Inference Problem So far, our starting point has been a given probability space (S, F, P). We now look at how to generate information
More informationMath 1B, lecture 8: Integration by parts
Math B, lecture 8: Integration by parts Nathan Pflueger 23 September 2 Introuction Integration by parts, similarly to integration by substitution, reverses a well-known technique of ifferentiation an explores
More information4.2 First Differentiation Rules; Leibniz Notation
.. FIRST DIFFERENTIATION RULES; LEIBNIZ NOTATION 307. First Differentiation Rules; Leibniz Notation In this section we erive rules which let us quickly compute the erivative function f (x) for any polynomial
More informationEC /11. Math for Microeconomics September Course, Part II Problem Set 1 with Solutions. a11 a 12. x 2
LONDON SCHOOL OF ECONOMICS Professor Leonardo Felli Department of Economics S.478; x7525 EC400 2010/11 Math for Microeconomics September Course, Part II Problem Set 1 with Solutions 1. Show that the general
More informationII. First variation of functionals
II. First variation of functionals The erivative of a function being zero is a necessary conition for the etremum of that function in orinary calculus. Let us now tackle the question of the equivalent
More informationCalculus I Sec 2 Practice Test Problems for Chapter 4 Page 1 of 10
Calculus I Sec 2 Practice Test Problems for Chapter 4 Page 1 of 10 This is a set of practice test problems for Chapter 4. This is in no way an inclusive set of problems there can be other types of problems
More informationUnit #6 - Families of Functions, Taylor Polynomials, l Hopital s Rule
Unit # - Families of Functions, Taylor Polynomials, l Hopital s Rule Some problems an solutions selecte or aapte from Hughes-Hallett Calculus. Critical Points. Consier the function f) = 54 +. b) a) Fin
More informationSYDE 112, LECTURE 1: Review & Antidifferentiation
SYDE 112, LECTURE 1: Review & Antiifferentiation 1 Course Information For a etaile breakown of the course content an available resources, see the Course Outline. Other relevant information for this section
More informationIntegration Review. May 11, 2013
Integration Review May 11, 2013 Goals: Review the funamental theorem of calculus. Review u-substitution. Review integration by parts. Do lots of integration eamples. 1 Funamental Theorem of Calculus In
More informationReview of Differentiation and Integration for Ordinary Differential Equations
Schreyer Fall 208 Review of Differentiation an Integration for Orinary Differential Equations In this course you will be expecte to be able to ifferentiate an integrate quickly an accurately. Many stuents
More informationEuler equations for multiple integrals
Euler equations for multiple integrals January 22, 2013 Contents 1 Reminer of multivariable calculus 2 1.1 Vector ifferentiation......................... 2 1.2 Matrix ifferentiation........................
More informationDERIVATIVES: LAWS OF DIFFERENTIATION MR. VELAZQUEZ AP CALCULUS
DERIVATIVES: LAWS OF DIFFERENTIATION MR. VELAZQUEZ AP CALCULUS THE DERIVATIVE AS A FUNCTION f x = lim h 0 f x + h f(x) h Last class we examine the limit of the ifference quotient at a specific x as h 0,
More informationSection 2.7 Derivatives of powers of functions
Section 2.7 Derivatives of powers of functions (3/19/08) Overview: In this section we iscuss the Chain Rule formula for the erivatives of composite functions that are forme by taking powers of other functions.
More informationThe Exact Form and General Integrating Factors
7 The Exact Form an General Integrating Factors In the previous chapters, we ve seen how separable an linear ifferential equations can be solve using methos for converting them to forms that can be easily
More informationTMA 4195 Matematisk modellering Exam Tuesday December 16, :00 13:00 Problems and solution with additional comments
Problem F U L W D g m 3 2 s 2 0 0 0 0 2 kg 0 0 0 0 0 0 Table : Dimension matrix TMA 495 Matematisk moellering Exam Tuesay December 6, 2008 09:00 3:00 Problems an solution with aitional comments The necessary
More informationMath 1271 Solutions for Fall 2005 Final Exam
Math 7 Solutions for Fall 5 Final Eam ) Since the equation + y = e y cannot be rearrange algebraically in orer to write y as an eplicit function of, we must instea ifferentiate this relation implicitly
More informationSeminars on Mathematics for Economics and Finance Topic 5: Optimization Kuhn-Tucker conditions for problems with inequality constraints 1
Seminars on Mathematics for Economics and Finance Topic 5: Optimization Kuhn-Tucker conditions for problems with inequality constraints 1 Session: 15 Aug 2015 (Mon), 10:00am 1:00pm I. Optimization with
More informationThe Principle of Least Action
Chapter 7. The Principle of Least Action 7.1 Force Methos vs. Energy Methos We have so far stuie two istinct ways of analyzing physics problems: force methos, basically consisting of the application of
More informationLinear and quadratic approximation
Linear an quaratic approximation November 11, 2013 Definition: Suppose f is a function that is ifferentiable on an interval I containing the point a. The linear approximation to f at a is the linear function
More informationModern Optimization Theory: Concave Programming
Modern Optimization Theory: Concave Programming 1. Preliminaries 1 We will present below the elements of modern optimization theory as formulated by Kuhn and Tucker, and a number of authors who have followed
More information2.1 Derivatives and Rates of Change
1a 1b 2.1 Derivatives an Rates of Change Tangent Lines Example. Consier y f x x 2 0 2 x-, 0 4 y-, f(x) axes, curve C Consier a smooth curve C. A line tangent to C at a point P both intersects C at P an
More information3.7 Implicit Differentiation -- A Brief Introduction -- Student Notes
Fin these erivatives of these functions: y.7 Implicit Differentiation -- A Brief Introuction -- Stuent Notes tan y sin tan = sin y e = e = Write the inverses of these functions: y tan y sin How woul we
More informationEcon Slides from Lecture 14
Econ 205 Sobel Econ 205 - Slides from Lecture 14 Joel Sobel September 10, 2010 Theorem ( Lagrange Multipliers ) Theorem If x solves max f (x) subject to G(x) = 0 then there exists λ such that Df (x ) =
More informationMath Chapter 2 Essentials of Calculus by James Stewart Prepared by Jason Gaddis
Math 231 - Chapter 2 Essentials of Calculus by James Stewart Prepare by Jason Gais Chapter 2 - Derivatives 21 - Derivatives an Rates of Change Definition A tangent to a curve is a line that intersects
More informationExam 3 Review. Lesson 19: Concavity, Inflection Points, and the Second Derivative Test. Lesson 20: Absolute Extrema on an Interval
Exam 3 Review Lessons 17-18: Relative Extrema, Critical Numbers, an First Derivative Test (from exam 2 review neee for curve sketching) Critical Numbers: where the erivative of a function is zero or unefine.
More informationChapter 2. Exponential and Log functions. Contents
Chapter. Exponential an Log functions This material is in Chapter 6 of Anton Calculus. The basic iea here is mainly to a to the list of functions we know about (for calculus) an the ones we will stu all
More information6 Optimization. The interior of a set S R n is the set. int X = {x 2 S : 9 an open box B such that x 2 B S}
6 Optimization The interior of a set S R n is the set int X = {x 2 S : 9 an open box B such that x 2 B S} Similarly, the boundary of S, denoted@s, istheset @S := {x 2 R n :everyopenboxb s.t. x 2 B contains
More informationAssignment 1. g i (x 1,..., x n ) dx i = 0. i=1
Assignment 1 Golstein 1.4 The equations of motion for the rolling isk are special cases of general linear ifferential equations of constraint of the form g i (x 1,..., x n x i = 0. i=1 A constraint conition
More informationwhere u is the decision-maker s payoff function over her actions and S is the set of her feasible actions.
Seminars on Mathematics for Economics and Finance Topic 3: Optimization - interior optima 1 Session: 11-12 Aug 2015 (Thu/Fri) 10:00am 1:00pm I. Optimization: introduction Decision-makers (e.g. consumers,
More informationAntiderivatives. Definition (Antiderivative) If F (x) = f (x) we call F an antiderivative of f. Alan H. SteinUniversity of Connecticut
Antierivatives Definition (Antierivative) If F (x) = f (x) we call F an antierivative of f. Antierivatives Definition (Antierivative) If F (x) = f (x) we call F an antierivative of f. Definition (Inefinite
More informationFall 2016: Calculus I Final
Answer the questions in the spaces provie on the question sheets. If you run out of room for an answer, continue on the back of the page. NO calculators or other electronic evices, books or notes are allowe
More informationNonlinear Programming and the Kuhn-Tucker Conditions
Nonlinear Programming and the Kuhn-Tucker Conditions The Kuhn-Tucker (KT) conditions are first-order conditions for constrained optimization problems, a generalization of the first-order conditions we
More informationSummary: Differentiation
Techniques of Differentiation. Inverse Trigonometric functions The basic formulas (available in MF5 are: Summary: Differentiation ( sin ( cos The basic formula can be generalize as follows: Note: ( sin
More informationSolutions to Practice Problems Tuesday, October 28, 2008
Solutions to Practice Problems Tuesay, October 28, 2008 1. The graph of the function f is shown below. Figure 1: The graph of f(x) What is x 1 + f(x)? What is x 1 f(x)? An oes x 1 f(x) exist? If so, what
More informationECON2285: Mathematical Economics
ECON2285: Mathematical Economics Yulei Luo Economics, HKU September 17, 2018 Luo, Y. (Economics, HKU) ME September 17, 2018 1 / 46 Static Optimization and Extreme Values In this topic, we will study goal
More information0.1 The Chain Rule. db dt = db
0. The Chain Rule A basic illustration of the chain rules comes in thinking about runners in a race. Suppose two brothers, Mark an Brian, hol an annual race to see who is the fastest. Last year Mark won
More informationDifferentiability, Computing Derivatives, Trig Review
Unit #3 : Differentiability, Computing Derivatives, Trig Review Goals: Determine when a function is ifferentiable at a point Relate the erivative graph to the the graph of an original function Compute
More informationCalculus of Variations
16.323 Lecture 5 Calculus of Variations Calculus of Variations Most books cover this material well, but Kirk Chapter 4 oes a particularly nice job. x(t) x* x*+ αδx (1) x*- αδx (1) αδx (1) αδx (1) t f t
More informationSection 3.1/3.2: Rules of Differentiation
: Rules of Differentiation Math 115 4 February 2018 Overview 1 2 Four Theorem for Derivatives Suppose c is a constant an f, g are ifferentiable functions. Then 1 2 3 4 x (c) = 0 x (x n ) = nx n 1 x [cf
More informationOptimality Conditions
Chapter 2 Optimality Conditions 2.1 Global and Local Minima for Unconstrained Problems When a minimization problem does not have any constraints, the problem is to find the minimum of the objective function.
More informationthere is no special reason why the value of y should be fixed at y = 0.3. Any y such that
25. More on bivariate functions: partial erivatives integrals Although we sai that the graph of photosynthesis versus temperature in Lecture 16 is like a hill, in the real worl hills are three-imensional
More informationDynamics of Knowledge Creation and Transfer: The Two Person Case
Dynamics of Knowlege Creation an Transfer: The Two Person Case Marcus Berliant an Masahisa Fujita June 8, 2006 Abstract This paper presents a micro-moel of knowlege creation an transfer for a couple. Our
More information1 Definition of the derivative
Math 20A - Calculus by Jon Rogawski Chapter 3 - Differentiation Prepare by Jason Gais Definition of the erivative Remark.. Recall our iscussion of tangent lines from way back. We now rephrase this in terms
More informationand from it produce the action integral whose variation we set to zero:
Lagrange Multipliers Monay, 6 September 01 Sometimes it is convenient to use reunant coorinates, an to effect the variation of the action consistent with the constraints via the metho of Lagrange unetermine
More informationDifferentiability, Computing Derivatives, Trig Review. Goals:
Secants vs. Derivatives - Unit #3 : Goals: Differentiability, Computing Derivatives, Trig Review Determine when a function is ifferentiable at a point Relate the erivative graph to the the graph of an
More informationAdvanced Partial Differential Equations with Applications
MIT OpenCourseWare http://ocw.mit.eu 18.306 Avance Partial Differential Equations with Applications Fall 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.eu/terms.
More information1 Lecture 20: Implicit differentiation
Lecture 20: Implicit ifferentiation. Outline The technique of implicit ifferentiation Tangent lines to a circle Derivatives of inverse functions by implicit ifferentiation Examples.2 Implicit ifferentiation
More informationx = c of N if the limit of f (x) = L and the right-handed limit lim f ( x)
Limit We say the limit of f () as approaches c equals L an write, lim L. One-Sie Limits (Left an Right-Hane Limits) Suppose a function f is efine near but not necessarily at We say that f has a left-hane
More informationSection 7.1: Integration by Parts
Section 7.1: Integration by Parts 1. Introuction to Integration Techniques Unlike ifferentiation where there are a large number of rules which allow you (in principle) to ifferentiate any function, the
More informationChapter 2 Lagrangian Modeling
Chapter 2 Lagrangian Moeling The basic laws of physics are use to moel every system whether it is electrical, mechanical, hyraulic, or any other energy omain. In mechanics, Newton s laws of motion provie
More informationChapter 3 Notes, Applied Calculus, Tan
Contents 3.1 Basic Rules of Differentiation.............................. 2 3.2 The Prouct an Quotient Rules............................ 6 3.3 The Chain Rule...................................... 9 3.4
More informationG j dq i + G j. q i. = a jt. and
Lagrange Multipliers Wenesay, 8 September 011 Sometimes it is convenient to use reunant coorinates, an to effect the variation of the action consistent with the constraints via the metho of Lagrange unetermine
More informationApplications of the Wronskian to ordinary linear differential equations
Physics 116C Fall 2011 Applications of the Wronskian to orinary linear ifferential equations Consier a of n continuous functions y i (x) [i = 1,2,3,...,n], each of which is ifferentiable at least n times.
More informationOutline. MS121: IT Mathematics. Differentiation Rules for Differentiation: Part 1. Outline. Dublin City University 4 The Quotient Rule
MS2: IT Mathematics Differentiation Rules for Differentiation: Part John Carroll School of Mathematical Sciences Dublin City University Pattern Observe You may have notice the following pattern when we
More informationTable of Common Derivatives By David Abraham
Prouct an Quotient Rules: Table of Common Derivatives By Davi Abraham [ f ( g( ] = [ f ( ] g( + f ( [ g( ] f ( = g( [ f ( ] g( g( f ( [ g( ] Trigonometric Functions: sin( = cos( cos( = sin( tan( = sec
More informationImplicit Differentiation. Lecture 16.
Implicit Differentiation. Lecture 16. We are use to working only with functions that are efine explicitly. That is, ones like f(x) = 5x 3 + 7x x 2 + 1 or s(t) = e t5 3, in which the function is escribe
More informationSolutions to Math 41 Second Exam November 4, 2010
Solutions to Math 41 Secon Exam November 4, 2010 1. (13 points) Differentiate, using the metho of your choice. (a) p(t) = ln(sec t + tan t) + log 2 (2 + t) (4 points) Using the rule for the erivative of
More informationII. An Application of Derivatives: Optimization
Anne Sibert Autumn 2013 II. An Application of Derivatives: Optimization In this section we consider an important application of derivatives: finding the minimum and maximum points. This has important applications
More informationChapter 1. Functions, Graphs, and Limits
Review for Final Exam Lecturer: Sangwook Kim Office : Science & Tech I, 226D math.gmu.eu/ skim22 Chapter 1. Functions, Graphs, an Limits A function is a rule that assigns to each objects in a set A exactly
More informationMath 251 Notes. Part I.
Math 251 Notes. Part I. F. Patricia Meina May 6, 2013 Growth Moel.Consumer price inex. [Problem 20, page 172] The U.S. consumer price inex (CPI) measures the cost of living base on a value of 100 in the
More informationLecture 10: October 30, 2017
Information an Coing Theory Autumn 2017 Lecturer: Mahur Tulsiani Lecture 10: October 30, 2017 1 I-Projections an applications In this lecture, we will talk more about fining the istribution in a set Π
More informationOptimization Tutorial 1. Basic Gradient Descent
E0 270 Machine Learning Jan 16, 2015 Optimization Tutorial 1 Basic Gradient Descent Lecture by Harikrishna Narasimhan Note: This tutorial shall assume background in elementary calculus and linear algebra.
More informationSchrödinger s equation.
Physics 342 Lecture 5 Schröinger s Equation Lecture 5 Physics 342 Quantum Mechanics I Wenesay, February 3r, 2010 Toay we iscuss Schröinger s equation an show that it supports the basic interpretation of
More informationCalculus I Announcements
Slie 1 Calculus I Announcements Office Hours: Amos Eaton 309, Monays 12:50-2:50 Exam 2 is Thursay, October 22n. The stuy guie is now on the course web page. Start stuying now, an make a plan to succee.
More informationCalculus Example Exam Solutions
Calculus Example Exam Solutions. Limits (8 points, 6 each) Evaluate the following limits: p x 2 (a) lim x 4 We compute as follows: lim p x 2 x 4 p p x 2 x +2 x 4 p x +2 x 4 (x 4)( p x + 2) p x +2 = p 4+2
More informationFinal Exam: Sat 12 Dec 2009, 09:00-12:00
MATH 1013 SECTIONS A: Professor Szeptycki APPLIED CALCULUS I, FALL 009 B: Professor Toms C: Professor Szeto NAME: STUDENT #: SECTION: No ai (e.g. calculator, written notes) is allowe. Final Exam: Sat 1
More information