CALCULUS FOR THE MORNING COMMUTE

Size: px
Start display at page:

Download "CALCULUS FOR THE MORNING COMMUTE"

Transcription

1 CALCULUS FOR THE MORNING COMMUTE Prof. David Bernstein Department of Computer Science JMU Department of Mathematics and Statistics Fall 2007 Colloquia

2 Calculus for the Morning Commute -1 Motivation Some History: The shortest path problemis easy to solve 25 years ago it became clear that we could develop route guidance systems if we only had network data The Breakthrough (in the U.S.): The Census Bureau digitized the entire country Today: A few companies maintain and market (very similar) street databases Many companies develop and market route guidance systems A Thought Experiment: What would happen if everyone used such a system to determine their route from home to work?

3 Calculus for the Morning Commute -2 Notation The Network: N denotes the set of nodes A denotes the set of arcs (or links) W N 2 denotes the set of origin-destinat pairs R w denotes the set of routes connecting w W (with R w = R w ) R = w W R w (with R = w W R w ) c : Z R + R R + denotes the vector of route cost functions Demand and Flows: D =(D w : w W) Z R ++ denotes the demand vector h =(h r : r R) Z R + denotes the route flow vector H D = { h Z R + : r R w h r = D w,w W } denotes a feasible route flow pattern

4 Calculus for the Morning Commute -3 Behavioral Model The Key Behavioral Assumption: Each commuter attempts to minimize her/his own travel cost, given the behavior of other commuters The EquilibriumModel: A flow pattern, h H D is said to be a (Nash) network equilibriumiff: h r > 0 c r (h) c s (h +1 s 1 r ) s R w for all r R w and w W. An Interpretation: In equilibrium, no commuter has any incentive to change her/his route

5 Calculus for the Morning Commute -4 The Important Questions Existence: When will such an equilibriumexist? Uniqueness: When will there be exactly one equilibrium? Stability: When will the equilibrium(or set of equilibria) be stable? Computation: Can we find equilibria efficiently (using a numerical algorithm)?

6 Calculus for the Morning Commute -5 A Simplifying Assumption Something a Computer Scientist Should Never Do: Consider the limiting case Operationalizing this Simplification: The equilibriumcondition: h r > 0 c r (h) c s (h +1 s 1 r ) s R w becomes: h r > 0 c r (h) c s (h + ɛ1 s ɛ1 r ) s R w and we consider the limit as ɛ 0

7 Calculus for the Morning Commute -6 Starting Again Notation: Replace Z with R above Let c w (h) =min{c s (h) :s R w } The EquilibriumModel: A flow pattern, h H D is said to be a continuous network equilibriumiff: h r > 0 c r (h) =c w (h) for all r R w and w W. The set of such flow patterns is denoted by CNE(c, D) Interpretation: In equilibrium, the cost on all used routes (connecting an OD-pair) must be minimal (and, hence, equal)

8 Calculus for the Morning Commute -7 A Graphical Approach The Network: Origin Destination A Simple Case: c 1 c2( h2) c1( h1) c 2 h 1 =0 h1= D h2= D h 2 =0

9 Calculus for the Morning Commute -8 Some Interesting Cases No Intersection: c1( h1) c 1 c 2 c2( h2) h 1 =0 h1= D h2= D h 2 =0 Discontinuous Costs: c 1 c2( h2) c1( h1) c 2 h 1 =0 h1= D h2= D h 2 =0

10 Calculus for the Morning Commute -9 Some Interesting Cases (cont.) Many Points of Equal Cost: c 1 c2( h2) c1( h1) c 2 h 1 =0 h1= D h2= D h 2 =0 No Intersection: c 1 c1( h1) c2( h2) c 2 h 1 =0 h1= D h2= D h 2 =0

11 Calculus for the Morning Commute -10 Lagrange Multipliers: A Review The Theorem: Let f : R n R and h : R m R be appropriately continuous and differentiable functions. Let x R n with h(x) =c, and let S denote the level set for h with value c. If f restricted to S has an optimum at x then there exists a λ R such that: f(x) λ h(x) =0 An Interpretation: Let L(x) =f(x) λh(x) and consider the critical points of L

12 Calculus for the Morning Commute -11 Karush-Kuhn-Tucker Conditions Motivation: L involves problems of the form: min x f(x) s.t. h i (x) =0fori =1,...,l KKT deal with problems of the form: min x f(x) s.t. h i (x) =0fori =1,...,l g i (x) 0fori =1,...,m The Necessary Conditions: For appropriately continuous and differentiable functions, if x solves the problemabove then there exist scalars µ i and λ i such that: f(x)+ m i=1 µ i g i (x)+ l i=1 λ i h i (x) =0 µ i g i (x) =0fori =1,...,m µ i 0fori =1,...,m The Sufficient Conditions: Require appropriate convexity of f, g i and h i

13 Calculus for the Morning Commute -12 Back to the Continuous EquilibriumProblem An Optimization Problem: min h V (h) s.t. r Rw h r = D w h r 0 w W r R The KKT Conditions: r V (h) µ w η r 0 for all r R w and w W η r h r =0forallr R η r 0 for all r R

14 Calculus for the Morning Commute -13 Using the KKT Conditions r V (h) µ w η r 0 for all r R w and w W η r h r =0forallr R η r 0 for all r R Some Simple Results: It follows from η r 0 and r V (h) µ w η r 0 that r V (h) µ w It follows from η r h r = 0 that h r > 0 η r =0 It follows that h r > 0 r V (h) =µ w It Would Be Nice If: r (h) =c r (h) for all r R (i.e., V (h) =c(h)) Because Then: µ w would be c w (h) A minimizer of V (h) subject to h H D would be an equilibrium

15 Calculus for the Morning Commute -14 In Order to Move Ahead An Important Question: Given a function, c : R R + R R +, under what conditions does there exist a function, V C 1 (R R +), with V = c? This Question Arises Elsewhere: In physics V is called a potential function for the gradient vector field, c

16 Calculus for the Morning Commute -15 In Order to Move Ahead (cont.) One Important Sufficient Condition: Path Independence [Stoke s Theorem] A Potential Function: V (h) = h 0 c(ω)dω A More Easily Verified Sufficient Condition: Lipschitz continuous costs (i.e., c(x) c(y) K x y where denotes the max norm) Symmetry (i.e., r c s (h) = s c r (h) for almost all h where both gradients exist) [Frobenius Theorem] A Usable Formulation: The Rectilinear Path of Integration: V (h) = R h i i=1 0 c i (h 1,h 2,...,h i 1,x,0,...,0)dx

17 Calculus for the Morning Commute -16 How Does This Help? Existence: An equilibriumwill exist whenever a minimum exists Uniqueness: The objective function is, in general, not strictly convex so equilibriumroute flows are not unique However, if route costs are additive we can use the following function of link flows: L i=1 f i 0 t i(f 1,f 2,...,f i 1,x,0,...,0)dx which is strictly convex if the link cost function, t, is strictly monotone

18 Calculus for the Morning Commute -17 How Does This Help? (cont.) Calculation: Consider feasible descent algorithms we need to calculate an initial feasible solution and then we need to iteratively find feasible descent directions. Can we? Stability: We need a behaviorally meaningful adjustment process

19 Calculus for the Morning Commute -18 A Behavioral Adjustment Mechanism Notation: ḣ rs denotes the switching rate from r to s (where r, s R w ) and ḣr s R w r(ḣsr ḣrs). ḣ rs is assumed to be determined by some route switching process a rs (h) 0 with an associated adjustment operator a r (h) s R w r(a sr a rs ). A function p : R + R R satisfying ṗ(t) = a[p(t)],t R + is called an adjustment path for a. Behavioral Assumptions: A c-adjustment operator, a, must satisfy: (Rationality) a rs (h) > 0 c r (h) >c s (h) (Feasibility) p(0) H D p H D (Persistency) a(h) =0 h CNE(c, D) for all D and r, s R w

20 Calculus for the Morning Commute -19 Adjustments and Stability Notation: The Hausdorff distance froma point, h R R, to a nonempty set, S R R,isgivenby ρ(h, S) =inf{ h x : x S}. The neighborhood of S in H D is given by H D (S, ɛ) ={h H D : ρ(h, S) ɛ}. The set of minima is given by MIN(V c,s)={h S : V c (h) =min g S V c (g)}. Definitions: A set S H D is locally V c -minimal iff there exists some ɛ>0 such that S =MIN[V c,h D (S, ɛ)] and S is isolated iff S = KKT(V c,h D ) H D (S, ɛ). A set, S H D is asymptotically a-stable iff there exists some ɛ>0 such that p(0) H D (S, ɛ) lim t ρ[p(t),s]=0 and for each ɛ>0 there is some α ɛ (0,ɛ]such that p(0) H D (s, α ɛ ) p(r + ) H D (S, ɛ)

21 Calculus for the Morning Commute -20 A Stability Property Theorem: If c is a gradient cost structure with cost potential V c, then for all c-adjustment processes a and demand patterns D each isolated, locally V c - minimal set, S H D is asymptotically a-stable. Sketch of the Proof: 1. Show that V c is a strict Liapunov function for every c-adjustment process a (i.e., that h H D V c (h) 0 and that V c (h) =0 a(h) =0). 2. Show that if there exists a Liapunov function V c for a on H D then every locally V c -minimal set is a-stable. 3. Show that all c-adjustment processes, a, for gradient cost structures eventually converge to network equilibria.

22 Calculus for the Morning Commute -21 An Interesting Adjustment Process Notation: For any x R let x + =max{0,x} δ 0 (0) = 1 and δ 0 (x) = 0 for all x>0 The Process: a rs(h) =h r [c r (h) c s (h)] + δ 0 [c s (h) c w (h)] a r (h) = s R w r[a sr(h + ) a rs (h + )] What s Interesting About It? It involves discontinuities at every point where the set of minimal cost routes in R w changes (for any w W)

23 Calculus for the Morning Commute -22 A Solution Concept for Discontinuous Dif. Eqs. Notation: Given x R n and a nonempty set S R n, the Hausdorff distance from x to S is defined as ρ(x, S) =inf{ x y : y S} For any ɛ>0, closed set X R n and nonempty set S X, theɛ-neghborhood of S in X is defined as X(S, ɛ) ={x X : ρ(x, S) ɛ} conv(s) denotes the convex hull of the set S R n cconv(s) denotes the closure of the convex hull of the set S R n Getting Started: If the image set of a is denoted by a [H D (h, ɛ)] = {a (g) :g H D (h, ɛ)} then cconv(a [H D (h, ɛ)]) contains all of the convex combinations of vectors in a [H D (h, ɛ)] ɛ>0 cconv(a [H D (h, ɛ)]) must contain the limits of such convex combinations as the size of the neighborhood goes to zero

24 Calculus for the Morning Commute -23 A Solution Concept for Disc. Dif. Eqs. (cont.) A Reminder A function, f :[a, b] R n is said to be absolutely continuous on [a, b] if given ɛ>0 there exists a δ>0 such that n i=1 f(y i ) f(z i ) <ɛ for every finite collextion, {(y i,z i ):i =1,...,n} of nonoverlapping intervals with n i=1 y i z i <δ Krasovkij Solutions to a : An absolutely continuous function, p : R + R R + is a solution to a iff for almost all t R + ṗ(t) ɛ>0 cconv(a [H D (p(t),ɛ)]) An Observation: Over regions of H D where a is continuous, this reduces to ṗ(t) =a [p(t)] (i.e., the classical Carathéodory solution)

25 Calculus for the Morning Commute -24 An Example One OD with three Routes: D =12 c 1 (h) =1+h 2 1 c 2 (h) =1+h 2 c 3 (h) =15+h 3 Solving: The unique equilibriumis h =(3, 9, 0)

26 Calculus for the Morning Commute -25 An Example h s 12 9 C 2. p(0) 1 4 p(0) C, is the set of points with c 1 (h) =c 2 (h) At p(0) commuters on 3 will siwtch to 1 and 2 so p moves in direction 1 This movement collapses the set of minimum cost routes from {1, 2} to {2} so commuters instantly stop switching to 1, moving p in direction 2 This shifts the set of minimum cost routes to {1} so that commuters instantly stop switching to 2 and start switchingto1,movingp in direction 3 12 h 1

27 Calculus for the Morning Commute -26 An Example (cont.) Conclusion: No direction of movement can persist for more than an instant (called a chattering regime) Commuters are continuously switching to 1 or 2 (or both) so the p is moving up and to the right Interpretation: The cone of feasible direction vectors corresponds to ɛ>0 cconv(a [H D (p(0),ɛ)]) (i.e., the Krasovskij solutions) Observations: The only absoutely continuous adjustment path starting at a point on C is the path with trajectory along C ṗ(t) must be almost everywhere tangent to C a [p(t)] is always parallel to direction 1 for any point p(t) C So ṗ(t) isnowhere equal to a [p(t)]

28 Calculus for the Morning Commute -27 Lower-Semicontinuous Costs Definition of Equilibrium: The classical definition (Wardrop) doesn t work Other definitions (Dafermos, Heydecker) allow collaboration The most obvious alternative (Nash) doesn t work Modeling Smallness : liminf ɛ 0 c r (h + ɛ1 r ɛ1 s )= lim ɛ 0 inf{c r (h+α1 r α1 s ):0<α<min(ɛ, h s )} New Definition: h is a user equilibrium iff: h r > 0 c r (h) liminf ɛ 0 c s (h + ɛ1 s ɛ1 r )

29 Calculus for the Morning Commute -28 Lower-Semicontinuous Costs (cont.) One Result: If c is lower semicontinuous then every Wardrop eqmis a user eqm If c is upper semicontinuous then every user eqm is a Wardrop eqm Another Result: If c is lower semicontinuous and flow shifts between routes only create discontinuities on those links where the flow changes then a user equilibriumexists

30 Calculus for the Morning Commute -29 SRD Equilibrium The Issue: Drivers simultaneously choose both a path, p, and a departure time, t [0,T]. Fluid Approximation: The departure rate on path p at time t is denoted by h p (t) R +

31 Calculus for the Morning Commute -30 SRD Equilibrium(cont.) Notation: h(t) =(h r (t) :r R) H D = { h : r R w T 0 h r (t)dν(t) =D w,w W } (where ν(t) is a Lebesgue measure on [0,T]). Assumptions: Each departure rate pattern, h, gives rise to a (time varying) traffic pattern, x(t) =(x a (t) : a A). The relationship between h and x is driven by the time needed to traverse arc a when entered at time t, d a (t). Link travel times are determined by the link occupancies at the time the link is entered (i.e., the number of vehicles ahead of you on a link when you enter).

32 Calculus for the Morning Commute -31 SRD Equilibrium(cont.) The time needed to traverse path r when departing from the origin at time t is given by: d p (t, h) =d a r 1 [x a r 1 (t)] + d a r 2 [x a r 2 (τ a r 1 (t))] where + + d a r m(r) [x a r m(r) (τ a r m(r) 1 (t))] τ a r 1 (t) =t + d a r 1 [x a r 1 (t)] r R τ a r i (t) =τ a r i 1 (t)+d a r i [x a r i (τ a r i 1 (t))] r R,i [2,m(r)].

33 Calculus for the Morning Commute -32 SRD Equilibrium(cont.) Travelers may arrive early or late but incur a schedule cost. α is the dollar penalty for early arrival. β is the dollar penalty for late arrival. [T,T + ] is the set of equally acceptable arrival times. The schedule cost is given by Φ r (t, h) = α[(t ) (t + d r (t, h))] if (T ) > [t + d r (t, h)] 0 if (T ) [t + d r (t, h)] (T + ) β[(t + d r (t, h)) (T + )] if (T + )< [t + d r (t, h)], Letting γ denote the value of travel time, total cost is given by c r (t, h) =γd r (t, h)+φ r (t, h) r R

34 Calculus for the Morning Commute -33 SRD Equilibrium(cont.) Letting µ r (h) = ess inf{c r (t, h) :t [0,T]} the relevant lower bound on achievable costs for a w-commuter is given by µ w (h) =min{µ r (h) :r R w } we can define an equilibriumas follows: Definition 1 (SRD Equilibrium) A departure rate pattern, h H D is said to be a simultaneous route and departure-time choice equilibrium (SRD equilibrium) for D if and only if (iff) h satisfies the following condition for all w W,and r R w : h r (t) > 0= c r (t, h) =µ w (h) for ν-almost all t [0,T]

35 Calculus for the Morning Commute -34 SRD Equilibrium(cont.) It is possible to formulate the SRD equilibrium problemas an infinite dimensional variational inequality: Theorem 1 A departure rate pattern, ĥ H D is an SRD equilibrium for c if and only if T r R for all h H D. 0 c r(t, ĥ)[h r(t) ĥr(t)]dν(t) 0 (1) This result is quite general, and does not depend on the formof d a or c r. This result allows us to consider questions of existence and solution algorithms.

36 Calculus for the Morning Commute -35 The Big Open Questions SRD Equilibrium: Existence and Uniqueness Efficient Algorithms Information Provision: How will users respond to traffic forecasts? How do we incorporate their reponses into our forecasts?

More on Lagrange multipliers

More on Lagrange multipliers More on Lagrange multipliers CE 377K April 21, 2015 REVIEW The standard form for a nonlinear optimization problem is min x f (x) s.t. g 1 (x) 0. g l (x) 0 h 1 (x) = 0. h m (x) = 0 The objective function

More information

Numerical Optimization

Numerical Optimization Constrained Optimization Computer Science and Automation Indian Institute of Science Bangalore 560 012, India. NPTEL Course on Constrained Optimization Constrained Optimization Problem: min h j (x) 0,

More information

Constrained Optimization and Lagrangian Duality

Constrained Optimization and Lagrangian Duality CIS 520: Machine Learning Oct 02, 2017 Constrained Optimization and Lagrangian Duality Lecturer: Shivani Agarwal Disclaimer: These notes are designed to be a supplement to the lecture. They may or may

More information

Shiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 4. Subgradient

Shiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 4. Subgradient Shiqian Ma, MAT-258A: Numerical Optimization 1 Chapter 4 Subgradient Shiqian Ma, MAT-258A: Numerical Optimization 2 4.1. Subgradients definition subgradient calculus duality and optimality conditions Shiqian

More information

5. Duality. Lagrangian

5. Duality. Lagrangian 5. Duality Convex Optimization Boyd & Vandenberghe Lagrange dual problem weak and strong duality geometric interpretation optimality conditions perturbation and sensitivity analysis examples generalized

More information

Lectures on Parametric Optimization: An Introduction

Lectures on Parametric Optimization: An Introduction -2 Lectures on Parametric Optimization: An Introduction Georg Still University of Twente, The Netherlands version: March 29, 2018 Contents Chapter 1. Introduction and notation 3 1.1. Introduction 3 1.2.

More information

5 Handling Constraints

5 Handling Constraints 5 Handling Constraints Engineering design optimization problems are very rarely unconstrained. Moreover, the constraints that appear in these problems are typically nonlinear. This motivates our interest

More information

Convex Optimization Boyd & Vandenberghe. 5. Duality

Convex Optimization Boyd & Vandenberghe. 5. Duality 5. Duality Convex Optimization Boyd & Vandenberghe Lagrange dual problem weak and strong duality geometric interpretation optimality conditions perturbation and sensitivity analysis examples generalized

More information

Lecture 18: Optimization Programming

Lecture 18: Optimization Programming Fall, 2016 Outline Unconstrained Optimization 1 Unconstrained Optimization 2 Equality-constrained Optimization Inequality-constrained Optimization Mixture-constrained Optimization 3 Quadratic Programming

More information

Convex Functions. Pontus Giselsson

Convex Functions. Pontus Giselsson Convex Functions Pontus Giselsson 1 Today s lecture lower semicontinuity, closure, convex hull convexity preserving operations precomposition with affine mapping infimal convolution image function supremum

More information

The common-line problem in congested transit networks

The common-line problem in congested transit networks The common-line problem in congested transit networks R. Cominetti, J. Correa Abstract We analyze a general (Wardrop) equilibrium model for the common-line problem in transit networks under congestion

More information

MVE165/MMG631 Linear and integer optimization with applications Lecture 13 Overview of nonlinear programming. Ann-Brith Strömberg

MVE165/MMG631 Linear and integer optimization with applications Lecture 13 Overview of nonlinear programming. Ann-Brith Strömberg MVE165/MMG631 Overview of nonlinear programming Ann-Brith Strömberg 2015 05 21 Areas of applications, examples (Ch. 9.1) Structural optimization Design of aircraft, ships, bridges, etc Decide on the material

More information

Convex Optimization M2

Convex Optimization M2 Convex Optimization M2 Lecture 3 A. d Aspremont. Convex Optimization M2. 1/49 Duality A. d Aspremont. Convex Optimization M2. 2/49 DMs DM par email: dm.daspremont@gmail.com A. d Aspremont. Convex Optimization

More information

Chapter 2 Convex Analysis

Chapter 2 Convex Analysis Chapter 2 Convex Analysis The theory of nonsmooth analysis is based on convex analysis. Thus, we start this chapter by giving basic concepts and results of convexity (for further readings see also [202,

More information

Topic 6: Projected Dynamical Systems

Topic 6: Projected Dynamical Systems Topic 6: Projected Dynamical Systems John F. Smith Memorial Professor and Director Virtual Center for Supernetworks Isenberg School of Management University of Massachusetts Amherst, Massachusetts 01003

More information

Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization

Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization Compiled by David Rosenberg Abstract Boyd and Vandenberghe s Convex Optimization book is very well-written and a pleasure to read. The

More information

g 2 (x) (1/3)M 1 = (1/3)(2/3)M.

g 2 (x) (1/3)M 1 = (1/3)(2/3)M. COMPACTNESS If C R n is closed and bounded, then by B-W it is sequentially compact: any sequence of points in C has a subsequence converging to a point in C Conversely, any sequentially compact C R n is

More information

UNDERGROUND LECTURE NOTES 1: Optimality Conditions for Constrained Optimization Problems

UNDERGROUND LECTURE NOTES 1: Optimality Conditions for Constrained Optimization Problems UNDERGROUND LECTURE NOTES 1: Optimality Conditions for Constrained Optimization Problems Robert M. Freund February 2016 c 2016 Massachusetts Institute of Technology. All rights reserved. 1 1 Introduction

More information

User Equilibrium CE 392C. September 1, User Equilibrium

User Equilibrium CE 392C. September 1, User Equilibrium CE 392C September 1, 2016 REVIEW 1 Network definitions 2 How to calculate path travel times from path flows? 3 Principle of user equilibrium 4 Pigou-Knight Downs paradox 5 Smith paradox Review OUTLINE

More information

CE 191: Civil & Environmental Engineering Systems Analysis. LEC 17 : Final Review

CE 191: Civil & Environmental Engineering Systems Analysis. LEC 17 : Final Review CE 191: Civil & Environmental Engineering Systems Analysis LEC 17 : Final Review Professor Scott Moura Civil & Environmental Engineering University of California, Berkeley Fall 2014 Prof. Moura UC Berkeley

More information

Lecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem

Lecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem Lecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem Michael Patriksson 0-0 The Relaxation Theorem 1 Problem: find f := infimum f(x), x subject to x S, (1a) (1b) where f : R n R

More information

Lagrange duality. The Lagrangian. We consider an optimization program of the form

Lagrange duality. The Lagrangian. We consider an optimization program of the form Lagrange duality Another way to arrive at the KKT conditions, and one which gives us some insight on solving constrained optimization problems, is through the Lagrange dual. The dual is a maximization

More information

Lecture 1: Introduction. Outline. B9824 Foundations of Optimization. Fall Administrative matters. 2. Introduction. 3. Existence of optima

Lecture 1: Introduction. Outline. B9824 Foundations of Optimization. Fall Administrative matters. 2. Introduction. 3. Existence of optima B9824 Foundations of Optimization Lecture 1: Introduction Fall 2009 Copyright 2009 Ciamac Moallemi Outline 1. Administrative matters 2. Introduction 3. Existence of optima 4. Local theory of unconstrained

More information

Nonlinear Programming (Hillier, Lieberman Chapter 13) CHEM-E7155 Production Planning and Control

Nonlinear Programming (Hillier, Lieberman Chapter 13) CHEM-E7155 Production Planning and Control Nonlinear Programming (Hillier, Lieberman Chapter 13) CHEM-E7155 Production Planning and Control 19/4/2012 Lecture content Problem formulation and sample examples (ch 13.1) Theoretical background Graphical

More information

You should be able to...

You should be able to... Lecture Outline Gradient Projection Algorithm Constant Step Length, Varying Step Length, Diminishing Step Length Complexity Issues Gradient Projection With Exploration Projection Solving QPs: active set

More information

Lecture 1: Introduction. Outline. B9824 Foundations of Optimization. Fall Administrative matters. 2. Introduction. 3. Existence of optima

Lecture 1: Introduction. Outline. B9824 Foundations of Optimization. Fall Administrative matters. 2. Introduction. 3. Existence of optima B9824 Foundations of Optimization Lecture 1: Introduction Fall 2010 Copyright 2010 Ciamac Moallemi Outline 1. Administrative matters 2. Introduction 3. Existence of optima 4. Local theory of unconstrained

More information

CS-E4830 Kernel Methods in Machine Learning

CS-E4830 Kernel Methods in Machine Learning CS-E4830 Kernel Methods in Machine Learning Lecture 3: Convex optimization and duality Juho Rousu 27. September, 2017 Juho Rousu 27. September, 2017 1 / 45 Convex optimization Convex optimisation This

More information

Lecture: Duality.

Lecture: Duality. Lecture: Duality http://bicmr.pku.edu.cn/~wenzw/opt-2016-fall.html Acknowledgement: this slides is based on Prof. Lieven Vandenberghe s lecture notes Introduction 2/35 Lagrange dual problem weak and strong

More information

minimize x subject to (x 2)(x 4) u,

minimize x subject to (x 2)(x 4) u, Math 6366/6367: Optimization and Variational Methods Sample Preliminary Exam Questions 1. Suppose that f : [, L] R is a C 2 -function with f () on (, L) and that you have explicit formulae for

More information

Optimality Conditions for Constrained Optimization

Optimality Conditions for Constrained Optimization 72 CHAPTER 7 Optimality Conditions for Constrained Optimization 1. First Order Conditions In this section we consider first order optimality conditions for the constrained problem P : minimize f 0 (x)

More information

The discrete-time second-best day-to-day dynamic pricing scheme

The discrete-time second-best day-to-day dynamic pricing scheme The discrete-time second-best day-to-day dynamic pricing scheme Linghui Han, David Z.W. Wang & Chengjuan Zhu 25-07-2017 School of Civil & Environmental Engineering Nanyang Technological University, Singapore

More information

I.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010

I.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010 I.3. LMI DUALITY Didier HENRION henrion@laas.fr EECI Graduate School on Control Supélec - Spring 2010 Primal and dual For primal problem p = inf x g 0 (x) s.t. g i (x) 0 define Lagrangian L(x, z) = g 0

More information

(x k ) sequence in F, lim x k = x x F. If F : R n R is a function, level sets and sublevel sets of F are any sets of the form (respectively);

(x k ) sequence in F, lim x k = x x F. If F : R n R is a function, level sets and sublevel sets of F are any sets of the form (respectively); STABILITY OF EQUILIBRIA AND LIAPUNOV FUNCTIONS. By topological properties in general we mean qualitative geometric properties (of subsets of R n or of functions in R n ), that is, those that don t depend

More information

Lecture 1. Stochastic Optimization: Introduction. January 8, 2018

Lecture 1. Stochastic Optimization: Introduction. January 8, 2018 Lecture 1 Stochastic Optimization: Introduction January 8, 2018 Optimization Concerned with mininmization/maximization of mathematical functions Often subject to constraints Euler (1707-1783): Nothing

More information

Lecture 3. Optimization Problems and Iterative Algorithms

Lecture 3. Optimization Problems and Iterative Algorithms Lecture 3 Optimization Problems and Iterative Algorithms January 13, 2016 This material was jointly developed with Angelia Nedić at UIUC for IE 598ns Outline Special Functions: Linear, Quadratic, Convex

More information

Lecture Notes in Advanced Calculus 1 (80315) Raz Kupferman Institute of Mathematics The Hebrew University

Lecture Notes in Advanced Calculus 1 (80315) Raz Kupferman Institute of Mathematics The Hebrew University Lecture Notes in Advanced Calculus 1 (80315) Raz Kupferman Institute of Mathematics The Hebrew University February 7, 2007 2 Contents 1 Metric Spaces 1 1.1 Basic definitions...........................

More information

Convex Optimization and Modeling

Convex Optimization and Modeling Convex Optimization and Modeling Duality Theory and Optimality Conditions 5th lecture, 12.05.2010 Jun.-Prof. Matthias Hein Program of today/next lecture Lagrangian and duality: the Lagrangian the dual

More information

Convex Analysis and Economic Theory AY Elementary properties of convex functions

Convex Analysis and Economic Theory AY Elementary properties of convex functions Division of the Humanities and Social Sciences Ec 181 KC Border Convex Analysis and Economic Theory AY 2018 2019 Topic 6: Convex functions I 6.1 Elementary properties of convex functions We may occasionally

More information

Convex Optimization & Lagrange Duality

Convex Optimization & Lagrange Duality Convex Optimization & Lagrange Duality Chee Wei Tan CS 8292 : Advanced Topics in Convex Optimization and its Applications Fall 2010 Outline Convex optimization Optimality condition Lagrange duality KKT

More information

A Paradox on Traffic Networks

A Paradox on Traffic Networks A Paradox on Traffic Networks Dietrich Braess Bochum Historical remarks. The detection of the paradox is also counterintuitive Is the mathematical paradox consistent with the psychological behavior of

More information

6.254 : Game Theory with Engineering Applications Lecture 7: Supermodular Games

6.254 : Game Theory with Engineering Applications Lecture 7: Supermodular Games 6.254 : Game Theory with Engineering Applications Lecture 7: Asu Ozdaglar MIT February 25, 2010 1 Introduction Outline Uniqueness of a Pure Nash Equilibrium for Continuous Games Reading: Rosen J.B., Existence

More information

The Karush-Kuhn-Tucker (KKT) conditions

The Karush-Kuhn-Tucker (KKT) conditions The Karush-Kuhn-Tucker (KKT) conditions In this section, we will give a set of sufficient (and at most times necessary) conditions for a x to be the solution of a given convex optimization problem. These

More information

Math 273a: Optimization Subgradients of convex functions

Math 273a: Optimization Subgradients of convex functions Math 273a: Optimization Subgradients of convex functions Made by: Damek Davis Edited by Wotao Yin Department of Mathematics, UCLA Fall 2015 online discussions on piazza.com 1 / 42 Subgradients Assumptions

More information

MS&E 246: Lecture 17 Network routing. Ramesh Johari

MS&E 246: Lecture 17 Network routing. Ramesh Johari MS&E 246: Lecture 17 Network routing Ramesh Johari Network routing Basic definitions Wardrop equilibrium Braess paradox Implications Network routing N users travel across a network Transportation Internet

More information

Iteration-complexity of first-order penalty methods for convex programming

Iteration-complexity of first-order penalty methods for convex programming Iteration-complexity of first-order penalty methods for convex programming Guanghui Lan Renato D.C. Monteiro July 24, 2008 Abstract This paper considers a special but broad class of convex programing CP)

More information

January 29, Introduction to optimization and complexity. Outline. Introduction. Problem formulation. Convexity reminder. Optimality Conditions

January 29, Introduction to optimization and complexity. Outline. Introduction. Problem formulation. Convexity reminder. Optimality Conditions Olga Galinina olga.galinina@tut.fi ELT-53656 Network Analysis Dimensioning II Department of Electronics Communications Engineering Tampere University of Technology, Tampere, Finl January 29, 2014 1 2 3

More information

Structural and Multidisciplinary Optimization. P. Duysinx and P. Tossings

Structural and Multidisciplinary Optimization. P. Duysinx and P. Tossings Structural and Multidisciplinary Optimization P. Duysinx and P. Tossings 2018-2019 CONTACTS Pierre Duysinx Institut de Mécanique et du Génie Civil (B52/3) Phone number: 04/366.91.94 Email: P.Duysinx@uliege.be

More information

Lecture 2: Convex Sets and Functions

Lecture 2: Convex Sets and Functions Lecture 2: Convex Sets and Functions Hyang-Won Lee Dept. of Internet & Multimedia Eng. Konkuk University Lecture 2 Network Optimization, Fall 2015 1 / 22 Optimization Problems Optimization problems are

More information

Lecture: Duality of LP, SOCP and SDP

Lecture: Duality of LP, SOCP and SDP 1/33 Lecture: Duality of LP, SOCP and SDP Zaiwen Wen Beijing International Center For Mathematical Research Peking University http://bicmr.pku.edu.cn/~wenzw/bigdata2017.html wenzw@pku.edu.cn Acknowledgement:

More information

Examination paper for TMA4180 Optimization I

Examination paper for TMA4180 Optimization I Department of Mathematical Sciences Examination paper for TMA4180 Optimization I Academic contact during examination: Phone: Examination date: 26th May 2016 Examination time (from to): 09:00 13:00 Permitted

More information

Mathematical Economics. Lecture Notes (in extracts)

Mathematical Economics. Lecture Notes (in extracts) Prof. Dr. Frank Werner Faculty of Mathematics Institute of Mathematical Optimization (IMO) http://math.uni-magdeburg.de/ werner/math-ec-new.html Mathematical Economics Lecture Notes (in extracts) Winter

More information

Subgradient. Acknowledgement: this slides is based on Prof. Lieven Vandenberghes lecture notes. definition. subgradient calculus

Subgradient. Acknowledgement: this slides is based on Prof. Lieven Vandenberghes lecture notes. definition. subgradient calculus 1/41 Subgradient Acknowledgement: this slides is based on Prof. Lieven Vandenberghes lecture notes definition subgradient calculus duality and optimality conditions directional derivative Basic inequality

More information

Optimality, Duality, Complementarity for Constrained Optimization

Optimality, Duality, Complementarity for Constrained Optimization Optimality, Duality, Complementarity for Constrained Optimization Stephen Wright University of Wisconsin-Madison May 2014 Wright (UW-Madison) Optimality, Duality, Complementarity May 2014 1 / 41 Linear

More information

SECTION C: CONTINUOUS OPTIMISATION LECTURE 9: FIRST ORDER OPTIMALITY CONDITIONS FOR CONSTRAINED NONLINEAR PROGRAMMING

SECTION C: CONTINUOUS OPTIMISATION LECTURE 9: FIRST ORDER OPTIMALITY CONDITIONS FOR CONSTRAINED NONLINEAR PROGRAMMING Nf SECTION C: CONTINUOUS OPTIMISATION LECTURE 9: FIRST ORDER OPTIMALITY CONDITIONS FOR CONSTRAINED NONLINEAR PROGRAMMING f(x R m g HONOUR SCHOOL OF MATHEMATICS, OXFORD UNIVERSITY HILARY TERM 5, DR RAPHAEL

More information

Written Examination

Written Examination Division of Scientific Computing Department of Information Technology Uppsala University Optimization Written Examination 202-2-20 Time: 4:00-9:00 Allowed Tools: Pocket Calculator, one A4 paper with notes

More information

Convex Optimization Notes

Convex Optimization Notes Convex Optimization Notes Jonathan Siegel January 2017 1 Convex Analysis This section is devoted to the study of convex functions f : B R {+ } and convex sets U B, for B a Banach space. The case of B =

More information

An introduction to Mathematical Theory of Control

An introduction to Mathematical Theory of Control An introduction to Mathematical Theory of Control Vasile Staicu University of Aveiro UNICA, May 2018 Vasile Staicu (University of Aveiro) An introduction to Mathematical Theory of Control UNICA, May 2018

More information

1 Functions and Graphs

1 Functions and Graphs 1 Functions and Graphs 1.1 Functions Cartesian Coordinate System A Cartesian or rectangular coordinate system is formed by the intersection of a horizontal real number line, usually called the x axis,

More information

Stochastic Programming Math Review and MultiPeriod Models

Stochastic Programming Math Review and MultiPeriod Models IE 495 Lecture 5 Stochastic Programming Math Review and MultiPeriod Models Prof. Jeff Linderoth January 27, 2003 January 27, 2003 Stochastic Programming Lecture 5 Slide 1 Outline Homework questions? I

More information

Convex Optimization. Newton s method. ENSAE: Optimisation 1/44

Convex Optimization. Newton s method. ENSAE: Optimisation 1/44 Convex Optimization Newton s method ENSAE: Optimisation 1/44 Unconstrained minimization minimize f(x) f convex, twice continuously differentiable (hence dom f open) we assume optimal value p = inf x f(x)

More information

DUALIZATION OF SUBGRADIENT CONDITIONS FOR OPTIMALITY

DUALIZATION OF SUBGRADIENT CONDITIONS FOR OPTIMALITY DUALIZATION OF SUBGRADIENT CONDITIONS FOR OPTIMALITY R. T. Rockafellar* Abstract. A basic relationship is derived between generalized subgradients of a given function, possibly nonsmooth and nonconvex,

More information

Convex Optimization Lecture 6: KKT Conditions, and applications

Convex Optimization Lecture 6: KKT Conditions, and applications Convex Optimization Lecture 6: KKT Conditions, and applications Dr. Michel Baes, IFOR / ETH Zürich Quick recall of last week s lecture Various aspects of convexity: The set of minimizers is convex. Convex

More information

Locally convex spaces, the hyperplane separation theorem, and the Krein-Milman theorem

Locally convex spaces, the hyperplane separation theorem, and the Krein-Milman theorem 56 Chapter 7 Locally convex spaces, the hyperplane separation theorem, and the Krein-Milman theorem Recall that C(X) is not a normed linear space when X is not compact. On the other hand we could use semi

More information

Departure time choice equilibrium problem with partial implementation of congestion pricing

Departure time choice equilibrium problem with partial implementation of congestion pricing Departure time choice equilibrium problem with partial implementation of congestion pricing Tokyo Institute of Technology Postdoctoral researcher Katsuya Sakai 1 Contents 1. Introduction 2. Method/Tool

More information

Existence of minimizers

Existence of minimizers Existence of imizers We have just talked a lot about how to find the imizer of an unconstrained convex optimization problem. We have not talked too much, at least not in concrete mathematical terms, about

More information

Date: July 5, Contents

Date: July 5, Contents 2 Lagrange Multipliers Date: July 5, 2001 Contents 2.1. Introduction to Lagrange Multipliers......... p. 2 2.2. Enhanced Fritz John Optimality Conditions...... p. 14 2.3. Informative Lagrange Multipliers...........

More information

IE 5531: Engineering Optimization I

IE 5531: Engineering Optimization I IE 5531: Engineering Optimization I Lecture 12: Nonlinear optimization, continued Prof. John Gunnar Carlsson October 20, 2010 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I October 20,

More information

Division of the Humanities and Social Sciences. Supergradients. KC Border Fall 2001 v ::15.45

Division of the Humanities and Social Sciences. Supergradients. KC Border Fall 2001 v ::15.45 Division of the Humanities and Social Sciences Supergradients KC Border Fall 2001 1 The supergradient of a concave function There is a useful way to characterize the concavity of differentiable functions.

More information

CO 250 Final Exam Guide

CO 250 Final Exam Guide Spring 2017 CO 250 Final Exam Guide TABLE OF CONTENTS richardwu.ca CO 250 Final Exam Guide Introduction to Optimization Kanstantsin Pashkovich Spring 2017 University of Waterloo Last Revision: March 4,

More information

Convexity in R n. The following lemma will be needed in a while. Lemma 1 Let x E, u R n. If τ I(x, u), τ 0, define. f(x + τu) f(x). τ.

Convexity in R n. The following lemma will be needed in a while. Lemma 1 Let x E, u R n. If τ I(x, u), τ 0, define. f(x + τu) f(x). τ. Convexity in R n Let E be a convex subset of R n. A function f : E (, ] is convex iff f(tx + (1 t)y) (1 t)f(x) + tf(y) x, y E, t [0, 1]. A similar definition holds in any vector space. A topology is needed

More information

CSCI : Optimization and Control of Networks. Review on Convex Optimization

CSCI : Optimization and Control of Networks. Review on Convex Optimization CSCI7000-016: Optimization and Control of Networks Review on Convex Optimization 1 Convex set S R n is convex if x,y S, λ,µ 0, λ+µ = 1 λx+µy S geometrically: x,y S line segment through x,y S examples (one

More information

Constrained Optimization

Constrained Optimization 1 / 22 Constrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University March 30, 2015 2 / 22 1. Equality constraints only 1.1 Reduced gradient 1.2 Lagrange

More information

Inequality Constraints

Inequality Constraints Chapter 2 Inequality Constraints 2.1 Optimality Conditions Early in multivariate calculus we learn the significance of differentiability in finding minimizers. In this section we begin our study of the

More information

Conservation laws and some applications to traffic flows

Conservation laws and some applications to traffic flows Conservation laws and some applications to traffic flows Khai T. Nguyen Department of Mathematics, Penn State University ktn2@psu.edu 46th Annual John H. Barrett Memorial Lectures May 16 18, 2016 Khai

More information

A SHIFTED PRIMAL-DUAL INTERIOR METHOD FOR NONLINEAR OPTIMIZATION

A SHIFTED PRIMAL-DUAL INTERIOR METHOD FOR NONLINEAR OPTIMIZATION A SHIFTED RIMAL-DUAL INTERIOR METHOD FOR NONLINEAR OTIMIZATION hilip E. Gill Vyacheslav Kungurtsev Daniel. Robinson UCSD Center for Computational Mathematics Technical Report CCoM-18-1 February 1, 2018

More information

Convex functions, subdifferentials, and the L.a.s.s.o.

Convex functions, subdifferentials, and the L.a.s.s.o. Convex functions, subdifferentials, and the L.a.s.s.o. MATH 697 AM:ST September 26, 2017 Warmup Let us consider the absolute value function in R p u(x) = (x, x) = x 2 1 +... + x2 p Evidently, u is differentiable

More information

A Brief Review on Convex Optimization

A Brief Review on Convex Optimization A Brief Review on Convex Optimization 1 Convex set S R n is convex if x,y S, λ,µ 0, λ+µ = 1 λx+µy S geometrically: x,y S line segment through x,y S examples (one convex, two nonconvex sets): A Brief Review

More information

September Math Course: First Order Derivative

September Math Course: First Order Derivative September Math Course: First Order Derivative Arina Nikandrova Functions Function y = f (x), where x is either be a scalar or a vector of several variables (x,..., x n ), can be thought of as a rule which

More information

Lecture 5: Gradient Descent. 5.1 Unconstrained minimization problems and Gradient descent

Lecture 5: Gradient Descent. 5.1 Unconstrained minimization problems and Gradient descent 10-725/36-725: Convex Optimization Spring 2015 Lecturer: Ryan Tibshirani Lecture 5: Gradient Descent Scribes: Loc Do,2,3 Disclaimer: These notes have not been subjected to the usual scrutiny reserved for

More information

Optimization and Optimal Control in Banach Spaces

Optimization and Optimal Control in Banach Spaces Optimization and Optimal Control in Banach Spaces Bernhard Schmitzer October 19, 2017 1 Convex non-smooth optimization with proximal operators Remark 1.1 (Motivation). Convex optimization: easier to solve,

More information

Kaisa Joki Adil M. Bagirov Napsu Karmitsa Marko M. Mäkelä. New Proximal Bundle Method for Nonsmooth DC Optimization

Kaisa Joki Adil M. Bagirov Napsu Karmitsa Marko M. Mäkelä. New Proximal Bundle Method for Nonsmooth DC Optimization Kaisa Joki Adil M. Bagirov Napsu Karmitsa Marko M. Mäkelä New Proximal Bundle Method for Nonsmooth DC Optimization TUCS Technical Report No 1130, February 2015 New Proximal Bundle Method for Nonsmooth

More information

Intersection Models and Nash Equilibria for Traffic Flow on Networks

Intersection Models and Nash Equilibria for Traffic Flow on Networks Intersection Models and Nash Equilibria for Traffic Flow on Networks Alberto Bressan Department of Mathematics, Penn State University bressan@math.psu.edu (Los Angeles, November 2015) Alberto Bressan (Penn

More information

Introduction to Machine Learning Prof. Sudeshna Sarkar Department of Computer Science and Engineering Indian Institute of Technology, Kharagpur

Introduction to Machine Learning Prof. Sudeshna Sarkar Department of Computer Science and Engineering Indian Institute of Technology, Kharagpur Introduction to Machine Learning Prof. Sudeshna Sarkar Department of Computer Science and Engineering Indian Institute of Technology, Kharagpur Module - 5 Lecture - 22 SVM: The Dual Formulation Good morning.

More information

Motivation. Lecture 2 Topics from Optimization and Duality. network utility maximization (NUM) problem:

Motivation. Lecture 2 Topics from Optimization and Duality. network utility maximization (NUM) problem: CDS270 Maryam Fazel Lecture 2 Topics from Optimization and Duality Motivation network utility maximization (NUM) problem: consider a network with S sources (users), each sending one flow at rate x s, through

More information

Math 273a: Optimization Convex Conjugacy

Math 273a: Optimization Convex Conjugacy Math 273a: Optimization Convex Conjugacy Instructor: Wotao Yin Department of Mathematics, UCLA Fall 2015 online discussions on piazza.com Convex conjugate (the Legendre transform) Let f be a closed proper

More information

Lecture 15 Newton Method and Self-Concordance. October 23, 2008

Lecture 15 Newton Method and Self-Concordance. October 23, 2008 Newton Method and Self-Concordance October 23, 2008 Outline Lecture 15 Self-concordance Notion Self-concordant Functions Operations Preserving Self-concordance Properties of Self-concordant Functions Implications

More information

Mathematical optimization

Mathematical optimization Optimization Mathematical optimization Determine the best solutions to certain mathematically defined problems that are under constrained determine optimality criteria determine the convergence of the

More information

Assignment 1: From the Definition of Convexity to Helley Theorem

Assignment 1: From the Definition of Convexity to Helley Theorem Assignment 1: From the Definition of Convexity to Helley Theorem Exercise 1 Mark in the following list the sets which are convex: 1. {x R 2 : x 1 + i 2 x 2 1, i = 1,..., 10} 2. {x R 2 : x 2 1 + 2ix 1x

More information

Convex Optimization. Prof. Nati Srebro. Lecture 12: Infeasible-Start Newton s Method Interior Point Methods

Convex Optimization. Prof. Nati Srebro. Lecture 12: Infeasible-Start Newton s Method Interior Point Methods Convex Optimization Prof. Nati Srebro Lecture 12: Infeasible-Start Newton s Method Interior Point Methods Equality Constrained Optimization f 0 (x) s. t. A R p n, b R p Using access to: 2 nd order oracle

More information

Algorithms for Constrained Optimization

Algorithms for Constrained Optimization 1 / 42 Algorithms for Constrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University April 19, 2015 2 / 42 Outline 1. Convergence 2. Sequential quadratic

More information

Lagrangian road pricing

Lagrangian road pricing Lagrangian road pricing Vianney Boeuf 1, Sébastien Blandin 2 1 École polytechnique Paristech, France 2 IBM Research Collaboratory, Singapore vianney.boeuf@polytechnique.edu, sblandin@sg.ibm.com Keywords:

More information

Convex Analysis and Optimization Chapter 2 Solutions

Convex Analysis and Optimization Chapter 2 Solutions Convex Analysis and Optimization Chapter 2 Solutions Dimitri P. Bertsekas with Angelia Nedić and Asuman E. Ozdaglar Massachusetts Institute of Technology Athena Scientific, Belmont, Massachusetts http://www.athenasc.com

More information

Outline. Roadmap for the NPP segment: 1 Preliminaries: role of convexity. 2 Existence of a solution

Outline. Roadmap for the NPP segment: 1 Preliminaries: role of convexity. 2 Existence of a solution Outline Roadmap for the NPP segment: 1 Preliminaries: role of convexity 2 Existence of a solution 3 Necessary conditions for a solution: inequality constraints 4 The constraint qualification 5 The Lagrangian

More information

Analysis Finite and Infinite Sets The Real Numbers The Cantor Set

Analysis Finite and Infinite Sets The Real Numbers The Cantor Set Analysis Finite and Infinite Sets Definition. An initial segment is {n N n n 0 }. Definition. A finite set can be put into one-to-one correspondence with an initial segment. The empty set is also considered

More information

Descent methods. min x. f(x)

Descent methods. min x. f(x) Gradient Descent Descent methods min x f(x) 5 / 34 Descent methods min x f(x) x k x k+1... x f(x ) = 0 5 / 34 Gradient methods Unconstrained optimization min f(x) x R n. 6 / 34 Gradient methods Unconstrained

More information

Contents Ordered Fields... 2 Ordered sets and fields... 2 Construction of the Reals 1: Dedekind Cuts... 2 Metric Spaces... 3

Contents Ordered Fields... 2 Ordered sets and fields... 2 Construction of the Reals 1: Dedekind Cuts... 2 Metric Spaces... 3 Analysis Math Notes Study Guide Real Analysis Contents Ordered Fields 2 Ordered sets and fields 2 Construction of the Reals 1: Dedekind Cuts 2 Metric Spaces 3 Metric Spaces 3 Definitions 4 Separability

More information

CONSTRAINT QUALIFICATIONS, LAGRANGIAN DUALITY & SADDLE POINT OPTIMALITY CONDITIONS

CONSTRAINT QUALIFICATIONS, LAGRANGIAN DUALITY & SADDLE POINT OPTIMALITY CONDITIONS CONSTRAINT QUALIFICATIONS, LAGRANGIAN DUALITY & SADDLE POINT OPTIMALITY CONDITIONS A Dissertation Submitted For The Award of the Degree of Master of Philosophy in Mathematics Neelam Patel School of Mathematics

More information

Algorithms for constrained local optimization

Algorithms for constrained local optimization Algorithms for constrained local optimization Fabio Schoen 2008 http://gol.dsi.unifi.it/users/schoen Algorithms for constrained local optimization p. Feasible direction methods Algorithms for constrained

More information

Lecture 7: September 17

Lecture 7: September 17 10-725: Optimization Fall 2013 Lecture 7: September 17 Lecturer: Ryan Tibshirani Scribes: Serim Park,Yiming Gu 7.1 Recap. The drawbacks of Gradient Methods are: (1) requires f is differentiable; (2) relatively

More information

Subgradients. subgradients and quasigradients. subgradient calculus. optimality conditions via subgradients. directional derivatives

Subgradients. subgradients and quasigradients. subgradient calculus. optimality conditions via subgradients. directional derivatives Subgradients subgradients and quasigradients subgradient calculus optimality conditions via subgradients directional derivatives Prof. S. Boyd, EE392o, Stanford University Basic inequality recall basic

More information