Convex Optimization Conjugate, Subdifferential, Proximation
|
|
- Evangeline Walsh
- 5 years ago
- Views:
Transcription
1 1 Lecture Notes, HCI, Chapter 6 Convex Optimization Conjugate, Subdifferential, Proximation Bastian Goldlücke Computer Vision Group Technical University of Munich
2 2 Bastian Goldlücke Overview x Lecture Notes, HCI WS Conjugate functionals Subdifferential calculus 2 Moreau s theorem Moreau s theorem Fixed points Subgradient descent 3 Summary
3 3 Overview x Bastian Goldlücke Lecture Notes, HCI WS Conjugate functionals Subdifferential calculus 2 Moreau s theorem Moreau s theorem Fixed points Subgradient descent 3 Summary
4 Affine functions x Conjugate functionals x Bastian Goldlücke Lecture Notes, HCI WS 211 In this tutorial, we interpret elements of the dual in a very geometrical way as the slope of affine functions. Definition Let ϕ V and c R, then an affine function on V is given by h ϕ,c : v x, ϕ c. We call ϕ the slope and c the intercept of h ϕ,c. R c [ϕ 1] h ϕ,c V 4
5 Affine functions x Conjugate functionals x Bastian Goldlücke Lecture Notes, HCI WS 211 We would like to find the largest affine function below f. For this, consider for each x V the affine function which passes through (x, f (x)): h ϕ,c (x) = f (x) x, ϕ c = f (x) c = x, ϕ f (x). ( x, ϕ f (x)) f (x) epi(f ) f h ϕ, x,ϕ f (x) x V To get the largest affine function below f, we have to pass to the supremum. The intercept of this function is called the conjugate functional of f. 5
6 6 Conjugate functionals x Conjugate functionals x Bastian Goldlücke Lecture Notes, HCI WS 211 Definition Let f conv(v). Then the conjugate functional f : V R { } is defined as f (ϕ) := sup [ x, ϕ f (x)]. x V An immediate consequence of the definition is Fenchel s inequality Let f conv(v). Then for all x V and ϕ V, x, ϕ f (x) + f (ϕ). In the above equation, equality holds if and only if ϕ belongs to the subdifferential f (x).
7 7 Conjugate functionals x Bastian Goldlücke Geometric interpretation of the conjugate x Lecture Notes, HCI WS 211 R f (ϕ) epi(f ) [ϕ 1] f V h ϕ,f (ϕ)
8 8 Conjugate functionals x Bastian Goldlücke Example: conjugate of an indicator function x Lecture Notes, HCI WS 211 Let K V be convex, and δ K be its indicator function. Then δ K (ϕ) = sup { x, ϕ δ K (x)} x K = sup x, ϕ = σ K (ϕ). x K i.e. the conjugate of an indicator function is the support functional.
9 9 Conjugate functionals x Bastian Goldlücke Properties of the conjugate functional x Lecture Notes, HCI WS 211 If f is convex, then f has the following remarkable property. The proof is not difficult (exercise). Theorem Let f conv(v). Then f is closed and convex. This will ultimately lead to a similar scenario which we had for the minimum norm problem: the dual problem of a convex optimization problem always attains its extremum, even if the primal problem does not.
10 1 The epigraph of f x Conjugate functionals x Bastian Goldlücke Lecture Notes, HCI WS 211 By definition, we have (ϕ, t) epi(f ) t sup [ x, ϕ f (x)]. x V If we define for each ϕ V and t R the affine functionals h ϕ,t (x) = x, ϕ t, then the epigraph of f can be written as epi(f ) = {(ϕ, t) V R : f h ϕ,t }. In other words, the epigraph of f consists of all pairs (ϕ, t) such that the affine function h ϕ,t lies below f. This insight will yield the interesting relationship f = f for closed convex functionals.
11 11 Second conjugate x Conjugate functionals x Bastian Goldlücke Lecture Notes, HCI WS 211 The epigraph of f consists of all pairs (ϕ, c) such that h ϕ,c lies below f. It almost completely characterizes f. The reason for the almost is that you can recover f only up to closure. Theorem Let f conv(v) be closed and V be reflexive, i.e. V = V. Then f = f. For the proof, note that f (x) = sup h ϕ,c (x) = sup h ϕ,c (x) h ϕ,c f (ϕ,c) epi(f ) = sup ϕ V [ x, ϕ f (ϕ)] = f (x). The first equality is intuitive, but surprisingly difficult to show - it is a consequence of the theorem of Hahn-Banach applied to the epigraph of f.
12 12 Conjugate functionals x Bastian Goldlücke Ex: second conjugate of an indicator function x Lecture Notes, HCI WS 211 Directly from the definition, we get the following Proposition Let K V be convex and δ K its indicator function. Then the support function of K is the conjugate of δ K, or σ K = δk. In addition, if K is closed, then δk = δ K, i.e. σk = δ K. The latter is correct because obviously, if K is closed, then so is K R + = epi(δ K ).
13 13 The conjugate of J x Conjugate functionals x Bastian Goldlücke Lecture Notes, HCI WS 211 Let K L 2 (Ω) be the following closed convex set: K = cl { div(ξ) : ξ C 1 c (Ω, R n ), ξ 1 }. Note that the space L 2 (Ω) is a Hilbert space, thus K is also a subset of its dual space. Proposition For every u, v L 2 (Ω), J(u) := σ K (u) = sup u, v = δk (u) v K J (v) = σk (v) = δ K (v) = δ K (v) = { if v K, otherwise.
14 14 The subdifferential x Subdifferential calculus x Bastian Goldlücke Lecture Notes, HCI WS 211 Definition Let f conv(v). A vector ϕ V is called a subgradient of f at x V if f (y) f (x) + y x, ϕ for all y V. The set of all subgradients of f at x is called the subdifferential f (x). Geometrically speaking, ϕ is a subgradient if the graph of the affine function h(y) = f (x) + y x, ϕ lies below the epigraph of f. Note that also h(x) = f (x), so it touches the epigraph.
15 15 The subdifferential x Subdifferential calculus x Bastian Goldlücke Lecture Notes, HCI WS 211 Example: the subdifferential of f : x x in is f () = [ 1, 1].
16 16 Subdifferential and derivatives x Subdifferential calculus x Bastian Goldlücke Lecture Notes, HCI WS 211 The subdifferential is a generalization of the Fréchet derivative (or the gradient in finite dimension), in the following sense. Theorem (subdifferential and Fréchet derivative Let f conv(v) be Fréchet differentiable at x V. Then f (x) = {df (x)}. The proof of the theorem is surprisingly involved - it requires to relate the subdifferential to one-sided directional derivatives. We will not explore these relationships in this lecture.
17 17 x Subdifferential calculus x Bastian Goldlücke Relationship between subgradient and conjugate x Lecture Notes, HCI WS 211 R f (ϕ) epi(f ) f h ϕ,f (ϕ) x V Here, we can see the equivalence ϕ f (x) h ϕ,f (ϕ)(y) = f (x) + y x, ϕ f (ϕ) = x, ϕ f (x)
18 18 The subdifferential and duality x Subdifferential calculus x Bastian Goldlücke Lecture Notes, HCI WS 211 The previously seen relationship between subgradients and conjugate functional can be summarized in the following theorem. Theorem Let f conv(v) and x V. Then the following conditions on a vector ϕ V are equivalent: ϕ f (x). x = argmax y V [ y, ϕ f (y)]. f (x) + f (ϕ) = x, ϕ. If furthermore, f is closed, then more conditions can be added to this list: x f (ϕ). ϕ = argmax ψ V [ x, ψ f (ψ)].
19 19 Formal proof of the theorem x Subdifferential calculus x Bastian Goldlücke Lecture Notes, HCI WS 211 The equivalences are easy to see. Rewriting the subgradient definition, one sees that ϕ f (x) means x, ϕ f (x) y, ϕ f (y) for all y V. This implies the first equivalence. Since the supremum over all y V on the right hand side is f (ϕ), we get the second together with the Fenchel inequality. If f is closed, then f = f, thus we get f (x) + f (ϕ) = x, ϕ. This is equivalent to the last two conditions using the same arguments as above on the conjugate functional.
20 2 Subdifferential calculus x Bastian Goldlücke Variational principle for convex functionals x Lecture Notes, HCI WS 211 As a corollary of the previous theorem, we obtain a generalized variational principle for convex functionals. It is a necessary and sufficient condition for the (global) extremum. Corollary (variational principle for convex functionals) Let f conv(v). Then ˆx is a global minimum of f if and only if f (ˆx). Furthermore, if f is closed, then ˆx is a global minimum if and only if ˆx f (), i.e. minimizing a functional is the same as computing the subdifferential of the conjugate functional at. To see this, just set ϕ = in the previous theorem.
21 21 Moreau s theorem x Bastian Goldlücke Overview x Lecture Notes, HCI WS Conjugate functionals Subdifferential calculus 2 Moreau s theorem Moreau s theorem Fixed points Subgradient descent 3 Summary
22 Moreau s theorem x Moreau s theorem x Bastian Goldlücke Moreau s theorem x Lecture Notes, HCI WS 211 For the remainder of the lecture, we will assume that the underlying space is a Hilbert space H, for example L 2 (Ω). Theorem (geometric Moreau) Let f be convex and closed on the Hilbert space H, which we identify with its dual. Then for every z H there is a unique decomposition z = ˆx + ϕ with ϕ f (ˆx), and the unique ˆx in this decomposition can be computed with the proximation { } 1 prox f (z) := argmin x H 2 x z 2 H + f (x). Corollary to Theorem 31.5 in Rockafellar, page 339 (of 423). The actual theorem has somewhat more content, but is very technical and quite hard to digest. The above is the essential consequence. 22
23 23 Moreau s theorem x Moreau s theorem x Bastian Goldlücke Proof of Moreau s Theorem x Lecture Notes, HCI WS 211 The correctness of the theorem is not too hard to see: if ˆx = prox f (z), then { } 1 ˆx argmin x H 2 x z 2 H + f (x) ˆx z + f (ˆx) z ˆx + f (ˆx). Existence and uniqueness of the proximation follows because the functional is closed, strictly convex and coercive.
24 24 Moreau s theorem x Moreau s theorem x Bastian Goldlücke The geometry of the graph of f x Lecture Notes, HCI WS 211 We will see in a moment that prox f is continuous. In particular, the map z (prox f (z), z prox f (z)) is a continuous map from H into the graph of f, graph( f ) := {(x, ϕ) : x H, ϕ f (x)} H H, with continuous inverse (x, ϕ) x + ϕ. The theorem of Moreau now says that this map is one-to one. In particular, H graph( f ), i.e. the sets are homeomorphic. In particular, graph( f ) is always connected. Another corollary of Moreau s theorem is that z = prox f (z) + prox f (z).
25 25 Moreau s theorem x Moreau s theorem x Bastian Goldlücke The proximation operator is continuous x Lecture Notes, HCI WS 211 Proposition Let f be a convex and closed functional on the Hilbert space H. Then prox f is Lipschitz with constant 1, i.e. for all z, z 1 in H, prox f (z ) prox f (z 1 ) H z z 1. We will prove this in an exercise.
26 26 Moreau s theorem x Fixed points x Bastian Goldlücke Fixed points of the proximation operator x Lecture Notes, HCI WS 211 Proposition Let f be closed and convex on the Hilbert space H. Let ẑ be a fixed point of the proximation operator prox f, i.e. ẑ = prox f (ẑ). Then ẑ is a minimizer of f. In particular, it also follows that ẑ (I prox f ) 1 (). To proof this, just note that because of Moreau s theorem, if ẑ is a fixed point. ẑ prox f (ẑ) + f (ẑ) f (ẑ)
27 27 Moreau s theorem x Subgradient descent x Bastian Goldlücke Subgradient descent x Lecture Notes, HCI WS 211 Let λ >, z H and x = prox λf (z). Then z x + λf (x) x z λ f (x). In particular, we have the following interesting observation: The proximation operator prox λf computes an implicit subgradient descent step of step size λ for the functional f. Implicit here means that the subgradient is not evaluated at the original, but at the new location. This improves stability of the descent. Note that if subgradient descent converges, then it converges to a fixed point ẑ of I λ f, in particular ẑ is a minimizer of the functional f.
28 28 Summary Summary x Bastian Goldlücke Lecture Notes, HCI WS 211 Convex optimization deals with finding minima of convex functionals, which can be non-differentiable. The generalization of the variational principle for a convex functional is the condition that at a minimum, zero must be an element of the subgradient. Efficient optimization methods rely heavily on the concept of duality and the conjugate functional. We will see that it allows to transform convex minimization problems into saddle point problems, which are sometimes easier to handle. Implicit subgradient descent for convex functionals can be computed by evaluating the proximation operator, which means solving another minimization problem.
29 29 Summary Bastian Goldlücke References x Lecture Notes, HCI WS 211 Convex Optimization Boyd and Vandenberghe, Convex Optimization, Stanford University Press 24. Excellent recent introduction to convex optimization. Reads very well, available online for free. Rockafellar, Convex Analysis, Princeton University Press 197. Classical introduction to convex analysis and optimization. Somewhat technical and not too easy to read, but very exhaustive.
BASICS OF CONVEX ANALYSIS
BASICS OF CONVEX ANALYSIS MARKUS GRASMAIR 1. Main Definitions We start with providing the central definitions of convex functions and convex sets. Definition 1. A function f : R n R + } is called convex,
More informationDual methods for the minimization of the total variation
1 / 30 Dual methods for the minimization of the total variation Rémy Abergel supervisor Lionel Moisan MAP5 - CNRS UMR 8145 Different Learning Seminar, LTCI Thursday 21st April 2016 2 / 30 Plan 1 Introduction
More informationBrøndsted-Rockafellar property of subdifferentials of prox-bounded functions. Marc Lassonde Université des Antilles et de la Guyane
Conference ADGO 2013 October 16, 2013 Brøndsted-Rockafellar property of subdifferentials of prox-bounded functions Marc Lassonde Université des Antilles et de la Guyane Playa Blanca, Tongoy, Chile SUBDIFFERENTIAL
More informationDual Decomposition.
1/34 Dual Decomposition http://bicmr.pku.edu.cn/~wenzw/opt-2017-fall.html Acknowledgement: this slides is based on Prof. Lieven Vandenberghes lecture notes Outline 2/34 1 Conjugate function 2 introduction:
More informationA SET OF LECTURE NOTES ON CONVEX OPTIMIZATION WITH SOME APPLICATIONS TO PROBABILITY THEORY INCOMPLETE DRAFT. MAY 06
A SET OF LECTURE NOTES ON CONVEX OPTIMIZATION WITH SOME APPLICATIONS TO PROBABILITY THEORY INCOMPLETE DRAFT. MAY 06 CHRISTIAN LÉONARD Contents Preliminaries 1 1. Convexity without topology 1 2. Convexity
More informationGradient Descent and Implementation Solving the Euler-Lagrange Equations in Practice
1 Lecture Notes, HCI, 4.1.211 Chapter 2 Gradient Descent and Implementation Solving the Euler-Lagrange Equations in Practice Bastian Goldlücke Computer Vision Group Technical University of Munich 2 Bastian
More informationOptimization and Optimal Control in Banach Spaces
Optimization and Optimal Control in Banach Spaces Bernhard Schmitzer October 19, 2017 1 Convex non-smooth optimization with proximal operators Remark 1.1 (Motivation). Convex optimization: easier to solve,
More informationConvex Functions. Pontus Giselsson
Convex Functions Pontus Giselsson 1 Today s lecture lower semicontinuity, closure, convex hull convexity preserving operations precomposition with affine mapping infimal convolution image function supremum
More informationConvex Analysis Background
Convex Analysis Background John C. Duchi Stanford University Park City Mathematics Institute 206 Abstract In this set of notes, we will outline several standard facts from convex analysis, the study of
More informationSubgradient. Acknowledgement: this slides is based on Prof. Lieven Vandenberghes lecture notes. definition. subgradient calculus
1/41 Subgradient Acknowledgement: this slides is based on Prof. Lieven Vandenberghes lecture notes definition subgradient calculus duality and optimality conditions directional derivative Basic inequality
More informationThe proximal mapping
The proximal mapping http://bicmr.pku.edu.cn/~wenzw/opt-2016-fall.html Acknowledgement: this slides is based on Prof. Lieven Vandenberghes lecture notes Outline 2/37 1 closed function 2 Conjugate function
More informationEE 546, Univ of Washington, Spring Proximal mapping. introduction. review of conjugate functions. proximal mapping. Proximal mapping 6 1
EE 546, Univ of Washington, Spring 2012 6. Proximal mapping introduction review of conjugate functions proximal mapping Proximal mapping 6 1 Proximal mapping the proximal mapping (prox-operator) of a convex
More informationIE 521 Convex Optimization
Lecture 5: Convex II 6th February 2019 Convex Local Lipschitz Outline Local Lipschitz 1 / 23 Convex Local Lipschitz Convex Function: f : R n R is convex if dom(f ) is convex and for any λ [0, 1], x, y
More informationContents: 1. Minimization. 2. The theorem of Lions-Stampacchia for variational inequalities. 3. Γ -Convergence. 4. Duality mapping.
Minimization Contents: 1. Minimization. 2. The theorem of Lions-Stampacchia for variational inequalities. 3. Γ -Convergence. 4. Duality mapping. 1 Minimization A Topological Result. Let S be a topological
More informationSubdifferential representation of convex functions: refinements and applications
Subdifferential representation of convex functions: refinements and applications Joël Benoist & Aris Daniilidis Abstract Every lower semicontinuous convex function can be represented through its subdifferential
More informationLecture 1: Background on Convex Analysis
Lecture 1: Background on Convex Analysis John Duchi PCMI 2016 Outline I Convex sets 1.1 Definitions and examples 2.2 Basic properties 3.3 Projections onto convex sets 4.4 Separating and supporting hyperplanes
More informationSubgradients. subgradients and quasigradients. subgradient calculus. optimality conditions via subgradients. directional derivatives
Subgradients subgradients and quasigradients subgradient calculus optimality conditions via subgradients directional derivatives Prof. S. Boyd, EE392o, Stanford University Basic inequality recall basic
More informationA Greedy Framework for First-Order Optimization
A Greedy Framework for First-Order Optimization Jacob Steinhardt Department of Computer Science Stanford University Stanford, CA 94305 jsteinhardt@cs.stanford.edu Jonathan Huggins Department of EECS Massachusetts
More informationConvex Optimization. Dani Yogatama. School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, USA. February 12, 2014
Convex Optimization Dani Yogatama School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, USA February 12, 2014 Dani Yogatama (Carnegie Mellon University) Convex Optimization February 12,
More informationProximal methods. S. Villa. October 7, 2014
Proximal methods S. Villa October 7, 2014 1 Review of the basics Often machine learning problems require the solution of minimization problems. For instance, the ERM algorithm requires to solve a problem
More informationStrongly convex functions, Moreau envelopes and the generic nature of convex functions with strong minimizers
University of Wollongong Research Online Faculty of Engineering and Information Sciences - Papers: Part B Faculty of Engineering and Information Sciences 206 Strongly convex functions, Moreau envelopes
More informationMath 273a: Optimization Convex Conjugacy
Math 273a: Optimization Convex Conjugacy Instructor: Wotao Yin Department of Mathematics, UCLA Fall 2015 online discussions on piazza.com Convex conjugate (the Legendre transform) Let f be a closed proper
More information8. Conjugate functions
L. Vandenberghe EE236C (Spring 2013-14) 8. Conjugate functions closed functions conjugate function 8-1 Closed set a set C is closed if it contains its boundary: x k C, x k x = x C operations that preserve
More informationDual Proximal Gradient Method
Dual Proximal Gradient Method http://bicmr.pku.edu.cn/~wenzw/opt-2016-fall.html Acknowledgement: this slides is based on Prof. Lieven Vandenberghes lecture notes Outline 2/19 1 proximal gradient method
More informationConvex Optimization Theory. Chapter 5 Exercises and Solutions: Extended Version
Convex Optimization Theory Chapter 5 Exercises and Solutions: Extended Version Dimitri P. Bertsekas Massachusetts Institute of Technology Athena Scientific, Belmont, Massachusetts http://www.athenasc.com
More informationRadial Subgradient Descent
Radial Subgradient Descent Benja Grimmer Abstract We present a subgradient method for imizing non-smooth, non-lipschitz convex optimization problems. The only structure assumed is that a strictly feasible
More informationDedicated to Michel Théra in honor of his 70th birthday
VARIATIONAL GEOMETRIC APPROACH TO GENERALIZED DIFFERENTIAL AND CONJUGATE CALCULI IN CONVEX ANALYSIS B. S. MORDUKHOVICH 1, N. M. NAM 2, R. B. RECTOR 3 and T. TRAN 4. Dedicated to Michel Théra in honor of
More informationMAXIMALITY OF SUMS OF TWO MAXIMAL MONOTONE OPERATORS
MAXIMALITY OF SUMS OF TWO MAXIMAL MONOTONE OPERATORS JONATHAN M. BORWEIN, FRSC Abstract. We use methods from convex analysis convex, relying on an ingenious function of Simon Fitzpatrick, to prove maximality
More informationIterative Convex Optimization Algorithms; Part One: Using the Baillon Haddad Theorem
Iterative Convex Optimization Algorithms; Part One: Using the Baillon Haddad Theorem Charles Byrne (Charles Byrne@uml.edu) http://faculty.uml.edu/cbyrne/cbyrne.html Department of Mathematical Sciences
More informationOn Gap Functions for Equilibrium Problems via Fenchel Duality
On Gap Functions for Equilibrium Problems via Fenchel Duality Lkhamsuren Altangerel 1 Radu Ioan Boţ 2 Gert Wanka 3 Abstract: In this paper we deal with the construction of gap functions for equilibrium
More informationLecture 16: FTRL and Online Mirror Descent
Lecture 6: FTRL and Online Mirror Descent Akshay Krishnamurthy akshay@cs.umass.edu November, 07 Recap Last time we saw two online learning algorithms. First we saw the Weighted Majority algorithm, which
More informationExtreme Abridgment of Boyd and Vandenberghe s Convex Optimization
Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization Compiled by David Rosenberg Abstract Boyd and Vandenberghe s Convex Optimization book is very well-written and a pleasure to read. The
More informationSOME REMARKS ON THE SPACE OF DIFFERENCES OF SUBLINEAR FUNCTIONS
APPLICATIONES MATHEMATICAE 22,3 (1994), pp. 419 426 S. G. BARTELS and D. PALLASCHKE (Karlsruhe) SOME REMARKS ON THE SPACE OF DIFFERENCES OF SUBLINEAR FUNCTIONS Abstract. Two properties concerning the space
More informationOn duality theory of conic linear problems
On duality theory of conic linear problems Alexander Shapiro School of Industrial and Systems Engineering, Georgia Institute of Technology, Atlanta, Georgia 3332-25, USA e-mail: ashapiro@isye.gatech.edu
More information6. Proximal gradient method
L. Vandenberghe EE236C (Spring 2016) 6. Proximal gradient method motivation proximal mapping proximal gradient method with fixed step size proximal gradient method with line search 6-1 Proximal mapping
More informationGEOMETRIC APPROACH TO CONVEX SUBDIFFERENTIAL CALCULUS October 10, Dedicated to Franco Giannessi and Diethard Pallaschke with great respect
GEOMETRIC APPROACH TO CONVEX SUBDIFFERENTIAL CALCULUS October 10, 2018 BORIS S. MORDUKHOVICH 1 and NGUYEN MAU NAM 2 Dedicated to Franco Giannessi and Diethard Pallaschke with great respect Abstract. In
More informationLECTURE 25: REVIEW/EPILOGUE LECTURE OUTLINE
LECTURE 25: REVIEW/EPILOGUE LECTURE OUTLINE CONVEX ANALYSIS AND DUALITY Basic concepts of convex analysis Basic concepts of convex optimization Geometric duality framework - MC/MC Constrained optimization
More information1 Sparsity and l 1 relaxation
6.883 Learning with Combinatorial Structure Note for Lecture 2 Author: Chiyuan Zhang Sparsity and l relaxation Last time we talked about sparsity and characterized when an l relaxation could recover the
More informationLocal strong convexity and local Lipschitz continuity of the gradient of convex functions
Local strong convexity and local Lipschitz continuity of the gradient of convex functions R. Goebel and R.T. Rockafellar May 23, 2007 Abstract. Given a pair of convex conjugate functions f and f, we investigate
More informationTHE UNIQUE MINIMAL DUAL REPRESENTATION OF A CONVEX FUNCTION
THE UNIQUE MINIMAL DUAL REPRESENTATION OF A CONVEX FUNCTION HALUK ERGIN AND TODD SARVER Abstract. Suppose (i) X is a separable Banach space, (ii) C is a convex subset of X that is a Baire space (when endowed
More informationContinuous Primal-Dual Methods in Image Processing
Continuous Primal-Dual Methods in Image Processing Michael Goldman CMAP, Polytechnique August 2012 Introduction Maximal monotone operators Application to the initial problem Numerical illustration Introduction
More informationSubgradients. subgradients. strong and weak subgradient calculus. optimality conditions via subgradients. directional derivatives
Subgradients subgradients strong and weak subgradient calculus optimality conditions via subgradients directional derivatives Prof. S. Boyd, EE364b, Stanford University Basic inequality recall basic inequality
More informationLecture 6: September 12
10-725: Optimization Fall 2013 Lecture 6: September 12 Lecturer: Ryan Tibshirani Scribes: Micol Marchetti-Bowick Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer: These notes have not
More informationMath 273a: Optimization Subgradients of convex functions
Math 273a: Optimization Subgradients of convex functions Made by: Damek Davis Edited by Wotao Yin Department of Mathematics, UCLA Fall 2015 online discussions on piazza.com 1 / 42 Subgradients Assumptions
More informationDual and primal-dual methods
ELE 538B: Large-Scale Optimization for Data Science Dual and primal-dual methods Yuxin Chen Princeton University, Spring 2018 Outline Dual proximal gradient method Primal-dual proximal gradient method
More informationON GAP FUNCTIONS OF VARIATIONAL INEQUALITY IN A BANACH SPACE. Sangho Kum and Gue Myung Lee. 1. Introduction
J. Korean Math. Soc. 38 (2001), No. 3, pp. 683 695 ON GAP FUNCTIONS OF VARIATIONAL INEQUALITY IN A BANACH SPACE Sangho Kum and Gue Myung Lee Abstract. In this paper we are concerned with theoretical properties
More informationExtended Monotropic Programming and Duality 1
March 2006 (Revised February 2010) Report LIDS - 2692 Extended Monotropic Programming and Duality 1 by Dimitri P. Bertsekas 2 Abstract We consider the problem minimize f i (x i ) subject to x S, where
More informationSome Properties of the Augmented Lagrangian in Cone Constrained Optimization
MATHEMATICS OF OPERATIONS RESEARCH Vol. 29, No. 3, August 2004, pp. 479 491 issn 0364-765X eissn 1526-5471 04 2903 0479 informs doi 10.1287/moor.1040.0103 2004 INFORMS Some Properties of the Augmented
More informationMaximal monotone operators are selfdual vector fields and vice-versa
Maximal monotone operators are selfdual vector fields and vice-versa Nassif Ghoussoub Department of Mathematics, University of British Columbia, Vancouver BC Canada V6T 1Z2 nassif@math.ubc.ca February
More informationarxiv: v2 [math.fa] 21 Jul 2013
Applications of Convex Analysis within Mathematics Francisco J. Aragón Artacho, Jonathan M. Borwein, Victoria Martín-Márquez, and Liangjin Yao arxiv:1302.1978v2 [math.fa] 21 Jul 2013 July 19, 2013 Abstract
More information6. Proximal gradient method
L. Vandenberghe EE236C (Spring 2013-14) 6. Proximal gradient method motivation proximal mapping proximal gradient method with fixed step size proximal gradient method with line search 6-1 Proximal mapping
More informationLECTURE 12 LECTURE OUTLINE. Subgradients Fenchel inequality Sensitivity in constrained optimization Subdifferential calculus Optimality conditions
LECTURE 12 LECTURE OUTLINE Subgradients Fenchel inequality Sensitivity in constrained optimization Subdifferential calculus Optimality conditions Reading: Section 5.4 All figures are courtesy of Athena
More informationUses of duality. Geoff Gordon & Ryan Tibshirani Optimization /
Uses of duality Geoff Gordon & Ryan Tibshirani Optimization 10-725 / 36-725 1 Remember conjugate functions Given f : R n R, the function is called its conjugate f (y) = max x R n yt x f(x) Conjugates appear
More informationA Dual Condition for the Convex Subdifferential Sum Formula with Applications
Journal of Convex Analysis Volume 12 (2005), No. 2, 279 290 A Dual Condition for the Convex Subdifferential Sum Formula with Applications R. S. Burachik Engenharia de Sistemas e Computacao, COPPE-UFRJ
More informationOn Nonconvex Subdifferential Calculus in Banach Spaces 1
Journal of Convex Analysis Volume 2 (1995), No.1/2, 211 227 On Nonconvex Subdifferential Calculus in Banach Spaces 1 Boris S. Mordukhovich, Yongheng Shao Department of Mathematics, Wayne State University,
More informationHelly's Theorem and its Equivalences via Convex Analysis
Portland State University PDXScholar University Honors Theses University Honors College 2014 Helly's Theorem and its Equivalences via Convex Analysis Adam Robinson Portland State University Let us know
More informationA characterization of essentially strictly convex functions on reflexive Banach spaces
A characterization of essentially strictly convex functions on reflexive Banach spaces Michel Volle Département de Mathématiques Université d Avignon et des Pays de Vaucluse 74, rue Louis Pasteur 84029
More informationIntroduction to Functional Analysis With Applications
Introduction to Functional Analysis With Applications A.H. Siddiqi Khalil Ahmad P. Manchanda Tunbridge Wells, UK Anamaya Publishers New Delhi Contents Preface vii List of Symbols.: ' - ix 1. Normed and
More informationShiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 4. Subgradient
Shiqian Ma, MAT-258A: Numerical Optimization 1 Chapter 4 Subgradient Shiqian Ma, MAT-258A: Numerical Optimization 2 4.1. Subgradients definition subgradient calculus duality and optimality conditions Shiqian
More informationUTILITY OPTIMIZATION IN A FINITE SCENARIO SETTING
UTILITY OPTIMIZATION IN A FINITE SCENARIO SETTING J. TEICHMANN Abstract. We introduce the main concepts of duality theory for utility optimization in a setting of finitely many economic scenarios. 1. Utility
More informationFenchel-Moreau Conjugates of Inf-Transforms and Application to Stochastic Bellman Equation
Fenchel-Moreau Conjugates of Inf-Transforms and Application to Stochastic Bellman Equation Jean-Philippe Chancelier and Michel De Lara CERMICS, École des Ponts ParisTech First Conference on Discrete Optimization
More informationCONVEX OPTIMIZATION VIA LINEARIZATION. Miguel A. Goberna. Universidad de Alicante. Iberian Conference on Optimization Coimbra, November, 2006
CONVEX OPTIMIZATION VIA LINEARIZATION Miguel A. Goberna Universidad de Alicante Iberian Conference on Optimization Coimbra, 16-18 November, 2006 Notation X denotes a l.c. Hausdorff t.v.s and X its topological
More informationStochastic model-based minimization under high-order growth
Stochastic model-based minimization under high-order growth Damek Davis Dmitriy Drusvyatskiy Kellie J. MacPhee Abstract Given a nonsmooth, nonconvex minimization problem, we consider algorithms that iteratively
More informationCHARACTERIZATION OF THE SUBDIFFERENTIALS OF CONVEX FUNCTIONS
PACIFIC JOURNAL OF MATHEMATICS Vol. 17, No.3, 1966 CHARACTERIZATION OF THE SUBDIFFERENTIALS OF CONVEX FUNCTIONS R. T. ROCKAFELLAR Each lower semi-continuous proper convex function f on a Banach space E
More informationConvex Optimization on Large-Scale Domains Given by Linear Minimization Oracles
Convex Optimization on Large-Scale Domains Given by Linear Minimization Oracles Arkadi Nemirovski H. Milton Stewart School of Industrial and Systems Engineering Georgia Institute of Technology Joint research
More informationConvex Analysis Notes. Lecturer: Adrian Lewis, Cornell ORIE Scribe: Kevin Kircher, Cornell MAE
Convex Analysis Notes Lecturer: Adrian Lewis, Cornell ORIE Scribe: Kevin Kircher, Cornell MAE These are notes from ORIE 6328, Convex Analysis, as taught by Prof. Adrian Lewis at Cornell University in the
More informationOptimization for Machine Learning
Optimization for Machine Learning (Problems; Algorithms - A) SUVRIT SRA Massachusetts Institute of Technology PKU Summer School on Data Science (July 2017) Course materials http://suvrit.de/teaching.html
More informationConvex Optimization. (EE227A: UC Berkeley) Lecture 4. Suvrit Sra. (Conjugates, subdifferentials) 31 Jan, 2013
Convex Optimization (EE227A: UC Berkeley) Lecture 4 (Conjugates, subdifferentials) 31 Jan, 2013 Suvrit Sra Organizational HW1 due: 14th Feb 2013 in class. Please L A TEX your solutions (contact TA if this
More informationEpiconvergence and ε-subgradients of Convex Functions
Journal of Convex Analysis Volume 1 (1994), No.1, 87 100 Epiconvergence and ε-subgradients of Convex Functions Andrei Verona Department of Mathematics, California State University Los Angeles, CA 90032,
More informationOptimization methods
Lecture notes 3 February 8, 016 1 Introduction Optimization methods In these notes we provide an overview of a selection of optimization methods. We focus on methods which rely on first-order information,
More informationConvex Optimization. (EE227A: UC Berkeley) Lecture 15. Suvrit Sra. (Gradient methods III) 12 March, 2013
Convex Optimization (EE227A: UC Berkeley) Lecture 15 (Gradient methods III) 12 March, 2013 Suvrit Sra Optimal gradient methods 2 / 27 Optimal gradient methods We saw following efficiency estimates for
More informationLecture 6 : Projected Gradient Descent
Lecture 6 : Projected Gradient Descent EE227C. Lecturer: Professor Martin Wainwright. Scribe: Alvin Wan Consider the following update. x l+1 = Π C (x l α f(x l )) Theorem Say f : R d R is (m, M)-strongly
More informationA Variational Approach to Lagrange Multipliers
JOTA manuscript No. (will be inserted by the editor) A Variational Approach to Lagrange Multipliers Jonathan M. Borwein Qiji J. Zhu Received: date / Accepted: date Abstract We discuss Lagrange multiplier
More informationSelf-equilibrated Functions in Dual Vector Spaces: a Boundedness Criterion
Self-equilibrated Functions in Dual Vector Spaces: a Boundedness Criterion Michel Théra LACO, UMR-CNRS 6090, Université de Limoges michel.thera@unilim.fr reporting joint work with E. Ernst and M. Volle
More informationMerit Functions and Descent Algorithms for a Class of Variational Inequality Problems
Merit Functions and Descent Algorithms for a Class of Variational Inequality Problems Michael Patriksson June 15, 2011 Abstract. We consider a variational inequality problem, where the cost mapping is
More informationLecture: Duality of LP, SOCP and SDP
1/33 Lecture: Duality of LP, SOCP and SDP Zaiwen Wen Beijing International Center For Mathematical Research Peking University http://bicmr.pku.edu.cn/~wenzw/bigdata2017.html wenzw@pku.edu.cn Acknowledgement:
More informationarxiv: v1 [math.oc] 21 Apr 2016
Accelerated Douglas Rachford methods for the solution of convex-concave saddle-point problems Kristian Bredies Hongpeng Sun April, 06 arxiv:604.068v [math.oc] Apr 06 Abstract We study acceleration and
More informationAdaptive discretization and first-order methods for nonsmooth inverse problems for PDEs
Adaptive discretization and first-order methods for nonsmooth inverse problems for PDEs Christian Clason Faculty of Mathematics, Universität Duisburg-Essen joint work with Barbara Kaltenbacher, Tuomo Valkonen,
More informationA Geometric Framework for Nonconvex Optimization Duality using Augmented Lagrangian Functions
A Geometric Framework for Nonconvex Optimization Duality using Augmented Lagrangian Functions Angelia Nedić and Asuman Ozdaglar April 15, 2006 Abstract We provide a unifying geometric framework for the
More informationALGORITHMS FOR MINIMIZING DIFFERENCES OF CONVEX FUNCTIONS AND APPLICATIONS
ALGORITHMS FOR MINIMIZING DIFFERENCES OF CONVEX FUNCTIONS AND APPLICATIONS Mau Nam Nguyen (joint work with D. Giles and R. B. Rector) Fariborz Maseeh Department of Mathematics and Statistics Portland State
More informationOn the convergence rate of a forward-backward type primal-dual splitting algorithm for convex optimization problems
On the convergence rate of a forward-backward type primal-dual splitting algorithm for convex optimization problems Radu Ioan Boţ Ernö Robert Csetnek August 5, 014 Abstract. In this paper we analyze the
More informationHAMILTON-JACOBI THEORY AND PARAMETRIC ANALYSIS IN FULLY CONVEX PROBLEMS OF OPTIMAL CONTROL. R. T. Rockafellar 1
HAMILTON-JACOBI THEORY AND PARAMETRIC ANALYSIS IN FULLY CONVEX PROBLEMS OF OPTIMAL CONTROL R. T. Rockafellar 1 Department of Mathematics, Box 354350 University of Washington, Seattle, WA 98195-4350 rtr@math.washington.edu
More informationLecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem
Lecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem Michael Patriksson 0-0 The Relaxation Theorem 1 Problem: find f := infimum f(x), x subject to x S, (1a) (1b) where f : R n R
More informationA Dykstra-like algorithm for two monotone operators
A Dykstra-like algorithm for two monotone operators Heinz H. Bauschke and Patrick L. Combettes Abstract Dykstra s algorithm employs the projectors onto two closed convex sets in a Hilbert space to construct
More informationLagrange Relaxation: Introduction and Applications
1 / 23 Lagrange Relaxation: Introduction and Applications Operations Research Anthony Papavasiliou 2 / 23 Contents 1 Context 2 Applications Application in Stochastic Programming Unit Commitment 3 / 23
More informationDesign of optimal RF pulses for NMR as a discrete-valued control problem
Design of optimal RF pulses for NMR as a discrete-valued control problem Christian Clason Faculty of Mathematics, Universität Duisburg-Essen joint work with Carla Tameling (Göttingen) and Benedikt Wirth
More informationON PROXIMAL POINT-TYPE ALGORITHMS FOR WEAKLY CONVEX FUNCTIONS AND THEIR CONNECTION TO THE BACKWARD EULER METHOD
ON PROXIMAL POINT-TYPE ALGORITHMS FOR WEAKLY CONVEX FUNCTIONS AND THEIR CONNECTION TO THE BACKWARD EULER METHOD TIM HOHEISEL, MAXIME LABORDE, AND ADAM OBERMAN Abstract. In this article we study the connection
More informationOptimality Conditions for Nonsmooth Convex Optimization
Optimality Conditions for Nonsmooth Convex Optimization Sangkyun Lee Oct 22, 2014 Let us consider a convex function f : R n R, where R is the extended real field, R := R {, + }, which is proper (f never
More informationNon-smooth Non-convex Bregman Minimization: Unification and New Algorithms
JOTA manuscript No. (will be inserted by the editor) Non-smooth Non-convex Bregman Minimization: Unification and New Algorithms Peter Ochs Jalal Fadili Thomas Brox Received: date / Accepted: date Abstract
More informationDUALIZATION OF SUBGRADIENT CONDITIONS FOR OPTIMALITY
DUALIZATION OF SUBGRADIENT CONDITIONS FOR OPTIMALITY R. T. Rockafellar* Abstract. A basic relationship is derived between generalized subgradients of a given function, possibly nonsmooth and nonconvex,
More informationIn Progress: Summary of Notation and Basic Results Convex Analysis C&O 663, Fall 2009
In Progress: Summary of Notation and Basic Results Convex Analysis C&O 663, Fall 2009 Intructor: Henry Wolkowicz November 26, 2009 University of Waterloo Department of Combinatorics & Optimization Abstract
More informationIdentifying Active Constraints via Partial Smoothness and Prox-Regularity
Journal of Convex Analysis Volume 11 (2004), No. 2, 251 266 Identifying Active Constraints via Partial Smoothness and Prox-Regularity W. L. Hare Department of Mathematics, Simon Fraser University, Burnaby,
More informationChapter 2 Convex Analysis
Chapter 2 Convex Analysis The theory of nonsmooth analysis is based on convex analysis. Thus, we start this chapter by giving basic concepts and results of convexity (for further readings see also [202,
More informationPrimal/Dual Decomposition Methods
Primal/Dual Decomposition Methods Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization Fall 2018-19, HKUST, Hong Kong Outline of Lecture Subgradients
More informationSequential Pareto Subdifferential Sum Rule And Sequential Effi ciency
Applied Mathematics E-Notes, 16(2016), 133-143 c ISSN 1607-2510 Available free at mirror sites of http://www.math.nthu.edu.tw/ amen/ Sequential Pareto Subdifferential Sum Rule And Sequential Effi ciency
More informationBrézis - Haraux - type approximation of the range of a monotone operator composed with a linear mapping
Brézis - Haraux - type approximation of the range of a monotone operator composed with a linear mapping Radu Ioan Boţ, Sorin-Mihai Grad and Gert Wanka Faculty of Mathematics Chemnitz University of Technology
More informationCO 250 Final Exam Guide
Spring 2017 CO 250 Final Exam Guide TABLE OF CONTENTS richardwu.ca CO 250 Final Exam Guide Introduction to Optimization Kanstantsin Pashkovich Spring 2017 University of Waterloo Last Revision: March 4,
More informationChapter 3. Characterization of best approximations. 3.1 Characterization of best approximations in Hilbert spaces
Chapter 3 Characterization of best approximations In this chapter we study properties which characterite solutions of the approximation problem. There is a big difference in the treatment of this question
More informationConvex Optimization and Modeling
Convex Optimization and Modeling Duality Theory and Optimality Conditions 5th lecture, 12.05.2010 Jun.-Prof. Matthias Hein Program of today/next lecture Lagrangian and duality: the Lagrangian the dual
More informationA user s guide to Lojasiewicz/KL inequalities
Other A user s guide to Lojasiewicz/KL inequalities Toulouse School of Economics, Université Toulouse I SLRA, Grenoble, 2015 Motivations behind KL f : R n R smooth ẋ(t) = f (x(t)) or x k+1 = x k λ k f
More information