general perturbations

Similar documents
Radius Theorems for Monotone Mappings

for all u C, where F : X X, X is a Banach space with its dual X and C X

Conditioning of linear-quadratic two-stage stochastic programming problems

The Implicit Function and Inverse Function Theorems

Some Properties of the Augmented Lagrangian in Cone Constrained Optimization

Aubin Criterion for Metric Regularity

On the Midpoint Method for Solving Generalized Equations

ON A CLASS OF NONSMOOTH COMPOSITE FUNCTIONS

Midterm 1. Every element of the set of functions is continuous

BORIS MORDUKHOVICH Wayne State University Detroit, MI 48202, USA. Talk given at the SPCOM Adelaide, Australia, February 2015

Functional Analysis. Franck Sueur Metric spaces Definitions Completeness Compactness Separability...

Yuqing Chen, Yeol Je Cho, and Li Yang

Lecture 5: The Bellman Equation

Metric regularity and systems of generalized equations

Existence results for quasi-equilibrium problems

Lecture Notes in Advanced Calculus 1 (80315) Raz Kupferman Institute of Mathematics The Hebrew University

Course 212: Academic Year Section 1: Metric Spaces

Contents: 1. Minimization. 2. The theorem of Lions-Stampacchia for variational inequalities. 3. Γ -Convergence. 4. Duality mapping.

A FIXED POINT THEOREM FOR GENERALIZED NONEXPANSIVE MULTIVALUED MAPPINGS

Real Analysis Math 131AH Rudin, Chapter #1. Dominique Abdi

ON GAP FUNCTIONS OF VARIATIONAL INEQUALITY IN A BANACH SPACE. Sangho Kum and Gue Myung Lee. 1. Introduction

Subdifferential representation of convex functions: refinements and applications

Journal of Inequalities in Pure and Applied Mathematics

Upper sign-continuity for equilibrium problems

B. Appendix B. Topological vector spaces

On an iterative algorithm for variational inequalities in. Banach space

On pseudomonotone variational inequalities

The weak topology of locally convex spaces and the weak-* topology of their duals

Discussion Paper Series

On constraint qualifications with generalized convexity and optimality conditions

ON A HYBRID PROXIMAL POINT ALGORITHM IN BANACH SPACES

ZERO DUALITY GAP FOR CONVEX PROGRAMS: A GENERAL RESULT

CONVERGENCE OF APPROXIMATING FIXED POINTS FOR MULTIVALUED NONSELF-MAPPINGS IN BANACH SPACES. Jong Soo Jung. 1. Introduction

Optimization and Optimal Control in Banach Spaces

Generalized Monotonicities and Its Applications to the System of General Variational Inequalities

Applied Mathematics Letters

Introduction to Optimization Techniques. Nonlinear Optimization in Function Spaces

A double projection method for solving variational inequalities without monotonicity

Solution existence of variational inequalities with pseudomonotone operators in the sense of Brézis

Lipschitz p-convex and q-concave maps

A Brøndsted-Rockafellar Theorem for Diagonal Subdifferential Operators

Extremal Solutions of Differential Inclusions via Baire Category: a Dual Approach

Non-linear factorization of linear operators

WEAK CONVERGENCE OF RESOLVENTS OF MAXIMAL MONOTONE OPERATORS AND MOSCO CONVERGENCE

Weak and strong convergence theorems of modified SP-iterations for generalized asymptotically quasi-nonexpansive mappings

Lecture 9 Metric spaces. The contraction fixed point theorem. The implicit function theorem. The existence of solutions to differenti. equations.

Convex Optimization Conjugate, Subdifferential, Proximation

************************************* Applied Analysis I - (Advanced PDE I) (Math 940, Fall 2014) Baisheng Yan

A convergence result for an Outer Approximation Scheme

Notes on Distributions

Closedness of the solution mapping to parametric vector equilibrium problems

Duality (Continued) min f ( x), X R R. Recall, the general primal problem is. The Lagrangian is a function. defined by

Real Analysis, 2nd Edition, G.B.Folland Elements of Functional Analysis

A SET OF LECTURE NOTES ON CONVEX OPTIMIZATION WITH SOME APPLICATIONS TO PROBABILITY THEORY INCOMPLETE DRAFT. MAY 06

Preprint Stephan Dempe and Patrick Mehlitz Lipschitz continuity of the optimal value function in parametric optimization ISSN

Journal of Inequalities in Pure and Applied Mathematics

On robustness of the regularity property of maps

(convex combination!). Use convexity of f and multiply by the common denominator to get. Interchanging the role of x and y, we obtain that f is ( 2M ε

Characterisation of Accumulation Points. Convergence in Metric Spaces. Characterisation of Closed Sets. Characterisation of Closed Sets

SOME REMARKS ON THE SPACE OF DIFFERENCES OF SUBLINEAR FUNCTIONS

("-1/' .. f/ L) I LOCAL BOUNDEDNESS OF NONLINEAR, MONOTONE OPERA TORS. R. T. Rockafellar. MICHIGAN MATHEMATICAL vol. 16 (1969) pp.

Maximal Monotone Inclusions and Fitzpatrick Functions

ITERATIVE SCHEMES FOR APPROXIMATING SOLUTIONS OF ACCRETIVE OPERATORS IN BANACH SPACES SHOJI KAMIMURA AND WATARU TAKAHASHI. Received December 14, 1999

AW -Convergence and Well-Posedness of Non Convex Functions

Problem Set 2: Solutions Math 201A: Fall 2016

MA651 Topology. Lecture 10. Metric Spaces.

APPLICATIONS IN FIXED POINT THEORY. Matthew Ray Farmer. Thesis Prepared for the Degree of MASTER OF ARTS UNIVERSITY OF NORTH TEXAS.

A SYNOPSIS OF HILBERT SPACE THEORY

Keywords. 1. Introduction.

The Subdifferential of Convex Deviation Measures and Risk Functions

The local equicontinuity of a maximal monotone operator

Caristi-type Fixed Point Theorem of Set-Valued Maps in Metric Spaces

Appendix A Functional Analysis

Lecture 11 Hyperbolicity.

On Total Convexity, Bregman Projections and Stability in Banach Spaces

Characterizations of Strong Regularity for Variational Inequalities over Polyhedral Convex Sets

A Unified Analysis of Nonconvex Optimization Duality and Penalty Methods with General Augmenting Functions

On Semicontinuity of Convex-valued Multifunctions and Cesari s Property (Q)

1 The Existence and Uniqueness Theorem for First-Order Differential Equations

The Strong Convergence of Subgradients of Convex Functions Along Directions: Perspectives and Open Problems

1. Supremum and Infimum Remark: In this sections, all the subsets of R are assumed to be nonempty.

Lecture 8. Strong Duality Results. September 22, 2008

Seminorms and locally convex spaces

An introduction to Mathematical Theory of Control

Notes for Functional Analysis

On duality theory of conic linear problems

Subdifferentiability and the Duality Gap

ON THE ESSENTIAL BOUNDEDNESS OF SOLUTIONS TO PROBLEMS IN PIECEWISE LINEAR-QUADRATIC OPTIMAL CONTROL. R.T. Rockafellar*

Local strong convexity and local Lipschitz continuity of the gradient of convex functions

Hölder Metric Subregularity with Applications to Proximal Point Method

REVIEW OF ESSENTIAL MATH 346 TOPICS

On nonexpansive and accretive operators in Banach spaces

Mathematics for Economists

An introduction to some aspects of functional analysis

Variational inequalities for set-valued vector fields on Riemannian manifolds

Derivatives. Differentiability problems in Banach spaces. Existence of derivatives. Sharpness of Lebesgue s result

Weak sharp minima on Riemannian manifolds 1

1. Bounded linear maps. A linear map T : E F of real Banach

1. Nonlinear Equations. This lecture note excerpted parts from Michael Heath and Max Gunzburger. f(x) = 0

Brøndsted-Rockafellar property of subdifferentials of prox-bounded functions. Marc Lassonde Université des Antilles et de la Guyane

Transcription:

under general perturbations (B., Kassay, Pini, Conditioning for optimization problems under general perturbations, NA, 2012) Dipartimento di Matematica, Università degli Studi di Genova February 28, 2012

Content 1 Introduction to conditioning The case of linear equation in finite dimensional space The case of generalized equation Condition number for scalar optimization problem 2 3 Sensitivity result Distance theorem Strongly convex programming

The case of linear equation in finite dimensional space The case of generalized equation Condition number for scalar optimization problem The goal of the analysis in a parametric (optimization) problem is to study how a change in the data affects the solution of the given problem. Qualitative approach: continuity properties of the solution map (stability analysis); Quantitative approach: Lipschitzian properties of the solution map (sensitivity analysis and condition number theory). Given a parametric (optimization) problem, a condition number essentially gives the following information: upper bound of the ratio of the size of the solution error to the size of the data error (Lipschitz modulus); how the initial problem can be perturbed in order to preserve its good properties.

The case of linear equation in finite dimensional space The case of generalized equation Condition number for scalar optimization problem Let A L(R n, R n ) and consider the equation Ax = p. If A in nonsingular, then the unique solution is x = A 1 p and A 1 (p) A 1 (p ) p p A 1 Theorem (Eckart-Young) Let A L(R n, R n ), nonsingular. Then min { B : A + B singular } = 1 B L(R n,r n ) A 1

The case of linear equation in finite dimensional space The case of generalized equation Condition number for scalar optimization problem Let X and P be Banach spaces and F : X P. The set of solution of the generalized equation p F (x) is given by S F (p) = {x X : x = F 1 (p)} and it is nonempty for p F (X ). i) How does S F (p) behave with respect to perturbations in p? (Lipschitzian property) ii) How does S F (p) behave with respect to perturbations of F? (Distance-type theorem)

The case of linear equation in finite dimensional space The case of generalized equation Condition number for scalar optimization problem (A.L.Dontchev, A.S. Lewis and R.T. Rockafellar, The radius of metric regularity, Trans. AMS, 2002) Fix a point (x, p) gph(f ). Definition (Metric regularity) The map F is said to be metrically regular around (x, p) when there exists k > 0 such that d(x, F 1 (p)) kd(p, F (x)) (1.1) for all (x, p) close to (x, p). The infimum of k such that (1.1) holds, is the regularity modulus of F around (x, p) and it is denoted by reg(f ; x p). For a single-valued linear map A in a finite dimensional space, the regularity modulus is the same for all (x, p) gph(a) and it is given by reg(a) = A 1

The case of linear equation in finite dimensional space The case of generalized equation Condition number for scalar optimization problem B X (x, r) denotes the open ball in X centered at x with radius r and B X (x, r) its closure. Definition (Aubin property) We say that F has the Aubin property around (x, p) when there exists k > 0 and a neighbourhood V of p such that F (x ) V F (x) + k x x B P (0, 1) (1.2) for all x, x close to x. The infimum of k such that (1.2) holds, is the Lipschitz moduls of F around (x, p) and it is denoted by lip(f ; x p).

The case of linear equation in finite dimensional space The case of generalized equation Condition number for scalar optimization problem Remark If F is single-valued around x, the Aubin property is the ordinary Lipschitz continuity of map around x: lip(f ; x) = lip(f ; x F (x)) = F (x) F (x ) lim sup x,x x,x x x x Remark Notice that reg(f ; x p) = lip(f 1 ; p x)

The case of linear equation in finite dimensional space The case of generalized equation Condition number for scalar optimization problem Given a map F such that reg(f ; x p) < +, how far can we perturbe it, with a map G in a suitable class, before loosing metrical regularity of F + G around a suitable point of its graph? Theorem (Estimates for Lipschitz perturbations) Consider a map F : X P and (x, p) gph(f ) at which gph(f ) is locally closed. Let k and µ be non negative constants such that reg(f ; x p) k, kµ < 1. Then for any function g : X P such that lip(g; x) < µ, we have reg(g + F ; x p + g(x) < k 1 kµ

The case of linear equation in finite dimensional space The case of generalized equation Condition number for scalar optimization problem Corollary (Distance-type result) Consider a map F : X P and (x, p) gph(f ) at which gph(f ) is locally closed. Let g : X P a Lipschitz map around x, then inf g {lip(g; x) : F + g is not metrically regular around (x, p + g(x)} 1 reg(f ; x p).

The case of linear equation in finite dimensional space The case of generalized equation Condition number for scalar optimization problem (T. Zolezzi, Condition number theorem in optimization, SIAM J. Optim., 2003) E is a real Banach space and E is its dual, denotes the duality pairing B(x, r) (B (p, r)) is the closed ball in E (E ) with center x (p) and radius r C 1,1 denotes the class of Fréchet differentiable functions such that the Fréchet derivative Df is a Lipschitz function. Given f C 1,1 (B(0, L)), and p E, consider the problem and the solution map min f p = min (f p, ) (O p ) B(0,L) B(0,L) m : B (0, r) E E, m(p) = argmin(b(0, L), f p )

The case of linear equation in finite dimensional space The case of generalized equation Condition number for scalar optimization problem The condition number cond(f ) is defined, under the assumption that the map m is single valued close to 0, as follows: cond(f ) = m(p) m(q) lim sup p,q 0, p q p q If m(0) = 0, i.e. argmin(b(0, L), f ) = {0}, then cond(f ) = lip(m; 0) = reg(m 1 ; 0 0). Furthermore, if m(p) B(0, L) for sufficiently small p, then Df (m(p)) = p and cond(f ) = lip(df 1 ; 0 0)

The case of linear equation in finite dimensional space The case of generalized equation Condition number for scalar optimization problem Class T 1 : functions f C 1,1 (B(0, L)) such that argmin(b(0, L), f ) = {0} argmin(b(0, L), f p ) for small p p argmin(b(0, L), f p ) upper hemicontinuous at p = 0. Class W 1 : functions f : B(0, L) R such that argmin(b(0, L), f p ) singleton for small p cond(f ) < +. Define I 1 = {g T 1 : g / W 1 } (ill conditioned problems).

The case of linear equation in finite dimensional space The case of generalized equation Condition number for scalar optimization problem The class C 1,1 (B(0, L)) is endowed with the pseudodistance { } Df1 (u) Df 2 (u) Df 1 (v) + Df 2 (v) d(f 1, f 2 ) = u v sup u,v Bm(0,L) Theorem (Distance-type theorem) Let f T 1 W 1 be such that Df one to one near 0. Let g T 1 1 be such that d(f, g) < cond(f ). Then g W 1. Corollary Let f T 1 W 1 be such that Df one to one near 0. Then dist(f, I 1 ) 1 cond(f ).

Definition (Condition number) Given f C 1,1 (B(0, L)) such that Df (0) = 0 Df is one to one near 0 we define the condition number as the positive extended real number: u v ĉ(f ) = lim sup u,v 0, u v Df (u) Df (v). Notice that ĉ(f ) 1 L Df > 0 where L Df is the Lipschitz constant of Df on B(0, L).

Assuming that Df is open at 0, i.e. 0 intdf (B(0, s)) s small enough ĉ(f ) is just the lipschitz modulus lip((df ) 1 ; 0) of (Df ) 1 at 0: ĉ(f ) = lim sup p,q 0, p q (Df ) 1 (p) (Df ) 1 (q). p q Notice that ĉ(f ) < + if and only if (Df ) 1 is Lipschitz near 0.

Sensitivity result Distance theorem Strongly convex programming Perturbed optimization problems Given a function f C 1,1 (B(0, L)) and a perturbation function g : E B(0, L) R, we consider for each p E the class of perturbed optimization problems: min F g (p, u) = min f (u) g(p, u) u B(0,L) u B(0,L) (OP(F g, p)) satisfying the assumptions i) g(0, ) = 0; ii) g(p, ) C 1,1 (B(0, L)), for each p E ; iii) sup u B(0,L) Dg(p, u) 0 if p 0, where Dg is the Fréchet derivative of g w.r.t. the second argument.

Sensitivity result Distance theorem Strongly convex programming The class T g Fix a perturbation function g. The class T g contains all the functions f C 1,1 (B(0, L)) such that: Argmin (B(0, L), f ) = {0}; Argmin (B(0, L), F g (p, )) is nonempty for small p; the map p Argmin(B(0, L), F g (p, )) is upper hemicontinuous at p = 0.

Sensitivity result Distance theorem Strongly convex programming The class W g The class W g contains all functions f T g such that: Df is one to one in a neighbourhood of 0; Df satisfies the openness property; ĉ(f ) < + ; there exists 0 < s < L such that Argmin(B(0, L), F g (p, )) B(0, s) is a singleton, for every sufficiently small p. The functions in W g give rise to what will be called a well conditioned optimization problem.

Sensitivity result Distance theorem Strongly convex programming Denote by m(p) the Argmin(B(0, L), F g (p, )) B(0, s) for p small. For a function f W g, the map p m(p) is single valued if we restrict our attention to a neighbourhood of 0 in E, and continuous at p = 0, with m(0) = {0}. The case where the map is locally constant, i.e. m(p) = {0} for all p in a neighborhood of 0, is not interesting; we can get rid of it by assuming that Dg(p, 0) 0 if p 0.

Sensitivity result Distance theorem Strongly convex programming Theorem 1 (Sensitivity) Suppose that the Fréchet derivative Dg : E B(0, L) E of the perturbation function g is Lipschitz, i.e. there exists L > 0 such that Dg(p, u) Dg(q, v) L( p q + u v ). Take f W g such that ĉ(f ) < 1/L. If Dg(p, 0) 0 when p 0, then, for every 0 < ɛ < (1 Lĉ(f ))/L, there exists a neighborhood of 0 E such that the map p m(p) is Lipschitz with constant L m (ĉ(f ) + ɛ)l 1 ĉ(f )L Lɛ.

Sensitivity result Distance theorem Strongly convex programming Theorem 2 (Distance-type theorem) Let f W g, and suppose that there exists γ > 1 such that, for p small enough, L Dg(p, ) ĉ(f ) < 1 γ. (3.1) Denote by h any function in the class T g such that Then h W g. d(f, h) < γ 1 γ 1 ĉ(f ). Notice that condition (3.1) holds for some γ > 1 if the Fréchet derivative Dg : E B(0, L) E of the perturbation function is Lipschitz and ĉ(f ) < 1/L.

Sensitivity result Distance theorem Strongly convex programming Corollary Let f W g, and suppose that there exists γ > 1 such that for p small enough, If h T g \ W g then L Dg(p, ) ĉ(f ) < 1 γ. d(f, h) γ 1 γ 1 ĉ(f ).

Sensitivity result Distance theorem Strongly convex programming Strongly convex functions E is a reflexive Banach space. Definition Let f : C E R be a function on the convex set C. We say that f is strongly convex with convexity modulus α > 0 if f ((1 t)u + tv) (1 t)f (u) + tf (v) α(1 t)t u v 2, u, v C, t [0, 1]. If f is Fréchet differentiable, strong convexity is equivalent to the strong monotonicity of Df on C, i.e. < Df (u) Df (v), u v > 2α u v 2, u, v C.

Sensitivity result Distance theorem Strongly convex programming Notice that strong monotonicity of Df trivially implies that f is one to one near 0; moreover ĉ(f ) u v 2 lim sup u,v 0, u v Df (u) Df (v), u v 1 2α. Proposition 1 Let f be a function in C 1,1 (B(0, L)) such that Df (0) = 0 holds. Suppose that f is strongly convex with convexity modulus equal to α > 0. Then Df in open at 0.

Sensitivity result Distance theorem Strongly convex programming Problem stable under perturbation Let us now consider a perturbation function g : E B(0, L) R satisfying the following conditions: i) g(0, ) = 0; ii) g(p, ) C 1,1 (B(0, L)), and it is a concave function for all small p E ; iii) the mapping (p, u) Dg(p, u) is continuous at (0, 0). These assumptions entail that the parametric optimization problem is stable under perturbation (see Kassay, G. and J. Kolumban, 2000).

Sensitivity result Distance theorem Strongly convex programming Proposition 2 Let f be a function in C 1,1 (B(0, L)) such that Df (0) = 0 holds, and assume that f is strongly convex. Consider a perturbation function g : E B(0, L) R satisfying i) iii). Then the function f belongs to W g.

Sensitivity result Distance theorem Strongly convex programming Main references 1. M. Bianchi, G. Kassay and R. Pini, Conditioning for optimization problems under general perturbations, NA, Vol. 75, pp. 37-45, 2012. 2. A.L.Dontchev, A.S. Lewis and R.T. Rockafellar, The radius of metric regularity, Trans. AMS, 2002. 3. A.L.Dontchev and R.T. Rockafellar, Implicit Functions and Solution Mappings, Springer, 2009. 4. G. Kassay and J. Kolumban, Multivalued parametric variational inequalities with α-pseudomonotone maps, J. Optim. Theory and Appl., Vol. 107, pp.35-50, 2000. 5. T. Zolezzi, Condition number theorems in optimization, SIAM J. OPTIM, Vol. 14, pp. 507-516, 2003.