Lecture 6 - Convex Sets

Similar documents
Lecture 1: Convex Sets January 23

IE 521 Convex Optimization

LMI MODELLING 4. CONVEX LMI MODELLING. Didier HENRION. LAAS-CNRS Toulouse, FR Czech Tech Univ Prague, CZ. Universidad de Valladolid, SP March 2009

Chapter 2: Preliminaries and elements of convex analysis

Assignment 1: From the Definition of Convexity to Helley Theorem

Math 341: Convex Geometry. Xi Chen

Introduction to Mathematical Programming IE406. Lecture 3. Dr. Ted Ralphs

Appendix B Convex analysis

Chapter 2 Convex Analysis

Linear Algebra Review: Linear Independence. IE418 Integer Programming. Linear Algebra Review: Subspaces. Linear Algebra Review: Affine Independence

Analysis-3 lecture schemes

Farkas Lemma. Rudi Pendavingh. Optimization in R n, lecture 2. Eindhoven Technical University. Rudi Pendavingh (TUE) Farkas Lemma ORN2 1 / 15

GEORGIA INSTITUTE OF TECHNOLOGY H. MILTON STEWART SCHOOL OF INDUSTRIAL AND SYSTEMS ENGINEERING LECTURE NOTES OPTIMIZATION III

COURSE ON LMI PART I.2 GEOMETRY OF LMI SETS. Didier HENRION henrion

Convex Sets with Applications to Economics

Convex Geometry. Carsten Schütt

Extreme points of compact convex sets

3. Linear Programming and Polyhedral Combinatorics

Appendix A: Separation theorems in IR n

AN INTRODUCTION TO CONVEXITY

Appendix PRELIMINARIES 1. THEOREMS OF ALTERNATIVES FOR SYSTEMS OF LINEAR CONSTRAINTS

Optimization and Optimal Control in Banach Spaces

Basic convexity. 1.1 Convex sets and combinations. λ + μ b (λ + μ)a;

Discrete Geometry. Problem 1. Austin Mohr. April 26, 2012

POLARS AND DUAL CONES

Practice Exam 1: Continuous Optimisation

Locally convex spaces, the hyperplane separation theorem, and the Krein-Milman theorem

Closedness of Integer Hulls of Simple Conic Sets

DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO

3. Linear Programming and Polyhedral Combinatorics

Continuous Optimisation, Chpt 7: Proper Cones

Notes for Functional Analysis

Linear Programming. Murti V. Salapaka Electrical Engineering Department University Of Minnesota, Twin Cities

p. 5 First line of Section 1.4. Change nonzero vector v R n to nonzero vector v C n int([x, y]) = (x, y)

Convex Optimization. (EE227A: UC Berkeley) Lecture 28. Suvrit Sra. (Algebra + Optimization) 02 May, 2013

LECTURE 25: REVIEW/EPILOGUE LECTURE OUTLINE

Chapter 1. Preliminaries. The purpose of this chapter is to provide some basic background information. Linear Space. Hilbert Space.

Problem Set 6: Solutions Math 201A: Fall a n x n,

Problem 1 (Exercise 2.2, Monograph)

Handout 2: Elements of Convex Analysis

Summer School: Semidefinite Optimization

Helly s Theorem with Applications in Combinatorial Geometry. Andrejs Treibergs. Wednesday, August 31, 2016

Contents Real Vector Spaces Linear Equations and Linear Inequalities Polyhedra Linear Programs and the Simplex Method Lagrangian Duality

LECTURE 3 LECTURE OUTLINE

Convex Optimization and an Introduction to Congestion Control. Lecture Notes. Fabian Wirth

MAT-INF4110/MAT-INF9110 Mathematical optimization

A Review of Linear Programming

3. Vector spaces 3.1 Linear dependence and independence 3.2 Basis and dimension. 5. Extreme points and basic feasible solutions

Key words. Complementarity set, Lyapunov rank, Bishop-Phelps cone, Irreducible cone

Week 3: Faces of convex sets

Helly's Theorem and its Equivalences via Convex Analysis

In English, this means that if we travel on a straight line between any two points in C, then we never leave C.

Operations that preserve the covering property of the lifting region

LECTURE SLIDES ON BASED ON CLASS LECTURES AT THE CAMBRIDGE, MASS FALL 2007 BY DIMITRI P. BERTSEKAS.

Nonlinear Programming Models

Continuity of convex functions in normed spaces

1 Overview. 2 Extreme Points. AM 221: Advanced Optimization Spring 2016

General Convexity, General Concavity, Fixed Points, Geometry, and Min-Max Points

LECTURE SLIDES ON CONVEX OPTIMIZATION AND DUALITY THEORY TATA INSTITUTE FOR FUNDAMENTAL RESEARCH MUMBAI, INDIA JANUARY 2009 PART I

Inequality Constraints

Some Properties of Convex Hulls of Integer Points Contained in General Convex Sets

The Split Closure of a Strictly Convex Body

Introduction to Koecher Cones. Michael Orlitzky

Example: feasibility. Interpretation as formal proof. Example: linear inequalities and Farkas lemma

Chapter 2. Convex Sets: basic results

Lecture: Cone programming. Approximating the Lorentz cone.

Lecture 2 - Unconstrained Optimization Definition[Global Minimum and Maximum]Let f : S R be defined on a set S R n. Then

On duality theory of conic linear problems

Chapter 1. Preliminaries

An Alternative Proof of Primitivity of Indecomposable Nonnegative Matrices with a Positive Trace

Convex Feasibility Problems

On John type ellipsoids

Notes taken by Graham Taylor. January 22, 2005

Cover Page. The handle holds various files of this Leiden University dissertation

4. Algebra and Duality

A NICE PROOF OF FARKAS LEMMA

Optimization methods NOPT048

Lecture 2: Convex Sets and Functions

On the projection onto a finitely generated cone

Convex Optimization Theory

Preliminary Version Draft 2015

Linear Vector Spaces

A Characterization of Polyhedral Convex Sets

LECTURE 4 LECTURE OUTLINE

Optimization methods NOPT048

Symmetric Matrices and Eigendecomposition

On the Properties of Positive Spanning Sets and Positive Bases

Research Division. Computer and Automation Institute, Hungarian Academy of Sciences. H-1518 Budapest, P.O.Box 63. Ujvári, M. WP August, 2007

Integral Jensen inequality

VI. Convexity. the (closed) interval spanned by x and y. A subset K of X is called convex if it is closed under interval formation, i.e.

ON GENERALIZED-CONVEX CONSTRAINED MULTI-OBJECTIVE OPTIMIZATION

Key words. Integer nonlinear programming, Cutting planes, Maximal lattice-free convex sets

Lecture 1: Introduction. Outline. B9824 Foundations of Optimization. Fall Administrative matters. 2. Introduction. 3. Existence of optima

Homework 5. Solutions

UNDERGROUND LECTURE NOTES 1: Optimality Conditions for Constrained Optimization Problems

Cutting planes from extended LP formulations

Lecture 7: Positive Semidefinite Matrices

1 Directional Derivatives and Differentiability

Lecture 1: Introduction. Outline. B9824 Foundations of Optimization. Fall Administrative matters. 2. Introduction. 3. Existence of optima

Optimization Theory. A Concise Introduction. Jiongmin Yong

Transcription:

Lecture 6 - Convex Sets Definition A set C R n is called convex if for any x, y C and λ [0, 1], the point λx + (1 λ)y belongs to C. The above definition is equivalent to saying that for any x, y C, the line segment [x, y] is also in C. convex sets nonconvex sets Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 1 / 32

Examples of Convex Sets Lines: A line in R n is a set of the form where z, d R n and d 0. [x, y], (x, y) for x, y R n (x y)., R n. A hyperplane is a set of the form L = {z + td : t R}, H = {x R n : a T x = b} (a R n \{0}, b R) The associated half-space is the set H = {x R n : a T x b} Both hyperplanes and half-spaces are convex sets. Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 2 / 32

Convexity of Balls Lemma. Let c R n and r > 0. Then the open ball and the closed ball are convex. B(c, r) = {x R n : x c < r} B[c, r] = {x R n : x c r} Note that the norm is an arbitrary norm defined over R n. Proof. In class Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 3 / 32

l 1, l 2 and l balls l 1 l 2 2 2 1.5 1.5 1 1 0.5 0.5 0 0 0.5 0.5 1 1 1.5 1.5 2 2 1 0 1 2 2 2 1 0 1 2 l 2 1.5 1 0.5 0 0.5 1 1.5 2 2 1 0 1 2 Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 4 / 32

Convexity of Ellipsoids An ellipsoid is a set of the form E = {x R n : x T Qx + 2b T x + c 0}, where Q R n n is positive semidefinite, b R n and c R. Proof. Lemma: E is convex. Write E as E = {x R n : f (x) 0} where f (x) x T Qx + 2b T x + c. Take x, y E and λ [0, 1]. Then f (x) 0, f (y) 0. The vector z = λx + (1 λ)y satisfies z T Qz = λ 2 x T Qx + (1 λ) 2 y T Qy + 2λ(1 λ)x T Qy. x T Qy Q 1/2 x Q 1/2 y = x T Qx y T Qy 1 2 (xt Qx + y T Qy) z T Qz λx T Qx + (1 λ)y T Qy f (z) = z T Qz + 2b T z + c λx T Qx + (1 λ)y T Qy + 2λb T x + 2(1 λ)b T y + λc + (1 λ)c = λf (x) + (1 λ)f (y) 0, Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 5 / 32

Algebraic Operations Preserving Convexity Lemma. Let C i R n be a convex set for any i I where I is an index set (possibly infinite). Then the set i I C i is convex. Proof. In class Example: Consider the set P = {x R n : Ax b} where A R m n and b R m is convex. P is called a convex polyhedron and it is indeed convex. Why? Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 6 / 32

Algebraic Operations Preserving Convexity preservation under addition, cartesian product, forward and inverse linear mappings Theorem. 1. Let C 1, C 2,..., C k R n be convex sets and let µ 1, µ 2,..., µ k R. Then the set µ 1C 1 + µ 2C 2 +... + µ k C k is convex. 2. Let C i R k i, i = 1,..., m be convex sets. Then the cartesian product C 1 C 2 C m = {(x 1, x 2,..., x m) : x i C i, i = 1, 2,..., m} is convex. 3. Let M R n be a convex set and let A R m n. Then the set is convex. A(M) = {Ax : x M} 4. Let D R m be convex and let A R m n. Then the set is convex. A 1 (M) = {x R n : Ax D} Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 7 / 32

Convex Combinations Given m points x 1, x 2,..., x m R n, a convex combination of these m points is a vector of the form λ 1 x 1 + λ 2 x 2 + +... + λ m x m, where λ 1, λ 2,..., λ m are nonnegative numbers satisfying λ 1 + λ 2 +... + λ m = 1. A convex set is defined by the property that any convex combination of two points from the set is also in the set. We will now show that a convex combination of any number of points from a convex set is in the set. Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 8 / 32

Convex Combinations Theorem.Let C R n be a convex set and let x 1, x 2,..., x m C. Then for any λ m, the relation m λ ix i C holds. Proof by induction on m. For m = 1 the result is obvious. The induction hypothesis is that for any m vectors x 1, x 2,..., x m C and any λ m, the vector m λ ix i belongs to C. We will now prove the theorem for m + 1 vectors. Suppose that x 1, x 2,..., x m+1 C and that λ m+1. We will show that z m+1 λ ix i C. If λ m+1 = 1, then z = x m+1 C and the result obviously follows. If λ m+1 < 1 then z = m m λ λ i ix i + λ m+1 x m+1 = (1 λ m+1 ) x i +λ m+1 x m+1. 1 λ m+1 }{{} v C and hence z = (1 λ m+1 )v + λ m+1 x m+1 C. v Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 9 / 32

The Convex Hull Definition. Let S R n. The convex hull of S, denoted by conv(s), is the set comprising all the convex combinations of vectors from S: { k } conv(s) λ i x i, : x 1, x 2,..., x k S, λ k. C conv(c) Figure: A nonconvex set and its convex hull Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 10 / 32

The Convex Hull The convex hull conv(s) is smallest convex set containing S. Proof. Lemma. Let S R n. If S T for some convex set T, then conv(s) T. Suppose that indeed S T for some convex set T. To prove that conv(s) T, take z conv(s). There exist x 1, x 2,..., x k S T (where k is a positive integer), and λ k such that z = k λ ix i. Since x 1, x 2,..., x k T, it follows that z T, showing the desired result. Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 11 / 32

We can assume that λ i > 0 for all i = 1, 2,..., k. If k n + 1, the result is proven. Otherwise, if k n + 2, then the vectors x 2 x 1, x 3 x 1,..., x k x 1, being more than n vectors in R n, are necessarily linearly dependent µ 2, µ 3,..., µ k not all zeros s.t. k i=2 µ i(x i x 1 ) = 0. Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 12 / 32 Carathéodory theorem Theorem. Let S R n and let x conv(s). Then there exist x 1, x 2,..., x n+1 S such that x conv ({x 1, x 2,..., x n+1 }), that is, there exist λ n+1 such that n+1 x = λ i x i. Proof. Let x conv(s). Then x 1, x 2,..., x k S and λ k s.t. x = k λ i x i.

Proof of Carathéodory Theorem Contd. Defining µ 1 = k i=2 µ i, we obtain that k µ i x i = 0, Not all of the coefficients µ 1, µ 2,..., µ k are zeros and k µ i = 0. There exists an index i for which µ i < 0. Let α R +. Then x = k λ i x i = k λ i x i + α k µ i x i = k (λ i + αµ i )x i. (1) We have k (λ i + αµ i ) = 1, so (1) is a convex combination representation iff λ i + αµ i 0 for all i = 1,..., k. (2). Since λ i > 0 for { all i, it follows that (2) is satisfied for all α [0, ε] where ε = min i:µi <0 λ i µ i }. Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 13 / 32

Proof of Carathéodory Theorem Contd. If we substitute { α = ε, then (2) still holds, but λ j + εµ j = 0 for j argmin µ } i. i:µ i <0 λ i This means that we found a representation of x as a convex combination of k 1 (or less) vectors. This process can be carried on until a representation of x as a convex combination of no more than n + 1 vectors is derived. Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 14 / 32

Example For n = 2, consider the four vectors ( 1 x 1 =, x 1) 2 = ( 1 2), x 3 = ( ( 2 2, x 1) 4 =, 2) and let x conv({x 1, x 2, x 3, x 4 }) be given by x = 1 8 x 1 + 1 4 x 2 + 1 2 x 3 + 1 ( 13 ) 8 x 4 = 8 11. 8 Find a representation of x as a convex combination of no more than 3 vectors. In class Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 15 / 32

Convex Cones A set S is called a cone if it satisfies the following property: for any x S and λ 0, the inclusion λx S is satisfied. The following lemma shows that there is a very simple and elegant characterization of convex cones. Lemma. A set S is a convex cone if and only if the following properties hold: A. x, y S x + y S. B. x S, λ 0 λx S. Simple exercise Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 16 / 32

Examples of Convex Cones The convex polytope C = {x R n : Ax 0}, where A R m n. Lorentz Cone The Lorenz cone, or ice cream cone is given by {( ) } L n x = R n+1 : x t, x R n, t R. t nonnegative polynomials. set consisting of all possible coefficients of polynomials of degree n 1 which are nonnegative over R: K n = {x R n : x 1 t n 1 + x 2 t n 2 +... + x n 1 t + x n 0 t R} 2 1.5 1 0.5 0 2 1 2 1 0 0 1 1 2 2 Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 17 / 32

The Conic Hull Definition. Given m points x 1, x 2,..., x m R n, a conic combination of these m points is a vector of the form λ 1 x 1 + λ 2 x 2 + + λ m x n, where λ R m +. The definition of the conic hull is now quite natural. Definition. Let S R n. Then the conic hull of S, denoted by cone(s) is the set comprising all the conic combinations of vectors from S: { k } conv(s) λ i x i : x 1, x 2,..., x k S, λ R k +. Similarly to the convex hull, the conic hull of a set S is the smallest cone containing S. Lemma. Let S R n. If S T for some convex cone T, then cone(s) T. Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 18 / 32

Representation Theorem for Conic Hulls a similar result to Carathéodory theorem Conic Representation Theorem. Let S R n and let x cone(s). Then there exist k linearly independent vector x 1, x 2,..., x k S such that x cone ({x 1, x 2,..., x k }), that is, there exist λ R k + such that In particular, k n. x = k λ i x i. Proof very similar to the proof of Carathéodory theorem. See page 107 of the book for the proof. Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 19 / 32

Basic Feasible Solutions Consider the convex polyhedron. P = {x R n : Ax = b, x 0}, (A R m n, b R m ) the rows of A are assumed to be linearly independent. The above is a standard formulation of the constraints of a linear programming problem. Definition. x is a basic feasible solution (abbreviated bfs) of P if the columns of A corresponding to the indices of the positive values of x are linearly independent. Example.Consider the linear system: x 1 + x 2 + x 3 = 6 x 2 + x 4 = 3 x 1, x 2, x 3, x 4 0. Find all the basic feasible solutions. In class Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 20 / 32

Existence of bfs s Theorem.Let P = {x R n : Ax = b, x 0}, where A R m n and b R m. If P, then it contains at least one bfs. Proof. P b cone({a 1, a 2,..., a n }) where a i denotes the i-th column of A. By the conic representation theorem, there exist indices i 1 < i 2 <... < i k and k numbers y i1, y i2,..., y ik 0 such that b = k j=1 y i j a ij and a i1, a i2,..., a ik are linearly independent. Denote x = k j=1 y i j e ij. Then obviously x 0 and in addition A x = k y ij Ae ij = j=1 k y ij a ij = b. Therefore, x is contained in P and the columns of A corresponding to the indices of the positive components of x are linearly independent, meaning that P contains a bfs. j=1 Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 21 / 32

Topological Properties of Convex Sets Proof. Theorem.Let C R n be a convex set. Then cl(c) is a convex set. Let x, y cl(c) and let λ [0, 1]. There exist sequences {x k } k 0 C and {y k } k 0 C for which x k x and y k y as k. (*) λx k + (1 λ)y k C for any k 0. (**) λx k + (1 λ)y k λx + (1 λ)y. (*)+(**) λx + (1 λ)y cl(c). Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 22 / 32

The Line Segment Principle Theorem. Let C be a convex set and assume that int(c). Suppose that x int(c) and y cl(c). Then (1 λ)x + λy int(c) for any λ [0, 1). Proof. There exists ε > 0 such that B(x, ε) C. Let z = (1 λ)x + λy. We will show that B(z, (1 λ)ε) C. Let w B(z, (1 λ)ε). Since y cl(c), w 1 C s.t. w 1 y < (1 λ)ε w z. (3) λ Set w 2 = 1 1 λ (w λw 1). Then w 2 x = w λw1 1 λ x = 1 (w z) + λ(y w1) 1 λ 1 (3) ( w z + λ w1 y ) < ε, 1 λ Hence, since B(x, ε) C, it follows that w 2 C. Finally, since w = λw 1 + (1 λ)w 2 with w 1, w 2 C, we have that w C. Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 23 / 32

Convexity of the Interior Theorem. Let C R n be a convex set. Then int(c) is convex. Proof. If int(c) =, then the theorem is obviously true. Otherwise, let x 1, x 2 int(c), and let λ (0, 1). By the LSP, λx 1 + (1 λ)x 2 int(c), establishing the convexity of int(c). Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 24 / 32

Combination of Closure and Interior Lemma. Let C be a convex set with a nonempty interior. Then 1. cl(int(c)) = cl(c). 2. int(cl(c)) = int(c). Proof of 1. Obviously, cl(int(c)) cl(c) holds. To prove that opposite, let x cl(c), y int(c). Then x k = 1 k y + ( 1 1 k ) x int(c) for any k 1. Since x is the limit (as k ) of the sequence {x k } k 1 int(c), it follows that x cl(int(c)). For the proof of 2, see pages 109,110 of the book for the proof of Lemma 6.30(b). Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 25 / 32

Compactness of the Convex Hull of Convex Sets Proof. Theorem. Let S R n be a compact set. Then conv(s) is compact. M > 0 such that x M for any x S. Let y conv(s). Then there exist x 1, x 2,..., x n+1 S and λ n+1 for which y = n+1 λ ix i and therefore n+1 n+1 n+1 y = λ i x i λ i x i M λ i = M, establishing the boundedness of conv(s). To prove the closedness of conv(s), let {y k } k 1 conv(s) be a sequence converging to y R n. There exist x k 1, xk 2,..., xk n+1 S and λk n+1 such that n+1 y k = λ k i x k i. (4) Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 26 / 32

Proof Contd. By the compactness of S and n+1, it follows that {(λ k, x k 1, xk 2,..., xk n+1 )} k 1 has a convergent subsequence {(λ k j, x k j 1, xk j 2,..., xk j n+1 )} j 1 whose limit will be denoted by with λ n+1, x 1, x 2,..., x n+1 S Taking the limit j in (λ, x 1, x 2,..., x n+1 ) n+1 y kj = λ k j i x k j i, we obtain that y = n+1 λ ix i conv(s) as required. Example: S = {(0, 0) T } {(x, y) T : xy 1} Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 27 / 32

Closedness of the Conic Hull of a Finite Set Proof. Theorem. Let a 1, a 2,..., a k R n. Then cone({a 1, a 2,..., a k }) is closed. By the conic representation theorem, each element of cone({a 1, a 2,..., a k }) can be represented as a conic combination of a linearly independent subset of {a 1, a 2,..., a k }. Therefore, if S 1, S 2,..., S N are all the subsets of {a 1, a 2,..., a k } comprising linearly independent vectors, then cone({a 1, a 2,..., a k }) = N cone(s i ). It is enough to show that cone(s i ) is closed for any i {1, 2,..., N}. Indeed, let i {1, 2,..., N}. Then S i = {b 1, b 2,..., b m }, where b 1, b 2,..., b m are linearly independent. cone(s i ) = {By : y R m +}, where B is the matrix whose columns are b 1, b 2,..., b m. Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 28 / 32

Proof Contd. Suppose that x k cone(s i ) for all k 1 and that x k x. y k R m + such that x k = By k. (5) y k = (B T B) 1 B T x k. Taking the limit as k in the last equation, we obtain that y k ȳ where ȳ = (B T B) 1 B T x. ȳ R m +. Thus, taking the limit in (5), we conclude that x = Bȳ with ȳ R m +, and hence x cone(s i ). Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 29 / 32

Extreme Points Definition. Let S R n be a convex set. A point x S is called an extreme point of S if there do not exist x 1, x 2 S(x 1 x 2 ) and λ (0, 1), such that x = λx 1 + (1 λ)x 2. The set of extreme point is denoted by ext(s). For example, the set of extreme points of a convex polytope consists of all its vertices. 4 3 x 2 x 3 2 x 4 1 0 1 x 1 2 2 1 0 1 2 3 4 Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 30 / 32

Equivalence Between bfs s and Extreme Points Theorem. Let P = {x R n : Ax = b, x 0}, where A R m n has linearly independent rows and b R m. The x is a basic feasible solution of P if and only if it is an extreme point of P. Theorem 6.34 in the book. Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 31 / 32

Krein-Milman Theorem Theorem. Let S R n be a compact convex set. Then S = conv(ext(s)). Amir Beck Introduction to Nonlinear Optimization Lecture Slides - Convex Sets 32 / 32