Convex Optimization in Computer Vision:

Size: px
Start display at page:

Download "Convex Optimization in Computer Vision:"

Transcription

1 Convex Optimization in Computer Vision: Segmentation and Multiview 3D Reconstruction Yiyong Feng and Daniel P. Palomar The Hong Kong University of Science and Technology (HKUST) ELEC Convex Optimization Fall , HKUST, Hong Kong

2 Outline of Lecture 1 Problem Setup Motivation General Problem Formulation 2 Convex Relaxation and Optimality 3 Applications: Image Segmentation and 3D Reconstruction Segmentation Multiview 3D Reconstruction Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

3 Outline of Lecture Problem Setup Motivation Problem Formulation 1 Problem Setup Motivation General Problem Formulation 2 Convex Relaxation and Optimality 3 Applications: Image Segmentation and 3D Reconstruction Segmentation Multiview 3D Reconstruction Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

4 Motivation: Segmentation Motivation Problem Formulation Consider an image, can we partition it into different regions? Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

5 Motivation Problem Formulation Motivation: Multiview Reconstruction Can we reconstruct 3D objective from multiview 2D images? Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

6 Motivation > Formulation? Motivation Problem Formulation How can we formulate mathematically the above two problems? Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

7 Basic Concepts Problem Setup Motivation Problem Formulation Denote space of interest as Ω R d 2D image segmentation d = 2 3D reconstruction d = 3 Find partition (Caccioppoli sets) with k + 1 regions: Ω i, i = 0,..., k, k i=0 Ω i = Ω, and Ω s Ω t = s t Weighted perimeter: Per g (Ω i ; Ω) Ω g (x) D1 {x Ωi }. D is some derivative called distributional derivative. Roughly speaking, if Ω i is in 2D, then it is the total weighted LENGTH of Ω i s boundary CURVE! if Ω i is in 3D, then it is the total weighted AREA of Ω i s boundary SURFACE! Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

8 Estimated Parameter Functions Motivation Problem Formulation g: nonnegative inhomogeneous metric. One example: 1 g (x) = point x 1+ I(x), where I (x) is the intensity or color value of f i : nonnegative weight functions, denote how likely one point belongs to the region Ω i. One example: f i (x) = log P i (x): negative log-likelihood for observing a certain point x belonging to Ω i Note that the smaller f i (x) is, the more likely x Ω i! Usually, f i and g functions can be estimated differently by using different information Here, we focus on the optimization part and assume f i s and g already given! Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

9 General Problem Formulation Motivation Problem Formulation Define regularized energy (with trade-off parameter ν > 0) as E (Ω i ) = k i=0 Ω i f i (x) dx + ν 1 2 k Per g (Ω i ; Ω) i=0 where first term: prior likelihood second term: regularity with weighted perimeter of partition {Ω i } Goal: minimize the regularized energy minimize {Ω i } subject to E (Ω i ) k i=0 Ω i = Ω, Ω s Ω t = s t Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

10 Outline of Lecture Problem Setup 1 Problem Setup Motivation General Problem Formulation 2 Convex Relaxation and Optimality 3 Applications: Image Segmentation and 3D Reconstruction Segmentation Multiview 3D Reconstruction Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

11 Convex Relaxation Part As we pointed out f i s and g are assumed to be given We focus on the Convex Relaxation Part (red dash region) Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

12 Implicit Representation, Case k = 1 (2 Regions) Consider now only TWO regions Clearly we just need one set Ω 1, and Ω 0 will simply be the complement set Represent the partition implicitly by function θ : Ω {0, 1} of Ω 1, i.e., θ = 1 Ω1 and 1 θ = 1 Ω0 Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

13 Implicit Reformulation, Case k = 1 (2 Regions) The regularized energy minimization amounts to minimize Ω Ω (f 1 (x) 1 Ω1 + f 0 (x) (1 1 Ω1 )) dx + νper g (Ω 1 ; Ω) 1 subject to Ω 1 Ω Per g (Ω 1 ; Ω) = Ω g (x) D1 {x Ω1 } = Ω g (x) Dθ (x) Equivalent implicit representation minimize θ subject to E (θ) Ω (f 1 (x) θ (x) + f 0 (x) (1 θ (x))) dx θ Θ +ν Ω g (x) Dθ (x) where Θ {θ : Ω {0, 1}} is a set of mappings! Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

14 Convexity, Case k = 1 (2 Regions) For any α [0, 1], and for any θ 1, θ 2 Θ, D ( αθ 1 + (1 α) θ 2) = αdθ 1 + (1 α) Dθ 2 Then we must have α Dθ 1 + (1 α) Dθ 2 E ( αθ 1 + (1 α) θ 2) θe ( θ 1) + (1 θ) E ( θ 2) Objective E (θ) is convex in θ! Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

15 Convexity, Case k = 1 (2 Regions) Recall that Θ = {θ θ : Ω {0, 1}} If any θ 1, θ 2 Θ, obviously for any α (0, 1) αθ 1 + (1 α) θ 2 / Θ Thus, Θ is nonconvex! Problem is nonconvex! Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

16 Convex Relaxation, Case k = 1 (2 Regions) Relax Θ to the convex set Θ {θ θ : Ω [0, 1]} Regularized energy minimization relaxation: minimize θ Ω (f 1 (x) θ (x) + f 0 (x) (1 θ (x))) dx +ν Ω g (x) Dθ (x) subject to θ Θ Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

17 Optimality, Case k = 1 (2 Regions) Proposition Thresholding a minimizer θ (x) of the relaxed problem leads to another optimal solution: { θ opt 1, if θ (x) µ (x) = 1 θ (x) µ = 0, if θ (x) < µ for any threshold µ (0, 1). But now observe that 1 θ (x) µ is also a minimizer of the original binary problem, why? Thus, the convex relaxation is tight! Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

18 Pre-Proof, Case k = 1 (2 Regions) Rewrite: E (θ) = = = Ω Ω Ω (f 1 (x) θ (x) + f 0 (x) (1 θ (x))) dx + ν g (x) Dθ (x) Ω ((f 1 (x) f 0 (x)) θ (x) + f 0 (x)) dx + ν g (x) Dθ (x) Ω f (x) θ (x) dx + ν g (x) Dθ (x) + C where f (x) f 1 (x) f 0 (x) and C Ω f 0 (x) dx Ignoring the constant terms, we will simply denote E (θ) f (x) θ (x) dx + ν g (x) Dθ (x) Ω Ω Ω Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

19 Proof, Case k = 1 (2 Regions) I Proof. 1 Layer cake formula: θ (x) = θ(x) µ dµ 2 Co-area formula and the fact θ (x) [0, 1]: Ω g (x) Dθ (x) = = = Per g ({x : θ (x) µ} ; Ω) dµ Per g ({x : θ (x) µ} ; Ω) dµ Ω g (x) D1 θ(x) µ dµ Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

20 Proof, Case k = 1 (2 Regions) II 3 Then E (θ) = = = = = Ω Ω f (x) θ (x) dx + ν f (x) 1 0 Ω 1 θ(x) µ dµdx + ν g (x) Dθ (x) f (x) 1 θ(x) µ dxdµ + ν Ω ( f (x) 1 θ(x) µ dx + ν Ω E ( 1 θ(x) µ ) dµ Ω 0 Ω Ω g (x) D1 θ(x) µ dµ g (x) D1 θ(x) µ dµ g (x) D1θ(x) µ ) dµ Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

21 Proof, Case k = 1 (2 Regions) III 4 Since θ is a minimizer of the relaxed problem, for any µ (0, 1), we have E (θ (x)) E ( ) 1 θ (x) µ 5 Assume µ 0 (0, 1) such that 1 θ (x) µ 0 is NOT a minimizer of the relaxed problem, we must have 6 Hence, E (θ (x)) = 1 0 E (θ (x)) < E ( 1 θ (x) µ 0 ) E (θ (x)) dµ < which leads to a contradiction! 1 0 E ( 1 θ (x) µ) dµ = E (θ (x)) Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

22 Proof, Case k = 1 (2 Regions) IV 7 Therefore, we can conclude that for any µ (0, 1), E (θ (x)) = E ( ) 1 θ (x) µ holds, i.e.,1 θ (x) µ is a minimizer of the relaxed problem, and it must be also a minimizer of the original nonconvex problem! Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

23 From Case k = 1 to k 2 Consider now more than TWO regions How to represent a partition implicitly and obtain the equivalent implicit reformulation? Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

24 Implicit Representation, Case k 2 ( 3 Regions) labeling function u : Ω {0, 1,..., k} u (x) = l x Ω l k binary functions θ (x) = (θ 1 (x),..., θ k (x)): { 1 u (x) i θ i (x) = 0 otherwise and 1 θ 1 (x) θ k (x) 0 u (x) and θ (x) are one-to-one correspondence via u (x) = k θ i (x) i=1 Then we can use k binary functions θ (x) to denote a partition {Ω i } k i=0 implicitly! Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

25 Implicit Representation, Case k 2 ( 3 Regions) Example k = 2, i.e., 3 regions: Relationship between θ (x) and a partition {Ω i } k i=0 1 {x Ω0 } = 1 θ 1 (x) 1 {x Ωi } = θ i (x) θ i+1 (x) i = 1,..., k 1 1 {x Ωk } = θ k (x) This implicit representation contains Case k = 1! Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

26 Implicit Representation, Case k 2 ( 3 Regions) Recall that the regularized energy is E (Ω i ) = k i=0 Ω i f i (x) dx + ν 1 2 k Per g (Ω i ; Ω) i=0 For simplicity, denote θ 0 (x) 1, θ k+1 (x) 0 First term can be rewritten as k i=0 Ω i f i (x) dx = k i=0 Ω (θ i (x) θ i+1 (x)) f i (x) dx Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

27 Implicit Representation, Case k 2 ( 3 Regions) Second term (involves some tough proof, omitted here) is 1 2 k i=0 Per g (Ω i ; Ω) = sup Ξ K k i=1 θ i (x) divξ i (x) dx Ω where Ξ (ξ 1,..., ξ k ) is the dual variable with ξ i : Ω R 2 Ξ is constrained to the set K Ξ : Ω Rd k, ξ i (x) g (x), x Ω, 1 i1 i2 k i 1 i i 2 K is convex since it is the intersection of balls. Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

28 Implicit Reformulation, Case k 2 ( 3 Regions) Equivalent implicit reformulation minimize θ subject to E (θ) = k i=0 Ω (θ i (x) θ i+1 (x)) f i (x) dx +ν sup k Ξ K i=1 Ω θ i (x) divξ i (x) dx θ Θ where { } Θ θ : Ω {0, 1} k, 1 θ 1 (x) θ k (x) 0, x Ω Again, the objective is convex, but the feasible set is nonconvex! Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

29 Convex Relaxation, Case k 2 ( 3 Regions) Relax Θ to { } Θ θ : Ω [0, 1] k, 1 θ 1 (x) θ k (x) 0, x Ω Convex relaxation problem minimize θ subject to E (θ) = k i=0 Ω (θ i (x) θ i+1 (x)) f i (x) dx +ν sup k Ξ K i=1 Ω θ i (x) divξ i (x) dx θ Θ We don t have Coarea formula for the second term, thus we cannot have the similar conclusion as Case k = 1 that the relaxation is tight. Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

30 Optimality, Case k 2 ( 3 Regions) Proposition Let θ (x) Θ be a minimizer of relaxed problem, we construct the threshold binary solution 1 {θ (x) µ} Θ. Assume θ gopt (x) Θ is the true global minimizer of the original problem, then: E ( ) ( 1 {θ (x) µ} E θ gopt (x) ) E ( ) 1 {θ (x) µ} E (θ (x)) Proof. The bound follows directly: E (θ (x)) E ( θ gopt (x) ) E ( ) 1 {θ (x) µ} Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

31 Optimality, Case k 2 ( 3 Regions) The choice of µ might be slightly different for different problems But usually it is around 0.5 The experiment examples show that the regularized energy E ( 1 {θ (x) µ}) is somehow robust in µ In many real world experiments, the bound is actually zero or near zero, such that for these examples the solutions are essentially optimal! Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

32 Summary on Convex Relaxation 2 regions: tight 3 regions: bounded Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

33 Outline of Lecture Problem Setup Segmentation Multiview 3D Reconstruction 1 Problem Setup Motivation General Problem Formulation 2 Convex Relaxation and Optimality 3 Applications: Image Segmentation and 3D Reconstruction Segmentation Multiview 3D Reconstruction Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

34 Applications Problem Setup Segmentation Multiview 3D Reconstruction Focus on two applications in computer vision: Image segmentation Multiview 3D reconstruction Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

35 Segmentation: Examples Segmentation Multiview 3D Reconstruction Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

36 Segmentation: Notations Segmentation Multiview 3D Reconstruction Basic notations Ω R 2 : plane that contains the scene of interest Partition: Ω i, i = 0,..., k Estimated parameter functions, for a pixel x: f i (x) 0: denote how likely one point belongs to the region Ω i ; two examples f i (x) = (I (x) c i ) 2 : squared error of input image I (x) to some mean intensity c i, where I (x) is the intensity or color value of x f i (x) = log P i (I (x)): negative log likelihood for observing a certain intensity or color value g (x) 0: inhomogeneous metric; one example g (x) = 1 1+ I(x) Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

37 Segmentation: Problem Formulation Segmentation Multiview 3D Reconstruction Minimize the regularized energy minimize {Ω i } subject to k i=0 Ω i f i (x) dx + ν 1 2 k i=1 Per g (Ω i ; Ω) k i=0 Ω i = Ω, Ω s Ω t = s t It is the same as the general formulation introduced before Optimality if there are two regions, i.e. case k = 1, the convex relaxation is tight! more than two regions, i.e. case k 2, the convex relaxation is not tight but closely bounded! Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

38 Segmentation: Numerical Results Segmentation Multiview 3D Reconstruction Sunflower, Input: 2D image = output: 10-label segmentation Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

39 Segmentation: Numerical Results Segmentation Multiview 3D Reconstruction Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

40 Reconstruction: Example I Segmentation Multiview 3D Reconstruction Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

41 Reconstruction: Example II Segmentation Multiview 3D Reconstruction Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

42 Reconstruction: Notations I Segmentation Multiview 3D Reconstruction Basic notations Ω R 3 : volume that contains the scene of interest S: a certain surface Robj S : the interior of S Rbck S : the exterior of S, i.e., the background V = R S obj RS bck Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

43 Reconstruction: Notations II Segmentation Multiview 3D Reconstruction Estimated parameter functions, for a voxel x: ρ obj (x): describes likelihood of voxel x in Robj S, the smaller value indicates the larger probability; one example ρ obj (x) = log P obj (x) ρ bck (x): describes likelihood of voxel x in Rbck S, the smaller value indicates the larger probability; one example ρ bck (x) = log P bck (x) ρ (x): photoconsistency measure, the smaller value indicates the voxel x is more likely to be on the surface; one example ( ρ (x) = exp τ ) j V OT E j (x), τ > 0 is a rate of decay parameter (say τ = 0.15), V OT E j denotes the vote from view (i.e., camera) j whether the voxel x is on the surface or not Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

44 Segmentation Multiview 3D Reconstruction Reconstruction: Problem Formulation Regularized energy (with trade-off parameter ν > 0): E (S) = ρ obj (x) dx + ρ bck (x) dx + ν ρ (x) ds R S obj R S bck S first two terms: prior likelihood last term: smoothness and photoconsistency Regularized energy minimization problem: S = arg min S Ω E (S) Robj S and RS bck are Ω 0 and Ω 1 ; ρ obj and ρ bck are f i s; ρ is g Note: this is Case k = 1 with 2 regions!!! Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

45 Reconstruction: Reformulation Segmentation Multiview 3D Reconstruction Represent S implicitly by function θ : Ω {0, 1} of Rbck S, i.e., θ = 1 R S and 1 θ = 1 bck R S obj Equivalent formulation (binary problem): minimize θ subject to E (θ) = Ω ρ obj (x) (1 θ (x)) + ρ bck (x) θ (x) dx θ Θ +ν Ω ρ (x) Dθ where Θ {u u : Ω {0, 1}} Convex relaxation Θ {θ θ : Ω [0, 1]} Optimality: thresholding the minimizer of the convex relaxation will give us the minimizer of the original problem! Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

46 Reconstruction: Numerical Results Segmentation Multiview 3D Reconstruction Input: multiview 2D images = output: reconstructed 3D model Y. Feng and D. Palomar (HKUST) CvxOpt in Comp. Vision ELEC / 48

47 For Further Reading Daniel Cremers, Thomas Pock, Kalin Kolev, and Antonin Chambolle, Convex Relaxation Techniques for Segmentation, Stereo and Multiview Reconstruction In Markov Random Fields for Vision and Image Processing. MIT Press, Antonin Chambolle, Daniel Cremers, and Thomas Pock, A convex approach to minimal partitions SIAM Journal on Imaging Sciences. 5(4): , Kalin Kolev, Maria Klodt, Thomas Brox, and Daniel Cremers, Continuous global optimization in multiview 3d reconstruction, International Journal of Computer Vision. 84(1):80 96, 2009.

48 Thanks For more information visit: palomar

Classification and Support Vector Machine

Classification and Support Vector Machine Classification and Support Vector Machine Yiyong Feng and Daniel P. Palomar The Hong Kong University of Science and Technology (HKUST) ELEC 5470 - Convex Optimization Fall 2017-18, HKUST, Hong Kong Outline

More information

Multiuser Downlink Beamforming: Rank-Constrained SDP

Multiuser Downlink Beamforming: Rank-Constrained SDP Multiuser Downlink Beamforming: Rank-Constrained SDP Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization Fall 2018-19, HKUST, Hong Kong Outline of Lecture

More information

Lagrange Duality. Daniel P. Palomar. Hong Kong University of Science and Technology (HKUST)

Lagrange Duality. Daniel P. Palomar. Hong Kong University of Science and Technology (HKUST) Lagrange Duality Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization Fall 2017-18, HKUST, Hong Kong Outline of Lecture Lagrangian Dual function Dual

More information

Non Convex Minimization using Convex Relaxation Some Hints to Formulate Equivalent Convex Energies

Non Convex Minimization using Convex Relaxation Some Hints to Formulate Equivalent Convex Energies Non Convex Minimization using Convex Relaxation Some Hints to Formulate Equivalent Convex Energies Mila Nikolova (CMLA, ENS Cachan, CNRS, France) SIAM Imaging Conference (IS14) Hong Kong Minitutorial:

More information

Convex Functions. Daniel P. Palomar. Hong Kong University of Science and Technology (HKUST)

Convex Functions. Daniel P. Palomar. Hong Kong University of Science and Technology (HKUST) Convex Functions Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization Fall 2017-18, HKUST, Hong Kong Outline of Lecture Definition convex function Examples

More information

A Tutorial on Primal-Dual Algorithm

A Tutorial on Primal-Dual Algorithm A Tutorial on Primal-Dual Algorithm Shenlong Wang University of Toronto March 31, 2016 1 / 34 Energy minimization MAP Inference for MRFs Typical energies consist of a regularization term and a data term.

More information

Lecture Notes on Support Vector Machine

Lecture Notes on Support Vector Machine Lecture Notes on Support Vector Machine Feng Li fli@sdu.edu.cn Shandong University, China 1 Hyperplane and Margin In a n-dimensional space, a hyper plane is defined by ω T x + b = 0 (1) where ω R n is

More information

Linear Programming Methods

Linear Programming Methods Chapter 11 Linear Programming Methods 1 In this chapter we consider the linear programming approach to dynamic programming. First, Bellman s equation can be reformulated as a linear program whose solution

More information

Lecture 2: Convex Sets and Functions

Lecture 2: Convex Sets and Functions Lecture 2: Convex Sets and Functions Hyang-Won Lee Dept. of Internet & Multimedia Eng. Konkuk University Lecture 2 Network Optimization, Fall 2015 1 / 22 Optimization Problems Optimization problems are

More information

u( x) = g( y) ds y ( 1 ) U solves u = 0 in U; u = 0 on U. ( 3)

u( x) = g( y) ds y ( 1 ) U solves u = 0 in U; u = 0 on U. ( 3) M ath 5 2 7 Fall 2 0 0 9 L ecture 4 ( S ep. 6, 2 0 0 9 ) Properties and Estimates of Laplace s and Poisson s Equations In our last lecture we derived the formulas for the solutions of Poisson s equation

More information

Convex relaxation. In example below, we have N = 6, and the cut we are considering

Convex relaxation. In example below, we have N = 6, and the cut we are considering Convex relaxation The art and science of convex relaxation revolves around taking a non-convex problem that you want to solve, and replacing it with a convex problem which you can actually solve the solution

More information

Lecture 7 Monotonicity. September 21, 2008

Lecture 7 Monotonicity. September 21, 2008 Lecture 7 Monotonicity September 21, 2008 Outline Introduce several monotonicity properties of vector functions Are satisfied immediately by gradient maps of convex functions In a sense, role of monotonicity

More information

MAS3706 Topology. Revision Lectures, May I do not answer enquiries as to what material will be in the exam.

MAS3706 Topology. Revision Lectures, May I do not answer  enquiries as to what material will be in the exam. MAS3706 Topology Revision Lectures, May 208 Z.A.Lykova It is essential that you read and try to understand the lecture notes from the beginning to the end. Many questions from the exam paper will be similar

More information

Alternative Decompositions for Distributed Maximization of Network Utility: Framework and Applications

Alternative Decompositions for Distributed Maximization of Network Utility: Framework and Applications Alternative Decompositions for Distributed Maximization of Network Utility: Framework and Applications Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization

More information

Primal/Dual Decomposition Methods

Primal/Dual Decomposition Methods Primal/Dual Decomposition Methods Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization Fall 2018-19, HKUST, Hong Kong Outline of Lecture Subgradients

More information

Szemerédi-Trotter theorem and applications

Szemerédi-Trotter theorem and applications Szemerédi-Trotter theorem and applications M. Rudnev December 6, 2004 The theorem Abstract These notes cover the material of two Applied post-graduate lectures in Bristol, 2004. Szemerédi-Trotter theorem

More information

LECTURE 15: COMPLETENESS AND CONVEXITY

LECTURE 15: COMPLETENESS AND CONVEXITY LECTURE 15: COMPLETENESS AND CONVEXITY 1. The Hopf-Rinow Theorem Recall that a Riemannian manifold (M, g) is called geodesically complete if the maximal defining interval of any geodesic is R. On the other

More information

A posteriori error control for the binary Mumford Shah model

A posteriori error control for the binary Mumford Shah model A posteriori error control for the binary Mumford Shah model Benjamin Berkels 1, Alexander Effland 2, Martin Rumpf 2 1 AICES Graduate School, RWTH Aachen University, Germany 2 Institute for Numerical Simulation,

More information

Multi-class DTI Segmentation: A Convex Approach

Multi-class DTI Segmentation: A Convex Approach Multi-class DTI Segmentation: A Convex Approach Yuchen Xie 1 Ting Chen 2 Jeffrey Ho 1 Baba C. Vemuri 1 1 Department of CISE, University of Florida, Gainesville, FL 2 IBM Almaden Research Center, San Jose,

More information

Lecture 8. Strong Duality Results. September 22, 2008

Lecture 8. Strong Duality Results. September 22, 2008 Strong Duality Results September 22, 2008 Outline Lecture 8 Slater Condition and its Variations Convex Objective with Linear Inequality Constraints Quadratic Objective over Quadratic Constraints Representation

More information

Convex Optimization M2

Convex Optimization M2 Convex Optimization M2 Lecture 3 A. d Aspremont. Convex Optimization M2. 1/49 Duality A. d Aspremont. Convex Optimization M2. 2/49 DMs DM par email: dm.daspremont@gmail.com A. d Aspremont. Convex Optimization

More information

POLARS AND DUAL CONES

POLARS AND DUAL CONES POLARS AND DUAL CONES VERA ROSHCHINA Abstract. The goal of this note is to remind the basic definitions of convex sets and their polars. For more details see the classic references [1, 2] and [3] for polytopes.

More information

Lecture 7 Introduction to Statistical Decision Theory

Lecture 7 Introduction to Statistical Decision Theory Lecture 7 Introduction to Statistical Decision Theory I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 20, 2016 1 / 55 I-Hsiang Wang IT Lecture 7

More information

Convex Optimization. (EE227A: UC Berkeley) Lecture 6. Suvrit Sra. (Conic optimization) 07 Feb, 2013

Convex Optimization. (EE227A: UC Berkeley) Lecture 6. Suvrit Sra. (Conic optimization) 07 Feb, 2013 Convex Optimization (EE227A: UC Berkeley) Lecture 6 (Conic optimization) 07 Feb, 2013 Suvrit Sra Organizational Info Quiz coming up on 19th Feb. Project teams by 19th Feb Good if you can mix your research

More information

Assignment 1: From the Definition of Convexity to Helley Theorem

Assignment 1: From the Definition of Convexity to Helley Theorem Assignment 1: From the Definition of Convexity to Helley Theorem Exercise 1 Mark in the following list the sets which are convex: 1. {x R 2 : x 1 + i 2 x 2 1, i = 1,..., 10} 2. {x R 2 : x 2 1 + 2ix 1x

More information

Constrained Optimization and Lagrangian Duality

Constrained Optimization and Lagrangian Duality CIS 520: Machine Learning Oct 02, 2017 Constrained Optimization and Lagrangian Duality Lecturer: Shivani Agarwal Disclaimer: These notes are designed to be a supplement to the lecture. They may or may

More information

Integral Jensen inequality

Integral Jensen inequality Integral Jensen inequality Let us consider a convex set R d, and a convex function f : (, + ]. For any x,..., x n and λ,..., λ n with n λ i =, we have () f( n λ ix i ) n λ if(x i ). For a R d, let δ a

More information

Introduction to Convex Optimization

Introduction to Convex Optimization Introduction to Convex Optimization Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization Fall 2018-19, HKUST, Hong Kong Outline of Lecture Optimization

More information

Minimax Problems. Daniel P. Palomar. Hong Kong University of Science and Technolgy (HKUST)

Minimax Problems. Daniel P. Palomar. Hong Kong University of Science and Technolgy (HKUST) Mini Problems Daniel P. Palomar Hong Kong University of Science and Technolgy (HKUST) ELEC547 - Convex Optimization Fall 2009-10, HKUST, Hong Kong Outline of Lecture Introduction Matrix games Bilinear

More information

Hamburger Beiträge zur Angewandten Mathematik

Hamburger Beiträge zur Angewandten Mathematik Hamburger Beiträge zur Angewandten Mathematik Numerical analysis of a control and state constrained elliptic control problem with piecewise constant control approximations Klaus Deckelnick and Michael

More information

Reminder Notes for the Course on Distribution Theory

Reminder Notes for the Course on Distribution Theory Reminder Notes for the Course on Distribution Theory T. C. Dorlas Dublin Institute for Advanced Studies School of Theoretical Physics 10 Burlington Road, Dublin 4, Ireland. Email: dorlas@stp.dias.ie March

More information

A Geometrical Analysis of a Class of Nonconvex Conic Programs for Convex Conic Reformulations of Quadratic and Polynomial Optimization Problems

A Geometrical Analysis of a Class of Nonconvex Conic Programs for Convex Conic Reformulations of Quadratic and Polynomial Optimization Problems A Geometrical Analysis of a Class of Nonconvex Conic Programs for Convex Conic Reformulations of Quadratic and Polynomial Optimization Problems Sunyoung Kim, Masakazu Kojima, Kim-Chuan Toh arxiv:1901.02179v1

More information

Integer Programming ISE 418. Lecture 12. Dr. Ted Ralphs

Integer Programming ISE 418. Lecture 12. Dr. Ted Ralphs Integer Programming ISE 418 Lecture 12 Dr. Ted Ralphs ISE 418 Lecture 12 1 Reading for This Lecture Nemhauser and Wolsey Sections II.2.1 Wolsey Chapter 9 ISE 418 Lecture 12 2 Generating Stronger Valid

More information

AMATH 353 Lecture 9. Weston Barger. How to classify PDEs as linear/nonlinear, order, homogeneous or non-homogeneous.

AMATH 353 Lecture 9. Weston Barger. How to classify PDEs as linear/nonlinear, order, homogeneous or non-homogeneous. AMATH 353 ecture 9 Weston Barger 1 Exam What you need to know: How to classify PDEs as linear/nonlinear, order, homogeneous or non-homogeneous. The definitions for traveling wave, standing wave, wave train

More information

Distributionally Robust Discrete Optimization with Entropic Value-at-Risk

Distributionally Robust Discrete Optimization with Entropic Value-at-Risk Distributionally Robust Discrete Optimization with Entropic Value-at-Risk Daniel Zhuoyu Long Department of SEEM, The Chinese University of Hong Kong, zylong@se.cuhk.edu.hk Jin Qi NUS Business School, National

More information

Robustness and duality of maximum entropy and exponential family distributions

Robustness and duality of maximum entropy and exponential family distributions Chapter 7 Robustness and duality of maximum entropy and exponential family distributions In this lecture, we continue our study of exponential families, but now we investigate their properties in somewhat

More information

Randomized Coordinate Descent Methods on Optimization Problems with Linearly Coupled Constraints

Randomized Coordinate Descent Methods on Optimization Problems with Linearly Coupled Constraints Randomized Coordinate Descent Methods on Optimization Problems with Linearly Coupled Constraints By I. Necoara, Y. Nesterov, and F. Glineur Lijun Xu Optimization Group Meeting November 27, 2012 Outline

More information

Linear Classification: Linear Programming

Linear Classification: Linear Programming Linear Classification: Linear Programming Yufei Tao Department of Computer Science and Engineering Chinese University of Hong Kong 1 / 21 Y Tao Linear Classification: Linear Programming Recall the definition

More information

An introduction to Mathematical Theory of Control

An introduction to Mathematical Theory of Control An introduction to Mathematical Theory of Control Vasile Staicu University of Aveiro UNICA, May 2018 Vasile Staicu (University of Aveiro) An introduction to Mathematical Theory of Control UNICA, May 2018

More information

Convex relaxation. In example below, we have N = 6, and the cut we are considering

Convex relaxation. In example below, we have N = 6, and the cut we are considering Convex relaxation The art and science of convex relaxation revolves around taking a non-convex problem that you want to solve, and replacing it with a convex problem which you can actually solve the solution

More information

Superhedging and distributionally robust optimization with neural networks

Superhedging and distributionally robust optimization with neural networks Superhedging and distributionally robust optimization with neural networks Stephan Eckstein joint work with Michael Kupper and Mathias Pohl Robust Techniques in Quantitative Finance University of Oxford

More information

Asymmetric Cheeger cut and application to multi-class unsupervised clustering

Asymmetric Cheeger cut and application to multi-class unsupervised clustering Asymmetric Cheeger cut and application to multi-class unsupervised clustering Xavier Bresson Thomas Laurent April 8, 0 Abstract Cheeger cut has recently been shown to provide excellent classification results

More information

Convex Optimization M2

Convex Optimization M2 Convex Optimization M2 Lecture 8 A. d Aspremont. Convex Optimization M2. 1/57 Applications A. d Aspremont. Convex Optimization M2. 2/57 Outline Geometrical problems Approximation problems Combinatorial

More information

Optimization. A first course on mathematics for economists

Optimization. A first course on mathematics for economists Optimization. A first course on mathematics for economists Xavier Martinez-Giralt Universitat Autònoma de Barcelona xavier.martinez.giralt@uab.eu II.3 Static optimization - Non-Linear programming OPT p.1/45

More information

Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization

Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization Compiled by David Rosenberg Abstract Boyd and Vandenberghe s Convex Optimization book is very well-written and a pleasure to read. The

More information

Machine Learning. Lecture 6: Support Vector Machine. Feng Li.

Machine Learning. Lecture 6: Support Vector Machine. Feng Li. Machine Learning Lecture 6: Support Vector Machine Feng Li fli@sdu.edu.cn https://funglee.github.io School of Computer Science and Technology Shandong University Fall 2018 Warm Up 2 / 80 Warm Up (Contd.)

More information

SCALE INVARIANT FOURIER RESTRICTION TO A HYPERBOLIC SURFACE

SCALE INVARIANT FOURIER RESTRICTION TO A HYPERBOLIC SURFACE SCALE INVARIANT FOURIER RESTRICTION TO A HYPERBOLIC SURFACE BETSY STOVALL Abstract. This result sharpens the bilinear to linear deduction of Lee and Vargas for extension estimates on the hyperbolic paraboloid

More information

A First Order Primal-Dual Algorithm for Nonconvex T V q Regularization

A First Order Primal-Dual Algorithm for Nonconvex T V q Regularization A First Order Primal-Dual Algorithm for Nonconvex T V q Regularization Thomas Möllenhoff, Evgeny Strekalovskiy, and Daniel Cremers TU Munich, Germany Abstract. We propose an efficient first order primal-dual

More information

MAP Examples. Sargur Srihari

MAP Examples. Sargur Srihari MAP Examples Sargur srihari@cedar.buffalo.edu 1 Potts Model CRF for OCR Topics Image segmentation based on energy minimization 2 Examples of MAP Many interesting examples of MAP inference are instances

More information

Learning Spectral Graph Segmentation

Learning Spectral Graph Segmentation Learning Spectral Graph Segmentation AISTATS 2005 Timothée Cour Jianbo Shi Nicolas Gogin Computer and Information Science Department University of Pennsylvania Computer Science Ecole Polytechnique Graph-based

More information

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 18

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 18 EE/ACM 150 - Applications of Convex Optimization in Signal Processing and Communications Lecture 18 Andre Tkacenko Signal Processing Research Group Jet Propulsion Laboratory May 31, 2012 Andre Tkacenko

More information

An Infeasible Interior-Point Algorithm with full-newton Step for Linear Optimization

An Infeasible Interior-Point Algorithm with full-newton Step for Linear Optimization An Infeasible Interior-Point Algorithm with full-newton Step for Linear Optimization H. Mansouri M. Zangiabadi Y. Bai C. Roos Department of Mathematical Science, Shahrekord University, P.O. Box 115, Shahrekord,

More information

NATIONAL UNIVERSITY OF SINGAPORE Department of Mathematics MA4247 Complex Analysis II Lecture Notes Part II

NATIONAL UNIVERSITY OF SINGAPORE Department of Mathematics MA4247 Complex Analysis II Lecture Notes Part II NATIONAL UNIVERSITY OF SINGAPORE Department of Mathematics MA4247 Complex Analysis II Lecture Notes Part II Chapter 2 Further properties of analytic functions 21 Local/Global behavior of analytic functions;

More information

SUPPLEMENTARY MATERIAL TO IRONING WITHOUT CONTROL

SUPPLEMENTARY MATERIAL TO IRONING WITHOUT CONTROL SUPPLEMENTARY MATERIAL TO IRONING WITHOUT CONTROL JUUSO TOIKKA This document contains omitted proofs and additional results for the manuscript Ironing without Control. Appendix A contains the proofs for

More information

at time t, in dimension d. The index i varies in a countable set I. We call configuration the family, denoted generically by Φ: U (x i (t) x j (t))

at time t, in dimension d. The index i varies in a countable set I. We call configuration the family, denoted generically by Φ: U (x i (t) x j (t)) Notations In this chapter we investigate infinite systems of interacting particles subject to Newtonian dynamics Each particle is characterized by its position an velocity x i t, v i t R d R d at time

More information

Theory and Applications of Stochastic Systems Lecture Exponential Martingale for Random Walk

Theory and Applications of Stochastic Systems Lecture Exponential Martingale for Random Walk Instructor: Victor F. Araman December 4, 2003 Theory and Applications of Stochastic Systems Lecture 0 B60.432.0 Exponential Martingale for Random Walk Let (S n : n 0) be a random walk with i.i.d. increments

More information

OWL to the rescue of LASSO

OWL to the rescue of LASSO OWL to the rescue of LASSO IISc IBM day 2018 Joint Work R. Sankaran and Francis Bach AISTATS 17 Chiranjib Bhattacharyya Professor, Department of Computer Science and Automation Indian Institute of Science,

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 16 Advanced topics in computational statistics 18 May 2017 Computer Intensive Methods (1) Plan of

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

Multiview Geometry and Bundle Adjustment. CSE P576 David M. Rosen

Multiview Geometry and Bundle Adjustment. CSE P576 David M. Rosen Multiview Geometry and Bundle Adjustment CSE P576 David M. Rosen 1 Recap Previously: Image formation Feature extraction + matching Two-view (epipolar geometry) Today: Add some geometry, statistics, optimization

More information

Machine Learning for Signal Processing Bayes Classification and Regression

Machine Learning for Signal Processing Bayes Classification and Regression Machine Learning for Signal Processing Bayes Classification and Regression Instructor: Bhiksha Raj 11755/18797 1 Recap: KNN A very effective and simple way of performing classification Simple model: For

More information

4. Convex optimization problems

4. Convex optimization problems Convex Optimization Boyd & Vandenberghe 4. Convex optimization problems optimization problem in standard form convex optimization problems quasiconvex optimization linear optimization quadratic optimization

More information

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu

More information

Online Appendix: Optimal Retrospective Voting

Online Appendix: Optimal Retrospective Voting Online Appendix: Optimal Retrospective Voting Ethan Bueno de Mesquita 1 Amanda Friedenberg 2 The notation and setup will be as in the main text, with the following exceptions: Let x l : Ω R be a random

More information

Markov Point Process for Multiple Object Detection

Markov Point Process for Multiple Object Detection Markov Point Process for Multiple Object Detection Xavier Descombes INRIA Sophia Antipolis Méditerranée 24/06/2013 Outline Motivation The different issues : - objects - reference measure - prior - data

More information

Stability of an abstract wave equation with delay and a Kelvin Voigt damping

Stability of an abstract wave equation with delay and a Kelvin Voigt damping Stability of an abstract wave equation with delay and a Kelvin Voigt damping University of Monastir/UPSAY/LMV-UVSQ Joint work with Serge Nicaise and Cristina Pignotti Outline 1 Problem The idea Stability

More information

Linear Classification: Linear Programming

Linear Classification: Linear Programming Yufei Tao Department of Computer Science and Engineering Chinese University of Hong Kong Recall the definition of linear classification. Definition 1. Let R d denote the d-dimensional space where the domain

More information

Lagrange Relaxation: Introduction and Applications

Lagrange Relaxation: Introduction and Applications 1 / 23 Lagrange Relaxation: Introduction and Applications Operations Research Anthony Papavasiliou 2 / 23 Contents 1 Context 2 Applications Application in Stochastic Programming Unit Commitment 3 / 23

More information

Exercises: Brunn, Minkowski and convex pie

Exercises: Brunn, Minkowski and convex pie Lecture 1 Exercises: Brunn, Minkowski and convex pie Consider the following problem: 1.1 Playing a convex pie Consider the following game with two players - you and me. I am cooking a pie, which should

More information

Correlation dimension for self-similar Cantor sets with overlaps

Correlation dimension for self-similar Cantor sets with overlaps F U N D A M E N T A MATHEMATICAE 155 (1998) Correlation dimension for self-similar Cantor sets with overlaps by Károly S i m o n (Miskolc) and Boris S o l o m y a k (Seattle, Wash.) Abstract. We consider

More information

A Polytope for a Product of Real Linear Functions in 0/1 Variables

A Polytope for a Product of Real Linear Functions in 0/1 Variables Don Coppersmith Oktay Günlük Jon Lee Janny Leung A Polytope for a Product of Real Linear Functions in 0/1 Variables Original: 29 September 1999 as IBM Research Report RC21568 Revised version: 30 November

More information

Lecture 5: The Bellman Equation

Lecture 5: The Bellman Equation Lecture 5: The Bellman Equation Florian Scheuer 1 Plan Prove properties of the Bellman equation (In particular, existence and uniqueness of solution) Use this to prove properties of the solution Think

More information

Index Tracking in Finance via MM

Index Tracking in Finance via MM Index Tracking in Finance via MM Prof. Daniel P. Palomar and Konstantinos Benidis The Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization Fall 2018-19, HKUST, Hong Kong

More information

u xx + u yy = 0. (5.1)

u xx + u yy = 0. (5.1) Chapter 5 Laplace Equation The following equation is called Laplace equation in two independent variables x, y: The non-homogeneous problem u xx + u yy =. (5.1) u xx + u yy = F, (5.) where F is a function

More information

Multiple View Geometry in Computer Vision

Multiple View Geometry in Computer Vision Multiple View Geometry in Computer Vision Prasanna Sahoo Department of Mathematics University of Louisville 1 Scene Planes & Homographies Lecture 19 March 24, 2005 2 In our last lecture, we examined various

More information

Phase Transition Phenomenon in Sparse Approximation

Phase Transition Phenomenon in Sparse Approximation Phase Transition Phenomenon in Sparse Approximation University of Utah/Edinburgh L1 Approximation: May 17 st 2008 Convex polytopes Counting faces Sparse Representations via l 1 Regularization Underdetermined

More information

arxiv: v1 [math.dg] 20 Dec 2016

arxiv: v1 [math.dg] 20 Dec 2016 LAGRANGIAN L-STABILITY OF LAGRANGIAN TRANSLATING SOLITONS arxiv:161.06815v1 [math.dg] 0 Dec 016 JUN SUN Abstract. In this paper, we prove that any Lagrangian translating soliton is Lagrangian L-stable.

More information

Convex Optimization Problems. Prof. Daniel P. Palomar

Convex Optimization Problems. Prof. Daniel P. Palomar Conve Optimization Problems Prof. Daniel P. Palomar The Hong Kong University of Science and Technology (HKUST) MAFS6010R- Portfolio Optimization with R MSc in Financial Mathematics Fall 2018-19, HKUST,

More information

Hamming Cube and Other Stuff

Hamming Cube and Other Stuff Hamming Cube and Other Stuff Sabrina Sixta Tuesday, May 27, 2014 Sabrina Sixta () Hamming Cube and Other Stuff Tuesday, May 27, 2014 1 / 12 Table of Contents Outline 1 Definition of a Hamming cube 2 Properties

More information

A Generative Model Based Approach to Motion Segmentation

A Generative Model Based Approach to Motion Segmentation A Generative Model Based Approach to Motion Segmentation Daniel Cremers 1 and Alan Yuille 2 1 Department of Computer Science University of California at Los Angeles 2 Department of Statistics and Psychology

More information

Lecture 8 Plus properties, merit functions and gap functions. September 28, 2008

Lecture 8 Plus properties, merit functions and gap functions. September 28, 2008 Lecture 8 Plus properties, merit functions and gap functions September 28, 2008 Outline Plus-properties and F-uniqueness Equation reformulations of VI/CPs Merit functions Gap merit functions FP-I book:

More information

Pseudo-Poincaré Inequalities and Applications to Sobolev Inequalities

Pseudo-Poincaré Inequalities and Applications to Sobolev Inequalities Pseudo-Poincaré Inequalities and Applications to Sobolev Inequalities Laurent Saloff-Coste Abstract Most smoothing procedures are via averaging. Pseudo-Poincaré inequalities give a basic L p -norm control

More information

On deterministic reformulations of distributionally robust joint chance constrained optimization problems

On deterministic reformulations of distributionally robust joint chance constrained optimization problems On deterministic reformulations of distributionally robust joint chance constrained optimization problems Weijun Xie and Shabbir Ahmed School of Industrial & Systems Engineering Georgia Institute of Technology,

More information

Consistency of Modularity Clustering on Random Geometric Graphs

Consistency of Modularity Clustering on Random Geometric Graphs Consistency of Modularity Clustering on Random Geometric Graphs Erik Davis The University of Arizona May 10, 2016 Outline Introduction to Modularity Clustering Pointwise Convergence Convergence of Optimal

More information

Lagrangian Duality and Convex Optimization

Lagrangian Duality and Convex Optimization Lagrangian Duality and Convex Optimization David Rosenberg New York University February 11, 2015 David Rosenberg (New York University) DS-GA 1003 February 11, 2015 1 / 24 Introduction Why Convex Optimization?

More information

What Metrics Can Be Approximated by Geo-Cuts, or Global Optimization of Length/Area and Flux

What Metrics Can Be Approximated by Geo-Cuts, or Global Optimization of Length/Area and Flux Proceedings of International Conference on Computer Vision (ICCV), Beijing, China, October 2005 vol.., p.1 What Metrics Can Be Approximated by Geo-Cuts, or Global Optimization of Length/Area and Flux Vladimir

More information

Lecture 9: October 25, Lower bounds for minimax rates via multiple hypotheses

Lecture 9: October 25, Lower bounds for minimax rates via multiple hypotheses Information and Coding Theory Autumn 07 Lecturer: Madhur Tulsiani Lecture 9: October 5, 07 Lower bounds for minimax rates via multiple hypotheses In this lecture, we extend the ideas from the previous

More information

Convexification of Mixed-Integer Quadratically Constrained Quadratic Programs

Convexification of Mixed-Integer Quadratically Constrained Quadratic Programs Convexification of Mixed-Integer Quadratically Constrained Quadratic Programs Laura Galli 1 Adam N. Letchford 2 Lancaster, April 2011 1 DEIS, University of Bologna, Italy 2 Department of Management Science,

More information

Course 212: Academic Year Section 1: Metric Spaces

Course 212: Academic Year Section 1: Metric Spaces Course 212: Academic Year 1991-2 Section 1: Metric Spaces D. R. Wilkins Contents 1 Metric Spaces 3 1.1 Distance Functions and Metric Spaces............. 3 1.2 Convergence and Continuity in Metric Spaces.........

More information

Accelerated Dual Gradient-Based Methods for Total Variation Image Denoising/Deblurring Problems (and other Inverse Problems)

Accelerated Dual Gradient-Based Methods for Total Variation Image Denoising/Deblurring Problems (and other Inverse Problems) Accelerated Dual Gradient-Based Methods for Total Variation Image Denoising/Deblurring Problems (and other Inverse Problems) Donghwan Kim and Jeffrey A. Fessler EECS Department, University of Michigan

More information

Interior-Point Methods for Linear Optimization

Interior-Point Methods for Linear Optimization Interior-Point Methods for Linear Optimization Robert M. Freund and Jorge Vera March, 204 c 204 Robert M. Freund and Jorge Vera. All rights reserved. Linear Optimization with a Logarithmic Barrier Function

More information

Canonical Problem Forms. Ryan Tibshirani Convex Optimization

Canonical Problem Forms. Ryan Tibshirani Convex Optimization Canonical Problem Forms Ryan Tibshirani Convex Optimization 10-725 Last time: optimization basics Optimization terology (e.g., criterion, constraints, feasible points, solutions) Properties and first-order

More information

[ zd z zdz dψ + 2i. 2 e i ψ 2 dz. (σ 2 ) 2 +(σ 3 ) 2 = (1+ z 2 ) 2

[ zd z zdz dψ + 2i. 2 e i ψ 2 dz. (σ 2 ) 2 +(σ 3 ) 2 = (1+ z 2 ) 2 2 S 2 2 2 2 2 M M 4 S 2 S 2 z, w : C S 2 z = 1/w e iψ S 1 S 2 σ 1 = 1 ( ) [ zd z zdz dψ + 2i 2 1 + z 2, σ 2 = Re 2 e i ψ 2 dz 1 + z 2 ], σ 3 = Im [ 2 e i ψ 2 dz 1 + z 2 σ 2 σ 3 (σ 2 ) 2 (σ 3 ) 2 σ 2 σ

More information

Scenario grouping and decomposition algorithms for chance-constrained programs

Scenario grouping and decomposition algorithms for chance-constrained programs Scenario grouping and decomposition algorithms for chance-constrained programs Yan Deng Shabbir Ahmed Jon Lee Siqian Shen Abstract A lower bound for a finite-scenario chance-constrained problem is given

More information

Approximation algorithms for cycle packing problems

Approximation algorithms for cycle packing problems Approximation algorithms for cycle packing problems Michael Krivelevich Zeev Nutov Raphael Yuster Abstract The cycle packing number ν c (G) of a graph G is the maximum number of pairwise edgedisjoint cycles

More information

Proximal Methods for Optimization with Spasity-inducing Norms

Proximal Methods for Optimization with Spasity-inducing Norms Proximal Methods for Optimization with Spasity-inducing Norms Group Learning Presentation Xiaowei Zhou Department of Electronic and Computer Engineering The Hong Kong University of Science and Technology

More information

1 Lecture 4: Set topology on metric spaces, 8/17/2012

1 Lecture 4: Set topology on metric spaces, 8/17/2012 Summer Jump-Start Program for Analysis, 01 Song-Ying Li 1 Lecture : Set topology on metric spaces, 8/17/01 Definition 1.1. Let (X, d) be a metric space; E is a subset of X. Then: (i) x E is an interior

More information

Lagrangian-Conic Relaxations, Part I: A Unified Framework and Its Applications to Quadratic Optimization Problems

Lagrangian-Conic Relaxations, Part I: A Unified Framework and Its Applications to Quadratic Optimization Problems Lagrangian-Conic Relaxations, Part I: A Unified Framework and Its Applications to Quadratic Optimization Problems Naohiko Arima, Sunyoung Kim, Masakazu Kojima, and Kim-Chuan Toh Abstract. In Part I of

More information

Fokker-Planck Equation on Graph with Finite Vertices

Fokker-Planck Equation on Graph with Finite Vertices Fokker-Planck Equation on Graph with Finite Vertices January 13, 2011 Jointly with S-N Chow (Georgia Tech) Wen Huang (USTC) Hao-min Zhou(Georgia Tech) Functional Inequalities and Discrete Spaces Outline

More information

Topics in Theoretical Computer Science April 08, Lecture 8

Topics in Theoretical Computer Science April 08, Lecture 8 Topics in Theoretical Computer Science April 08, 204 Lecture 8 Lecturer: Ola Svensson Scribes: David Leydier and Samuel Grütter Introduction In this lecture we will introduce Linear Programming. It was

More information