Some new randomized methods for control and optimization
|
|
- Laurence Webb
- 5 years ago
- Views:
Transcription
1 Some new randomized methods for control and optimization B.T. Polyak Institute for Control Sciences, Russian Academy of Sciences, Moscow, Russia June 29 with E. Gryazina, P. Scherbakov Workshop in honor of B.Barmish for the occasion of his 6th birthday B.T. Polyak (ICS, Moscow) Randomized methods June 29 1 / 41
2 CONGRATULATIONS! Bob, it s been a long time we have worked together, but I remember this time with delight. I remember your eyes shining when we discovered e E picture for randomized robustness... Remain as young and vital as ever! B.T. Polyak (ICS, Moscow) Randomized methods June 29 2 / 41
3 Outline Historical remarks 2 problems generation of points in a set and optimization Ideal Monte Carlo and convergence estimates Markov Chain Monte Carlo Boundary oracle Hit-and-Run Shake-and-Bake Exploiting barriers Applications to control Applications to optimization Conclusions B.T. Polyak (ICS, Moscow) Randomized methods June 29 3 / 41
4 Historical remarks First random search methods for optimization Rastrigin (196-ies) Randomized methods for control Stengel and Ray (199) Barmish and Polyak (1996) Tempo, Calafiore, Dabbene (24) Revival of randomized approaches for optimization Bertsimas and Vempala (24), Dabbene, Shcherbakov and Polyak (28) B.T. Polyak (ICS, Moscow) Randomized methods June 29 4 / 41
5 Historical remarks First random search methods for optimization Rastrigin (196-ies) Randomized methods for control Stengel and Ray (199) Barmish and Polyak (1996) Tempo, Calafiore, Dabbene (24) Revival of randomized approaches for optimization Bertsimas and Vempala (24), Dabbene, Shcherbakov and Polyak (28) B.T. Polyak (ICS, Moscow) Randomized methods June 29 4 / 41
6 Historical remarks First random search methods for optimization Rastrigin (196-ies) Randomized methods for control Stengel and Ray (199) Barmish and Polyak (1996) Tempo, Calafiore, Dabbene (24) Revival of randomized approaches for optimization Bertsimas and Vempala (24), Dabbene, Shcherbakov and Polyak (28) B.T. Polyak (ICS, Moscow) Randomized methods June 29 4 / 41
7 2 main problems 1. Generation of points in a set X R n. It can be nonconvex and non-connected. Applications: generate stabilizing controllers and calculate performance specifications, generate perturbations and check robustness. 2. Convex optimization min c T x s.t. x X X is a convex bounded closed set in R n with nonempty interior. B.T. Polyak (ICS, Moscow) Randomized methods June 29 5 / 41
8 Ideal Monte Carlo x 1, x 2,... x N independent uniformly distributed points in X Cutting plane method for optimization 1. Set X 1 = X 2. Generate uniform x 1, x 2,... x N X k 3. Find f k = min c T x i 4. Set X k+1 = X k {x : c T x f k } go to Step 2. B.T. Polyak (ICS, Moscow) Randomized methods June 29 6 / 41
9 Convergence estimates f = max x X ct x, f = min x X ct x, h = f f Theorem After k iterations of the algorithm E [f k ] f q k, q = h ( n B N + 1, 1 ), n where B(a, b) is Euler beta-function. Dabbene, Scherbakov, Polyak, 47th CDC, 28 B.T. Polyak (ICS, Moscow) Randomized methods June 29 7 / 41
10 Radon theorem and center of gravity method Case of a special interest: N = 1, k = 1 Theorem Let x 1 be a random point uniformly distributed in X. Then E [ ( c T ] x 1 f h 1 1 ). n + 1 ( E [x 1 ] = g (center of gravity of X), c T g f h 1 1 ) n + 1 [Radon theorem (1916)] center of gravity method x k = g k, X k+1 = X k {x : c T x c T g k } B.T. Polyak (ICS, Moscow) Randomized methods June 29 8 / 41
11 Implementable random algorithms: MCMC How to implement Monte Carlo method? X is a simple set (box, ball, simplex etc.) Rejection method Markov-Chain Monte Carlo schemes (random walks in X) P.Diaconis The Markov chain Monte Carlo revolution, Bull. of the AMS, 29, Vol. 46. No 2. pp B.T. Polyak (ICS, Moscow) Randomized methods June 29 9 / 41
12 Implementable random algorithms: MCMC How to implement Monte Carlo method? X is a simple set (box, ball, simplex etc.) Rejection method Markov-Chain Monte Carlo schemes (random walks in X) P.Diaconis The Markov chain Monte Carlo revolution, Bull. of the AMS, 29, Vol. 46. No 2. pp B.T. Polyak (ICS, Moscow) Randomized methods June 29 9 / 41
13 Implementable random algorithms: MCMC How to implement Monte Carlo method? X is a simple set (box, ball, simplex etc.) Rejection method Markov-Chain Monte Carlo schemes (random walks in X) P.Diaconis The Markov chain Monte Carlo revolution, Bull. of the AMS, 29, Vol. 46. No 2. pp B.T. Polyak (ICS, Moscow) Randomized methods June 29 9 / 41
14 Implementable random algorithms: MCMC How to implement Monte Carlo method? X is a simple set (box, ball, simplex etc.) Rejection method Markov-Chain Monte Carlo schemes (random walks in X) P.Diaconis The Markov chain Monte Carlo revolution, Bull. of the AMS, 29, Vol. 46. No 2. pp B.T. Polyak (ICS, Moscow) Randomized methods June 29 9 / 41
15 Boundary oracle Given x X, d vector specifying the direction in R n L = {t R : x + td X} Boundary oracle For convex sets L = (t, t), where t = inf{t : x + td X}, t = sup{t : x + td X} Complete boundary oracle L = {t R : x + td X} + inner normals to X at the boundary points B.T. Polyak (ICS, Moscow) Randomized methods June 29 1 / 41
16 Boundary oracle Given x X, d vector specifying the direction in R n L = {t R : x + td X} Boundary oracle For convex sets L = (t, t), where t = inf{t : x + td X}, t = sup{t : x + td X} Complete boundary oracle L = {t R : x + td X} + inner normals to X at the boundary points B.T. Polyak (ICS, Moscow) Randomized methods June 29 1 / 41
17 Boundary oracle is available for numerous sets LMI set { } n X = x R n : A + x i A i LMI constrained set of symmetric matrices P X = { P : AP + P A T + C, P } Quadratic matrix inequalities set X = { P : AP + P A T + P BB T P + C, P } Linear algebraic inequalities set X = { x R n : c T i x a i, i = 1,..., m } i=1 B.T. Polyak (ICS, Moscow) Randomized methods June / 41
18 Hit-and-Run 1. i =, x X 2. Choose random direction d uniformly distributed on the unit sphere 3. x i+1 = x i + t 1 d, t 1 is uniformly distributed on L = (t, t) 4. L is updated with respect to x i+1, go to Step 2. Theorem (Turchin (1971), Smith (1984)) Let X be bounded open or coincides with the closure of interior points of X. Then for any measurable set A X probability P i (A) = P (x i A x ) can be estimated as P i (A) P (A) q i, where P (A) = Vol(A) Vol(X) and q < 1 does not depend on x. B.T. Polyak (ICS, Moscow) Randomized methods June / 41
19 Shake-and-Bake: an alternative way for generating points Points are asymptotically uniformly distributed in the boundary of X. Complete boundary oracle is exploited. LMI set { } n X = x R n : A + x i A i n i = (A i e, e), where e is the eigenvector corresponding to n zero eigenvalue of the matrix A + x i A i. i=1 LMI constrained set of symmetric matrices P X = { P : AP + P A T + C } N = (ee T A + A T ee T ), i=1 where e is the eigenvector corresponding to zero eigenvalue of the matrix AP + P A T + C. B.T. Polyak (ICS, Moscow) Randomized methods June / 41
20 Shake-and-Bake: the algorithm 1. i =, x X, n is the normal. 2. Choose random direction s i, s i = 1 ξ 2 n 1 n + r, ξ uniform random in (, 1), r is random unit uniform direction (n, r) =. 3. x i+1 = x i + ts, t is given by the boundary oracle for the direction s. 4. L is updated with respect to x i+1, go to Step 2. B.T. Polyak (ICS, Moscow) Randomized methods June / 41
21 Shake-and-Bake for nonconvex sets B.T. Polyak (ICS, Moscow) Randomized methods June / 41
22 Numerical simulation Standard SDP of the form min c T x n s.t. A + x i A i i=1 A i, i =, 1,... n symmetric real matrices m m; c = [,...,, 1] We applied modified HR where min x i was replaced with averaged X i + various heuristic acceleration methods (scaling, projecting, accelerating step) Open problem: number of HR points in every step. B.T. Polyak (ICS, Moscow) Randomized methods June / 41
23 Disadvantages of HR Jams in a corner. Jams for long or thin bodies. (Lovasz, Vempala. Hit-and-Run from a corner, 27) Number of iterations to achieve accuracy ε N > 1 1 n2 R 2 r 2 ln M ε How to accelerate? Smith (1998) d uniformly on a sphere another distribution H B.T. Polyak (ICS, Moscow) Randomized methods June / 41
24 y Barrier functions in convex optimization Yu. Nesterov, A. Nemirovski, S. Boyd Convex function F (x) is a barrier for a convex set X, if F (x) is defined on interior of X and F (x) for x X. Self-concordant barriers, Newton method Interior-point methods are highly effective for convex optimization! x B.T. Polyak (ICS, Moscow) Randomized methods June / 41
25 Modified Hit-and-Run by use of barriers F (x) self-concordant barrier for X F (x + y) = F (x ) + ( F (x ), y) ( 2 F (x )y, y) +o(y 2 ), }{{} F (y) F (x) quadratic approximation of F (x) at x. Dikin s ellipsoid E = {y : ( 2 F (x )(y x ), y x ) 1} X B.T. Polyak (ICS, Moscow) Randomized methods June / 41
26 Modified Hit-and-Run by use of barriers contd Random direction d = ( 2 F ) 1/2 ξ, ξ uniformly on a sphere Next point x = x + λd, λ [λ, λ] as in Hit-and-Run B.T. Polyak (ICS, Moscow) Randomized methods June 29 2 / 41
27 Modified Hit-and-Run by use of barriers: example Q = {x R 2 : x i a i }, a = [1 4, 1] strip 1 points in R Magenta Hit-and-Run.6 x Blue Modified Hit-and-Run x 1 x 1 5 B.T. Polyak (ICS, Moscow) Randomized methods June / 41
28 y Barrier MC: Phase 1 F (x) self-concordant barrier for X F (x + y) = F (x ) + ( F (x ), y) ( 2 F (x )y, y) +o(y 2 ), }{{} F (y) E = {y : F (y) }, or E = {y : ( 2 F (x )(y x ), y x )) δ}, x = ( 2 F (x )) 1 F (x ), δ = (( 2 F (x )) 1 F (x ), F (x )) Random direction 1 ( d = x 2 ) 1/2 F + ξ, δ 1 2 x * x ξ uniformly on a sphere 3 4 B.T. Polyak (ICS, Moscow) Randomized methods June / x
29 Advantages Departs corners. Well walks in long and thin sets. Conjecture x k asymptotically distributed with density ρ(x) > on X. Phase ( 2. 2 ) 1/2 F Ellipsoid is defined by matrix H i =. δ After N iterations let H = 1 Hi and generate random direction N d = Hξ, ξ uniformly on unit sphere B.T. Polyak (ICS, Moscow) Randomized methods June / 41
30 Barrier MC can be applied for Polyhedral sets in R n X = {x R n : (a i, x) b i, i = 1,..., m}. m F (x) = ln(b i (a i, x)), F (x ) = a i 1 (a i, x ), i=1 2 F (x ) = a i a it (1 (a i, x )) 2 LMI in standard format X = {x R l : A(x) = A + F (x) = ln det(a(x)) LMI with matrix variables l x i A i, A i S n n }. i=1 Q = {X, trcx 1, X, C S n n }. F (X) = ln detx ln(1 trcx) B.T. Polyak (ICS, Moscow) Randomized methods June / 41
31 Example: diamond X = {x R n : x i, (a, x) 1}, a = [1,..., 1, 1 4 ], F (x) = lnx i ln(1 (a, x)) x B.T. Polyak (ICS, Moscow) Randomized methods June / 41
32 Comparison with pure MC X = {x R n : x i, (a, x) 1}, a = [1,..., 1, 1 4 ] f i = (a, x i ), f 1, x Q If x i are uniform on X, then cdf t Φ(t) = p(f)df t n Φ(t) n = 3 N = 2 n = 1 N = 1 n = 2 N = t B.T. Polyak (ICS, Moscow) Randomized methods June / 41
33 Empirical density (N = 5, R 2 ) 4 1 New Method Phase 1 1 x 1 4 Hit and Run New Method Phase 2 1 x 1 4 Ideal Monte Carlo B.T. Polyak (ICS, Moscow) Randomized methods June / 41
34 Example: stick and sheet R 2 Q = {x R n : x i a i } a = [1 4,..., 1 4, 1] a = [1,..., 1, 1 4 ] 1 R 2 N=1 12 x 1 5 R 2 N= x 2.5 x n x 1 x x 1 B.T. Polyak (ICS, Moscow) Randomized methods June / 41
35 Example: stick and sheet R2 Q = {x Rn : xi ai } a = [1 4,..., 1 4, 1] a = [1,..., 1, 1 4 ] R2 5 x 1 R2 N=5 N= xn xn x1 B.T. Polyak (ICS, Moscow) x Randomized methods x June / 41
36 Example: stick and sheet R 2 Phase 2 Q = {x R n : x i a i } a = [1 4,..., 1 4, 1] a = [1,..., 1, 1 4 ] R 2 ÔÀÇÀ 2 N= x 1 5 R 2 N= x n x n x x x B.T. Polyak (ICS, Moscow) Randomized methods June 29 3 / 41
37 Standard LMI l X = {x R l : A(x) = A + x i A i, A i S n n } i=1 F (x) = ln det(a(x)) F (x ) i = tra i A(x ) 1, 2 F (x ) ij = tra(x ) 1 A i A(x ) 1 A j 1 1 x A, A 1, A 2 random matrices 4 4, A 1 points x 1 B.T. Polyak (ICS, Moscow) Randomized methods June / 41
38 LMI with matrix variables Q = {X, trcx 1, X, C S n n }. F (X) = ln detx ln(1 trcx), ε = 1 trcx Ellipsoid E = {Y : F (y) } is X + X CX, Y + 1 ε 2 Y + X CY CX, Y ε 2 Center of ellipsoid Y solution of Lyapunov equation AXB T X + G = where A = X ( ) C ε,b = CX X CX, G = X 2 ε Choice of direction for ellipsoid AXA + BXB, X δ Z - uniform on Z F = 1, λ uniform on [ α, α], α : α 2 AZA + BZB, Z = δ, then λz E. B.T. Polyak (ICS, Moscow) Randomized methods June / 41
39 X : 2 2, C = I {[ ] p11 p Q = 12 p 12 p 22 }, p 11 + p 22 1 p 11 >, p 22 > p 11 p 22 p 2 12 p 11 + p P P 22 P B.T. Polyak (ICS, Moscow) Randomized methods June / 41
40 Comparison with pure MC Theorem If x i are uniform i.i.d. on X R n, f = (c, x), f = min(c, x), h = max (c, x) f then X E[min f i ] f h X B ( N + 1, 1 n n ), i = 1,..., N n N C E[min f i ] 1 E[max f i ] I diag(1,1) I diag(1,...,1,1) B.T. Polyak (ICS, Moscow) Randomized methods June / 41
41 Comparison with pure MC t Φ(t) = p(f)df t m, m = n(n + 1) B.T. Polyak (ICS, Moscow) Randomized methods June / 41
42 Applications to control Sets with available boundary oracle Stability set for polynomials K = {k R n : p(s, k) = p (s) + Stability set for matrices n k i p i (s) is stable} A R n n, B R n m, C R l n K = {K R m l : A + BKC is stable} Robust stability set for polynomials n K = {k : P (s, q) + k i P i (s, q) is stable q Q}, Q R m i=1 i=1 Quadratic stability set ẋ = Ax K = {P > : AP + P A T } B.T. Polyak (ICS, Moscow) Randomized methods June / 41
43 Stability set for polynomials K = {k R n : p(s, k) = p (s) + n k i p i (s) is stable} i=1 k K i.e. p(s, k ) is stable, d = s/ s, s = randn(n,1) random direction Boundary oracle: L = {t R : k + td K}, i.e. {t R : p(s, k ) + t d i p i (s) is stable}. D-decomposition problem for real scalar parameter t! Gryazina E. N., Polyak B. T. Stability regions in the parameter space: D-decomposition revisited //Automatica. 26. Vol. 42, No. 1, P B.T. Polyak (ICS, Moscow) Randomized methods June / 41
44 Example: Generating points in the disconnected set K = {k R n : p(s, k) = p (s) + n k i p i (s) is stable}, p(s, k) = 2.2s s s k 1 (s 3 + s 2 s 1) + k 2 (s 3 3s 2 + 3s 1) i=1 B.T. Polyak (ICS, Moscow) Randomized methods June / 41
45 Stability set for matrices ẋ = Ax + Bu, y = Cx, u = Ky A R n n, B R n m, C R l n ; K = {K R m l : A + BKC is stable} K K, i.e. A + BK C is stable D = Y/ Y, Y = randn(m, l) random direction in the matrix space K A + B(K + td)c = F + tg, where F = A + BK C, G = BDC Boundary oracle: L = {t R : F + tg is stable} Total description of L is hard: f(t) = max Re eig(f + tg) numerical solution of the equation f(t) =, t > (MatLab command fsolve) B.T. Polyak (ICS, Moscow) Randomized methods June / 41
46 Quadratic stability ẋ = Ax + Bu, u = Kx K = {K : P >, A T c P + P A c }, K is convex and bounded. A c = A + BK Q = P 1 >, QA T + AQ + BY + Y T B T <, Y = KQ. k K, Q = P 1, Y = K Q starting points Q = Q + tj, Y = Y + tg, where J and G are random directions in the matrix space. initial inequality F + tr < Boundary oracle: L = ( t, t), where t = min λ i, t = min µ i ; λ i real positive eigenvalues for the pair of matrices F = Q A T + AQ + BY + Y T B T and R = JA T + AJ + BG + G T B T ; µ i correspondingly for matrices F, R. B.T. Polyak (ICS, Moscow) Randomized methods June 29 4 / 41
47 Conclusions New versions of MCMC are effective Randomized approaches for optimization are promising. Proposed methods are simple in implementation and give an opportunity to solve large-dimensional problems. B.T. Polyak (ICS, Moscow) Randomized methods June / 41
Course Outline. FRTN10 Multivariable Control, Lecture 13. General idea for Lectures Lecture 13 Outline. Example 1 (Doyle Stein, 1979)
Course Outline FRTN Multivariable Control, Lecture Automatic Control LTH, 6 L-L Specifications, models and loop-shaping by hand L6-L8 Limitations on achievable performance L9-L Controller optimization:
More informationDeterminant maximization with linear. S. Boyd, L. Vandenberghe, S.-P. Wu. Information Systems Laboratory. Stanford University
Determinant maximization with linear matrix inequality constraints S. Boyd, L. Vandenberghe, S.-P. Wu Information Systems Laboratory Stanford University SCCM Seminar 5 February 1996 1 MAXDET problem denition
More informationLMI MODELLING 4. CONVEX LMI MODELLING. Didier HENRION. LAAS-CNRS Toulouse, FR Czech Tech Univ Prague, CZ. Universidad de Valladolid, SP March 2009
LMI MODELLING 4. CONVEX LMI MODELLING Didier HENRION LAAS-CNRS Toulouse, FR Czech Tech Univ Prague, CZ Universidad de Valladolid, SP March 2009 Minors A minor of a matrix F is the determinant of a submatrix
More informationCOURSE ON LMI PART I.2 GEOMETRY OF LMI SETS. Didier HENRION henrion
COURSE ON LMI PART I.2 GEOMETRY OF LMI SETS Didier HENRION www.laas.fr/ henrion October 2006 Geometry of LMI sets Given symmetric matrices F i we want to characterize the shape in R n of the LMI set F
More informationThe Q-parametrization (Youla) Lecture 13: Synthesis by Convex Optimization. Lecture 13: Synthesis by Convex Optimization. Example: Spring-mass System
The Q-parametrization (Youla) Lecture 3: Synthesis by Convex Optimization controlled variables z Plant distubances w Example: Spring-mass system measurements y Controller control inputs u Idea for lecture
More informationInterval solutions for interval algebraic equations
Mathematics and Computers in Simulation 66 (2004) 207 217 Interval solutions for interval algebraic equations B.T. Polyak, S.A. Nazin Institute of Control Sciences, Russian Academy of Sciences, 65 Profsoyuznaya
More informationBarrier Method. Javier Peña Convex Optimization /36-725
Barrier Method Javier Peña Convex Optimization 10-725/36-725 1 Last time: Newton s method For root-finding F (x) = 0 x + = x F (x) 1 F (x) For optimization x f(x) x + = x 2 f(x) 1 f(x) Assume f strongly
More informationFRTN10 Multivariable Control, Lecture 13. Course outline. The Q-parametrization (Youla) Example: Spring-mass System
FRTN Multivariable Control, Lecture 3 Anders Robertsson Automatic Control LTH, Lund University Course outline The Q-parametrization (Youla) L-L5 Purpose, models and loop-shaping by hand L6-L8 Limitations
More informationFrom Convex Optimization to Linear Matrix Inequalities
Dep. of Information Engineering University of Pisa (Italy) From Convex Optimization to Linear Matrix Inequalities eng. Sergio Grammatico grammatico.sergio@gmail.com Class of Identification of Uncertain
More information1. Find the solution of the following uncontrolled linear system. 2 α 1 1
Appendix B Revision Problems 1. Find the solution of the following uncontrolled linear system 0 1 1 ẋ = x, x(0) =. 2 3 1 Class test, August 1998 2. Given the linear system described by 2 α 1 1 ẋ = x +
More informationThe Lovasz-Vempala algorithm for computing the volume of a convex body in O*(n^4) - Theory and Implementation
The Lovasz-Vempala algorithm for computing the volume of a convex body in O*(n^4) - Theory and Implementation Mittagsseminar, 20. Dec. 2011 Christian L. Müller, MOSAIC group, Institute of Theoretical Computer
More informationDenis ARZELIER arzelier
COURSE ON LMI OPTIMIZATION WITH APPLICATIONS IN CONTROL PART II.2 LMIs IN SYSTEMS CONTROL STATE-SPACE METHODS PERFORMANCE ANALYSIS and SYNTHESIS Denis ARZELIER www.laas.fr/ arzelier arzelier@laas.fr 15
More informationPENNON A Generalized Augmented Lagrangian Method for Nonconvex NLP and SDP p.1/22
PENNON A Generalized Augmented Lagrangian Method for Nonconvex NLP and SDP Michal Kočvara Institute of Information Theory and Automation Academy of Sciences of the Czech Republic and Czech Technical University
More informationEE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 17
EE/ACM 150 - Applications of Convex Optimization in Signal Processing and Communications Lecture 17 Andre Tkacenko Signal Processing Research Group Jet Propulsion Laboratory May 29, 2012 Andre Tkacenko
More informationSemidefinite Programming
Chapter 2 Semidefinite Programming 2.0.1 Semi-definite programming (SDP) Given C M n, A i M n, i = 1, 2,..., m, and b R m, the semi-definite programming problem is to find a matrix X M n for the optimization
More informationA Conservation Law Method in Optimization
A Conservation Law Method in Optimization Bin Shi Florida International University Tao Li Florida International University Sundaraja S. Iyengar Florida International University Abstract bshi1@cs.fiu.edu
More informationComparison of Lasserre s measure based bounds for polynomial optimization to bounds obtained by simulated annealing
Comparison of Lasserre s measure based bounds for polynomial optimization to bounds obtained by simulated annealing Etienne de lerk Monique Laurent March 2, 2017 Abstract We consider the problem of minimizing
More informationThe maximal stable set problem : Copositive programming and Semidefinite Relaxations
The maximal stable set problem : Copositive programming and Semidefinite Relaxations Kartik Krishnan Department of Mathematical Sciences Rensselaer Polytechnic Institute Troy, NY 12180 USA kartis@rpi.edu
More informationStatic Output Feedback Stabilisation with H Performance for a Class of Plants
Static Output Feedback Stabilisation with H Performance for a Class of Plants E. Prempain and I. Postlethwaite Control and Instrumentation Research, Department of Engineering, University of Leicester,
More informationA solution approach for linear optimization with completely positive matrices
A solution approach for linear optimization with completely positive matrices Franz Rendl http://www.math.uni-klu.ac.at Alpen-Adria-Universität Klagenfurt Austria joint work with M. Bomze (Wien) and F.
More informationThe Ongoing Development of CSDP
The Ongoing Development of CSDP Brian Borchers Department of Mathematics New Mexico Tech Socorro, NM 87801 borchers@nmt.edu Joseph Young Department of Mathematics New Mexico Tech (Now at Rice University)
More informationLargest dual ellipsoids inscribed in dual cones
Largest dual ellipsoids inscribed in dual cones M. J. Todd June 23, 2005 Abstract Suppose x and s lie in the interiors of a cone K and its dual K respectively. We seek dual ellipsoidal norms such that
More information7. Lecture notes on the ellipsoid algorithm
Massachusetts Institute of Technology Michel X. Goemans 18.433: Combinatorial Optimization 7. Lecture notes on the ellipsoid algorithm The simplex algorithm was the first algorithm proposed for linear
More informationResearch Article An Equivalent LMI Representation of Bounded Real Lemma for Continuous-Time Systems
Hindawi Publishing Corporation Journal of Inequalities and Applications Volume 28, Article ID 67295, 8 pages doi:1.1155/28/67295 Research Article An Equivalent LMI Representation of Bounded Real Lemma
More informationAn ellipsoid algorithm for probabilistic robust controller design
Delft University of Technology Fac. of Information Technology and Systems Control Systems Engineering Technical report CSE02-017 An ellipsoid algorithm for probabilistic robust controller design S. Kanev,
More informationConvex Optimization. (EE227A: UC Berkeley) Lecture 6. Suvrit Sra. (Conic optimization) 07 Feb, 2013
Convex Optimization (EE227A: UC Berkeley) Lecture 6 (Conic optimization) 07 Feb, 2013 Suvrit Sra Organizational Info Quiz coming up on 19th Feb. Project teams by 19th Feb Good if you can mix your research
More informationModel Counting for Logical Theories
Model Counting for Logical Theories Wednesday Dmitry Chistikov Rayna Dimitrova Department of Computer Science University of Oxford, UK Max Planck Institute for Software Systems (MPI-SWS) Kaiserslautern
More information6 Markov Chain Monte Carlo (MCMC)
6 Markov Chain Monte Carlo (MCMC) The underlying idea in MCMC is to replace the iid samples of basic MC methods, with dependent samples from an ergodic Markov chain, whose limiting (stationary) distribution
More informationLecture Note 5: Semidefinite Programming for Stability Analysis
ECE7850: Hybrid Systems:Theory and Applications Lecture Note 5: Semidefinite Programming for Stability Analysis Wei Zhang Assistant Professor Department of Electrical and Computer Engineering Ohio State
More informationTheory in Model Predictive Control :" Constraint Satisfaction and Stability!
Theory in Model Predictive Control :" Constraint Satisfaction and Stability Colin Jones, Melanie Zeilinger Automatic Control Laboratory, EPFL Example: Cessna Citation Aircraft Linearized continuous-time
More informationSYNTHESIS OF ROBUST DISCRETE-TIME SYSTEMS BASED ON COMPARISON WITH STOCHASTIC MODEL 1. P. V. Pakshin, S. G. Soloviev
SYNTHESIS OF ROBUST DISCRETE-TIME SYSTEMS BASED ON COMPARISON WITH STOCHASTIC MODEL 1 P. V. Pakshin, S. G. Soloviev Nizhny Novgorod State Technical University at Arzamas, 19, Kalinina ul., Arzamas, 607227,
More informationCS675: Convex and Combinatorial Optimization Spring 2018 The Ellipsoid Algorithm. Instructor: Shaddin Dughmi
CS675: Convex and Combinatorial Optimization Spring 2018 The Ellipsoid Algorithm Instructor: Shaddin Dughmi History and Basics Originally developed in the mid 70s by Iudin, Nemirovski, and Shor for use
More informationWhat can be expressed via Conic Quadratic and Semidefinite Programming?
What can be expressed via Conic Quadratic and Semidefinite Programming? A. Nemirovski Faculty of Industrial Engineering and Management Technion Israel Institute of Technology Abstract Tremendous recent
More informationStochastic Design Criteria in Linear Models
AUSTRIAN JOURNAL OF STATISTICS Volume 34 (2005), Number 2, 211 223 Stochastic Design Criteria in Linear Models Alexander Zaigraev N. Copernicus University, Toruń, Poland Abstract: Within the framework
More informationOn the Sandwich Theorem and a approximation algorithm for MAX CUT
On the Sandwich Theorem and a 0.878-approximation algorithm for MAX CUT Kees Roos Technische Universiteit Delft Faculteit Electrotechniek. Wiskunde en Informatica e-mail: C.Roos@its.tudelft.nl URL: http://ssor.twi.tudelft.nl/
More information8. Geometric problems
8. Geometric problems Convex Optimization Boyd & Vandenberghe extremal volume ellipsoids centering classification placement and facility location 8 Minimum volume ellipsoid around a set Löwner-John ellipsoid
More informationAn interior-point trust-region polynomial algorithm for convex programming
An interior-point trust-region polynomial algorithm for convex programming Ye LU and Ya-xiang YUAN Abstract. An interior-point trust-region algorithm is proposed for minimization of a convex quadratic
More informationInterior Point Algorithms for Constrained Convex Optimization
Interior Point Algorithms for Constrained Convex Optimization Chee Wei Tan CS 8292 : Advanced Topics in Convex Optimization and its Applications Fall 2010 Outline Inequality constrained minimization problems
More informationEE364b Homework 4. L(y,ν) = (1/2) y x ν(1 T y 1), minimize (1/2) y x 2 2 subject to y 0, 1 T y = 1,
EE364b Prof. S. Boyd EE364b Homework 4 1. Projection onto the probability simplex. In this problem you will work out a simple method for finding the Euclidean projection y of x R n onto the probability
More informationAn Algorithm for Solving the Convex Feasibility Problem With Linear Matrix Inequality Constraints and an Implementation for Second-Order Cones
An Algorithm for Solving the Convex Feasibility Problem With Linear Matrix Inequality Constraints and an Implementation for Second-Order Cones Bryan Karlovitz July 19, 2012 West Chester University of Pennsylvania
More informationBayesian Estimation of Input Output Tables for Russia
Bayesian Estimation of Input Output Tables for Russia Oleg Lugovoy (EDF, RANE) Andrey Polbin (RANE) Vladimir Potashnikov (RANE) WIOD Conference April 24, 2012 Groningen Outline Motivation Objectives Bayesian
More informationA TOUR OF LINEAR ALGEBRA FOR JDEP 384H
A TOUR OF LINEAR ALGEBRA FOR JDEP 384H Contents Solving Systems 1 Matrix Arithmetic 3 The Basic Rules of Matrix Arithmetic 4 Norms and Dot Products 5 Norms 5 Dot Products 6 Linear Programming 7 Eigenvectors
More informationECE 680 Linear Matrix Inequalities
ECE 680 Linear Matrix Inequalities Stan Żak School of Electrical and Computer Engineering Purdue University zak@purdue.edu October 11, 2017 Stan Żak (Purdue University) ECE 680 Linear Matrix Inequalities
More informationInterior Point Methods: Second-Order Cone Programming and Semidefinite Programming
School of Mathematics T H E U N I V E R S I T Y O H F E D I N B U R G Interior Point Methods: Second-Order Cone Programming and Semidefinite Programming Jacek Gondzio Email: J.Gondzio@ed.ac.uk URL: http://www.maths.ed.ac.uk/~gondzio
More informationAn E cient A ne-scaling Algorithm for Hyperbolic Programming
An E cient A ne-scaling Algorithm for Hyperbolic Programming Jim Renegar joint work with Mutiara Sondjaja 1 Euclidean space A homogeneous polynomial p : E!R is hyperbolic if there is a vector e 2E such
More informationThroughout these notes we assume V, W are finite dimensional inner product spaces over C.
Math 342 - Linear Algebra II Notes Throughout these notes we assume V, W are finite dimensional inner product spaces over C 1 Upper Triangular Representation Proposition: Let T L(V ) There exists an orthonormal
More informationA New Look at the Performance Analysis of First-Order Methods
A New Look at the Performance Analysis of First-Order Methods Marc Teboulle School of Mathematical Sciences Tel Aviv University Joint work with Yoel Drori, Google s R&D Center, Tel Aviv Optimization without
More information15-780: LinearProgramming
15-780: LinearProgramming J. Zico Kolter February 1-3, 2016 1 Outline Introduction Some linear algebra review Linear programming Simplex algorithm Duality and dual simplex 2 Outline Introduction Some linear
More informationConvex Optimization and l 1 -minimization
Convex Optimization and l 1 -minimization Sangwoon Yun Computational Sciences Korea Institute for Advanced Study December 11, 2009 2009 NIMS Thematic Winter School Outline I. Convex Optimization II. l
More informationMulti-objective Controller Design:
Multi-objective Controller Design: Evolutionary algorithms and Bilinear Matrix Inequalities for a passive suspension A. Molina-Cristobal*, C. Papageorgiou**, G. T. Parks*, M. C. Smith**, P. J. Clarkson*
More informationLecture 6: Conic Optimization September 8
IE 598: Big Data Optimization Fall 2016 Lecture 6: Conic Optimization September 8 Lecturer: Niao He Scriber: Juan Xu Overview In this lecture, we finish up our previous discussion on optimality conditions
More informationProblem structure in semidefinite programs arising in control and signal processing
Problem structure in semidefinite programs arising in control and signal processing Lieven Vandenberghe Electrical Engineering Department, UCLA Joint work with: Mehrdad Nouralishahi, Tae Roh Semidefinite
More informationOn the Scalability in Cooperative Control. Zhongkui Li. Peking University
On the Scalability in Cooperative Control Zhongkui Li Email: zhongkli@pku.edu.cn Peking University June 25, 2016 Zhongkui Li (PKU) Scalability June 25, 2016 1 / 28 Background Cooperative control is to
More informationSF2822 Applied Nonlinear Optimization. Preparatory question. Lecture 10: Interior methods. Anders Forsgren. 1. Try to solve theory question 7.
SF2822 Applied Nonlinear Optimization Lecture 10: Interior methods Anders Forsgren SF2822 Applied Nonlinear Optimization, KTH 1 / 24 Lecture 10, 2017/2018 Preparatory question 1. Try to solve theory question
More informationA New Trust Region Algorithm Using Radial Basis Function Models
A New Trust Region Algorithm Using Radial Basis Function Models Seppo Pulkkinen University of Turku Department of Mathematics July 14, 2010 Outline 1 Introduction 2 Background Taylor series approximations
More informationOptimization in. Stephen Boyd. 3rd SIAM Conf. Control & Applications. and Control Theory. System. Convex
Optimization in Convex and Control Theory System Stephen Boyd Engineering Department Electrical University Stanford 3rd SIAM Conf. Control & Applications 1 Basic idea Many problems arising in system and
More informationSemidefinite Programming Duality and Linear Time-invariant Systems
Semidefinite Programming Duality and Linear Time-invariant Systems Venkataramanan (Ragu) Balakrishnan School of ECE, Purdue University 2 July 2004 Workshop on Linear Matrix Inequalities in Control LAAS-CNRS,
More informationCS711008Z Algorithm Design and Analysis
CS711008Z Algorithm Design and Analysis Lecture 8 Linear programming: interior point method Dongbo Bu Institute of Computing Technology Chinese Academy of Sciences, Beijing, China 1 / 31 Outline Brief
More informationIntroduction to Matrix Algebra
Introduction to Matrix Algebra August 18, 2010 1 Vectors 1.1 Notations A p-dimensional vector is p numbers put together. Written as x 1 x =. x p. When p = 1, this represents a point in the line. When p
More informationAnalysis-3 lecture schemes
Analysis-3 lecture schemes (with Homeworks) 1 Csörgő István November, 2015 1 A jegyzet az ELTE Informatikai Kar 2015. évi Jegyzetpályázatának támogatásával készült Contents 1. Lesson 1 4 1.1. The Space
More informationJune Engineering Department, Stanford University. System Analysis and Synthesis. Linear Matrix Inequalities. Stephen Boyd (E.
Stephen Boyd (E. Feron :::) System Analysis and Synthesis Control Linear Matrix Inequalities via Engineering Department, Stanford University Electrical June 1993 ACC, 1 linear matrix inequalities (LMIs)
More informationModern Optimal Control
Modern Optimal Control Matthew M. Peet Arizona State University Lecture 19: Stabilization via LMIs Optimization Optimization can be posed in functional form: min x F objective function : inequality constraints
More informationMathematical Optimisation, Chpt 2: Linear Equations and inequalities
Mathematical Optimisation, Chpt 2: Linear Equations and inequalities Peter J.C. Dickinson p.j.c.dickinson@utwente.nl http://dickinson.website version: 12/02/18 Monday 5th February 2018 Peter J.C. Dickinson
More informationA Quick Tour of Linear Algebra and Optimization for Machine Learning
A Quick Tour of Linear Algebra and Optimization for Machine Learning Masoud Farivar January 8, 2015 1 / 28 Outline of Part I: Review of Basic Linear Algebra Matrices and Vectors Matrix Multiplication Operators
More informationPENNON A Generalized Augmented Lagrangian Method for Convex NLP and SDP p.1/39
PENNON A Generalized Augmented Lagrangian Method for Convex NLP and SDP Michal Kočvara Institute of Information Theory and Automation Academy of Sciences of the Czech Republic and Czech Technical University
More informationRank-one LMIs and Lyapunov's Inequality. Gjerrit Meinsma 4. Abstract. We describe a new proof of the well-known Lyapunov's matrix inequality about
Rank-one LMIs and Lyapunov's Inequality Didier Henrion 1;; Gjerrit Meinsma Abstract We describe a new proof of the well-known Lyapunov's matrix inequality about the location of the eigenvalues of a matrix
More informationProximal and First-Order Methods for Convex Optimization
Proximal and First-Order Methods for Convex Optimization John C Duchi Yoram Singer January, 03 Abstract We describe the proximal method for minimization of convex functions We review classical results,
More information1 Introduction Semidenite programming (SDP) has been an active research area following the seminal work of Nesterov and Nemirovski [9] see also Alizad
Quadratic Maximization and Semidenite Relaxation Shuzhong Zhang Econometric Institute Erasmus University P.O. Box 1738 3000 DR Rotterdam The Netherlands email: zhang@few.eur.nl fax: +31-10-408916 August,
More informationOptimisation and Operations Research
Optimisation and Operations Research Lecture 22: Linear Programming Revisited Matthew Roughan http://www.maths.adelaide.edu.au/matthew.roughan/ Lecture_notes/OORII/ School
More informationthe uncertain system with probability one in a finite number of steps. Subsequently, at a fixed step k, we also compute a lower bound of the probabili
Probabilistic Robust Design with Linear Quadratic Regulators B. T. Polyak Λ and R. Tempo ΛΛ Λ Institute for Control Science Russian Academy of Sciences Profsojuznaja 65, Moscow 117806 Russia boris@ipu.rssi.ru
More informationLecture notes on the ellipsoid algorithm
Massachusetts Institute of Technology Handout 1 18.433: Combinatorial Optimization May 14th, 007 Michel X. Goemans Lecture notes on the ellipsoid algorithm The simplex algorithm was the first algorithm
More informationELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications
ELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications Professor M. Chiang Electrical Engineering Department, Princeton University March
More informationAdvances in Convex Optimization: Theory, Algorithms, and Applications
Advances in Convex Optimization: Theory, Algorithms, and Applications Stephen Boyd Electrical Engineering Department Stanford University (joint work with Lieven Vandenberghe, UCLA) ISIT 02 ISIT 02 Lausanne
More information1 Lyapunov theory of stability
M.Kawski, APM 581 Diff Equns Intro to Lyapunov theory. November 15, 29 1 1 Lyapunov theory of stability Introduction. Lyapunov s second (or direct) method provides tools for studying (asymptotic) stability
More informationORF 523 Lecture 9 Spring 2016, Princeton University Instructor: A.A. Ahmadi Scribe: G. Hall Thursday, March 10, 2016
ORF 523 Lecture 9 Spring 2016, Princeton University Instructor: A.A. Ahmadi Scribe: G. Hall Thursday, March 10, 2016 When in doubt on the accuracy of these notes, please cross check with the instructor
More informationAccelerating Nesterov s Method for Strongly Convex Functions
Accelerating Nesterov s Method for Strongly Convex Functions Hao Chen Xiangrui Meng MATH301, 2011 Outline The Gap 1 The Gap 2 3 Outline The Gap 1 The Gap 2 3 Our talk begins with a tiny gap For any x 0
More informationFeedback control of transient energy growth in subcritical plane Poiseuille flow
Feedback control of transient energy growth in subcritical plane Poiseuille flow F. Martinelli, M. Quadrio, J. McKernan, J.F. Whidborne LadHyX, École Polytechnique; Politecnico di Milano; King s College,
More informationFast Algorithms for SDPs derived from the Kalman-Yakubovich-Popov Lemma
Fast Algorithms for SDPs derived from the Kalman-Yakubovich-Popov Lemma Venkataramanan (Ragu) Balakrishnan School of ECE, Purdue University 8 September 2003 European Union RTN Summer School on Multi-Agent
More informationNOTES ON LINEAR ALGEBRA CLASS HANDOUT
NOTES ON LINEAR ALGEBRA CLASS HANDOUT ANTHONY S. MAIDA CONTENTS 1. Introduction 2 2. Basis Vectors 2 3. Linear Transformations 2 3.1. Example: Rotation Transformation 3 4. Matrix Multiplication and Function
More informationCopositive Programming and Combinatorial Optimization
Copositive Programming and Combinatorial Optimization Franz Rendl http://www.math.uni-klu.ac.at Alpen-Adria-Universität Klagenfurt Austria joint work with I.M. Bomze (Wien) and F. Jarre (Düsseldorf) IMA
More informationCanonical Problem Forms. Ryan Tibshirani Convex Optimization
Canonical Problem Forms Ryan Tibshirani Convex Optimization 10-725 Last time: optimization basics Optimization terology (e.g., criterion, constraints, feasible points, solutions) Properties and first-order
More informationExercise Sheet 1.
Exercise Sheet 1 You can download my lecture and exercise sheets at the address http://sami.hust.edu.vn/giang-vien/?name=huynt 1) Let A, B be sets. What does the statement "A is not a subset of B " mean?
More informationThe moment-lp and moment-sos approaches
The moment-lp and moment-sos approaches LAAS-CNRS and Institute of Mathematics, Toulouse, France CIRM, November 2013 Semidefinite Programming Why polynomial optimization? LP- and SDP- CERTIFICATES of POSITIVITY
More informationLinear Algebra review Powers of a diagonalizable matrix Spectral decomposition
Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Prof. Tesler Math 283 Fall 2018 Also see the separate version of this with Matlab and R commands. Prof. Tesler Diagonalizing
More informationAppendix A Solving Linear Matrix Inequality (LMI) Problems
Appendix A Solving Linear Matrix Inequality (LMI) Problems In this section, we present a brief introduction about linear matrix inequalities which have been used extensively to solve the FDI problems described
More informationChapter 1. Matrix Algebra
ST4233, Linear Models, Semester 1 2008-2009 Chapter 1. Matrix Algebra 1 Matrix and vector notation Definition 1.1 A matrix is a rectangular or square array of numbers of variables. We use uppercase boldface
More informationQuadratic Stability of Dynamical Systems. Raktim Bhattacharya Aerospace Engineering, Texas A&M University
.. Quadratic Stability of Dynamical Systems Raktim Bhattacharya Aerospace Engineering, Texas A&M University Quadratic Lyapunov Functions Quadratic Stability Dynamical system is quadratically stable if
More informationKnowledge Discovery and Data Mining 1 (VO) ( )
Knowledge Discovery and Data Mining 1 (VO) (707.003) Review of Linear Algebra Denis Helic KTI, TU Graz Oct 9, 2014 Denis Helic (KTI, TU Graz) KDDM1 Oct 9, 2014 1 / 74 Big picture: KDDM Probability Theory
More informationEE Applications of Convex Optimization in Signal Processing and Communications Dr. Andre Tkacenko, JPL Third Term
EE 150 - Applications of Convex Optimization in Signal Processing and Communications Dr. Andre Tkacenko JPL Third Term 2011-2012 Due on Thursday May 3 in class. Homework Set #4 1. (10 points) (Adapted
More informationA sensitivity result for quadratic semidefinite programs with an application to a sequential quadratic semidefinite programming algorithm
Volume 31, N. 1, pp. 205 218, 2012 Copyright 2012 SBMAC ISSN 0101-8205 / ISSN 1807-0302 (Online) www.scielo.br/cam A sensitivity result for quadratic semidefinite programs with an application to a sequential
More informationContinuous methods for numerical linear algebra problems
Continuous methods for numerical linear algebra problems Li-Zhi Liao (http://www.math.hkbu.edu.hk/ liliao) Department of Mathematics Hong Kong Baptist University The First International Summer School on
More informationLecture 15 Newton Method and Self-Concordance. October 23, 2008
Newton Method and Self-Concordance October 23, 2008 Outline Lecture 15 Self-concordance Notion Self-concordant Functions Operations Preserving Self-concordance Properties of Self-concordant Functions Implications
More informationTopics in Theoretical Computer Science: An Algorithmist's Toolkit Fall 2007
MIT OpenCourseWare http://ocw.mit.edu 18.409 Topics in Theoretical Computer Science: An Algorithmist's Toolkit Fall 2007 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.
More informationLecture: Algorithms for LP, SOCP and SDP
1/53 Lecture: Algorithms for LP, SOCP and SDP Zaiwen Wen Beijing International Center For Mathematical Research Peking University http://bicmr.pku.edu.cn/~wenzw/bigdata2018.html wenzw@pku.edu.cn Acknowledgement:
More informationSparse Covariance Selection using Semidefinite Programming
Sparse Covariance Selection using Semidefinite Programming A. d Aspremont ORFE, Princeton University Joint work with O. Banerjee, L. El Ghaoui & G. Natsoulis, U.C. Berkeley & Iconix Pharmaceuticals Support
More informationPreliminaries Overview OPF and Extensions. Convex Optimization. Lecture 8 - Applications in Smart Grids. Instructor: Yuanzhang Xiao
Convex Optimization Lecture 8 - Applications in Smart Grids Instructor: Yuanzhang Xiao University of Hawaii at Manoa Fall 2017 1 / 32 Today s Lecture 1 Generalized Inequalities and Semidefinite Programming
More informationSimulated Annealing for Constrained Global Optimization
Monte Carlo Methods for Computation and Optimization Final Presentation Simulated Annealing for Constrained Global Optimization H. Edwin Romeijn & Robert L.Smith (1994) Presented by Ariel Schwartz Objective
More informationBasic Linear Algebra in MATLAB
Basic Linear Algebra in MATLAB 9.29 Optional Lecture 2 In the last optional lecture we learned the the basic type in MATLAB is a matrix of double precision floating point numbers. You learned a number
More informationLinear Algebra review Powers of a diagonalizable matrix Spectral decomposition
Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Prof. Tesler Math 283 Fall 2016 Also see the separate version of this with Matlab and R commands. Prof. Tesler Diagonalizing
More informationEE363 homework 8 solutions
EE363 Prof. S. Boyd EE363 homework 8 solutions 1. Lyapunov condition for passivity. The system described by ẋ = f(x, u), y = g(x), x() =, with u(t), y(t) R m, is said to be passive if t u(τ) T y(τ) dτ
More information