Invariant Subspace Computation - A Geometric Approach
|
|
- Heather Cobb
- 5 years ago
- Views:
Transcription
1 Invariant Subspace Computation - A Geometric Approach Pierre-Antoine Absil PhD Defense Université de Liège Thursday, 13 February
2 Acknowledgements Advisor: Prof. R. Sepulchre (Université de Liège) Collaborations with: Prof. P. an Dooren (Université Catholique de Louvain) Prof. R. Mahony (Australian National University) 2
3 Outline Eigenspace computation: definitions. Eigenspace computation: an application. Geometric approach to eigenspace computation. New numerical algorithms: Newton methods. Shifted inverse iterations. Global behaviour and improvements. 3
4 Topic: Refining estimates of eigenspaces A Data: n-by-n matrix A p-dimensional subspace, approx. of eigenspace. Task: compute better estimate of. Definition of eigenspace : A. + 4
5 Application: visual positioning Relative positions Raw images 5
6 Application: visual positioning Relative positions Raw images 6
7 Application: visual positioning Relative positions Raw images?? 7
8 Application: visual positioning Relative positions Raw images?? Dim
9 Application: visual positioning Relative positions Raw images Dim Projection Dim 20 Processed images 9
10 Application: visual positioning Relative positions Raw images Dim Projection Dim 20 Processed images 10
11 Application: visual positioning Relative positions Raw images Dim Projection Dim 20 Processed images 11
12 Application: visual positioning Relative positions Raw images Dim Projection Dim 20 Processed images 12
13 Application: visual positioning Relative positions Raw images Dim Projection Dim 20 Processed images 13
14 Application: visual positioning Learning... Compute covariance matrix and dominant eigenspace... 14
15 Application: visual positioning Raw image Projection onto eigenspace. 15
16 Application: visual positioning Raw image Projection onto eigenspace. 16
17 Outline Eigenspace computation: definitions. Eigenspace computation: an application. Geometric approach to eigenspace computation. New numerical algorithms: Newton methods. Shifted inverse iterations. Global behaviour and improvements. 17
18 Refining estimates of eigenspaces A + Grass(p, n) + p = dim() = dim() = dim( + ). Grass(p, n) (Grassmann manifold): the set of p-planes of R n. 18
19 Matrix representations M = [ y 1... y p] = M (M p p invertible) span = [y 1... y p ] R n p y 2 + y 1 Grass(p, n) 19
20 Iterations on the Grassmann manifold M = M (M p p invertible) R n p + Grass(p, n) 20
21 Outline Eigenspace computation: definitions. Eigenspace computation: an application. Geometric approach. New numerical algorithms: Newton methods. Shifted inverse iterations. Global behaviour and improvements. 21
22 Newton method for eigenspace computation M = M F ( ) := Π A, where Π := I ( T ) 1 T. R n p y 2 + Newton method in R n p : y 1 Grass(p, n) F ( ) + DF ( )[ ] = 0 + = + 22
23 Newton method for eigenspace computation M = M F ( ) := Π A, where Π := I ( T ) 1 T. y 2 R n p Newton method in R n p : + F ( ) + DF ( )[ ] = 0 y 1 Grass(p, n) + = + Result: =, hence + = 0. 23
24 Newton method Horizontal update M = M F ( ) := Π A, where Π := I ( T ) 1 T. y 1 y 2 R n p + H Grass(p, n) Newton method with constrained update: F ( ) + DF ( )[ ] = 0 T = 0 Overdetermined system! 24
25 Overdetermined system: Two remedies: Newton method - Relaxations F ( ) + DF ( )[ ] = 0 T = 0 = arg min F ( ) + DF ( )[ ] 2 = 0 T =0 Π (F ( ) + DF ( )[ ]) = 0 T = 0 25
26 Newton method Riemannian interpretation M = M F ( ) := Π A Π (F ( ) + DF ( )[ ]) = 0 T = 0 y 2 R n p H + = +. + y 1 Grass(p, n) Riemannian Newton method: F ( ) + F ( ) = 0 + = Exp( ) 26
27 Newton method for eigenspace computation Iteration mapping: Grass(p, n) + Grass(p, n) defined by 1. Pick a basis R n p that spans and solve the equation ΠAΠ ( T ) 1 T A = ΠA (1) under the constraint T = 0, where Π := I ( T ) 1 T. 2. Perform the update + = span( + ). (2) 27
28 Newton method Related work Previous work: Stewart [Ste73], Dongarra, Moler and Wilkinson [DMW83], Chatelin [Cha84], Demmel [Dem87]. Lösche, Schwetlick and Timmerman [LST98]. Edelman, Arias and Smith [EAS98]. Lundström and Elden [LE02]. New results (PAA, Mahony, Sepulchre [AMS02]): Iteration converges locally quadratically (even when A A T ). Convergence is locally cubic if T. 28
29 Newton method Cross sections y 2 R n p M = M S Parameterize S: = W + W K. Then ΠA = 0 becomes a quadratic matrix equation + A 22 K KA 11 + A 21 KA 12 K = 0 y 1 Grass(p, n) where A 11 = W T AW, A 12 = W T AW, A 21 = W T AW and A 22 = W T AW. 29
30 Newton method Adaptive cross section M = M + R n p y 2 + y 1 Grass(p, n) Advantage: i = A T, convergence is cubic, instead of quadratic in the fixed cross section case. 30
31 Outline Eigenspace computation: definitions. Eigenspace computation: an application. Geometric approach. New numerical algorithms: Newton methods. Shifted inverse iterations. Global behaviour and improvements. 31
32 Rayleigh Quotient Iteration Case p = 1 mz mv R n v z y + y y = my (A ρ(y)i)z = y ρ(y) := (y T Ay)/(y T y). span(v) span(y + ) span(y) Grass(1, n) Local cubic convergence to eigendirections o = A T. 32
33 Rayleigh Quotient Iteration From p = 1 to p 1 M Z = M (A ρ(y)i)z = y ρ(y) := (y T Ay)/(y T y). R n p + + Want: p>1?? Well defined on Grass(p, n). Grass(p, n) Reduces to RQI when p = 1. Local cubic convergence. 33
34 Rayleigh Quotient Iteration Case p 1 M R n p + + Grass(p, n) Classical Rayleigh Quotient Iteration (p = 1): (A ρ(y)i)z = y ρ(y) := (y T Ay)/(y T y). Generalized iteration (p 1): AZ Z( T ) 1 T A =. 34
35 Rayleigh Quotient Iteration From p = 1 to p 1 M?? Z = M (A ρ(y)i)z = y ρ(y) := (y T A )/(y T y). R n p + + Grass(p, n) Generalized iteration: AZ Z( T ) 1 T A =. Want: Well defined on Grass(p, n). Reduces to RQI when p = 1. Local cubic convergence. 35
36 Rayleigh Quotient Iteration From p = 1 to p 1 M ZM Z = M (A ρ(y)i)z = y ρ(y) := (y T A )/(y T y). R n p + + Grass(p, n) Generalized iteration: AZ Z( T ) 1 T A =. Want: Well defined on Grass(p, n). Reduces to RQI when p = 1. Local cubic convergence. 36
37 Rayleigh Quotient Iteration From p = 1 to p 1 M ZM Z = M (A ρ(y)i)z = y ρ(y) := (y T A )/(y T y). R n p + + Grass(p, n) Generalized iteration: AZ Z( T ) 1 T A =. Want: Well defined on Grass(p, n). Reduces to RQI when p = 1. Local cubic convergence. 37
38 Rayleigh Quotient Iteration From p = 1 to p 1 M ZM Z = M (A ρ(y)i)z = y ρ(y) := (y T A )/(y T y). R n p + + Grass(p, n) Generalized iteration: AZ Z( T ) 1 T A =. Want: Well defined on Grass(p, n). Reduces to RQI when p = 1. Local cubic convergence. 38
39 Rayleigh Quotient Iteration From p = 1 to p 1 M ZM Z = M (A ρ(y)i)z = y ρ(y) := (y T A )/(y T y). R n p + + Grass(p, n) Generalized iteration: AZ Z( T ) 1 T A =. Want: Well defined on Grass(p, n). Reduces to RQI when p = 1. Local cubic convergence. 39
40 Rayleigh Quotient Iteration Local Convergence Z AZ Z( T ) 1 T A = K + + K Result: K + c K 3 R n p + Grass(p, n) 40
41 Rayleigh Quotient Iteration From p = 1 to p 1 M Z = M (A ρ(y)i)z = y ρ(y) := (y T A )/(y T y). R n p + + Grass(p, n) Generalized iteration: AZ Z( T ) 1 T A =. Want: Well defined on Grass(p, n). Reduces to RQI when p = 1. Local cubic convergence. 41
42 Rayleigh Quotient Iteration The matrix algorithm Iteration mapping: Grass(p, n) + Grass(p, n) defined by 1. Pick a basis R n p that spans. 2. Solve for Z R n p. 3. Define + as the span of Z. AZ Z( T ) 1 T A = (3) 42
43 Outline Eigenspace computation: definitions. Eigenspace computation: an application. Geometric approach. New numerical algorithms: Newton methods. Shifted inverse iterations. Global behaviour and improvements. 43
44 Global behaviour of iterative methods Grass(p, n) Study basins of attraction of eigenspaces. 44
45 Global behaviour for p = 1 When p = 1, Riemannian Newton method Rayleigh Quotient Iteration. 3 3 X k Grass(1, 3) x k 1 2 X k 1 2 Reference: Batterson and Smillie [BS89]. 45
46 Newton method Global behaviour gamma = 0.2, p = 1, span(x k+1 ) = span Z 2 gamma = 0.01, p = 1, span(x k+1 ) = span Z 2 v 3 v 1 v Basins may deteriorate when eigenvalues are clustered. 46
47 Newton method Improved global behaviour p = 2, NG with tau parameter 2 p = 2, NG with tau parameter 2 v 3 v 1 v Remedy: modified Newton methods with large basins of attraction. 47
48 Rayleigh Quotient Iteration Global behaviour (p > 1) Theoretical results:? Experimental results: AZ Z( T ) 1 T A =, + = ZM gamma = 0.2, p = 2, span(x k+1 ) = span Z 2 gamma = 0.01, p = 2, span(x k+1 ) = span Z 2 v 3 v 1 v
49 Rayleigh Quotient Iteration Improved global behaviour (p > 1) gamma = 0.2, p = 2, GRQI with maxvar=.5*pi/5 2 gamma = 0.01, p = 2, GRQI with maxvar=.5*pi/5 2 v 3 v 1 v Modified Rayleigh Quotient Iteration with limited steps. Reference: PAA, Sepulchre, an Dooren, Mahony [ASM03]. 49
50 Refining eigenspace estimates Main Results Newton methods: Practical formulation of the Riemannian Newton method on the Grassmann manifold. Newton method for refining eigenspace estimates. Quadratic convergence in general. Cubic convergence i = A T. Shifted inverse iterations: Grassmannian generalization of the Rayleigh Quotient Iteration, using block shifts. Local cubic convergence to the eigenspaces o = A T. Two-sided version for the nonsymmetric case. Study of the global behaviour of the iterations. Modified methods with large basins of attraction and high rate of convergence. 50
51 Refining eigenspace estimates Conclusion M = M R n p y 2 + y 1 Grass(p, n) Geometry: iteration on the Grassmann manifold of p-planes in R n. Dynamics: k+1 = ( k ). Discrete dynamical system. Numerical analysis: compute efficiently. 51
52 References [AMS02] P.-A. Absil, R. Mahony, and R. Sepulchre, Riemannian geometry of Grassmann manifolds with a view on algorithmic computation, submitted to Acta Appl. Math., November [ASM03] P.-A. Absil, R. Sepulchre, P. an Dooren, and R. Mahony, Cubically convergent iterations for invariant subspace computation, submitted to SIAM J. Matrix Anal. Appl., January [BS89] [Cha84] S. Batterson and J. Smillie, The dynamics of Rayleigh quotient iteration, SIAM J. Numer. Anal. 26 (1989), no. 3, F. Chatelin, Simultaneous Newton s iteration for the 52
53 eigenproblem, Computing, Suppl. 5 (1984), [Dem87] [DMW83] [EAS98] [LE02] J. W. Demmel, Three methods for refining estimates of invariant subspaces, Computing 38 (1987), J. J. Dongarra, C. B. Moler, and J. H. Wilkinson, Improving the accuracy of computed eigenvalues and eigenvectors, SIAM J. Numer. Anal. 20 (1983), no. 1, A. Edelman, T. A. Arias, and S. T. Smith, The geometry of algorithms with orthogonality constraints, SIAM J. Matrix Anal. Appl. 20 (1998), no. 2, E. Lundström and L. Eldén, Adaptive eigenvalue computations using Newton s method on the Grassmann manifold, SIAM J. Matrix Anal. Appl. 23 (2002), no. 3,
54 [LST98] [Ste73] R. Lösche, H. Schwetlick, and G. Timmerman, A modified block Newton iteration for approximating an invariant subspace of a symmetric matrix, Linear Algebra Appl (1998), G. W. Stewart, Error and perturbation bounds for subspaces associated with certain eigenvalue problems, SIAM Review 15 (1973), no. 4,
55 Numerical experiment: AZ Z T A =, T = I + = qf(z). A = = e e e e e e-001 = e e e e e e-001 = e e e e e e
56 AZ Z( T ) 1 T A = Z orthon R n p + Grass(p, n) 56
57 AZ Z( T ) 1 T A = Z Z 0 T A = diag and T = I 2 1 R n p + Grass(p, n) 57
Generalized Shifted Inverse Iterations on Grassmann Manifolds 1
Proceedings of the Sixteenth International Symposium on Mathematical Networks and Systems (MTNS 2004), Leuven, Belgium Generalized Shifted Inverse Iterations on Grassmann Manifolds 1 J. Jordan α, P.-A.
More informationRiemannian geometry of Grassmann manifolds with a view on algorithmic computation
Acta Applicandae Mathematicae, Volume 80, Issue 2, pp. 199 220, January 2004 Riemannian geometry of Grassmann manifolds with a view on algorithmic computation P.-A. Absil R. Mahony R. Sepulchre Last revised:
More informationMATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.
MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. Orthogonal sets Let V be a vector space with an inner product. Definition. Nonzero vectors v 1,v
More informationIterative methods for symmetric eigenvalue problems
s Iterative s for symmetric eigenvalue problems, PhD McMaster University School of Computational Engineering and Science February 11, 2008 s 1 The power and its variants Inverse power Rayleigh quotient
More informationNewton Method on Riemannian Manifolds: Covariant Alpha-Theory.
Newton Method on Riemannian Manifolds: Covariant Alpha-Theory. Jean-Pierre Dedieu a, Pierre Priouret a, Gregorio Malajovich c a MIP. Département de Mathématique, Université Paul Sabatier, 31062 Toulouse
More informationRobust Principal Component Pursuit via Alternating Minimization Scheme on Matrix Manifolds
Robust Principal Component Pursuit via Alternating Minimization Scheme on Matrix Manifolds Tao Wu Institute for Mathematics and Scientific Computing Karl-Franzens-University of Graz joint work with Prof.
More informationMath 1553, Introduction to Linear Algebra
Learning goals articulate what students are expected to be able to do in a course that can be measured. This course has course-level learning goals that pertain to the entire course, and section-level
More informationRiemannian Optimization Method on the Flag Manifold for Independent Subspace Analysis
Riemannian Optimization Method on the Flag Manifold for Independent Subspace Analysis Yasunori Nishimori 1, Shotaro Akaho 1, and Mark D. Plumbley 2 1 Neuroscience Research Institute, National Institute
More informationMATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL
MATH 3 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left
More informationVariational inequalities for set-valued vector fields on Riemannian manifolds
Variational inequalities for set-valued vector fields on Riemannian manifolds Chong LI Department of Mathematics Zhejiang University Joint with Jen-Chih YAO Chong LI (Zhejiang University) VI on RM 1 /
More informationTHE APPLICATION OF EIGENPAIR STABILITY TO BLOCK DIAGONALIZATION
SIAM J. NUMER. ANAL. c 1997 Society for Industrial and Applied Mathematics Vol. 34, No. 3, pp. 1255 1268, June 1997 020 THE APPLICATION OF EIGENPAIR STABILITY TO BLOCK DIAGONALIZATION NILOTPAL GHOSH, WILLIAM
More informationMath 3191 Applied Linear Algebra
Math 9 Applied Linear Algebra Lecture : Orthogonal Projections, Gram-Schmidt Stephen Billups University of Colorado at Denver Math 9Applied Linear Algebra p./ Orthonormal Sets A set of vectors {u, u,...,
More informationChapter Six. Newton s Method
Chapter Six Newton s Method This chapter provides a detailed development of the archetypal second-order optimization method, Newton s method, as an iteration on manifolds. We propose a formulation of Newton
More informationMATH 304 Linear Algebra Lecture 34: Review for Test 2.
MATH 304 Linear Algebra Lecture 34: Review for Test 2. Topics for Test 2 Linear transformations (Leon 4.1 4.3) Matrix transformations Matrix of a linear mapping Similar matrices Orthogonality (Leon 5.1
More informationAssignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.
Assignment 1 Math 5341 Linear Algebra Review Give complete answers to each of the following questions Show all of your work Note: You might struggle with some of these questions, either because it has
More informationft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST
me me ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST 1. (1 pt) local/library/ui/eigentf.pg A is n n an matrices.. There are an infinite number
More informationMATH 1553 PRACTICE FINAL EXAMINATION
MATH 553 PRACTICE FINAL EXAMINATION Name Section 2 3 4 5 6 7 8 9 0 Total Please read all instructions carefully before beginning. The final exam is cumulative, covering all sections and topics on the master
More information1. Introduction. In this paper we consider the large and sparse eigenvalue problem. Ax = λx (1.1) T (λ)x = 0 (1.2)
A NEW JUSTIFICATION OF THE JACOBI DAVIDSON METHOD FOR LARGE EIGENPROBLEMS HEINRICH VOSS Abstract. The Jacobi Davidson method is known to converge at least quadratically if the correction equation is solved
More informationA Tuned Preconditioner for Inexact Inverse Iteration Applied to Hermitian Eigenvalue Problems
A Tuned Preconditioner for Applied to Eigenvalue Problems Department of Mathematical Sciences University of Bath, United Kingdom IWASEP VI May 22-25, 2006 Pennsylvania State University, University Park
More informationAA242B: MECHANICAL VIBRATIONS
AA242B: MECHANICAL VIBRATIONS 1 / 17 AA242B: MECHANICAL VIBRATIONS Solution Methods for the Generalized Eigenvalue Problem These slides are based on the recommended textbook: M. Géradin and D. Rixen, Mechanical
More informationYORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions
YORK UNIVERSITY Faculty of Science Department of Mathematics and Statistics MATH 222 3. M Test # July, 23 Solutions. For each statement indicate whether it is always TRUE or sometimes FALSE. Note: For
More informationBindel, Fall 2016 Matrix Computations (CS 6210) Notes for
1 Power iteration Notes for 2016-10-17 In most introductory linear algebra classes, one computes eigenvalues as roots of a characteristic polynomial. For most problems, this is a bad idea: the roots of
More informationAlgunos ejemplos de éxito en matemáticas computacionales
Algunos ejemplos de éxito en matemáticas computacionales Carlos Beltrán Coloquio UC3M Leganés 4 de febrero de 2015 Solving linear systems Ax = b where A is n n, det(a) 0 Input: A M n (C), det(a) 0. b C
More informationYORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions
YORK UNIVERSITY Faculty of Science Department of Mathematics and Statistics MATH 3. M Test # Solutions. (8 pts) For each statement indicate whether it is always TRUE or sometimes FALSE. Note: For this
More informationDefinition (T -invariant subspace) Example. Example
Eigenvalues, Eigenvectors, Similarity, and Diagonalization We now turn our attention to linear transformations of the form T : V V. To better understand the effect of T on the vector space V, we begin
More informationChapter 6: Orthogonality
Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products
More informationAn extrinsic look at the Riemannian Hessian
http://sites.uclouvain.be/absil/2013.01 Tech. report UCL-INMA-2013.01-v2 An extrinsic look at the Riemannian Hessian P.-A. Absil 1, Robert Mahony 2, and Jochen Trumpf 2 1 Department of Mathematical Engineering,
More informationLINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS
LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,
More informationNumerical Methods I: Eigenvalues and eigenvectors
1/25 Numerical Methods I: Eigenvalues and eigenvectors Georg Stadler Courant Institute, NYU stadler@cims.nyu.edu November 2, 2017 Overview 2/25 Conditioning Eigenvalues and eigenvectors How hard are they
More informationFOR EIGEN-LIKE PROBLEMS. 1 Department of Mathematics. Massachusetts Institute of Technology. 2 Lincoln Laboratory
BIT 36:1 (1996), 001{003. ON CONJUGATE GRADIENT-LIKE METHODS FOR EIGEN-LIKE PROBLEMS A. EDELMAN y S. T. SMITH z 1 Department of Mathematics Massachusetts Institute of Technology Cambridge, MA 02139 USA
More informationOptimization on the Grassmann manifold: a case study
Optimization on the Grassmann manifold: a case study Konstantin Usevich and Ivan Markovsky Department ELEC, Vrije Universiteit Brussel 28 March 2013 32nd Benelux Meeting on Systems and Control, Houffalize,
More informationAn Iterative Procedure for Solving the Riccati Equation A 2 R RA 1 = A 3 + RA 4 R. M.THAMBAN NAIR (I.I.T. Madras)
An Iterative Procedure for Solving the Riccati Equation A 2 R RA 1 = A 3 + RA 4 R M.THAMBAN NAIR (I.I.T. Madras) Abstract Let X 1 and X 2 be complex Banach spaces, and let A 1 BL(X 1 ), A 2 BL(X 2 ), A
More informationMath 205, Summer I, Week 4b:
Math 205, Summer I, 2016 Week 4b: Chapter 5, Sections 6, 7 and 8 (5.5 is NOT on the syllabus) 5.6 Eigenvalues and Eigenvectors 5.7 Eigenspaces, nondefective matrices 5.8 Diagonalization [*** See next slide
More informationPreconditioned inverse iteration and shift-invert Arnoldi method
Preconditioned inverse iteration and shift-invert Arnoldi method Melina Freitag Department of Mathematical Sciences University of Bath CSC Seminar Max-Planck-Institute for Dynamics of Complex Technical
More informationS.F. Xu (Department of Mathematics, Peking University, Beijing)
Journal of Computational Mathematics, Vol.14, No.1, 1996, 23 31. A SMALLEST SINGULAR VALUE METHOD FOR SOLVING INVERSE EIGENVALUE PROBLEMS 1) S.F. Xu (Department of Mathematics, Peking University, Beijing)
More informationArnoldi Methods in SLEPc
Scalable Library for Eigenvalue Problem Computations SLEPc Technical Report STR-4 Available at http://slepc.upv.es Arnoldi Methods in SLEPc V. Hernández J. E. Román A. Tomás V. Vidal Last update: October,
More informationAPPLIED NUMERICAL LINEAR ALGEBRA
APPLIED NUMERICAL LINEAR ALGEBRA James W. Demmel University of California Berkeley, California Society for Industrial and Applied Mathematics Philadelphia Contents Preface 1 Introduction 1 1.1 Basic Notation
More informationarxiv: v1 [math.oc] 22 May 2018
On the Connection Between Sequential Quadratic Programming and Riemannian Gradient Methods Yu Bai Song Mei arxiv:1805.08756v1 [math.oc] 22 May 2018 May 23, 2018 Abstract We prove that a simple Sequential
More informationFinding Rightmost Eigenvalues of Large, Sparse, Nonsymmetric Parameterized Eigenvalue Problems
Finding Rightmost Eigenvalues of Large, Sparse, Nonsymmetric Parameterized Eigenvalue Problems AMSC 663-664 Final Report Minghao Wu AMSC Program mwu@math.umd.edu Dr. Howard Elman Department of Computer
More informationAnnouncements Wednesday, November 01
Announcements Wednesday, November 01 WeBWorK 3.1, 3.2 are due today at 11:59pm. The quiz on Friday covers 3.1, 3.2. My office is Skiles 244. Rabinoffice hours are Monday, 1 3pm and Tuesday, 9 11am. Section
More informationPRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them.
Prof A Suciu MTH U37 LINEAR ALGEBRA Spring 2005 PRACTICE FINAL EXAM Are the following vectors independent or dependent? If they are independent, say why If they are dependent, exhibit a linear dependence
More informationControl Systems. Linear Algebra topics. L. Lanari
Control Systems Linear Algebra topics L Lanari outline basic facts about matrices eigenvalues - eigenvectors - characteristic polynomial - algebraic multiplicity eigenvalues invariance under similarity
More informationAlgebraic models for higher-order correlations
Algebraic models for higher-order correlations Lek-Heng Lim and Jason Morton U.C. Berkeley and Stanford Univ. December 15, 2008 L.-H. Lim & J. Morton (MSRI Workshop) Algebraic models for higher-order correlations
More informationHomework 11 Solutions. Math 110, Fall 2013.
Homework 11 Solutions Math 110, Fall 2013 1 a) Suppose that T were self-adjoint Then, the Spectral Theorem tells us that there would exist an orthonormal basis of P 2 (R), (p 1, p 2, p 3 ), consisting
More informationHOMOGENEOUS JACOBI DAVIDSON. 1. Introduction. We study a homogeneous Jacobi Davidson variant for the polynomial eigenproblem
HOMOGENEOUS JACOBI DAVIDSON MICHIEL E. HOCHSTENBACH AND YVAN NOTAY Abstract. We study a homogeneous variant of the Jacobi Davidson method for the generalized and polynomial eigenvalue problem. Special
More informationUsing the Karush-Kuhn-Tucker Conditions to Analyze the Convergence Rate of Preconditioned Eigenvalue Solvers
Using the Karush-Kuhn-Tucker Conditions to Analyze the Convergence Rate of Preconditioned Eigenvalue Solvers Merico Argentati University of Colorado Denver Joint work with Andrew V. Knyazev, Klaus Neymeyr
More informationc 2006 Society for Industrial and Applied Mathematics
SIAM J. MATRIX ANAL. APPL. Vol. 28, No. 4, pp. 1069 1082 c 2006 Society for Industrial and Applied Mathematics INEXACT INVERSE ITERATION WITH VARIABLE SHIFT FOR NONSYMMETRIC GENERALIZED EIGENVALUE PROBLEMS
More informationThe nonsmooth Newton method on Riemannian manifolds
The nonsmooth Newton method on Riemannian manifolds C. Lageman, U. Helmke, J.H. Manton 1 Introduction Solving nonlinear equations in Euclidean space is a frequently occurring problem in optimization and
More informationMATH 115A: SAMPLE FINAL SOLUTIONS
MATH A: SAMPLE FINAL SOLUTIONS JOE HUGHES. Let V be the set of all functions f : R R such that f( x) = f(x) for all x R. Show that V is a vector space over R under the usual addition and scalar multiplication
More informationMath 3C Lecture 25. John Douglas Moore
Math 3C Lecture 25 John Douglas Moore June 1, 2009 Let V be a vector space. A basis for V is a collection of vectors {v 1,..., v k } such that 1. V = Span{v 1,..., v k }, and 2. {v 1,..., v k } are linearly
More informationMATH 567: Mathematical Techniques in Data Science Clustering II
This lecture is based on U. von Luxburg, A Tutorial on Spectral Clustering, Statistics and Computing, 17 (4), 2007. MATH 567: Mathematical Techniques in Data Science Clustering II Dominique Guillot Departments
More information5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.
Linear Algebra - Test File - Spring Test # For problems - consider the following system of equations. x + y - z = x + y + 4z = x + y + 6z =.) Solve the system without using your calculator..) Find the
More informationc 2009 Society for Industrial and Applied Mathematics
SIAM J. MATRIX ANAL. APPL. Vol.??, No.?, pp.?????? c 2009 Society for Industrial and Applied Mathematics GRADIENT FLOW APPROACH TO GEOMETRIC CONVERGENCE ANALYSIS OF PRECONDITIONED EIGENSOLVERS ANDREW V.
More informationOff-diagonal perturbation, first-order approximation and quadratic residual bounds for matrix eigenvalue problems
Off-diagonal perturbation, first-order approximation and quadratic residual bounds for matrix eigenvalue problems Yuji Nakatsukasa Abstract When a symmetric block diagonal matrix [ A 1 A2 ] undergoes an
More informationReview problems for MA 54, Fall 2004.
Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on
More informationConvergence of Eigenspaces in Kernel Principal Component Analysis
Convergence of Eigenspaces in Kernel Principal Component Analysis Shixin Wang Advanced machine learning April 19, 2016 Shixin Wang Convergence of Eigenspaces April 19, 2016 1 / 18 Outline 1 Motivation
More informationIterative Algorithm for Computing the Eigenvalues
Iterative Algorithm for Computing the Eigenvalues LILJANA FERBAR Faculty of Economics University of Ljubljana Kardeljeva pl. 17, 1000 Ljubljana SLOVENIA Abstract: - We consider the eigenvalue problem Hx
More informationGeodesic Equivalence in sub-riemannian Geometry
03/27/14 Equivalence in sub-riemannian Geometry Supervisor: Dr.Igor Zelenko Texas A&M University, Mathematics Some Preliminaries: Riemannian Metrics Let M be a n-dimensional surface in R N Some Preliminaries:
More informationDirect methods for symmetric eigenvalue problems
Direct methods for symmetric eigenvalue problems, PhD McMaster University School of Computational Engineering and Science February 4, 2008 1 Theoretical background Posing the question Perturbation theory
More informationAnnouncements Wednesday, November 01
Announcements Wednesday, November 01 WeBWorK 3.1, 3.2 are due today at 11:59pm. The quiz on Friday covers 3.1, 3.2. My office is Skiles 244. Rabinoffice hours are Monday, 1 3pm and Tuesday, 9 11am. Section
More informationA block Newton method for nonlinear eigenvalue problems
A block Newton method for nonlinear eigenvalue problems Daniel Kressner July 20, 2009 Abstract We consider matrix eigenvalue problems that are nonlinear in the eigenvalue parameter. One of the most fundamental
More informationThe Eigenvalue Shift Technique and Its Eigenstructure Analysis of a Matrix
The Eigenvalue Shift Technique and Its Eigenstructure Analysis of a Matrix Chun-Yueh Chiang Center for General Education, National Formosa University, Huwei 632, Taiwan. Matthew M. Lin 2, Department of
More informationReduction of nonlinear eigenproblems with JD
Reduction of nonlinear eigenproblems with JD Henk van der Vorst H.A.vanderVorst@math.uu.nl Mathematical Institute Utrecht University July 13, 2005, SIAM Annual New Orleans p.1/16 Outline Polynomial Eigenproblems
More informationMath 205, Summer I, Week 4b: Continued. Chapter 5, Section 8
Math 205, Summer I, 2016 Week 4b: Continued Chapter 5, Section 8 2 5.8 Diagonalization [reprint, week04: Eigenvalues and Eigenvectors] + diagonaliization 1. 5.8 Eigenspaces, Diagonalization A vector v
More informationMATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION
MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether
More informationMath 113 Homework 5 Solutions (Starred problems) Solutions by Guanyang Wang, with edits by Tom Church.
Math 113 Homework 5 Solutions (Starred problems) Solutions by Guanyang Wang, with edits by Tom Church. Exercise 5.C.1 Suppose T L(V ) is diagonalizable. Prove that V = null T range T. Proof. Let v 1,...,
More informationEfficient and Accurate Rectangular Window Subspace Tracking
Efficient and Accurate Rectangular Window Subspace Tracking Timothy M. Toolan and Donald W. Tufts Dept. of Electrical Engineering, University of Rhode Island, Kingston, RI 88 USA toolan@ele.uri.edu, tufts@ele.uri.edu
More informationA HARMONIC RESTARTED ARNOLDI ALGORITHM FOR CALCULATING EIGENVALUES AND DETERMINING MULTIPLICITY
A HARMONIC RESTARTED ARNOLDI ALGORITHM FOR CALCULATING EIGENVALUES AND DETERMINING MULTIPLICITY RONALD B. MORGAN AND MIN ZENG Abstract. A restarted Arnoldi algorithm is given that computes eigenvalues
More informationMultisets mixture learning-based ellipse detection
Pattern Recognition 39 (6) 731 735 Rapid and brief communication Multisets mixture learning-based ellipse detection Zhi-Yong Liu a,b, Hong Qiao a, Lei Xu b, www.elsevier.com/locate/patcog a Key Lab of
More informationA NOTE ON MATRIX REFINEMENT EQUATIONS. Abstract. Renement equations involving matrix masks are receiving a lot of attention these days.
A NOTE ON MATRI REFINEMENT EQUATIONS THOMAS A. HOGAN y Abstract. Renement equations involving matrix masks are receiving a lot of attention these days. They can play a central role in the study of renable
More information33A Linear Algebra and Applications: Practice Final Exam - Solutions
33A Linear Algebra and Applications: Practice Final Eam - Solutions Question Consider a plane V in R 3 with a basis given by v = and v =. Suppose, y are both in V. (a) [3 points] If [ ] B =, find. (b)
More informationRiemannian geometry of positive definite matrices: Matrix means and quantum Fisher information
Riemannian geometry of positive definite matrices: Matrix means and quantum Fisher information Dénes Petz Alfréd Rényi Institute of Mathematics Hungarian Academy of Sciences POB 127, H-1364 Budapest, Hungary
More informationNEW ESTIMATES FOR RITZ VECTORS
MATHEMATICS OF COMPUTATION Volume 66, Number 219, July 1997, Pages 985 995 S 0025-5718(97)00855-7 NEW ESTIMATES FOR RITZ VECTORS ANDREW V. KNYAZEV Abstract. The following estimate for the Rayleigh Ritz
More informationONP-MF: An Orthogonal Nonnegative Matrix Factorization Algorithm with Application to Clustering
ONP-MF: An Orthogonal Nonnegative Matrix Factorization Algorithm with Application to Clustering Filippo Pompili 1, Nicolas Gillis 2, P.-A. Absil 2,andFrançois Glineur 2,3 1- University of Perugia, Department
More informationCommunities, Spectral Clustering, and Random Walks
Communities, Spectral Clustering, and Random Walks David Bindel Department of Computer Science Cornell University 3 Jul 202 Spectral clustering recipe Ingredients:. A subspace basis with useful information
More informationThis is a closed book exam. No notes or calculators are permitted. We will drop your lowest scoring question for you.
Math 54 Fall 2017 Practice Exam 2 Exam date: 10/31/17 Time Limit: 80 Minutes Name: Student ID: GSI or Section: This exam contains 7 pages (including this cover page) and 7 problems. Problems are printed
More informationPrincipal Component Analysis
Machine Learning Michaelmas 2017 James Worrell Principal Component Analysis 1 Introduction 1.1 Goals of PCA Principal components analysis (PCA) is a dimensionality reduction technique that can be used
More informationproblem Au = u by constructing an orthonormal basis V k = [v 1 ; : : : ; v k ], at each k th iteration step, and then nding an approximation for the e
A Parallel Solver for Extreme Eigenpairs 1 Leonardo Borges and Suely Oliveira 2 Computer Science Department, Texas A&M University, College Station, TX 77843-3112, USA. Abstract. In this paper a parallel
More informationMajorization for Changes in Ritz Values and Canonical Angles Between Subspaces (Part I and Part II)
1 Majorization for Changes in Ritz Values and Canonical Angles Between Subspaces (Part I and Part II) Merico Argentati (speaker), Andrew Knyazev, Ilya Lashuk and Abram Jujunashvili Department of Mathematics
More informationThe Nullspace free eigenvalue problem and the inexact Shift and invert Lanczos method. V. Simoncini. Dipartimento di Matematica, Università di Bologna
The Nullspace free eigenvalue problem and the inexact Shift and invert Lanczos method V. Simoncini Dipartimento di Matematica, Università di Bologna and CIRSA, Ravenna, Italy valeria@dm.unibo.it 1 The
More informationComputation of eigenvalues and singular values Recall that your solutions to these questions will not be collected or evaluated.
Math 504, Homework 5 Computation of eigenvalues and singular values Recall that your solutions to these questions will not be collected or evaluated 1 Find the eigenvalues and the associated eigenspaces
More informationECS231 Handout Subspace projection methods for Solving Large-Scale Eigenvalue Problems. Part I: Review of basic theory of eigenvalue problems
ECS231 Handout Subspace projection methods for Solving Large-Scale Eigenvalue Problems Part I: Review of basic theory of eigenvalue problems 1. Let A C n n. (a) A scalar λ is an eigenvalue of an n n A
More informationDavidson Method CHAPTER 3 : JACOBI DAVIDSON METHOD
Davidson Method CHAPTER 3 : JACOBI DAVIDSON METHOD Heinrich Voss voss@tu-harburg.de Hamburg University of Technology The Davidson method is a popular technique to compute a few of the smallest (or largest)
More informationEigenvalues and Eigenvectors A =
Eigenvalues and Eigenvectors Definition 0 Let A R n n be an n n real matrix A number λ R is a real eigenvalue of A if there exists a nonzero vector v R n such that A v = λ v The vector v is called an eigenvector
More informationFamily Feud Review. Linear Algebra. October 22, 2013
Review Linear Algebra October 22, 2013 Question 1 Let A and B be matrices. If AB is a 4 7 matrix, then determine the dimensions of A and B if A has 19 columns. Answer 1 Answer A is a 4 19 matrix, while
More informationLinear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4
Linear Systems Math Spring 8 c 8 Ron Buckmire Fowler 9 MWF 9: am - :5 am http://faculty.oxy.edu/ron/math//8/ Class 7 TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5. Summary
More information7. Symmetric Matrices and Quadratic Forms
Linear Algebra 7. Symmetric Matrices and Quadratic Forms CSIE NCU 1 7. Symmetric Matrices and Quadratic Forms 7.1 Diagonalization of symmetric matrices 2 7.2 Quadratic forms.. 9 7.4 The singular value
More informationOptimization On Manifolds: Methods and Applications
1 Optimization On Manifolds: Methods and Applications Pierre-Antoine Absil (UCLouvain) BFG 09, Leuven 18 Sep 2009 2 Joint work with: Chris Baker (Sandia) Kyle Gallivan (Florida State University) Eric Klassen
More informationOrthogonal iteration to QR
Notes for 2016-03-09 Orthogonal iteration to QR The QR iteration is the workhorse for solving the nonsymmetric eigenvalue problem. Unfortunately, while the iteration itself is simple to write, the derivation
More informationAn Asynchronous Algorithm on NetSolve Global Computing System
An Asynchronous Algorithm on NetSolve Global Computing System Nahid Emad S. A. Shahzadeh Fazeli Jack Dongarra March 30, 2004 Abstract The Explicitly Restarted Arnoldi Method (ERAM) allows to find a few
More informationThe University of Texas at Austin Department of Electrical and Computer Engineering. EE381V: Large Scale Learning Spring 2013.
The University of Texas at Austin Department of Electrical and Computer Engineering EE381V: Large Scale Learning Spring 2013 Assignment Two Caramanis/Sanghavi Due: Tuesday, Feb. 19, 2013. Computational
More informationMATH Linear Algebra
MATH 304 - Linear Algebra In the previous note we learned an important algorithm to produce orthogonal sequences of vectors called the Gramm-Schmidt orthogonalization process. Gramm-Schmidt orthogonalization
More informationMathematical foundations - linear algebra
Mathematical foundations - linear algebra Andrea Passerini passerini@disi.unitn.it Machine Learning Vector space Definition (over reals) A set X is called a vector space over IR if addition and scalar
More information1 Invariant subspaces
MATH 2040 Linear Algebra II Lecture Notes by Martin Li Lecture 8 Eigenvalues, eigenvectors and invariant subspaces 1 In previous lectures we have studied linear maps T : V W from a vector space V to another
More informationContents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2
Contents Preface for the Instructor xi Preface for the Student xv Acknowledgments xvii 1 Vector Spaces 1 1.A R n and C n 2 Complex Numbers 2 Lists 5 F n 6 Digression on Fields 10 Exercises 1.A 11 1.B Definition
More informationMATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.
MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2. Diagonalization Let L be a linear operator on a finite-dimensional vector space V. Then the following conditions are equivalent:
More informationSOLVING MESH EIGENPROBLEMS WITH MULTIGRID EFFICIENCY
SOLVING MESH EIGENPROBLEMS WITH MULTIGRID EFFICIENCY KLAUS NEYMEYR ABSTRACT. Multigrid techniques can successfully be applied to mesh eigenvalue problems for elliptic differential operators. They allow
More informationMaximization of the sum of the trace ratio on the Stiefel manifold, I: Theory
SCIENCE CHINA Mathematics. ARTICLES. doi: 10.1007/s11425-014-4824-0 Maximization of the sum of the trace ratio on the Stiefel manifold, I: Theory ZHANG LeiHong 1 & LI RenCang 2, 1 Department of Applied
More informationMatrix Algorithms. Volume II: Eigensystems. G. W. Stewart H1HJ1L. University of Maryland College Park, Maryland
Matrix Algorithms Volume II: Eigensystems G. W. Stewart University of Maryland College Park, Maryland H1HJ1L Society for Industrial and Applied Mathematics Philadelphia CONTENTS Algorithms Preface xv xvii
More informationRanking from Crowdsourced Pairwise Comparisons via Matrix Manifold Optimization
Ranking from Crowdsourced Pairwise Comparisons via Matrix Manifold Optimization Jialin Dong ShanghaiTech University 1 Outline Introduction FourVignettes: System Model and Problem Formulation Problem Analysis
More information