Solution of the Inverse Eigenvalue Problem for Certain (Anti-) Hermitian Matrices Using Newton s Method

Similar documents
1 Multiply Eq. E i by λ 0: (λe i ) (E i ) 2 Multiply Eq. E j by λ and add to Eq. E i : (E i + λe j ) (E i )

Inverse Eigenvalue Problems for Two Special Acyclic Matrices

1 Determinants. 1.1 Determinant

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 1 x 2. x n 8 (4) 3 4 2

Dimension. Eigenvalue and eigenvector

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

4. Determinants.

Lecture Summaries for Linear Algebra M51A

Math Linear Algebra Final Exam Review Sheet

II. Determinant Functions

MATH 323 Linear Algebra Lecture 6: Matrix algebra (continued). Determinants.

Math 240 Calculus III

On an Inverse Problem for a Quadratic Eigenvalue Problem

Chapter 4 - MATRIX ALGEBRA. ... a 2j... a 2n. a i1 a i2... a ij... a in

CAAM 335 Matrix Analysis

Solutions Problem Set 8 Math 240, Fall

Math 489AB Exercises for Chapter 2 Fall Section 2.3

Topic 1: Matrix diagonalization

0.1 Rational Canonical Forms

Symmetric and anti symmetric matrices

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Chapter 1: Systems of linear equations and matrices. Section 1.1: Introduction to systems of linear equations

Eigenvalues and Eigenvectors

RANK AND PERIMETER PRESERVER OF RANK-1 MATRICES OVER MAX ALGEBRA

MATH 511 ADVANCED LINEAR ALGEBRA SPRING 2006

Online Exercises for Linear Algebra XM511

Introduction to Matrix Algebra

MATH Topics in Applied Mathematics Lecture 12: Evaluation of determinants. Cross product.

Eigenvalues and Eigenvectors

Linear Algebra: Matrix Eigenvalue Problems

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

Math 407: Linear Optimization

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

1 Matrices and Systems of Linear Equations. a 1n a 2n

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

Cayley-Hamilton Theorem

SENSITIVITY OF THE STATIONARY DISTRIBUTION OF A MARKOV CHAIN*

Lemma 8: Suppose the N by N matrix A has the following block upper triangular form:

Linear Systems and Matrices

3 Matrix Algebra. 3.1 Operations on matrices

Announcements Wednesday, November 01

Basic Concepts in Linear Algebra

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 13

Matrix Algebra. Matrix Algebra. Chapter 8 - S&B

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Chapter 3 Transformations

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Foundations of Matrix Analysis

Mathematics. EC / EE / IN / ME / CE. for

Numerical Linear Algebra Homework Assignment - Week 2

Review of Basic Concepts in Linear Algebra

Appendix A: Matrices

Properties of Linear Transformations from R n to R m

Matrices and Linear Algebra

MTH 5102 Linear Algebra Practice Final Exam April 26, 2016

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p

Introduction to Matrices

Chapter 1. Matrix Algebra

Exercise Sheet 1.

Review of Linear Algebra

Math Matrix Algebra

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

MA 1B ANALYTIC - HOMEWORK SET 7 SOLUTIONS

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Inverse Eigenvalue Problems

Linear Algebra: Lecture notes from Kolman and Hill 9th edition.

Conceptual Questions for Review

ECON 186 Class Notes: Linear Algebra

Intrinsic products and factorizations of matrices

7.6 The Inverse of a Square Matrix

Lecture 10 - Eigenvalues problem

Linear Algebra Review. Vectors

NEW CONSTRUCTION OF THE EAGON-NORTHCOTT COMPLEX. Oh-Jin Kang and Joohyung Kim

Eigenvalues and Eigenvectors

The Eigenvalue Shift Technique and Its Eigenstructure Analysis of a Matrix

Matrix Algebra, part 2

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Practical Linear Algebra: A Geometry Toolbox

G1110 & 852G1 Numerical Linear Algebra

2 b 3 b 4. c c 2 c 3 c 4

MATH 423 Linear Algebra II Lecture 20: Geometry of linear transformations. Eigenvalues and eigenvectors. Characteristic polynomial.

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

Math 408 Advanced Linear Algebra

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

4. Linear transformations as a vector space 17

1. General Vector Spaces

Eigenvalue and Eigenvector Problems

EIGENVALUES AND EIGENVECTORS 3

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Lecture Notes in Linear Algebra

Linear Algebra Primer

c c c c c c c c c c a 3x3 matrix C= has a determinant determined by

Matrices A brief introduction

Linear Algebra 1 Exam 1 Solutions 6/12/3

Elementary maths for GMT

Equality: Two matrices A and B are equal, i.e., A = B if A and B have the same order and the entries of A and B are the same.

Transcription:

Journal of Mathematics Research; Vol 6, No ; 014 ISSN 1916-9795 E-ISSN 1916-9809 Published by Canadian Center of Science and Education Solution of the Inverse Eigenvalue Problem for Certain (Anti-) Hermitian Matrices Using Newton s Method Francis T Oduro 1 1 Department of Mathematics, Kwame Nkrumah University of Science and Technology Kumasi, Ghana Correspondence: Francis T Oduro, Department of Mathematics, Kwame Nkrumah University of Science and Technology Kumasi, Ghana E-mail: francistoduro@yahoocouk Received: May 9, 013 Accepted: February 19, 014 Online Published: April 18, 014 doi:105539/jmrv6np64 URL: http://dxdoiorg/105539/jmrv6np64 Abstract Using distinct non zero diagonal elements of a Hermitian (or anti Hermitian) matrix as independent variables of the characteristic polynomial function, a Newton s algorithm is developed for the solution of the inverse eigenproblem given distinct nonzero eigenvalues It is found that if a singular Hermitian (or singular anti Hermitian) matrix of rank one is used as the initial matrix, convergence to an exact solution is achieved in only one step This result can be extended to n n matrices provided the target eigenvalues are respectively of multiplicities p and q with p + q = n and 1 p, q < n Moreover, the initial matrix would be of rank one and would have only two distinct corresponding nonzero diagonal elements, the rest being repeated To illustrate the result, numerical examples are given for the cases n =, 3 and 4 Keywords: Hermitian matrix, anti-hermitian matrix, inverse eigenproblem, Newton s method 1 Introduction The matrix inverse eigenvalue problem entails the reconstruction of a matrix from its eigenvalues It has been of interest not only to algebraists but also to numerical analysts, control theorists, statisticians and engineers Most research effort have been directed at solving the inverse eigenvalue problem for nonsingular symmetric matrices (Chu & Golub, 005; Gladwell, 004; Deakin & Luke, 199; Chu, 1995) Recently, however, the case of singular symmetric matrices of arbitrary order and rank has been virtually solved provided linear dependency relations are specified (Gyamfi, Oduro, & Aidoo, 013; Aidoo, Gyamfi, Ackora-Prah, & Oduro, 013) It is plausible to expect solutions of the inverse eigenvalue problem in a sufficiently small neighborhood of any such Hermitian matrix In this paper, therefore, we develop a Newton s method for the solution of the inverse eigenvalue problem for a (anti-) Hermitian matrix using a singular (anti-) Hermitian matrix as the initial approximation In the case of a (anti-) Hermitian matrix with non-zero diagonal elements, we show that an exact solution is obtained in only one step if the initial matrix is of rank one This result is subsequently extended to n n matrices provided the target eigenvalues are respectively of multiplicities p and q with p + q = n and 1 p, q < n Preliminaries In this section we review previous results obtained by Gyamfi et al (013) and Aidoo et al (013) in respect of the inverse eigenvalue problem for singular symmetric matrices which are here extended to include singular (anti-) Hermitian matrices as well Lemma 1 There exists a one-to-one correspondence between the elements of a Hermitian or anti Hermitian matrix and its distinct nonzero eigenvalues if and only if the matrix is of rank 1 (Aidoo et al, 013) Proof Let the given Hermitian or anti Hermitian matrix be of rank r, then, clearly, the number of independent elements is r(r+1) Thus a one to one onto correspondence will exist between the elements of the matrix and its distinct nonzero eigenvalues if and only if r(r+1) = r ie, if and only if r = 1 Proposition 1 If the row dependence relations for a Hermitian or anti Hermitian matrix of rank 1 are specified as follows R i = k i 1 R 1, i =,n 1 (1) 64

where R i is the ith row and each k i is a nonzero scalar Then the matrix can be generated from its nonzero eigenvalue λ: 1 k 1 k k n 1 where A = a k 1 k 1 k 1 k k k 1 k k k n 1 a = k 1 k n 1 k k n 1 k n 1 λ 1 + k 1 + + k n 1 Proof By induction on n Similarly result has also been obtained for singular matrix of rank greater than 1 3 Main Results 31 Characteristic Function of Matrix Elements Let X = (x 1, x,,x n ) R n where x 1, x,,x n are n elements of a Hermitian matrix A of order n n For given distinct (target) eigenvalues λ 1,λ,,λ n, of another Hermitian matrix A t, called the target matrix, we now define a characteristic function as a function of n elements of A given by f i (x 1, x,,x n ) = () n ( 1) n k I k λ n k i (3) Here, I k the coefficient of λ n k i is the kth principal invariant of the second order Hermitian tensor represented by A As is well-known I k can be written in terms of the eigenvalues of A, which we denote here by μ 1,μ,,μ n k=0 I k = 1 n! π S n k μ π( j) (4) j=1 where π S n is a permutation of the natural numbers and S n is the symmetric group of order n Moreover the I k may also be expressed in terms of the trace of A and its powers For our purposes it suffices to write down the first few terms: Thus, I n = det A = n μ j j=1 I 0 = 1 I 1 = tra = n j=1 μ j I = 1 [ (tra) (tra ) ] = μ i μ j Note that when A coincides with A t then each characteristic function reduces to a characteristic equation in λ i i< j f i (x 1, x,,x n ) = 0 i = 1,, n (5) For our purposes it is convenient to choose x 1 = a, x = a,,x n = a nn as the n elements of the matrix A which are also used as the n independent variables of each f i f i (a, a,,a nn ) = 0 i = 1,, n (6) Definition 1 (An Inverse Eigenvalue Problem) An inverse eigenvalue problem can be formulated as follows: To find the diagonal elements a ii of the target (anti-) Hermitian matrix A t, given its non-diagonal elements, as specified in Equation () and its spectrum {λ 1,,λ n } 65

Remark 1 The solution of the IEP stated above is then simply the solution of Equation (6) Now this equation is clearly solvable by Newton s method This will require the Jacobian matrix J of the smooth vector function with components f i which is given by [ ] fi J = a jj The entries of J may be computed as follows Theorem 1 f i = λ n 1 i + tr(m j )λi n + + ( 1) n det(m j ) (8) a jj where M j is the (n 1) (n 1) matrix obtained by deleting the jth row and the jth column of A Theorem det J = (a ii a jj )(λ i λ j ) (9) Proof From the expression i< j f i = λ n 1 i + tr(m j )λi n + + ( 1) n det(m j ) a jj It is clear that the Jacobian matrix will have the ith and the kth rows equal if λ i is the same as λ k Also the Jacobian matrix will have the jth and lth columns equal if the diagonal elements a jj and a ll are equal The result follows from the vanishing of a determinant with repeated rows or repeated columns as well as (up to multiplication by a constant) from the factor theorem of elementary algebra It is easily checked that by the definition of characteristic polynomial adopted, the constant multiple is 1 Theorem 3 If the (anti-) Hermitian matrix A is of rank r, the characteristic polynomial functions reduce to the forms: f i (a, a,,a nn ) = r k=0 ( 1) n k I k λ n k i i = 1,, n (10) Proof If A is (anti-) Hermitian of rank r, then all the possible invariants of A for which k > r vanish (from Equation (4)) Corollary 1 If the (anti-) Hermitian matrix A is of rank one, then the characteristic polynomial functions reduce to the forms f i (a, a,,a nn ) = λ n i (tra)λ n 1 i () with A being (anti-) Hermitian and of rank 1, all the invariants of A vanish except the trace and the f i become linear functions of the diagonal elements of A Theorem 4 (Newton s Algorithm) Let A (0) be (anti-) Hermitian and X (0) = (a (0), a(0),, a(0) nn ) Then, provided the diagonal elements a (0) ii of A (0) are distinct, the inverse eigenvalue problem stated above can be solved in a sufficiently small neighbourhood of X (0) by the Newton s method given by the scheme X (N+1) = X (N) J 1 f (X (N) ), N = 0, 1,, (1) Here, the Jacobian matrix J must be invertible Theorem 5 (Singularity of J for n > and distinct eigenvalues) The Jacobian matrix corresponding to distinct target eigenvalues λ 1,λ,,λ n, n > is singular Proof Clearly, it is sufficient to prove this result for the case n = 3 f i a jj = λ n 1 i + tr(m j ) (13) 66 (7)

Therefore + (a + a 33 )λ 1 + (a + a 33 )λ 1 + (a + a )λ 1 J = + (a + a 33 )λ + (a + a 33 )λ + (a + a )λ λ 3 + (a + a 33 )λ 3 λ 3 + (a + a 33 )λ 3 λ 3 + (a + a )λ 3 + (a + a 33 )λ 1 + (a + a )λ 1 = + (a + a 33 )λ + (a + a )λ λ 3 λ 3 + (a + a 33 )λ 3 λ 3 + (a + a )λ 3 + (a + a 33 )λ 1 + (a + a )λ 1 +(a + a 33 ) + (a + a 33 )λ + (a + a )λ λ 3 λ 3 + (a + a 33 )λ 3 λ 3 + (a + a )λ 3 = (a + a 33 ) + (a + a 33 )(a + a ) λ 3 λ 3 λ 3 λ 3 λ 3 λ 3 +(a + a 33 ) + (a + a 33 )(a + a ) λ 3 λ 3 λ 3 λ 3 λ 3 λ 3 J = 0 (14) Corollary If A (0) is (anti-) Hermitian, singular and of rank 1 and with distinct nonzero diagonal element, then f at X (0) is linear Therefore, the solution of the IEP by Newton s method is exact and in one step only This is only true for n = Example 1 (The case n = ) From Equation (), for n =, the characteristic functions of an (anti-) Hermitian matrix A of rank one are given as, f 1 (a, a ) = λ 1(a + a ) f (a, a ) = λ (a + a ) From Equation (8), the corresponding Jacobian matrix is given as [ ] fi a λ 1 a λ 1 J = = a (15) jj a λ a λ The inverse of the Jacobian matrix is then computed as follows J 1 = 1 (λ λ 1 )(a a ) a λ λ 1 a (16) λ a a λ 1 Note that for J to be invertible, the target eigenvalues λ 1 and λ must be distinct So also must be the diagonal elements of A Numerical Example 1 Let λ 1 = 1 and λ = 3 Now suppose, initial matrix is given by the singular symmetric matrix 1 A (0) = 4 So that a (0) = 1, a (0) = 4, f 1 (a (0), a (0) ) = 1 + 5 = 6 and f (a (0), a (0) ) = 9 15 = 6 Thus f 1 (a (0), a (0) ) f (a (0), a (0) ) = 6 6 67

and, Newton s method then gives a (1) a (1) = 1 4 + 1 6 1 1 5 6 = 1 4 + 0 3 = 1 1 1 Hence A (1) = 1 It is easily checked the eigenvalues of A(1) are -1 and 3 1 Note also that the Hermitian matrices, 4, 1 i i 4, 1 i also each yield the same solution of the i 4 1 IEP as 4 1 Similarly, the (anti-) Hermitian matrices, i 4, i 1 i i 4, i 1 i also each yield the same solution i 4 1 of the IEP as i, where i and 3i are the target eigenvalues in this case 4 Theorem 6 Let the target eigenvalues λ 1,λ,,λ n have multiplicities p and q, where p + q = n and 1 p, q < n And let the initial matrix A possess corresponding diagonal elements repeated p times and q times respectively with non diagonal elements as prescribed by equation Then, the n characteristic polynomial functions may be reduced to only two: f i (a, a,,a nn ) = f i (a kk, a kk,,a } {{ kk, a } mm, a mm,,a mm ) = φ } {{ } i (a kk, a mm ) i = 1, p times q times where φ 1 (a kk, a mm ) = (pλ k ) pλ k (pa kk + qa mm ) φ (a kk, a mm ) = (qλ m ) qλ m (pa kk + qa mm ) with corresponding Jacobian matrix: [ ] φi qa mm pλ k J φ = = a jj qa mm qλ m pa kk pλ k (17) pa kk qλ m Which may be used to solve the for the diagonal element a (1), a(1) for the IEP by the Newtons method in only one step Subsequently, at least n 1 non-diagonal elements may also be computed using Equation 4 as constraint Proof The diagonal elements of the initial matrix a (0), a(0) are, without loss of generality replaced respectively by pa kk, qa mm in the Jacobian matrix in Equation (15), as well as in the corresponding characteristic functions The target eigenvalues λ k,λ m are also respectively replaced by pλ k, qλ m in the same expressions Thus, the solution of the IEP by Newton s method is here also exact in only one step Subsequently, at least n 1 non-diagonal elements may also be computed using Equation 4 as constraint Example (The case of n = 3) Consider a 3 3 (anti-) Hermitian initial matrix of rank 1 with non diagonal elements as precribed by Equation () 1 k 1 k A = a k 1 k 1 k 1 k k k 1 k k In order to have repeated diagonal elements of the initial matrix (say a (0) Where A (0) is an (anti-) Hermitian matrix of rank 1 68 = a(0) 33 ), we can put k 1 = k or k 1 = ±ik

Numerical Example In particular, for the matrix above with a = 1, k 1 =, k = or±i we have the following possibilities 1 A (0) = 4 4 4 4 1 i or A (0) = 4 4i i 4i 4 1 i or A (0) = 4 4i i 4i 4 All these initial Hermitian matrices together with their anti Hermitian counterparts should lead to the same solution of the IEP given any target spectrum of multiplicity : λ 1 λ = λ 3 The characteristic functions of the problem is given as (p = 1 and q = ): φ 1 (a, a ) = λ 1(a + a ) φ (a, a ) = (λ ) λ (a + a ) The corresponding Jacobian matrix also is given as [ ] φi a λ 1 a λ 1 J φ = = a jj a λ a λ The inverse of the Jacobian matrix is then computed as follows Jφ 1 1 a λ λ 1 a = (λ λ 1 )(a a ) λ a a λ 1 Thus a (1) a (1) = a (0) a (0) 1 a λ λ 1 a φ 1 (a (0) (λ λ 1 )(a a ), a(0) ) λ a a λ 1 φ (a (0), a(0) ) As a numerical example we now solve the inverse eigenvalue problem for λ 1 = 1,λ = λ 3 = 3/ φ 1 (a, a ) = 1 + (1 + (4)) = 10, φ (a, a ) = 9 3(1 + (4)) = 18, so that a (0) = 1 and a(0) = 4 and φ 1 (a (0), a(0) ) φ (a (0), a(0) ) = 10 18 The Newton s method gives the diagonal elements: a (1) a (1) = 1 8 + 1 10 60 13 17 18 = 1 8 + 1 16 60 436 = 1667 07333 Further two non-diagonal elements a 1 and a 3 ( putting a 13 = a 31 = 0) may be obtained by solving the constraints: 1 [ (tra) (tra ) ] = λ 1 λ + λ 1 λ 3 + λ λ 3 det A = λ 1 λ λ 3 Thus, a symmetric solution of the IEP is obtained as follows: 1667 03700i 0 A (1) = 03700i 03667 13966 0 13966 03667 It is easily checked that the eigenvalues of A (1) are ( 1, 3/, 3/) 69

Note that as in the previous example, certain 3 3 initial Hermitian matrices with their anti-hermitian counterparts should lead to the similar complex symmetric or (anti-) Hermitian solutions to the IEP, provided the non diagonal elements are appropriately prescribed using Equations () and (4) Example 3 (The case of n = 4) For the case n = 4, when the target eigenvalues λ 1 and λ each have multiplicities, the characteristic functions of the (anti-) Hermitian matrix of rank one, reduce to the following forms: φ 1 (a, a ) = (λ 1 ) λ 1 (a + a ) φ (a, a ) = (λ ) λ (a + a ) Then, the corresponding Jacobian matrix is also given as [ ] φi a λ 1 a λ 1 J φ = = a jj a a λ The inverse of the Jacobian matrix is then computed as follows J 1 = 1 (λ λ 1 )((a a ) a λ 1 a λ a a λ 1 Numerical Example 3 As a numerical example we now solve the inverse eigenvalue problem for λ 1 = λ 3 = 1/ and λ = λ 4 = 3/ Then φ 1 (a, a ) = 1 + ( + (4)) =, φ (a, a ) = 9 3( + (4)) = 1, so that a (0) = 1 and a(0) = 4 and φ 1 (a (0), a (0) [ ] ) φ (a (0), a (0) ) = 1 and, Newton s method then gives the diagonal elements: (1) a = 8 + 1 [ ] 1 5 48 13 17 1 = + 1 6 8 48 506 = 441667 41667 a (1) Further two distinct non-diagonal elements a 1, a 3 (putting a 1 = a 34, and : a 14 = a 41 = a 13 = a 31 = a 4 = a 4 = 0) may be obtained by solving the constraints: 1 [ (tra) (tra ) ] = λ 1 λ + λ 1 λ 3 + λ 1 λ 4 + λ λ 3 + λ λ 4 + λ 3 λ 4 det A = λ 1 λ λ 3 λ 4 Thus, a symmetric solution of the IEP is obtained as follows: 0833 13851 0 0 13851 10833 7701i 0 A (1) = 0 7701i 0833 13851 0 0 13851 10833 It is easily checked that the eigenvalues of A (1) are ( 1/, 3/, 1/, 3/) Note that as in the previous examples, certain 4 4 initial Hermitian matrices with their anti-hermitian counterparts should lead to the similar complex symmetric or (anti-) Hermitian solutions to the IEP, provided the non diagonal elements are appropriately prescribed using Equations () and (4) 4 Conclusion We have obtained, in this study, a Newton s algorithm for solving the inverse eigenvalue problem for certain (anti-) Hermitian matrices and demonstrated that in the neighbourhood of certain singular (anti-) Hermitian matrices of rank 1, the solution is in one step 70

References Aidoo, A Y, Gyamfi, K B, Ackora-Prah, J, & Oduro, F T (013) Solvability of the inverse eigenvalue problem for dense symmetric matrices Advances in Pure Mathematics, 3, 14-19 Chu, M T (1995) Constructing a hermitian matrix from its diagonal elements and eigenvalues SIAM Journal of Matrix Anal Applics, 16, 07-17 Chu, M T, & Golub, G H (005) Inverse Eigenvalue Problems: Theory Algorithms and Applications New York: Oxford University Press http://dxdoiorg/101093/acprof:oso/97801985666490010001 Deakin, A S, & Luke, T M (199) On the inverse eigenvalue problem for matrices J of Phys A, Math and General, 5, 635-648 http://dxdoiorg/101088/0305-4470/5/3/00 Gladwell, G M L (004) Inverse Problems in Vibration (nd ed) Dordrecht: Kluwer Academic Publishers Gyamfi, K B, Oduro, F T, & Aidoo, A Y (013) Solution of an inverse eigenvalue problem by Newton s method on a fibre bundle with structure group so(n) Journal of Mathematics and System Science, 3(3) Copyrights Copyright for this article is retained by the author(s), with first publication rights granted to the journal This is an open-access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommonsorg/licenses/by/30/) 71