FORMS ON INNER PRODUCT SPACES

Similar documents
Linear algebra 2. Yoav Zemel. March 1, 2012

1 Principal component analysis and dimensional reduction

Math Linear Algebra II. 1. Inner Products and Norms

October 25, 2013 INNER PRODUCT SPACES

A PRIMER ON SESQUILINEAR FORMS

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Eigenvectors and Hermitian Operators

Linear Algebra: Matrix Eigenvalue Problems

1. General Vector Spaces

Numerical Linear Algebra

SYLLABUS. 1 Linear maps and matrices

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction

Linear Algebra Lecture Notes-II

Stat 159/259: Linear Algebra Notes

Spectral Theorem for Self-adjoint Linear Operators

The following definition is fundamental.

Boolean Inner-Product Spaces and Boolean Matrices

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

Foundations of Matrix Analysis

6 Inner Product Spaces

Quantum Computing Lecture 2. Review of Linear Algebra

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

Lecture notes: Applied linear algebra Part 1. Version 2

EIGENVALUES AND EIGENVECTORS 3

Math 413/513 Chapter 6 (from Friedberg, Insel, & Spence)

Linear Algebra. Min Yan

4.2. ORTHOGONALITY 161

Linear algebra and applications to graphs Part 1

Algebra II. Paulius Drungilas and Jonas Jankauskas

LECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS MAT FALL 2006 PRINCETON UNIVERSITY

Math 443 Differential Geometry Spring Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook.

Matrices A brief introduction

Algebra Exam Syllabus

Duke University, Department of Electrical and Computer Engineering Optimization for Scientists and Engineers c Alex Bronstein, 2014

FUNCTIONAL ANALYSIS LECTURE NOTES: ADJOINTS IN HILBERT SPACES

Reflections and Rotations in R 3

MATH 583A REVIEW SESSION #1

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

Linear Algebra Review. Vectors

Review of Linear Algebra

G1110 & 852G1 Numerical Linear Algebra

Singular Value Decomposition (SVD) and Polar Form

Elementary Row Operations on Matrices

Chapter Two Elements of Linear Algebra

Mathematical Methods wk 2: Linear Operators

Numerical Linear Algebra Homework Assignment - Week 2

4.6. Linear Operators on Hilbert Spaces

Review of linear algebra

Linear algebra. S. Richard

Spectral Theory, with an Introduction to Operator Means. William L. Green

Math113: Linear Algebra. Beifang Chen

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra

Mathematical Methods wk 1: Vectors

Mathematical Methods wk 1: Vectors

Zangwill s Global Convergence Theorem

Spectral Theorems in Euclidean and Hermitian Spaces

Exercise Sheet 1.

ELEMENTARY MATRIX ALGEBRA

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

Page 52. Lecture 3: Inner Product Spaces Dual Spaces, Dirac Notation, and Adjoints Date Revised: 2008/10/03 Date Given: 2008/10/03

Linear Algebra Notes. Lecture Notes, University of Toronto, Fall 2016

Honors Linear Algebra, Spring Homework 8 solutions by Yifei Chen

Elementary linear algebra

12. Perturbed Matrices

LINEAR ALGEBRA REVIEW

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Linear Algebra II. 7 Inner product spaces. Notes 7 16th December Inner products and orthonormal bases

REPRESENTATION THEORY NOTES FOR MATH 4108 SPRING 2012

NOTES ON BILINEAR FORMS

4. Linear transformations as a vector space 17

Review of Linear Algebra Definitions, Change of Basis, Trace, Spectral Theorem

Review of some mathematical tools

POSITIVE MAP AS DIFFERENCE OF TWO COMPLETELY POSITIVE OR SUPER-POSITIVE MAPS

Math 593: Problem Set 10

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ.

The Spectral Theorem for normal linear maps

Algebra I Fall 2007

Chapter 5. Basics of Euclidean Geometry

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm

Typical Problem: Compute.

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

STAT200C: Review of Linear Algebra

Chapter 5. Linear Algebra

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

SAMPLE OF THE STUDY MATERIAL PART OF CHAPTER 1 Introduction to Linear Algebra

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

MATHEMATICS 217: HONORS LINEAR ALGEBRA Spring Term, First Week, February 4 8

Homework 11 Solutions. Math 110, Fall 2013.

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

Definition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition

INTRODUCTION TO REPRESENTATION THEORY AND CHARACTERS

Lecture 2: Linear operators

REAL LINEAR ALGEBRA: PROBLEMS WITH SOLUTIONS

Linear Algebra 2 Spectral Notes

Chapter III. Quantum Computation. Mathematical preliminaries. A.1 Complex numbers. A.2 Linear algebra review

Spectral Theorems in Euclidean and Hermitian Spaces

Transcription:

FORMS ON INNER PRODUCT SPACES MARIA INFUSINO PROSEMINAR ON LINEAR ALGEBRA WS2016/2017 UNIVERSITY OF KONSTANZ Abstract. This note aims to give an introduction on forms on inner product spaces and their relation to linear operators. After briefly recalling some basic concepts from the theory of linear operators on inner product spaces, we focus on the space of forms on a real or complex finite-dimensional vector space V and show that it is isomorphic to the space of linear operators on V. We also describe the matrix representation of a form with respect to an ordered basis of the space on which it is defined, giving special attention to the case of forms on finite-dimensional complex inner product spaces and in particular to Hermitian forms. Contents Introduction 1 1. Preliminaries 1 2. Sesquilinear forms on inner product spaces 3 3. Matrix representation of a form 4 4. Hermitian forms 6 Notes 7 References 8

FORMS ON INNER PRODUCT SPACES 1 Introduction In this note we are going to introduce the concept of forms on an inner product space and describe some of their main properties (c.f. [2, Section 9.2]). The notion of inner product is a basic notion in linear algebra which allows to rigorously introduce on any vector space intuitive geometrical notions such as the length of an element and the orthogonality between two of them. We will just recall this notion in Section 1 together with some basic examples (for more details on this structure see e.g. [1, Chapter 3], [2, Chapter 8], [3]). In Section 1 we will also recall the famous Riesz representation theorem for finite-dimensional inner product spaces which describes a direct correspondence between inner products and linear functionals. This will be particularly useful in the following because we are going to analyze in details the relation between forms and linear operators on inner product spaces. Indeed, after introducing the general definition of a form in Section 2, we prove the main result of this note (see Theorem 2.4). This establishes an isomorphism between the space of all forms on a real or complex finite-dimensional vector space V and the space of linear operators on V. In Section 3 we define the matrix associated to a form on a finite-dimensional vector space w.r.t. an ordered basis of its domain and study some properties occurring in the complex case in presence of an inner product. In particular, we will see how the isomorphism mentioned above allows to transfer fundamental properties of representing matrices of linear operators to forms. This same direction is pursued in Section 4, where we focus on the class of Hermitian forms and derive a beautiful correspondence between Hermitian forms and self-adjoint linear operators. On the one hand, we derive from a basic property of Hermitian forms a characterization of self-adjoint linear operators on finite-dimensional complex inner product spaces. On the other hand, thanks to a fundamental property of linear self-adjoint operators, we give a quite direct proof of the so-called principal axis theorem. 1. Preliminaries Throughout this note we denote by K the field of real numbers or the field of complex numbers and consider only vector spaces over K. Let us start by recalling the definition of inner product and making some examples of inner product spaces. Definition 1.1 (Inner product). Let V be a vector space over K. A function, : V V K is an inner product on V if for all α, β, γ V and for all c K the following properties hold: (a) cα + β, γ = c α, γ + β, γ (linearity in the first variable) (b) β, α = α, β (conjugate symmetry) (c) α, α 0 and α, α = 0 α = 0 (positive definiteness) Remark 1.2. Note that conditions (a), (b) imply the conjugate linearity of the inner product in the second variable, i.e. for all α, β, γ V α, cβ + γ = c α, β + α, γ. Definition 1.3 (Inner product space). A vector space V over K together with an inner product, is said to be an inner product space and it is usually denoted by (V,, ).

2 MARIA INFUSINO A finite-dimensional inner product space over K is called euclidean space when K = R and unitary space when K = C. Examples 1.4. Let n N. 1. Consider the function, : K n K n K defined by: n (1.1) α, β := x i y i, α = (x 1,..., x n ) t, β = (y 1,..., y n ) t K n, i=1 where t denotes the transpose operator. It is easy to check that (K n,, ) is an inner product space. The inner product defined in (1.1) is usually called standard inner product. 2. Let M(n n; K) be the space of all square matrices of order n with entries in K and for any B = (B jk ) n j,k=1 we define the conjugate traspose B of B to be the matrix B := (Bjk )n j,k=1 with B jk = B kj. Consider the function, : M(n n; K) M(n n; K) K defined by: A, B := tr(ab ), A, B M(n n; K), where tr denotes the trace operator on M(n n; K). Then for any A = (A jk ) n j,k=1, B = (B jk) n j,k=1 M(n n; K) we have n n n n n A, B = tr(ab ) = (AB ) jj = A jk Bkj = A jk B jk. j=1 j=1 k=1 j=1 k=1 It is now clear that the latter corresponds to the standard inner product on K n2 through the well-known isomorphism between M(n n; K) and K n2. Hence, (M(n n; K),, ) is an inner product space. (Of course one can also directly check that, satisfies all the properties (a), (b), (c) of Definition 1.1 to show that, is an inner product on M(n n; K)). Exercise 1.5. Consider the space C([0, 1]) of all continuous functions from [0, 1] to C and the function, : C([0, 1]) C([0, 1]) C defined by: (1.2) f, g := 1 0 f(x)g(x)dx, f, g C([0, 1]). Show that, is a complex inner product on C([0, 1]). 1 Before introducing the concept of form on a generic inner product space, let us recall an important result about linear functionals on finite-dimensional inner product spaces. Theorem 1.6 (Riesz representation theorem). Let (V,, ) be a finite-dimensional inner product space over K and T : V K linear. Then there exists a unique β V such that T (α) = α, β for all α V. Proof. (see [2, Section 8.3, Theorem 6] or [3, Skript 20, Satz 3]) The Riesz representation theorem guarantees that every linear functional on a finite-dimensional inner product space is the inner product w.r.t. a fixed vector of its domain. This result can be used to prove the existence of the adjoint T of a linear operator T on a finite-dimensional inner product space V over K (see [2, Section 8.3, Theorem 7]), i.e. T : V V linear such that T α, β = α, T β,

FORMS ON INNER PRODUCT SPACES 3 α, β V. Recall also that a linear operator T on V is said to be self-adjoint if T = T. As a last reminder, we recall the notion of matrix associated to a linear operator on a finite-dimensional vector space V over K. Let us denote by L(V ; V ) the space of all linear operators from V to V. Definition 1.7. Let n N. Given an orderded basis B := {α 1,..., α n } of an n dimensional vector space V over K and T L(V ; V ), the matrix [T ] B having as columns the vectors T α j for j = 1,..., n is called the matrix associated to T w.r.t. B. In symbols [T ] B := (A jk ) n j,k=1 s.t. T α j = n k=1 A kjα k, j {1,..., n}. 2. Sesquilinear forms on inner product spaces Definition 2.1. A (sesquilinear) form on a vector space V over K is a function f : V V K such that: for all α, β, γ V and for all c K the following hold: (a) f(cα + β, γ) = cf(α, γ) + f(β, γ) (b) f(α, cβ + γ) = cf(α, β) + f(α, γ). Thus, a sesquilinear form f on a vector space V over K is a function on V V which is linear as a function of the first argument but conjugate linear as a function of the second argument. Note that if K = R then any sequilinear form f is linear as a function of each of its arguments, i.e. f is a bilinear form. However, in the case K = C, a sesquilinear form f is not bilinear unless f 0 on V. In the following, for notational convenience, we will omit the adjective sesquilinear. Example 2.2. Any inner product on a vector space V over K is a form on V. Exercise 2.3. For all α = (x 1, x 2 ) t, β = (y 1, y 2 ) t C 2, let 1. f(α, β) := 1 2. g(α, β) := (x 1 y 1 ) 2 + x 2 y 2 3. h(α, β) := (x 1 + y 1 ) 2 (x 1 y 1 ) 2. Establish which ones of the above functions are forms on C 2, motivating your answers. 2 Let us denote by F(V, V ; K) the space of all forms on V. This is a linear subspace of the vector space of all scalar valued functions on V V. The following result summarizes the beautiful relation existing between forms and linear operators on finite-dimensional inner product spaces. Theorem 2.4. Let (V,, ) be a finite-dimensional inner product space over K. (a) If f F(V, V ; K), then there exists a unique linear operator T f : V V such that f(α, β) = T f α, β, α, β V. (b) The map φ : F(V, V ; K) L(V ; V ) defined by φ(f) := T f for all f F(V, V ; K) is an isomorphism of vector spaces. Proof. (a) Existence: Let us fix a vector β V. Then f β :V K α f β (α) := f(α, β) is a linear functional on V and so by Theorem 1.6 there exists a unique β V s.t. f β (α) = α, β, α V.

4 MARIA INFUSINO Define U : V V by U(β) := β, β V. Then we have (2.1) f(α, β) = α, U(β), α, β V. Since f F(V, V ; K), we know that for all α, β, γ V and for all c K: f(α, cβ + γ) = cf(α, β) + f(α, γ) which, by using (2.1), can be rewritten as: α, U(cβ + γ) = c α, U(β) + α, U(γ) = α, cu(β) + U(γ). The latter implies that U(cβ + γ) = cu(β) + U(γ)β, β, γ V, c K i.e. U is linear. Since V is finite-dimensional, then there exists the adjoint U of the linear operator U on V. Taking T f := U, we have that for all α, β V f(α, β) (2.1) = α, U(β) = U (α), β = T f (α), β, which is the desired conclusion. Uniqueness: Suppose there exists T L(V ; V ) s.t. T T f, f(α, β) = T (α), β α, β V. Then 0 = T f (α), β T (α), β = T f (α) T (α), β, α, β V and so T f α = T α, α V, which is a contradiction. (b) To show that φ is an isomorphism between F (V, V ; K) and L(V ; V ) we need to prove that φ is a linear bijective map between these two vector spaces. Part (a) already shows that φ is bijective so it remains to verify the linearity of φ. Let f, g F (V, V ; K) and c K. Then for any α, β V we have T cf+g α, β (a) = (cf + g)(α, β) = cf(α, β) + g(α, β) (a) = c T f α, β + T g α, β = (ct f + T g )α, β which implies that T cf+g = ct f + T g, i.e. φ(cf + g) = cφ(f) + φ(g). 3. Matrix representation of a form In this section we focus on forms on finite-dimensional inner product spaces and study their matrix representation. Let us start with a general definition. Definition 3.1. Let n N. Given an ordered basis B := {α 1,..., α n } of an n dimensional vector space V over K and f F (V, V ; K), the matrix [f] B := (A jk ) n j,k=1 defined by A jk := f(α k, α j ) for all j, k = 1,..., n is called the matrix associated to f w.r.t. B. Proposition 3.2. Let V be a finite-dimensional inner product space over K and f F (V, V ; K). If B is an orthonormal basis of V, then [f] B [T f ] B (here T f is the operator given by Theorem 2.4 and [T f ] B is defined according to Definition 1.7). Proof. Let B := {α 1,..., α n } be an orthonormal basis of V. Let [f] B := (A jk ) n j,k=1 and [T f ] B := (M jk ) n j,k=1. Then for any j, k {1,..., n} we have: n n Def 3.1 Thm 2.4 Def 1.7 A jk = f(α k, α j ) = T f α k, α j = M lk α l, α j = M lk α l, α j = M jk. l=1 l=1

FORMS ON INNER PRODUCT SPACES 5 Note that Proposition 3.2 does not hold for a general basis as it is showed by the following example. Example 3.3. Consider the standard inner product on R 2 and f : R 2 R 2 R defined { by f(x, ( y) ):= x 1 y 2 ( + x 2 )} y 1 for all x = { (x 1, x( 2 ) t, y ) = (y 1, y( 2 ) t )} R 2. Let 1 0 1 3 B := b 1 :=, b 0 2 := and B 1 := b 1 :=, b 2 2 :=. Then 4 B and B are both bases( of R 2 but ) only B is also( orthonormal. ) By using Definition 0 1 4 10 3.1, we obtain [f] B = and [f] 1 0 B =. By Theorem 2.4-(a) 10 24 ( ) ( ) applied to f we have that T f : R 2 R 2 x1 x2 is given by T f = and, x 2 x 1 ( ) ( ) 0 1 by Definition 1.7, we can compute: [T f ] B = T f (b 1 ) T f (b 2 ) = and 1 0 ( ) ( ) 1 [T f ] B = T f (b 1) T f (b 2 7 2 2) =. Hence, [T f ] B [f] B but [T f ] B [f] B. 1 2 5 2 Let n N. If B := {α 1,..., α n } is an ordered basis of an n dimensional vector space V over K, f F (V, V ; K) and [f] B = (A jk ) n j,k=1, then for any x, y V we have x = n s=1 x sα s, y = n r=1 y rα r and A n 11... A 1n x 1 f(x, y) = y r f(α s, α r )x s = ( y 1... y n )......, r,s=1 A n1... A nn x n i.e. f(x, y) = Y [f] B X where X := [x] B and Y := [y] B are the coordinate matrices of x and y w.r.t. B and Y is the conjugate transpose of Y (see Example 1.4-2). For the remainder of this note, we want to concentrate on the complex finitedimensional inner product spaces. In particular, in the last part of this section we want to show how the isomorphism described in Theorem 2.4 allows to transfer a very important property of representing matrices of linear operators to forms. For convenience let us rewrite here the statement of the so-called orthormal triangularizability theorem for linear operators on complex finite-dimensional inner product spaces (see [2, Section 8.5, Theorem 21] or [3, Skript 23, Satz I]). Theorem 3.4. Let V be a finite-dimensional inner product space over C and let T L(V ; V ). Then there exists an orthonormal basis B for V s.t. [T ] B is an upper triangular matrix. Then we can easily derive the analogous result for forms: Theorem 3.5. Let V be a finite-dimensional inner product space over C and let f F (V, V ; C). Then there exists an orthonormal basis B for V s.t. [f] B is an upper triangular matrix. Proof. By Theorem 2.4, there exists a unique linear operator T f such that f(α, β) = T f α, β for all α, β V. Therefore, by applying Theorem 3.4 to T f, we get that there exists an orthonormal basis B for V s.t. [T f ] B is an upper triangular matrix. But Proposition 3.2 guarantees that [f] B [T f ] B, which completes the proof.

6 MARIA INFUSINO 4. Hermitian forms In this section we continue to study forms on complex finite-dimensional inner product spaces and in particular we focus on a special class of those, namely the Hermitian forms. Definition 4.1 (Hermitian form). A form f on a complex vector space V is called Hermitian if f(α, β) = f(β, α) for all α, β V. The following is a well-known characterization of Hermitian forms. Theorem 4.2. Let V be a complex vector space and f F (V, V ; C). Then f is Hermitian if and only if f(α, α) R for all α V. Proof. Suppose that f be an Hermitian form on V. Then by definition f(α, α) = f(α, α) and so f(α, α) R. Conversely, assume that f(α, α) R for all α V. Let α, β V. Then: f(α + β, α + β) and so f(α, β) + f(β, α) R i.e. = f(α, α) +f(α, β) + f(β, α) + f(β, β) (4.1) f(α, β) + f(β, α) = f(α, β) + f(β, α). We also have that f(α + iβ, α + iβ) and so if(β, α) if(α, β) R i.e. = f(α, α) if(α, β) + if(β, α) + f(β, β) (4.2) if(α, β) + if(β, α) = if(α, β) if(β, α). Multiplying (4.2) by i and summing it side by side to (4.1) we get f(α, β) = f(β, α), i.e. f is Hermitian. Thanks to Theorem 2.4, we get an interesting connection between Hermitian forms and self-adjoint operators on finite-dimensional inner product spaces over C. Proposition 4.3. Let (V,, ) be a complex finite-dimensional inner product space and f F (V, V ; C). Then f is Hermitian if and only if T f is self-adjoint (here T f is the one given by Theorem 2.4). Proof. By Theorem 2.4, we know that f(α, β) = T f α, β for all α, β V and so f(β, α) = T f β, α = α, T f β = T f α, β. Hence, we have that f is Hermitian if and only if T f = T f. Combining the previous two results, we directly obtain the following characterization of self-adjoint operators on a complex finite-dimensional inner product space. Corollary 4.4. Let (V,, ) be a complex finite-dimensional inner product space and T L(V ; V ). Then T is self-adjoint if and only if T α, α R, α V. Example 4.5. The form f defined in Example 1.4-2 for K = C is Hermitian. Indeed, A M(n n, C), f(a, A) = n j,k=1 A jk A kj = n j,k=1 A kja kj R and so f is Hermitian by Theorem 4.2.

FORMS ON INNER PRODUCT SPACES 7 In the end, let us point out some properties of the matrix associated to an Hermitian form on a finite-dimensional vector space. First of all, if B is an ordered basis of a finite-dimensional complex vector space V and f F(V, V ; C) then f is Hermitian if and only if A := [f] B is Hermitian. Indeed, we have seen in the previous section that for any x, y V we have f(x, y) = Y AX where X = [x] B and Y = [y] B and so f(y, x) = X AY = X t AY = (X t AY ) t = Y t A t X = Y A X. Hence, f is Hermitian if and only if A = A, i.e. [f] B is Hermitian. A nice property of self-adjoint operators which reflects in the theory of representing matrix of Hermitian forms is the following one. Theorem 4.6. Let V be a finite-dimensional complex vector space and T L(V ; V ) self-adjoint. Then there exists an orthonormal basis B for V consisting of eigenvectors for T. Proof. (see [2, Sec 8.5, Theorem 18] or [3, Skript 23, Kor 2]) This provides the following result for Hermitian forms which is often known as Principle Axis Theorem. Theorem 4.7. Let f be an Hermitian form on a finite-dimensional complex inner product space V. Then there exists an orthonormal basis B for V such that [f] B is a diagonal matrix with real entries. Proof. Let n N be the dimension of V and, the inner product on V. By Theorem 2.4, there exists a unique linear operator T f on V such that f(α, β) = T f α, β for all α, β V. Since f is Hermitian, by Proposition 4.3, we have that T f is self-adjoint. Hence, by Theorem 4.6 there exists an orthonormal basis B := {α 1,..., α n } for V consisting of eigenvectors for T f, i.e. T f α j = c j α j for j = 1,..., n with c j C. Now for any j, k {1,..., n} we have f(α k, α j ) = T f α k, α j = c k α k, α j = c k δ kj, where δ jk is the Korenecker delta. Hence, [f] B = diag(c 1,..., c n ) and for any j {1,..., n} we get c j = f(α j, α j ), which is a real number by Theorem 4.2 as f is Hermitian. Notes 1. Solution to Ex 1.5: For any f, g, h C([0, 1]) and c C we have: (a) cf + g, h = 1 0 (cf(x) + g(x))h(x)dx = c 1 0 f(x)h(x)dt + 1 0 g(x)h(x)dx = c f, h + g, h (b) g, f = 1 0 g(x)f(x)dx = 1 0 f(x)g(x)dx = 1 0 f(x)g(x)dx = f, g (c) f, f = 1 0 f(x)f(x)dx 0 since f(x)f(x) 0 for all x [0, 1] and f, f = 1 0 f(x)f(x)dx = 0 iff f 0 on [0, 1]. Hence,, defined in (1.2) is a complex inner product on C([0, 1]). 2. Solution to Ex 2.3: h is the only sesequilinear form on C 2 among the given ones. In fact, it is easy to verify that f and g are both not linear in the first argument, while one can rewrite h as h(α, β) = 4x 1 y 1 for all α = (x 1, x 2 ) t, β = (y 1, y 2 ) t C 2 which clearly fulfills all the properties in Definition 2.1.

8 MARIA INFUSINO References [1] P.R. Halmos, Finite-dimensional vector spaces Reprinting of the 1958 second edition. Undergraduate Texts in Mathematics. Springer-Verlag, New York-Heidelberg, 1974. viii+200 pp. [2] K. Hoffman, R. Kunze, Linear algebra, Second edition Prentice-Hall, Inc., Englewood Cliffs, N.J., 1971 viii+407 pp. [3] S. Kuhlmann, Lineare Algebra II, http://www.math.uni-konstanz.de/ kuhlmann/lehre/ss12- LinAlg2/Skripts/Gesamt-LinAlg2-SS2012.pdf