Statistics for Social and Behavioral Sciences

Similar documents
SpringerBriefs in Mathematics

Machine Tool Vibrations and Cutting Dynamics

Numerical Approximation Methods for Elliptic Boundary Value Problems

Tile-Based Geospatial Information Systems

For other titles in this series, go to Universitext

Dissipative Ordered Fluids

The Theory of the Top Volume II

Kazumi Tanuma. Stroh Formalism and Rayleigh Waves

Graduate Texts in Mathematics 216. Editorial Board S. Axler F.W. Gehring K.A. Ribet

Controlled Markov Processes and Viscosity Solutions

On the necessary and sufficient condition for the extended Wedderburn-Guttman theorem

PHASE PORTRAITS OF PLANAR QUADRATIC SYSTEMS

Topics in Algebra and Analysis

Maximum Principles in Differential Equations

Linear Partial Differential Equations for Scientists and Engineers

PROBLEMS AND SOLUTIONS FOR COMPLEX ANALYSIS

SpringerBriefs in Statistics

Undergraduate Texts in Mathematics

Preface to Second Edition... vii. Preface to First Edition...

UNDERSTANDING PHYSICS

Elements of Applied Bifurcation Theory

Fundamentals of Mass Determination

ThiS is a FM Blank Page

ATOMIC SPECTROSCOPY: Introduction to the Theory of Hyperfine Structure

Universitext. Series Editors:

Modern Power Systems Analysis

Progress in Mathematical Physics

On V-orthogonal projectors associated with a semi-norm

Multiscale Modeling and Simulation of Composite Materials and Structures

Multiplicative Complexity, Convolution, and the DFT

A Linear Systems Primer

Statistics and Measurement Concepts with OpenStat

Semiconductor Physical Electronics

EXPLICIT EXPRESSIONS OF PROJECTORS ON CANONICAL VARIABLES AND DISTANCES BETWEEN CENTROIDS OF GROUPS. Haruo Yanai*

Doubt-Free Uncertainty In Measurement

Linear Statistical Models

Physics of Classical Electromagnetism

Advanced Calculus of a Single Variable

Electronic Materials: Science & Technology

Felipe Linares Gustavo Ponce. Introduction to Nonlinear Dispersive Equations ABC

Igor Emri Arkady Voloshin. Statics. Learning from Engineering Examples

LINEAR FUNCTIONS AND MATRIX THEORY

Chapter 3 Transformations

Controlled Markov Processes and Viscosity Solutions

UNITEXT La Matematica per il 3+2. Volume 87

Multivariate Analysis in The Human Services

Use R! Series Editors: Robert Gentleman Kurt Hornik Giovanni Parmigiani

Undergraduate Texts in Mathematics

Multivariable Calculus with MATLAB

Hands-on Matrix Algebra Using R

Progress in Mathematics

Springer Texts in Electrical Engineering. Consulting Editor: John B. Thomas

Probability Theory, Random Processes and Mathematical Statistics

Circuit Analysis for Power Engineering Handbook

Graduate Texts in Mathematics 135. Editorial Board S. Axler K.A. Ribet

Coordination of Large-Scale Multiagent Systems

HONORS LINEAR ALGEBRA (MATH V 2020) SPRING 2013

Undergraduate Texts in Mathematics. Editors J. H. Ewing F. W. Gehring P. R. Halmos

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

for Complex Environmental Models

Linear Models in Matrix Form

SpringerBriefs in Mathematics

Introduction to Numerical Analysis

MATRIX AND LINEAR ALGEBR A Aided with MATLAB

Quantum Biological Information Theory

Statics and Mechanics of Structures

Semantics of the Probabilistic Typed Lambda Calculus

Dynamics and Control of Lorentz-Augmented Spacecraft Relative Motion

Interactive Quantum Mechanics

Graduate Texts in Mathematics 51

Latif M. Jiji. Heat Convection. With 206 Figures and 16 Tables

Columbus State Community College Mathematics Department Public Syllabus

APPLIED NUMERICAL LINEAR ALGEBRA

Mechanical Engineering Series. Frederic F. Ling Series Editor. Springer. New York Berlin Heidelberg Hong Kong London Milan Paris Tokyo

MATRICES and ALGEBRA. second edition. Hans Schneider. George Phillip Barker. DOVER PUBLICATIONS, INC., New York

Studies in the History of Mathematics and Physical Sciences

A Beginner s Guide to Finite Mathematics

Elementary Linear Algebra with Applications Bernard Kolman David Hill Ninth Edition

Lecture Notes in Mathematics 2138

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Publication of the Museum of Nature South Tyrol Nr. 11

Undergraduate Texts in Mathematics

Nadir Jeevanjee. An Introduction to Tensors and Group Theory for Physicists

Applied Linear Algebra in Geoscience Using MATLAB

Analysis and Control of Age-Dependent Population Dynamics

LINEAR ALGEBRA KNOWLEDGE SURVEY

Géza Schay. A Concise Introduction to Linear Algebra

Fractional Dynamics and Control

Moore-Penrose s inverse and solutions of linear systems

P.M. Cohn. Basic Algebra. Groups, Rings and Fields. m Springer

UNIT 6: The singular value decomposition.

Data Analysis Using the Method of Least Squares

Mechanics of Materials

Structure in Data. A major objective in data analysis is to identify interesting features or structure in the data.

Parameter Estimation and Hypothesis Testing in Linear Models

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Undergraduate Lecture Notes in Physics

1 Cricket chirps: an example

Applied Multivariate Statistical Analysis Richard Johnson Dean Wichern Sixth Edition

Numerical Data Fitting in Dynamical Systems

Transcription:

Statistics for Social and Behavioral Sciences Advisors: S.E. Fienberg W.J. van der Linden For other titles published in this series, go to http://www.springer.com/series/3463

Haruo Yanai Kei Takeuchi Yoshio Takane Projection Matrices, Generalized Inverse Matrices, and Singular Value Decomposition

Haruo Yanai Department of Statistics St. Luke s College of Nursing 10-1 Akashi-cho Chuo-ku Tokyo 104-0044 Japan hyanai@slcn.ac.jp Kei Takeuchi 2-34-4 Terabun Kamakurashi Kanagawa-ken 247-0064 Japan kei.takeuchi@wind.ocn.ne.jp Yoshio Takane Department of Psychology McGill University 1205 Dr. Penfield Avenue Montreal Québec H3A 1B1 Canada takane@psych.mcgill.ca ISBN 978-1-4419-9886-6 e-isbn 978-1-4419-9887-3 DOI 10.1007/978-1-4419-9887-3 Springer New York Dordrecht Heidelberg London Library of Congress Control Number: 2011925655 Springer Science+Business Media, LLC 2011 All rights reserved. This work may not be translated or copied in whole or in part without the written permission of the publisher (Springer Science+Business Media, LLC, 233 Spring Street, New York, NY 10013, USA), except for brief excerpts in connection with reviews or scholarly analysis. Use in connection with any form of information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed is forbidden. The use in this publication of trade names, trademarks, service marks, and similar terms, even if they are not identified as such, is not to be taken as an expression of opinion as to whether or not they are subject to proprietary rights. Printed on acid-free paper Springer is part of Springer Science+Business Media (www.springer.com)

Preface All three authors of the present book have long-standing experience in teaching graduate courses in multivariate analysis (MVA). These experiences have taught us that aside from distribution theory, projections and the singular value decomposition (SVD) are the two most important concepts for understanding the basic mechanism of MVA. The former underlies the least squares (LS) estimation in regression analysis, which is essentially a projection of one subspace onto another, and the latter underlies principal component analysis (PCA), which seeks to find a subspace that captures the largest variability in the original space. Other techniques may be considered some combination of the two. This book is about projections and SVD. A thorough discussion of generalized inverse (g-inverse) matrices is also given because it is closely related to the former. The book provides systematic and in-depth accounts of these concepts from a unified viewpoint of linear transformations in finite dimensional vector spaces. More specifically, it shows that projection matrices (projectors) and g-inverse matrices can be defined in various ways so that a vector space is decomposed into a direct-sum of (disjoint) subspaces. This book gives analogous decompositions of matrices and discusses their possible applications. This book consists of six chapters. Chapter 1 overviews the basic linear algebra necessary to read this book. Chapter 2 introduces projection matrices. The projection matrices discussed in this book are general oblique projectors, whereas the more commonly used orthogonal projectors are special cases of these. However, many of the properties that hold for orthogonal projectors also hold for oblique projectors by imposing only modest additional conditions. This is shown in Chapter 3. Chapter 3 first defines, for an n by m matrix A, a linear transformation y = Ax that maps an element x in the m-dimensional Euclidean space E m onto an element y in the n-dimensional Euclidean space E n. Let Sp(A) = {y y = Ax} (the range or column space of A) and Ker(A) = {x Ax = 0} (the null space of A). Then, there exist an infinite number of the subspaces V and W that satisfy E n = Sp(A) W and E m = V Ker(A), (1) where indicates a direct-sum of two subspaces. Here, the correspondence between V and Sp(A) is one-to-one (the dimensionalities of the two subspaces coincide), and an inverse linear transformation from Sp(A) to V can v

vi PREFACE be uniquely defined. Generalized inverse matrices are simply matrix representations of the inverse transformation with the domain extended to E n. However, there are infinitely many ways in which the generalization can be made, and thus there are infinitely many corresponding generalized inverses A of A. Among them, an inverse transformation in which W = Sp(A) (the ortho-complement subspace of Sp(A)) and V = Ker(A) = Sp(A ) (the ortho-complement subspace of Ker(A)), which transforms any vector in W to the zero vector in Ker(A), corresponds to the Moore-Penrose inverse. Chapter 3 also shows a variety of g-inverses that can be formed depending on the choice of V and W, and which portion of Ker(A) vectors in W are mapped into. Chapter 4 discusses generalized forms of oblique projectors and g-inverse matrices, and gives their explicit representations when V is expressed in terms of matrices. Chapter 5 decomposes Sp(A) and Sp(A ) = Ker(A) into sums of mutually orthogonal subspaces, namely and Sp(A) = E 1 E 2 E r Sp(A ) = F 1 F 2 F r, where indicates an orthogonal direct-sum. It will be shown that E j can be mapped into F j by y = Ax and that F j can be mapped into E j by x = A y. The singular value decomposition (SVD) is simply the matrix representation of these transformations. Chapter 6 demonstrates that the concepts given in the preceding chapters play important roles in applied fields such as numerical computation and multivariate analysis. Some of the topics in this book may already have been treated by existing textbooks in linear algebra, but many others have been developed only recently, and we believe that the book will be useful for many researchers, practitioners, and students in applied mathematics, statistics, engineering, behaviormetrics, and other fields. This book requires some basic knowledge of linear algebra, a summary of which is provided in Chapter 1. This, together with some determination on the part of the reader, should be sufficient to understand the rest of the book. The book should also serve as a useful reference on projectors, generalized inverses, and SVD. In writing this book, we have been heavily influenced by Rao and Mitra s (1971) seminal book on generalized inverses. We owe very much to Professor

PREFACE vii C. R. Rao for his many outstanding contributions to the theory of g-inverses and projectors. This book is based on the original Japanese version of the book by Yanai and Takeuchi published by Todai-Shuppankai (University of Tokyo Press) in 1983. This new English edition by the three of us expands the original version with new material. January 2011 Haruo Yanai Kei Takeuchi Yoshio Takane

Contents Preface v 1 Fundamentals of Linear Algebra 1 1.1 Vectors and Matrices....................... 1 1.1.1 Vectors.......................... 1 1.1.2 Matrices.......................... 3 1.2 Vector Spaces and Subspaces.................. 6 1.3 Linear Transformations..................... 11 1.4 Eigenvalues and Eigenvectors.................. 16 1.5 Vector and Matrix Derivatives.................. 19 1.6 Exercises for Chapter 1..................... 22 2 Projection Matrices 25 2.1 Definition............................. 25 2.2 Orthogonal Projection Matrices................. 30 2.3 Subspaces and Projection Matrices............... 33 2.3.1 Decomposition into a direct-sum of disjoint subspaces.................... 33 2.3.2 Decomposition into nondisjoint subspaces....... 39 2.3.3 Commutative projectors................. 41 2.3.4 Noncommutative projectors............... 44 2.4 Norm of Projection Vectors................... 46 2.5 Matrix Norm and Projection Matrices............. 49 2.6 General Form of Projection Matrices.............. 52 2.7 Exercises for Chapter 2..................... 53 3 Generalized Inverse Matrices 55 3.1 Definition through Linear Transformations........... 55 3.2 General Properties........................ 59 3.2.1 Properties of generalized inverse matrices....... 59 ix

x CONTENTS 3.2.2 Representation of subspaces by generalized inverses.................... 61 3.2.3 Generalized inverses and linear equations....... 64 3.2.4 Generalized inverses of partitioned square matrices...................... 67 3.3 A Variety of Generalized Inverse Matrices........... 70 3.3.1 Reflexive generalized inverse matrices......... 71 3.3.2 Minimum norm generalized inverse matrices...... 73 3.3.3 Least squares generalized inverse matrices....... 76 3.3.4 The Moore-Penrose generalized inverse matrix.... 79 3.4 Exercises for Chapter 3..................... 85 4 Explicit Representations 87 4.1 Projection Matrices........................ 87 4.2 Decompositions of Projection Matrices............. 94 4.3 The Method of Least Squares.................. 98 4.4 Extended Definitions....................... 101 4.4.1 A generalized form of least squares g-inverse..... 103 4.4.2 A generalized form of minimum norm g-inverse.... 106 4.4.3 A generalized form of the Moore-Penrose inverse... 111 4.4.4 Optimal g-inverses.................... 118 4.5 Exercises for Chapter 4..................... 120 5 Singular Value Decomposition (SVD) 125 5.1 Definition through Linear Transformations........... 125 5.2 SVD and Projectors....................... 134 5.3 SVD and Generalized Inverse Matrices............. 138 5.4 Some Properties of Singular Values............... 140 5.5 Exercises for Chapter 5..................... 148 6 Various Applications 151 6.1 Linear Regression Analysis................... 151 6.1.1 The method of least squares and multiple regression analysis.................... 151 6.1.2 Multiple correlation coefficients and their partitions...................... 154 6.1.3 The Gauss-Markov model................ 156 6.2 Analysis of Variance....................... 161 6.2.1 One-way design...................... 161 6.2.2 Two-way design..................... 164

CONTENTS xi 6.2.3 Three-way design..................... 166 6.2.4 Cochran s theorem.................... 168 6.3 Multivariate Analysis....................... 171 6.3.1 Canonical correlation analysis.............. 172 6.3.2 Canonical discriminant analysis............. 178 6.3.3 Principal component analysis.............. 182 6.3.4 Distance and projection matrices............ 189 6.4 Linear Simultaneous Equations................. 195 6.4.1 QR decomposition by the Gram-Schmidt orthogonalization method................ 195 6.4.2 QR decomposition by the Householder transformation............... 197 6.4.3 Decomposition by projectors.............. 200 6.5 Exercises for Chapter 6..................... 201 7 Answers to Exercises 205 7.1 Chapter 1............................. 205 7.2 Chapter 2............................. 208 7.3 Chapter 3............................. 210 7.4 Chapter 4............................. 214 7.5 Chapter 5............................. 220 7.6 Chapter 6............................. 223 8 References 229 Index 233