Yanai / Takeuchi / Takane | Projection Matrices, Generalized Inverse Matrices, and Singular Value Decomposition | E-Book | www.sack.de
E-Book

E-Book, Englisch, 236 Seiten

Reihe: Statistics for Social and Behavioral Sciences

Yanai / Takeuchi / Takane Projection Matrices, Generalized Inverse Matrices, and Singular Value Decomposition


1. Auflage 2011
ISBN: 978-1-4419-9887-3
Verlag: Springer
Format: PDF
Kopierschutz: 1 - PDF Watermark

E-Book, Englisch, 236 Seiten

Reihe: Statistics for Social and Behavioral Sciences

ISBN: 978-1-4419-9887-3
Verlag: Springer
Format: PDF
Kopierschutz: 1 - PDF Watermark



Aside from distribution theory, projections and the singular value decomposition (SVD) are the two most important concepts for understanding the basic mechanism of multivariate analysis. The former underlies the least squares estimation in regression analysis, which is essentially a projection of one subspace onto another, and the latter underlies principal component analysis, which seeks to find a subspace that captures the largest variability in the original space.

This book is about projections and SVD. A thorough discussion of generalized inverse (g-inverse) matrices is also given because it is closely related to the former. The book provides systematic and in-depth accounts of these concepts from a unified viewpoint of linear transformations finite dimensional vector spaces. More specially, it shows that projection matrices (projectors) and g-inverse matrices can be defined in various ways so that a vector space is decomposed into a direct-sum of (disjoint) subspaces. Projection Matrices, Generalized Inverse Matrices, and Singular Value Decomposition will be useful for researchers, practitioners, and students in applied mathematics, statistics, engineering, behaviormetrics, and other fields.



Yanai / Takeuchi / Takane Projection Matrices, Generalized Inverse Matrices, and Singular Value Decomposition jetzt bestellen!

Weitere Infos & Material


1;Preface;6
2;Contents;10
3;Chapter 1 Fundamentals of Linear Algebra;14
3.1;1.1 Vectors and Matrices;14
3.1.1;1.1.1 Vectors;14
3.1.2;1.1.2 Matrices;16
3.2;1.2 Vector Spaces and Subspaces;19
3.3;1.3 Linear Transformations;24
3.4;1.4 Eigenvalues and Eigenvectors;29
3.5;1.5 Vector and Matrix Derivatives;32
3.6;1.6 Exercises for Chapter 1;35
4;Chapter 2 Projection Matrices;38
4.1;2.1 Definition;38
4.2;2.2 Orthogonal Projection Matrices;43
4.3;2.3 Subspaces and Projection Matrices;46
4.3.1;2.3.1 Decomposition into a direct-sum of disjoint subspaces;46
4.3.2;2.3.2 Decomposition into nondisjoint subspaces;52
4.3.3;2.3.3 Commutative projectors;54
4.3.4;2.3.4 Noncommutative projectors;57
4.4;2.4 Norm of Projection Vectors;59
4.5;2.5 Matrix Norm and Projection Matrices;62
4.6;2.6 General Form of Projection Matrices;65
4.7;2.7 Exercises for Chapter 2;66
5;Chapter 3 Generalized Inverse Matrices;68
5.1;3.1 Definition through Linear Transformations;68
5.2;3.2 General Properties;72
5.2.1;3.2.1 Properties of generalized inverse matrices;72
5.2.2;3.2.2 Representation of subspaces by generalized inverses;74
5.2.3;3.2.3 Generalized inverses and linear equations;77
5.2.4;3.2.4 Generalized inverses of partitioned square matrices;80
5.3;3.3 A Variety of Generalized Inverse Matrices;83
5.3.1;3.3.1 Reflexive generalized inverse matrices;84
5.3.2;3.3.2 Minimum norm generalized inverse matrices;86
5.3.3;3.3.3 Least squares generalized inverse matrices;89
5.3.4;3.3.4 The Moore-Penrose generalized inverse matrix;92
5.4;3.4 Exercises for Chapter 3;98
6;Chapter 4 Explicit Representations;100
6.1;4.1 Projection Matrices;100
6.2;4.2 Decompositions of Projection Matrices;107
6.3;4.3 The Method of Least Squares;111
6.4;4.4 Extended Definitions;114
6.4.1;4.4.1 A generalized form of least squares g-inverse;116
6.4.2;4.4.2 A generalized form of minimum norm g-inverse;119
6.4.3;4.4.3 A generalized form of the Moore-Penrose inverse;124
6.4.4;4.4.4 Optimal g-inverses;131
6.5;4.5 Exercises for Chapter 4;133
7;Chapter 5 Singular Value Decomposition (SVD);137
7.1;5.1 Definition through Linear Transformations;137
7.2;5.2 SVD and Projectors;146
7.3;5.3 SVD and Generalized Inverse Matrices;150
7.4;5.4 Some Properties of Singular Values;152
7.5;5.5 Exercises for Chapter 5;160
8;Chapter 6 Various Applications;162
8.1;6.1 Linear Regression Analysis;162
8.1.1;6.1.1 The method of least squares and multiple regression analysis;162
8.1.2;6.1.2 Multiple correlation coefficients and their partitions;165
8.1.3;6.1.3 The Gauss-Markov model;167
8.2;6.2 Analysis of Variance;172
8.2.1;6.2.1 One-way design;172
8.2.2;6.2.2 Two-way design;175
8.2.3;6.2.3 Three-way design;177
8.2.4;6.2.4 Cochran's theorem;179
8.3;6.3 Multivariate Analysis;182
8.3.1;6.3.1 Canonical correlation analysis;183
8.3.2;6.3.2 Canonical discriminant analysis;189
8.3.3;6.3.3 Principal component analysis;193
8.3.4;6.3.4 Distance and projection matrices;200
8.4;6.4 Linear Simultaneous Equations;206
8.4.1;6.4.1 QR decomposition by the Gram-Schmidt orthogonalization method;206
8.4.2;6.4.2 QR decomposition by the Householder transformation;208
8.4.3;6.4.3 Decomposition by projectors;211
8.5;6.5 Exercises for Chapter 6;212
9;Chapter 7 Answers to Exercises;215
9.1;7.1 Chapter 1;215
9.2;7.2 Chapter 2;218
9.3;7.3 Chapter 3;220
9.4;7.4 Chapter 4;224
9.5;7.5 Chapter 5;230
9.6;7.6 Chapter 6;233
10;Chapter 8 References;239
11;Index;243



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.