Buch, Englisch, Band 7, 104 Seiten, Format (B × H): 156 mm x 234 mm
A Guided Tour
Buch, Englisch, Band 7, 104 Seiten, Format (B × H): 156 mm x 234 mm
Reihe: Foundations and Trends® in Machine Learning
ISBN: 978-1-60198-378-7
Verlag: Now Publishers
We give a tutorial overview of several foundational methods for dimension reduction. We divide the methods into projective methods and methods that model the manifold on which the data lies. For projective methods, we review projection pursuit, principal component analysis (PCA), kernel PCA, probabilistic PCA, canonical correlation analysis (CCA), kernel CCA, Fisher discriminant analysis, oriented PCA, and several techniques for sufficient dimension reduction. For the manifold methods, we review multidimensional scaling (MDS), landmark MDS, Isomap, locally linear embedding, Laplacian eigenmaps, and spectral clustering. Although the review focuses on foundations, we also provide pointers to some more modern techniques. We also describe the correlation dimension as one method for estimating the intrinsic dimension, and we point out that the notion of dimension can be a scale-dependent quantity. The Nyström method, which links several of the manifold algorithms, is also reviewed. We use a publicly available dataset to illustrate some of the methods. The goal is to provide a self-contained overview of key concepts underlying many of these algorithms, and to give pointers for further reading.
Autoren/Hrsg.
Fachgebiete
Weitere Infos & Material
1: Introduction 2: Estimating the Dimension 3: Projective Methods 4: Manifold Modeling 5: Pointers and Conclusions. Acknowledgements. References. A Appendix: The Nearest Positive Semidefinite Matrix




