Kohonen | Self-Organizing Maps | E-Book | sack.de
E-Book

E-Book, Englisch, Band 30, 426 Seiten, eBook

Reihe: Springer Series in Information Sciences

Kohonen Self-Organizing Maps


2. Auflage 1997
ISBN: 978-3-642-97966-8
Verlag: Springer
Format: PDF
Kopierschutz: 1 - PDF Watermark

E-Book, Englisch, Band 30, 426 Seiten, eBook

Reihe: Springer Series in Information Sciences

ISBN: 978-3-642-97966-8
Verlag: Springer
Format: PDF
Kopierschutz: 1 - PDF Watermark



The second, revised edition of this book was suggested by the impressive sales of the first edition. Fortunately this enabled us to incorporate new important results that had just been obtained. The ASSOM (Adaptive-Subspace SOM) is a new architecture in which invariant-feature detectors emerge in an unsupervised learning process. Its basic principle was already introduced in the first edition, but the motiva tion and theoretical discussion in the second edition is more thorough and consequent. New material has been added to Sect. 5.9 and this section has been rewritten totally. Correspondingly, Sect. 1.4, which deals with adaptive subspace classifiers in general and constitutes the prerequisite for the ASSOM principle, has also been extended and rewritten totally. Another new SOM development is the WEBSOM, a two-layer architecture intended for the organization of very large collections of full-text documents such as those found in the Internet and World Wide Web. This architecture was published after the first edition came out. The idea and results seemed to be so important that the new Sect. 7.8 has now been added to the second edition. Another addition that contains new results is Sect. 3.15, which describes the acceleration in the computing of very large SOMs. It was also felt that Chap. 7, which deals with 80M applications, had to be extended.

Kohonen Self-Organizing Maps jetzt bestellen!

Zielgruppe


Research


Autoren/Hrsg.


Weitere Infos & Material


1. Mathematical Preliminaries.- 1.1 Mathematical Concepts and Notations.- 1.2 Distance Measures for Patterns.- 1.3 Statistical Pattern Recognition.- 1.4 The Subspace Methods of Classification.- 1.5 The Robbins-Monro Stochastic Approximation.- 1.6 Dynamically Expanding Context.- 2. Justification of Neural Modeling.- 2.1 Models, Paradigms, and Methods.- 2.2 On the Complexity of Biological Nervous Systems.- 2.3 Relation Between Biological and Artificial Neural Networks.- 2.4 What Functions of the Brain Are Usually Modeled?.- 2.5 When Do We Have to Use Neural Computing?.- 2.6 Transformation, Relaxation, and Decoder.- 2.7 Categories of ANNs.- 2.8 Competitive-Learning Networks.- 2.9 Three Phases of Development of Neural Models.- 2.10 A Simple Nonlinear Dynamic Model of the Neuron.- 2.11 Learning Laws.- 2.12 Brain Maps.- 3. The Basic SOM.- 3.1 The SOM Algorithm in the Euclidean Space.- 3.2 The “Dot-Product SOM”.- 3.3 Preliminary Demonstrations of Topology-Preserving Mappings.- 3.4 Basic Mathematical Approaches to Self-Organization.- 3.5 Initialization of the SOM Algorithms.- 3.6 On the “Optimal” Learning-Rate Factor.- 3.7 Effect of the Form of the Neighborhood Function.- 3.8 Magnification Factor.- 3.9 Practical Advice for the Construction of Good Maps.- 3.10 Examples of Data Analyses Implemented by the SOM.- 3.11 Using Gray Levels to Indicate Clusters in the SOM.- 3.12 Derivation of the SOM Algorithm in the General Metric.- 3.13 What Kind of SOM Actually Ensues from the Distortion Measure?.- 3.14 Batch Computation of the SOM (“Batch Map”).- 3.15 Further Speedup of SOM Computation.- 4. Physiological Interpretation of SOM.- 4.1 Two Different Lateral Control Mechanisms.- 4.2 Learning Equation.- 4.3 System Models of SOM and Their Simulations.- 4.4 Recapitulation of theFeatures of the Physiological SOM Model.- 5. Variants of SOM.- 5.1 Overview of Ideas to Modify the Basic SOM.- 5.2 Adaptive Tensorial Weights.- 5.3 Tree-Structured SOM in Searching.- 5.4 Different Definitions of the Neighborhood.- 5.5 Neighborhoods in the Signal Space.- 5.6 Dynamical Elements Added to the SOM.- 5.7 Operator Maps.- 5.8 Supervised SOM.- 5.9 The Adaptive-Subspace SOM (ASSOM).- 5.10 Feedback-Controlled Adaptive-Subspace SOM (FASSOM).- 6. Learning Vector Quantization.- 6.1 Optimal Decision.- 6.2 The LVQ1.- 6.3 The Optimized-Learning-Rate LVQ1 (OLVQ1).- 6.4 The LVQ2 (LVQ2.1).- 6.5 The LVQ3.- 6.6 Differences Between LVQ1, LVQ2 and LVQ3.- 6.7 General Considerations.- 6.8 The Hypermap-Type LVQ.- 6.9 The “LVQ-SOM”.- 7. Applications.- 7.1 Preprocessing of Optic Patterns.- 7.2 Acoustic Preprocessing.- 7.3 Process and Machine Monitoring.- 7.4 Diagnosis of Speech Voicing.- 7.5 Transcription of Continuous Speech.- 7.6 Texture Analysis.- 7.7 Contextual Maps.- 7.8 Organization of Large Document Files.- 7.9 Robot-Arm Control.- 7.10 Telecommunications.- 7.11 The SOM as an Estimator.- 8. Hardware for SOM.- 8.1 An Analog Classifier Circuit.- 8.2 Fast Digital Classifier Circuits.- 8.3 SIMD Implementation of SOM.- 8.4 Transputer Implementation of SOM.- 8.5 Systolic-Array Implementation of SOM.- 8.6 The COKOS Chip.- 8.7 The TInMANN Chip.- 9. An Overview of SOM Literature.- 9.1 General.- 9.2 Early Works on Competitive Learning.- 9.3 Status of the Mathematical Analyses.- 9.4 Survey of General Aspects of the SOM.- 9.5 Modifications and Analyses of LVQ.- 9.6 Survey of Diverse Applications of SOM.- 9.7 Applications of LVQ.- 9.8 Survey of SOM and LVQ Implementations.- 9.9 New References in the Second Edition.- 10. Glossary of “Neural” Terms.- References.



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.