E-Book, Englisch, Band 30, 362 Seiten, eBook
Kohonen Self-Organizing Maps
1995
ISBN: 978-3-642-97610-0
Verlag: Springer
Format: PDF
Kopierschutz: 1 - PDF Watermark
E-Book, Englisch, Band 30, 362 Seiten, eBook
Reihe: Springer Series in Information Sciences
ISBN: 978-3-642-97610-0
Verlag: Springer
Format: PDF
Kopierschutz: 1 - PDF Watermark
The book we have at hand is the fourth monograph I wrote for Springer Verlag. The previous one named "Self-Organization and Associative Mem ory" (Springer Series in Information Sciences, Volume 8) came out in 1984. Since then the self-organizing neural-network algorithms called SOM and LVQ have become very popular, as can be seen from the many works re viewed in Chap. 9. The new results obtained in the past ten years or so have warranted a new monograph. Over these years I have also answered lots of questions; they have influenced the contents of the present book. I hope it would be of some interest and help to the readers if I now first very briefly describe the various phases that led to my present SOM research, and the reasons underlying each new step. I became interested in neural networks around 1960, but could not in terrupt my graduate studies in physics. After I was appointed Professor of Electronics in 1965, it still took some years to organize teaching at the uni versity. In 1968 - 69 I was on leave at the University of Washington, and D. Gabor had just published his convolution-correlation model of autoasso ciative memory. I noticed immediately that there was something not quite right about it: the capacity was very poor and the inherent noise and crosstalk were intolerable. In 1970 I therefore sugge~ted the auto associative correlation matrix memory model, at the same time as J.A. Anderson and K. Nakano.
Zielgruppe
Research
Autoren/Hrsg.
Weitere Infos & Material
1. Mathematical Preliminaries.- 1.1 Mathematical Concepts and Notations.- 1.1.1 Vector Space Concepts.- 1.1.2 Matrix Notations.- 1.1.3 Further Properties of Matrices.- 1.1.4 On Matrix Differential Calculus.- 1.2 Distance Measures for Patterns.- 1.2.1 Measures of Similarity and Distance in Vector Spaces.- 1.2.2 Measures of Similarity and Distance Between Symbol Strings.- 1.3 Statistical Pattern Recognition.- 1.3.1 Supervised Classification.- 1.3.2 Unsupervised Classification.- 1.4 The Robbins-Monro Stochastic pproximation.- 1.4.1 The Adaptive Linear Element.- 1.4.2 Vector Quantization.- 1.5 The Subspace Methods of Classification.- 1.5.1 The Basic Subspace Method.- 1.5.2 The Learning Subspace Method (LSM).- 1.6 Dynamically Expanding Context.- 1.6.1 Setting Up the Problem.- 1.6.2 Automatic Determination of Context-Independent Productions.- 1.6.3 Conflict Bit.- 1.6.4 Construction of Memory for the Context-Dependent Productions.- 1.6.5 The Algorithm for the Correction of New Strings.- 1.6.6 Estimation Procedure for Unsuccessful Searches.- 1.6.7 Practical Experiments.- 2. Justification of Neural Modeling.- 2.1 Models, Paradigms, and Methods.- 2.2 On the Complexity of Biological Nervous Systems.- 2.3 Relation Between Biological and Artificial Neural Networks.- 2.4 What Functions of the Brain Are Usually Modeled?.- 2.5 When Do We Have to Use Neural Computing?.- 2.6 Transformation, Relaxation, and Decoder.- 2.7 Categories of ANNs.- 2.8 Competitive-Learning Networks.- 2.9 Three Phases of Development of Neural Models.- 2.10 A Simple Nonlinear Dynamic Model of the Neuron.- 2.11 Learning Laws.- 2.11.1 Hebb’s Law.- 2.11.2 The Riccati-Type Learning Law.- 2.11.3 The PCA-Type Learning Law.- 2.12 Brain Maps.- 3. The Basic SOM.- 3.1 The SOM Algorithm in the Euclidean Space.- 3.2 The “Dot-Product SOM”.- 3.3 Preliminary Demonstrations of Topology-Preserving Mappings.- 3.3.1 Ordering of Reference Vectors in the Input Space.- 3.3.2 Demonstrations of Ordering of Responses in the Output Plane.- 3.4 Basic Mathematical Approaches to Self-Organization.- 3.4.1 One-Dimensional Case.- 3.4.2 Constructive Proof of Ordering of Another One-Dimensional SOM.- 3.4.3 An Attempt to Justify the SOM Algorithm for General Dimensionalities.- 3.5 Initialization of the SOM Algorithms.- 3.6 On the “Optimal” Learning-Rate Factor.- 3.7 Effect of the Form of the Neighborhood Function.- 3.8 Magnification Factor.- 3.9 Practical Advice for the Construction of Good Maps.- 3.10 Examples of Data Analyses Implemented by the SOM.- 3.10.1 Attribute Maps with Full Data Matrix.- 3.10.2 Case Example of Attribute Maps Based on Incomplete Data Matrices (Missing Data): “Poverty Map”.- 3.11 Using Gray Levels to Indicate Clusters in the SOM.- 3.12 Derivation of the SOM Algorithm in the General Metric.- 3.13 What Kind of SOM Actually Ensues from the Distortion Measure?.- 3.14 Batch Computation of the SOM (“Batch Map”).- 4. Physiological Interpretation of SOM.- 4.1 Two Different Lateral Control Mechanisms.- 4.1.1 The WTA Function, Based on Lateral Activity Control.- 4.1.2 Lateral Control of Plasticity.- 4.2 Learning Equation.- 4.3 System Models of SOM and Their Simulations.- 4.4 Recapitulation of the Features of the Physiological SOM Model.- 5. Variants of SOM.- 5.1 Overview of Ideas to Modify the Basic SOM.- 5.2 Adaptive Tensorial Weights.- 5.3 Tree-Structured SOM in Searching.- 5.4 Different Definitions of the Neighborhood.- 5.5 Neighborhoods in the Signal Space.- 5.6 Dynamical Elements Added to the SOM.- 5.7 Operator Maps.- 5.8 Supervised SOM.- 5.9 Adaptive-Subspace SOM (ASSOM) for the Implementation of Wavelets and Gabor Filters.- 5.10 Feedback-Controlled Adaptive-Subspace SOM (FASSOM) ….- 6. Learning Vector Quantization.- 6.1 Optimal Decision.- 6.2 The LVQ1.- 6.3 The Optimized-Learning-Rate LVQ1 (OLVQ1).- 6.4 The LVQ2 (LVQ2.1).- 6.5 The LVQ3.- 6.6 Differences Between LVQ1, LVQ2 and LVQ3.- 6.7 General Considerations.- 6.8 The Hypermap-Type LVQ.- 6.9 The “LVQ-SOM”.- 7. Applications.- 7.1 Preprocessing.- 7.2 Process and Machine State Monitoring.- 7.3 Diagnosis of Speech Voicing.- 7.4 Transcription of Continuous Speech.- 7.5 Texture Analysis.- 7.6 Contextual Maps.- 7.6.1 Role-Based Semantic Map.- 7.6.2 Unsupervised Categorization of Phonemic Classes from Text.- 7.7 Robot-Arm Control I.- 7.8 Robot-Arm Control II.- 8. Hardware for SOM.- 8.1 An Analog Classifier Circuit.- 8.2 A Fast Digital Classifier Circuit.- 8.3 SIMD Implementation of SOM.- 8.4 Transputer Implementation of SOM.- 8.5 Systolic-Array Implementation of SOM.- 8.6 The COKOS Chip.- 8.7 The TInMANN Chip.- 9. An Overview of SOM Literature.- 9.1 General.- 9.2 Early Works on Competitive Learning.- 9.3 Status of the Mathematical Analyses.- 9.4 Survey of General Aspects of the SOM.- 9.4.1 General.- 9.4.2 Mathematical Derivations, Analyses, and Modifications of the SOM.- 9.5 Modifications and Analyses of LVQ.- 9.6 Survey of Diverse Applications of SOM.- 9.6.1 Machine Vision and Image Analysis.- 9.6.2 Optical Character and Script Reading.- 9.6.3 Speech Analysis and Recognition.- 9.6.4 Acoustic and Musical Studies.- 9.6.5 Signal Processing and Radar Measurements.- 9.6.6 Telecommunications.- 9.6.7 Industrial and Other Real-World Measurements.- 9.6.8 Process Control.- 9.6.9 Robotics.- 9.6.10 Chemistry.- 9.6.11 Physics.- 9.6.12 Electronic-Circuit Design.- 9.6.13 Medical Applications Without Image Processing.- 9.6.14 Data Processing.- 9.6.15 Linguistic and AI Problems.- 9.6.16 Mathematical Problems.- 9.6.17 Neurophysiological Research.- 9.7 Applications of LVQ.- 9.8 Survey of SOM and LVQ Implementations.- 10. Glossary of “Neural” Terms.- References.




