Gelenbe | Neural Networks | E-Book | www.sack.de
E-Book

E-Book, Englisch, 232 Seiten, Web PDF

Gelenbe Neural Networks

Advances and Applications, 2
1. Auflage 2014
ISBN: 978-1-4832-9709-5
Verlag: Elsevier Science & Techn.
Format: PDF
Kopierschutz: 1 - PDF Watermark

Advances and Applications, 2

E-Book, Englisch, 232 Seiten, Web PDF

ISBN: 978-1-4832-9709-5
Verlag: Elsevier Science & Techn.
Format: PDF
Kopierschutz: 1 - PDF Watermark



The present volume is a natural follow-up to Neural Networks: Advances and Applications which appeared one year previously. As the title indicates, it combines the presentation of recent methodological results concerning computational models and results inspired by neural networks, and of well-documented applications which illustrate the use of such models in the solution of difficult problems. The volume is balanced with respect to these two orientations: it contains six papers concerning methodological developments and five papers concerning applications and examples illustrating the theoretical developments. Each paper is largely self-contained and includes a complete bibliography.The methodological part of the book contains two papers on learning, one paper which presents a computational model of intracortical inhibitory effects, a paper presenting a new development of the random neural network, and two papers on associative memory models. The applications and examples portion contains papers on image compression, associative recall of simple typed images, learning applied to typed images, stereo disparity detection, and combinatorial optimisation.

Gelenbe Neural Networks jetzt bestellen!

Autoren/Hrsg.


Weitere Infos & Material


1;Front Cover;1
2;Neural Networks: Advances and Applications, 2;4
3;Copyright Page;5
4;Table of Contents;10
5;Preface;6
6;Chapter 1. Learning in the Recurrent Random Neural Network;12
6.1;Abstract;12
6.2;1. Introduction;12
6.3;2. The random network model;14
6.4;3. Learning with the recurrent random network model;17
6.5;Appendix : Existence and Uniqueness of Network Solutions;20
6.6;Remark;22
6.7;Acknowledgements;22
6.8;References;22
7;Chapter 2. Generalization Performance of Feed-Forward Neural Networks;24
7.1;Abstract;24
7.2;1. Introduction;24
7.3;2. Generalization Problems, Algorithms and Measures;26
7.4;3. GNET : Generalization Neural Network Evaluation Tool;34
7.5;4. Performance Evaluation Studies;41
7.6;5. Conclusions;46
7.7;6. Acknowledgements;47
7.8;7. References;47
8;Chapter 3. The Nature of Intracortical Inhibitory Effects;50
8.1;Abstract;50
8.2;1. PERISTIMULUS INHIBITION;50
8.3;2. COMPETITIVE DISTRIBUTION HYPOTHESIS;55
8.4;3. A SPECIFIC FORMULATION;58
8.5;4. MODELS I AND C;63
8.6;5. PREDICTION OF INTERELEMENT RELATIONSHIPS;65
8.7;6. SIMULATION RESULTS;70
8.8;7.0 DISCUSSION;81
8.9;References;86
8.10;APPENDIX;90
9;Chapter 4. Random Neural Networks with Multiple Classes of Signals;94
9.1;Abstract;94
9.2;1. The model;95
9.3;2. Non-linear signal flow equations and product form stationary solution;98
9.4;3. Stability conditions;101
9.5;4. Conclusions;103
9.6;References;104
10;Chapter 5. The Microcircuit Associative Memory Architecture;106
10.1;Abstract;106
10.2;1. Introduction;106
10.3;2. Cerebellar Structure and Function;107
10.4;3. The Importance of Basket Interneurons;110
10.5;4. Experimental Paradigm;111
10.6;5. The Marr Model;112
10.7;6. The Microcircuit;113
10.8;7. The Microcircuit Associative Memory;117
10.9;8. Experimental Verification;128
10.10;9. Comparative Analysis;131
10.11;10. Observations;132
10.12;11. Summary;133
10.13;References;134
10.14;Acknowledgments;138
11;Chapter 6. Generalised Associative Memory and the Computation of Membership Functions;140
11.1;Abstract;140
11.2;1. Introduction;140
11.3;2. Sparse distributed memory;143
11.4;3. Asymptotic analysis of SDM;146
11.5;4. Associative memory and the computation of membership functions;149
11.6;References;150
12;Chapter 7. Layered Neural Network for Stereo Disparity Detection;152
12.1;Abstract;152
12.2;1 Introduction;152
12.3;2 Network design;154
12.4;3 Experiments;161
12.5;4 Discussion;161
12.6;5 Conclusion;162
12.7;Acknowledgement;162
12.8;References;164
13;Chapter 8. Storage and Recognition Methods for The Random Neural Network;166
13.1;Abstract;166
13.2;1. Introduction;166
13.3;2. Some definitions;168
13.4;3. The random neural network as an auto-associative memory;169
13.5;4. Learning Algorithm;171
13.6;5. Some recognition methods;172
13.7;7. Method comparative analysis;181
13.8;8. Learning method with local parameters;182
13.9;9. Conclusion;185
13.10;Appendix: heuristic to determine . and .;186
13.11;Remark;186
13.12;References;187
14;Chapter 9. Neural Networks for Image Compression;188
14.1;1 Simple NN's for image compression;188
14.2;2 Improved structures;196
14.3;3 Simulation results;202
14.4;4 Conclusions;205
14.5;Acknowledgments;208
14.6;References;208
15;Chapter 10. Autoassociative Memory with the Random Neural Network using Gelenbe's Learning Algorithm;210
15.1;Abstract;210
15.2;1 Introduction;210
15.3;2 Autoassociative memory operation;212
15.4;3 Simulations and performance results;217
15.5;5 Conclusion;224
15.6;References;225
16;Chapter 11. Minimum Graph Covering with the Random Neural Network Model;226
16.1;Abstract;226
16.2;1. Introduction;226
16.3;2. Random network solution;229
16.4;3. Conclusions;233
16.5;References;233



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.