Livshin | Artificial Neural Networks with Java | E-Book | www.sack.de
E-Book

E-Book, Englisch, 575 Seiten

Livshin Artificial Neural Networks with Java

Tools for Building Neural Network Applications
1. ed
ISBN: 978-1-4842-4421-0
Verlag: Apress
Format: PDF
Kopierschutz: 1 - PDF Watermark

Tools for Building Neural Network Applications

E-Book, Englisch, 575 Seiten

ISBN: 978-1-4842-4421-0
Verlag: Apress
Format: PDF
Kopierschutz: 1 - PDF Watermark



Use Java to develop neural network applications in this practical book. After learning the rules involved in neural network processing, you will manually process the first neural network example. This covers the internals of front and back propagation, and facilitates the understanding of the main principles of neural network processing. Artificial Neural Networks with Java also teaches you how to prepare the data to be used in neural network development and suggests various techniques of data preparation for many unconventional tasks. 
The next big topic discussed in the book is using Java for neural network processing. You will use the Encog Java framework and discover how to do rapid development with Encog, allowing you to create large-scale neural network applications.
The book also discusses the inability of neural networks to approximate complex non-continuous functions, and it introduces the micro-batch method that solves this issue. The step-by-step approach includes plenty of examples, diagrams, and screen shots to help you grasp the concepts quickly and easily.

What You Will LearnPrepare your data for many different tasks
Carry out some unusual neural network tasks
Create neural network to process non-continuous functions
Select and improve the development model  

Who This Book Is For
Intermediate machine learning and deep learning developers who are interested in switching to Java.


Igor Livshin is a senior architect with extensive experience in developing large-scale applications. He worked for many years for two large insurance companies: CNN and Blue Cross & Blue Shield of Illinois. He currently works as a senior researcher at DevTechnologies specializing in AI and neural networks. Igor has a master's degree in computer science from the Institute of Technology in Odessa, Russia/Ukraine.   

Livshin Artificial Neural Networks with Java jetzt bestellen!

Autoren/Hrsg.


Weitere Infos & Material


1;Table of Contents;5
2;About the Author;11
3;About the Technical Reviewer;12
4;Acknowledgments;13
5;Introduction;14
6;Chapter 1: Learning About Neural Networks;17
6.1;Biological and Artificial Neurons;18
6.2;Activation Functions;19
6.3;Summary;21
7;Chapter 2: Internal Mechanics of Neural Network Processing;22
7.1;Function to Be Approximated;22
7.2;Network Architecture;24
7.3;Forward-Pass Calculation;25
7.4;Input Record 1;26
7.5;Input Record 2;27
7.6;Input Record 3;28
7.7;Input Record 4;29
7.8;Backpropagation-Pass Calculations;30
7.9;Function Derivative and Function Divergent;31
7.10;Most Commonly Used Function Derivatives;32
7.11;Summary;34
8;Chapter 3: Manual Neural Network Processing;35
8.1;Example 1: Manual Approximation of a Function at a Single Point;35
8.2;Building the Neural Network;36
8.3;Forward-Pass Calculation;38
8.3.1;Hidden Layers;39
8.3.2;Output Layer;39
8.4;Backward-Pass Calculation;41
8.4.1;Calculating Weight Adjustments for the Output Layer Neurons;41
8.4.1.1;Calculating Adjustment for;41
8.5;Calculating the Adjustment for;42
8.6;Calculating the Adjustment for;44
8.7;Calculating Weight Adjustments for Hidden-Layer Neurons;45
8.7.1;Calculating the Adjustment for;45
8.7.2;Calculating the Adjustment for;46
8.7.3;Calculating the Adjustment for;47
8.7.4;Calculating the Adjustment for;48
8.7.5;Calculating the Adjustment for;49
8.7.6;Calculating the Adjustment for;50
8.8;Updating Network Biases;50
8.9;Going Back to the Forward Pass;52
8.9.1;Hidden Layers;52
8.10;Output Layer;53
8.11;Matrix Form of Network Calculation;56
8.12;Digging Deeper;56
8.13;Mini-Batches and Stochastic Gradient;59
8.14;Summary;60
9;Chapter 4: Configuring Your Development Environment;61
9.1;Installing the Java 11 Environment on Your Windows Machine;61
9.2;Installing the NetBeans IDE;64
9.3;Installing the Encog Java Framework;65
9.4;Installing the XChart Package;66
9.5;Summary;67
10;Chapter 5: Neural Network Development Using the Java Encog Framework;68
10.1;Example 2: Function Approximation Using the  Java Environment;68
10.2;Network Architecture;70
10.3;Normalizing the Input Data Sets;71
10.4;Building the Java Program That Normalizes Both Data Sets;71
10.5;Building the Neural Network Processing Program;82
10.6;Program Code;90
10.7;Debugging and Executing the Program;113
10.8;Processing Results for the Training Method;114
10.9;Testing the Network;115
10.10;Testing Results;119
10.11;Digging Deeper;120
10.12;Summary;121
11;Chapter 6: Neural Network Prediction Outside the Training Range;122
11.1;Example 3a: Approximating Periodic Functions Outside of the Training Range;123
11.2;Network Architecture for Example 3a;127
11.3;Program Code for Example 3a;127
11.4;Testing the Network;145
11.5;Example 3b: Correct Way of Approximating Periodic Functions Outside the Training Range;147
11.5.1;Preparing the Training Data;147
11.6;Network Architecture for Example 3b;150
11.7;Program Code for Example 3b;151
11.7.1;Training Results for Example 3b;173
11.7.2;Testing Results for Example 3b;175
11.8;Summary;176
12;Chapter 7: Processing Complex Periodic Functions;177
12.1;Example 4: Approximation of a Complex Periodic Function;177
12.2;Data Preparation;180
12.3;Reflecting Function Topology in the Data;181
12.3.1;Network Architecture;188
12.4;Program Code;188
12.5;Training the Network;212
12.6;Testing the Network;214
12.7;Digging Deeper;217
12.8;Summary;218
13;Chapter 8: Approximating Noncontinuous Functions;219
13.1;Example 5: Approximating Noncontinuous Functions;219
13.1.1;Network Architecture;223
13.2;Program Code;224
13.2.1;Code Fragments for the Training Process;238
13.3;Unsatisfactory Training Results;242
13.4;Approximating the Noncontinuous Function Using the Micro-Bach Method;244
13.5;Program Code for Micro-Batch Processing;245
13.5.1;Program Code for the getChart() Method;269
13.5.2;Code Fragment 1 of the Training Method;274
13.5.3;Code Fragment 2 of the Training Method;275
13.6;Training Results for the Micro-Batch Method;281
13.7;Test Processing Logic;287
13.8;Testing Results for the Micro-Batch Method;291
13.9;Digging Deeper;293
13.10;Summary;300
14;Chapter 9: Approximating Continuous Functions with Complex Topology;301
14.1;Example 5a: Approximation of a Continuous Function with Complex Topology Using the Conventional Network Process;301
14.1.1;Network Architecture for Example 5a;304
14.1.2;Program Code for Example 5a;305
14.1.3;Training Processing Results for Example 5a;319
14.2;Approximation of a Continuous Function with Complex Topology Using the Micro-Batch Method;322
14.2.1;Testing Processing for Example 5a;326
14.3;Example 5b: Approximation of Spiral-Like Functions;352
14.3.1;Network Architecture for Example 5b;356
14.3.2;Program Code for Example 5b;357
14.4;Approximation of the Same Function Using the Micro-Batch Method;374
14.5;Summary;404
15;Chapter 10: Using Neural Networks to Classify Objects;405
15.1;Example 6: Classification of Records;405
15.2;Training Data Set;407
15.3;Network Architecture;411
15.4;Testing Data Set;411
15.5;Program Code for Data Normalization;413
15.6;Program Code for Classification;419
15.7;Training Results;448
15.8;Testing Results;458
15.9;Summary;459
16;Chapter 11: The Importance of Selecting the Correct Model;460
16.1;Example 7: Predicting Next Month’s Stock Market Price;460
16.2;Including Function Topology in the Data Set;468
16.3;Building Micro-Batch Files;470
16.4;Network Architecture;476
16.5;Program Code;477
16.6;Training Process;511
16.7;Training Results;513
16.8;Testing Data Set;520
16.9;Testing Logic;525
16.10;Testing Results;535
16.11;Analyzing the Testing Results;538
16.12;Summary;540
17;Chapter 12: Approximation of Functions in 3D Space;541
17.1;Example 8: Approximation of Functions in 3D Space;542
17.1.1;Data Preparation;542
17.1.2;Network Architecture;547
17.2;Program Code;548
17.2.1;Processing Results;563
17.3;Summary;571
18;Index;572



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.