Livshin | Livshin, I: Artificial Neural Networks with Java | Buch | 978-1-4842-4420-3 | sack.de

Buch, Englisch, 566 Seiten, Book, Format (B × H): 178 mm x 254 mm, Gewicht: 1091 g

Livshin

Livshin, I: Artificial Neural Networks with Java

Buch, Englisch, 566 Seiten, Book, Format (B × H): 178 mm x 254 mm, Gewicht: 1091 g

ISBN: 978-1-4842-4420-3
Verlag: APRESS L.P.


Use Java to develop neural network applications in this practical book. After learning the rules involved in neural network processing, you will manually process the first neural network example. This covers the internals of front and back propagation, and facilitates the understanding of the main principles of neural network processing. Artificial Neural Networks with Java also teaches you how to prepare the data to be used in neural network development and suggests various techniques of data preparation for many unconventional tasks.
The next big topic discussed in the book is using Java for neural network processing. You will use the Encog Java framework and discover how to do rapid development with Encog, allowing you to create large-scale neural network applications.
The book also discusses the inability of neural networks to approximate complex non-continuous functions, and it introduces the micro-batch method that solves this issue. The step-by-step approach includes plenty of examples, diagrams, and screen shots to help you grasp the concepts quickly and easily.

What You Will Learn - Prepare your data for many different tasks

- Carry out some unusual neural network tasks

- Create neural network to process non-continuous functions

- Select and improve the development model

Who This Book Is For
Intermediate machine learning and deep learning developers who are interested in switching to Java.
Livshin Livshin, I: Artificial Neural Networks with Java jetzt bestellen!

Zielgruppe


Professional/practitioner


Autoren/Hrsg.


Weitere Infos & Material


Part One. Getting Started with Neural Networks Chapter 1. Learning Neural Network
Biological and Artificial Neurons Activation Functions Summary
Chapter 2. Internal Mechanism of Neural Network Processing
Function to be Approximated Network Architecture Forward Pass Calculations Back-Propagation Pass Calculations Function derivative and function divergent Table of Most Commonly Used Function Derivatives Summary

Chapter 3. Manual Neural Network Processing
Example 1. Manual Approximation of a Function at a Single Point Building the Neural Network Forward pass calculation Backward Pass Calculation Calculating Weight Adjustments for the Output Layer Neurons Calculating Weight Adjustments for the Hidden Layer Neurons Updating Network Biases Back to the Forward Pass Matrix Form of Network Calculation Digging Deeper Mini-Batches and Stochastic Gradient Summary

Part Two. Neural Network Java Development Environment Chapter 4. Configuring Your Development Environment
Installing Java 8 Environment on Your Windows Machine Installing NetBeans IDE Installing Encog Java Framework Installing XChart Package Summary
Chapter 5. Neural Network Development Using Java Encog Framework
Example 2. Function Approximation using Java environment Network Architecture Normalizing the Input datasets Building the Java Program that Normalizes Both Datasets Program Code Debugging and Executing the Program Processing Results for the Training Method Testing the Network Testing Results Digging deeper. Summary
Part Three. Development Non-Trivial Neural Network Applications Chapter 6. Neural Network Prediction Outside of the Training Range Example 3a. Approximating Periodic Functions Outside of the Training Range Network Architecture for Example 3a Program Code for Example 3a Testing The Network Example 3b. Correct Way of Approximating Periodic Functions Outside of the Training Range Preparing the Training Data Network Architecture for the Example 3b Program Code for Example 3b Training Results for Example 3b Testing Results for Example 3b Summary
Chapter 7. Processing Complex Periodic Functions Example 4. Approximation of a Complex Periodic Function Data Preparation Reflecting Function Topology in Data Network Architecture Program Code Testing the Network Digging Deeper Summary
Chapter 8. Approximating Non-Continuous Functions Example 5. Approximating Non-Continuous Functions Approximating Non-Continuous Function Using Conventional Network Process . . . . . . . Network Architecture Program Code Code Fragments for the Training Process Unsatisfactory Training Results Approximating the Non-Continuous Function Using Micro-Bach Method Program Code for Micro-Batch processing Program Code for the getChart() Method Code Fragment 1 of the Training Method Code Fragment 2 of the Training Method Training Results for Micro-Batch method Test Processing Logic Testing Results for Micro-Batch method Digging Deeper Summary
Chapter 9. Approximation Continuous Functions with Complex Topology Example 5a. Approximation of Continuous Function with Complex Topology Network Architecture for Example 5a Program Code for Example 5a Training Processing Results for Example 5a Approximation of Continuous Function with Complex Topology Using Micro-Batch Method Program Code for Example 5a Using Micro-Batch Method Example 5b. Approximation of Spiral-Like Functions Network Architecture for Example 5b Program Code for Example 5b Approximation of the Same Functions Using Micro-Batch Method Summary

Chapter 10. Using Neural Network for Classification of Objects
Example 6. Classification of records Training Dataset Network Architecture Testing Dataset Program Code for Data Normalization Program Code for Classification Training Results Testing Results Summary Chapter 11. Importance of Selecting a Correct Model
Example 7. Predicting Next Month Stock Market Price. . Data Preparation Including Function Topology in the Dataset Building Micro-Batch Files Ne


Igor Livshin is a senior architect with extensive experience in developing large-scale applications. He worked for many years for two large insurance companies: CNN and Blue Cross & Blue Shield of Illinois. He currently works as a senior researcher at DevTechnologies specializing in AI and neural networks. Igor has a master's degree in computer science from the Institute of Technology in Odessa, Russia/Ukraine.


Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.