E-Book, Englisch, 162 Seiten
Kim MATLAB Deep Learning
1. ed
ISBN: 978-1-4842-2845-6
Verlag: Apress
Format: PDF
Kopierschutz: 1 - PDF Watermark
With Machine Learning, Neural Networks and Artificial Intelligence
E-Book, Englisch, 162 Seiten
ISBN: 978-1-4842-2845-6
Verlag: Apress
Format: PDF
Kopierschutz: 1 - PDF Watermark
Get started with MATLAB for deep learning and AI with this in-depth primer. In this book, you start with machine learning fundamentals, then move on to neural networks, deep learning, and then convolutional neural networks. In a blend of fundamentals and applications, MATLAB Deep Learning employs MATLAB as the underlying programming language and tool for the examples and case studies in this book.
With this book, you'll be able to tackle some of today's real world big data, smart bots, and other complex data problems. You'll see how deep learning is a complex and more intelligent aspect of machine learning for modern smart data analysis and usage.
What You'll LearnUse MATLAB for deep learning
Discover neural networks and multi-layer neural networks
Work with convolution and pooling layers
Build a MNIST example with these layers
Who This Book Is For
Those who want to learn deep learning using MATLAB. Some MATLAB experience may be useful.
Phil Kim, PhD is an experienced MATLAB programmer and user. He also works with algorithms of large data sets drawn from AI, machine learning. He has worked at Korea Aerospace Research Institute as a Senior Researcher. There, his main task was to develop autonomous flight algorithm and onboard software for unmanned aerial vehicle. An on-screen keyboard program named 'Clickey' was developed by him during his period in PhD program and served as a bridge to bring the author currently to his current assignment as a Senior Research Officer at National Rehabilitation Research Institute of Korea.
Autoren/Hrsg.
Weitere Infos & Material
1;Contents at a Glance;4
2;Contents;5
3;About the Author;8
4;About the Technical Reviewer;9
5;Acknowledgments;10
6;Introduction;11
7;Chapter 1: Machine Learning;14
7.1;What Is Machine Learning?;15
7.2;Challenges with Machine Learning;17
7.2.1;Overfitting;19
7.2.2;Confronting Overfitting;23
7.3;Types of Machine Learning;25
7.3.1;Classification and Regression;27
7.4;Summary;30
8;Chapter 2: Neural Network;32
8.1;Nodes of a Neural Network;33
8.2;Layers of Neural Network;35
8.3;Supervised Learning of a Neural Network;40
8.4;Training of a Single-Layer Neural Network: Delta Rule;42
8.5;Generalized Delta Rule;45
8.6;SGD, Batch, and Mini Batch;47
8.7;Stochastic Gradient Descent;47
8.7.1;Batch;48
8.7.2;Mini Batch;49
8.8;Example: Delta Rule;50
8.9;Implementation of the SGD Method;51
8.10;Implementation of the Batch Method;54
8.11;Comparison of the SGD and the Batch;56
8.12;Limitations of Single-Layer Neural Networks;58
8.13;Summary;63
9;Chapter 3: Training of Multi-Layer Neural Network;65
9.1;Back-Propagation Algorithm;66
9.2;Example: Back-Propagation;72
9.2.1;XOR Problem;74
9.2.2;Momentum;77
9.3;Cost Function and Learning Rule;80
9.4;Example: Cross Entropy Function;85
9.5;Cross Entropy Function;86
9.6;Comparison of Cost Functions;88
9.7;Summary;91
10;Chapter 4: Neural Network and Classification;93
10.1;Binary Classification;93
10.2;Multiclass Classification;98
10.3;Example: Multiclass Classification;105
10.4;Summary;114
11;Chapter 5: Deep Learning;115
11.1;Improvement of the Deep Neural Network;117
11.1.1;Vanishing Gradient;117
11.1.2;Overfitting;119
11.1.3;Computational Load;121
11.2;Example: ReLU and Dropout;121
11.2.1;ReLU Function;122
11.2.2;Dropout;126
11.3;Summary;132
12;Chapter 6: Convolutional Neural Network;133
12.1;Architecture of ConvNet;133
12.2;Convolution Layer;136
12.3;Pooling Layer;142
12.4;Example: MNIST;143
12.5;Summary;159
13;Index;160




