E-Book, Englisch, 571 Seiten
Norris Machine Learning with the Raspberry Pi
1. ed
ISBN: 978-1-4842-5174-4
Verlag: Apress
Format: PDF
Kopierschutz: 1 - PDF Watermark
Experiments with Data and Computer Vision
E-Book, Englisch, 571 Seiten
ISBN: 978-1-4842-5174-4
Verlag: Apress
Format: PDF
Kopierschutz: 1 - PDF Watermark
Using the Pi Camera and a Raspberry Pi board, expand and replicate interesting machine learning (ML) experiments. This book provides a solid overview of ML and a myriad of underlying topics to further explore. Non-technical discussions temper complex technical explanations to make the hottest and most complex topic in the hobbyist world of computing understandable and approachable.Machine learning, also commonly referred to as deep learning (DL), is currently being integrated into a multitude of commercial products as well as widely being used in industrial, medical, and military applications. It is hard to find any modern human activity, which has not been 'touched' by artificial intelligence (AI) applications. Building on the concepts first presented in Beginning Artificial Intelligence with the Raspberry Pi, you'll go beyond simply understanding the concepts of AI into working with real machine learning experiments and applying practical deep learning concepts to experiments with the Pi board and computer vision. What you learn with Machine Learning with the Raspberry Pi can then be moved on to other platforms to go even further in the world of AI and ML to better your hobbyist or commercial projects.What You'll LearnAcquire a working knowledge of current ML
Use the Raspberry Pi to implement ML techniques and algorithms
Apply AI and ML tools and techniques to your own work projects and studies
Who This Book Is ForEngineers and scientists but also experienced makers and hobbyists. Motivated high school students who desire to learn about ML can benefit from this material with determination.
Donald Norris is an avid electronics hobbyist and maker. He is also an electronics engineer with an advanced degree in Production Management. Don is retired from civilian government service with the US Navy, where he specialized in acoustics and digital signal processing. He also has more than a dozen years' experience as a professional software developer using C, C#, C++, Python, and Java, as well as five years' experience as a certified IT security consultant.
Autoren/Hrsg.
Weitere Infos & Material
1;Table of Contents;4
2;About the Author;8
3;About the Technical Reviewer;9
4;Chapter 1: Introduction to machine learning (ML) with the Raspberry Pi (RasPi);10
4.1;RasPi introduction;10
4.1.1;Writing the Raspbian Image to a micro SD card;13
4.1.1.1;Mandatory configurations;15
4.1.1.2;Optional configurations;21
4.1.2;Updating and upgrading the Raspbian distribution;24
4.1.2.1;Python virtual environment;25
4.1.2.2;Installing a Python virtual environment;26
4.1.2.3;Installing dependencies;31
4.1.3;ML facts;33
4.1.3.1;ML basics;34
4.1.3.2;Linear prediction and classification;35
4.1.3.3;Iris demonstration – Part 1;38
4.1.3.4;Iris demonstration – Part 2;44
4.1.3.5;Iris demonstration – Part 3;49
5;Chapter 2: Exploration of ML data models: Part 1;57
5.1;Installing OpenCV 4;57
5.1.1;Download OpenCV 4 source code;59
5.1.2;Building the OpenCV software;60
5.1.3;Seaborn data visualization library;67
5.1.3.1;Scatter plot;70
5.1.3.2;Facet grid plot;73
5.1.3.3;Box plot;75
5.1.3.4;Strip plot;76
5.1.3.5;Violin plot;78
5.1.3.6;KDE plot;80
5.1.3.7;Pair plots;82
5.1.4;Underlying big principle;85
5.1.4.1;Linear regression;85
5.1.4.1.1;LR demonstration;86
5.1.4.2;Logistic regression;89
5.1.4.2.1;LogR model development;93
5.1.4.2.2;LogR demonstration;94
5.1.5;Naive Bayes;99
5.1.5.1;Brief review of the Bayes’ theorem;100
5.1.5.2;Preparing data for use by the Naive Bayes model;101
5.1.5.2.1;Naive Bayes model example;102
5.1.5.2.2;Pros and cons;105
5.1.5.2.3;Gaussian Naive Bayes;107
5.1.5.2.4;Gaussian Naive Bayes (GNB) demonstration;108
5.2;k-nearest neighbor (k-NN) model;109
5.2.1;KNN demonstration;110
5.3;Decision tree classifier;114
5.3.1;Decision tree algorithm;115
5.3.1.1;Information gain;117
5.3.1.1.1;Split criterion;117
5.3.1.1.2;Measuring information;118
5.3.1.1.3;Properties of entropy;119
5.3.1.1.4;Information gain example;120
5.3.1.1.5;Gini index;125
5.3.1.1.5.1;Simple Gini index example;126
5.3.1.2;Gain ratio;129
5.3.1.2.1;Intrinsic information;129
5.3.1.2.2;Definition of gain ratio;130
5.3.2;Decision tree classifier demonstration with scikit-learn;130
5.3.2.1;Visualizing the decision tree;135
5.3.2.2;Optimizing a decision tree;139
5.3.2.3;Pros and cons for decision trees;140
5.3.2.3.1;Pros;140
5.3.2.3.2;Cons;141
6;Chapter 3: Exploration of ML data models: Part 2;142
6.1;Principal component analysis;143
6.1.1;PCA script discussion;144
6.1.1.1;PCA demonstration;153
6.1.1.2;When to use PCA;157
6.2;Linear discriminant analysis;157
6.2.1;LDA script discussion;159
6.2.1.1;LDA demonstration;165
6.2.1.2;Comparison of PCA and LDA;168
6.3;Support vector machines;169
6.3.1;SVM demonstration – Part 1;173
6.3.2;SVM demonstration – Part 2;176
6.4;Learning vector quantization;184
6.4.1;LVQ basic concepts;185
6.4.1.1;Euclidean distance;185
6.4.1.2;Best matching unit;186
6.4.1.3;Training codebook vectors;186
6.4.2;LVQ demonstration;187
6.5;Bagging and random forests;197
6.5.1;Introduction to bagging and random forest;197
6.5.1.1;Bootstrap aggregation (bagging);198
6.5.1.2;Random forest;199
6.5.1.3;Performance estimation and variable importance;200
6.5.2;Bootstrap resampling demonstration;200
6.5.3;Bagging demonstration;202
6.5.4;Random forest demonstration;211
7;Chapter 4: Preparation for deep learning;220
7.1;DL basics;220
7.1.1;Machine learning from data patterns;221
7.1.1.1;Linear classifier;223
7.1.2;Loss functions;228
7.1.2.1;Different types of loss functions;228
7.1.3;Optimizer algorithm;232
7.1.3.1;Deep dive into the gradient descent algorithm;235
7.2;Artificial neural network;242
7.2.1;How ANNs are trained and function;245
7.2.1.1;Practical ANN example;249
7.2.1.2;Complex ANN example;252
7.2.1.3;Modifying weight values;256
7.2.2;Practical ANN weight modification example;265
7.2.2.1;Some issues with ANN learning;266
7.2.3;ANN Python demonstration – Part 1;269
7.2.4;ANN Python demonstration – Part 2;274
8;Chapter 5: Practical deep learning ANN demonstrations;285
8.1;Parts list;286
8.2;Recognizing handwritten number demonstration;286
8.2.1;Project history and preparatory details;290
8.2.2;Adjusting the input datasets;299
8.2.3;Interpreting ANN output data values;301
8.2.4;Creating an ANN that does handwritten number recognition;303
8.2.5;Initial ANN training script demonstration;305
8.2.6;ANN test script demonstration;307
8.2.7;ANN test script demonstration using the full training dataset;315
8.2.8;Recognizing your own handwritten numbers;319
8.2.8.1;Installing the Pi Camera;320
8.2.8.2;Installing the Pi Camera software;324
8.2.8.3;Handwritten number recognition demonstration;325
8.3;Handwritten number recognition using Keras;330
8.3.1;Introduction to Keras;330
8.3.2;Installing Keras;331
8.3.3;Downloading the dataset and creating a model;332
9;Chapter 6: CNN demonstrations;341
9.1;Parts list;341
9.2;Introduction to the CNN model;342
9.3;History and evolution of the CNN;348
9.4;Fashion MNIST demonstration;364
9.5;More complex Fashion MNIST demonstration;375
9.6;VGG Fashion MNIST demonstration;379
9.7;Jason’s Fashion MNIST demonstration;384
10;Chapter 7: Predictions using ANNs and CNNs;392
10.1;Pima Indian Diabetes demonstration;393
10.1.1;Background for the Pima Indian Diabetes study;393
10.1.2;Preparing the data;394
10.2;Using the scikit-learn library with Keras;413
10.2.1;Grid search with Keras and scikit-learn;416
10.3;Housing price regression predictor demonstration;421
10.3.1;Preprocessing the data;422
10.3.2;The baseline model;426
10.3.3;Improved baseline model;430
10.3.4;Another improved baseline model;433
10.4;Predictions using CNNs;436
10.4.1;Univariate time series CNN model;438
10.4.1.1;Preprocessing the dataset;438
10.4.1.2;Create a CNN model;441
10.4.1.3;Multivariate time series CNN model;445
10.4.1.3.1;Multiple input series;446
10.4.1.3.2;Preprocessing the dataset;447
11;Chapter 8: Predictions using CNNs and MLPs for medical research;457
11.1;Parts list;458
11.2;Downloading the breast cancer histology Image dataset;459
11.2.1;Preparing the project environment;463
11.2.1.1;Configuration script;464
11.2.1.2;Building the dataset;465
11.2.1.3;Running the build dataset script;468
11.2.1.4;The CNN model;470
11.2.1.5;Training and testing script;474
11.2.1.6;Running the training and testing script;482
11.2.1.7;Evaluating the results with a discussion of sensitivity, specificity, and AUROC curves;486
11.2.1.7.1;What is sensitivity?;487
11.2.1.7.2;What is specificity?;488
11.2.1.7.3;What are the differences between sensitivity and specificity and how are they used?;488
11.2.2;Using a MLP model for breast cancer prediction;495
11.2.2.1;Running the MLP script;500
12;Chapter 9: Reinforcement learning;505
12.1;Markov decision process;507
12.1.1;Discounted future reward;509
12.1.2;Q-learning;510
12.1.2.1;Q-learning example;513
12.1.2.1.1;Manual Q-learning experiments;521
12.1.2.1.2;Q-learning demonstration with a Python script;527
12.1.2.1.3;Running the script;533
12.1.2.1.4;Q-learning in a hostile environment demonstration;535
12.1.2.1.4.1;Running the script and evaluating the results;540
12.1.2.1.5;Q-learning in a hostile environment with a priori knowledge demonstration;543
12.1.2.1.5.1;Running the script and evaluating the results;548
12.2;Q-learning and neural networks;551
13;Index;558




