E-Book, Englisch, 338 Seiten, Web PDF
Sp„th / Rheinboldt Mathematical Algorithms for Linear Regression
1. Auflage 2014
ISBN: 978-1-4832-6454-7
Verlag: Elsevier Science & Techn.
Format: PDF
Kopierschutz: 1 - PDF Watermark
E-Book, Englisch, 338 Seiten, Web PDF
ISBN: 978-1-4832-6454-7
Verlag: Elsevier Science & Techn.
Format: PDF
Kopierschutz: 1 - PDF Watermark
Mathematical Algorithms for Linear Regression discusses numerous fitting principles related to discrete linear approximations, corresponding numerical methods, and FORTRAN 77 subroutines. The book explains linear Lp regression, method of the lease squares, the Gaussian elimination method, the modified Gram-Schmidt method, the method of least absolute deviations, and the method of least maximum absolute deviation. The investigator can determine which observations can be classified as outliers (those with large errors) and which are not by using the fitting principle. The text describes the elimination of outliers and the selection of variables if too many or all of them are given by values. The clusterwise linear regression accounts if only a few of the relevant variables have been collected or are collectible, assuming that their number is small in relation to the number of observations. The book also examines linear Lp regression with nonnegative parameters, the Kuhn-Tucker conditions, the Householder transformations, and the branch-and-bound method. The text points out the method of least squares is mainly used for models with nonlinear parameters or for orthogonal distances. The book can serve and benefit mathematicians, students, and professor of calculus, statistics, or advanced mathematics.
Autoren/Hrsg.
Weitere Infos & Material
1;Front Cover;1
2;Mathematical Algorithms for Linear Regression;4
3;Copyright Page;5
4;Table of Contents;6
5;Preface;8
6;Notation;10
7;Chapter I. Introduction;12
7.1;References;25
8;Chapter II. Linear Lp Regression;28
8.1;2.1 Fundamentals;28
8.2;2.2 p= 2 (Method of the Least Squares: NGL, MGS, ICMGS, GIVR, HFTI, SVDR);32
8.3;2.3 p . 1,2, 8 (LPREGR);59
8.4;2.4 p = 1 (Method of Least Absolute Deviations: A478L1, AFKLl, BLOD1);69
8.5;2.5 p =8 (Method of Least Maximum Absolute Deviation: A328LI, A495LI, ABDLI);96
8.6;2.6 Comparison of Residuals (RES) and Choice of p;119
8.7;2.7 The Elimination of Outliers;127
8.8;2.8 Selection of Variables (SCR, SCRFL1);136
8.9;2.9 Clusterwise Linear Regression (CWLL2R, CWLL1R,CWLLIR);157
8.10;2.10 Average Linear Regression (AVLLSQ);196
8.11;References;203
9;Chapter III. Robust Regression (ROBUST);204
9.1;References;216
10;Chapter IV. Ridge Regression (RRL2, RRL1, RRLl);218
10.1;References;227
11;Chapter V. Linear Lp Regression with Linear Constraints;228
11.1;5.1 Introduction;228
11.2;5.2 p = 2(CL2)
;231
11.3;5.3 p = 1 (CL1 );240
11.4;5.4 p = 8 (CLI);249
11.5;References;259
12;Chapter VI. Linear Lp Regression with Nonnegative Parameters (p = 2: NNLS; p = 1: NNL1 ; p = 8: NNLI);260
12.1;References;268
13;Chapter VII. Orthogonal Linear Lp Regression;270
13.1;7.1 Fundamentals;270
13.2;7.2 p = 2 (L2ORTH);276
13.3;7.3 . .1,2, 8 (LPORTH);282
13.4;7.4 p = 1 (LIORTH);291
13.5;7.5 . = 8 (LIORTH);297
13.6;7.6 Comparison of Residuals and Choice of .;301
13.7;7.7 Orthogonal L2 Regression with Linear Manifolds (LMORTH);305
13.8;References;309
14;Final Remarks;310
15;List of Subroutines;313
16;Appendix: Examples;314
17;Index;336