E-Book, Englisch, 360 Seiten, Web PDF
Gruber / Lieberman / Olkin Regression Estimators
1. Auflage 2014
ISBN: 978-1-4832-6097-6
Verlag: Elsevier Science & Techn.
Format: PDF
Kopierschutz: 1 - PDF Watermark
A Comparative Study
E-Book, Englisch, 360 Seiten, Web PDF
ISBN: 978-1-4832-6097-6
Verlag: Elsevier Science & Techn.
Format: PDF
Kopierschutz: 1 - PDF Watermark
Regression Estimators: A Comparative Study presents, compares, and contrasts the development and the properties of the ridge type estimators that result from both Bayesian and non-Bayesian (frequentist) methods. The book is divided into four parts. The first part (Chapters I and II) discusses the need for alternatives to least square estimators, gives a historical survey of the literature and summarizes basic ideas in Matrix Theory and Statistical Decision Theory used throughout the book. The second part (Chapters III and IV) covers the estimators from both the Bayesian and from the frequentist points of view and explores the mathematical relationships between them. The third part (Chapters V-VIII) considers the efficiency of the estimators with and without averaging over a prior distribution. Part IV, the final two chapters IX and X, suggests applications of the methods and results of Chapters III-VII to Kaiman Filters and Analysis of Variance, two very important areas of application. Statisticians and workers in fields that use statistical methods who would like to know more about the analytical properties of ridge type estimators will find the book invaluable.
Autoren/Hrsg.
Weitere Infos & Material
1;Front Cover;1
2;Regression Estimators: A Comparative Study;4
3;Copyright Page;5
4;Table of Contents;6
5;Preface;10
6;Part I: Introduction and Mathematical Preliminaries;14
6.1;Chapter I. Introduction;16
6.1.1;1.0. Motivation for Writing This Book;16
6.1.2;1.1. Purpose of This Book;17
6.1.3;1.2. Least Square Estimators and the Need for Alternatives;17
6.1.4;1.3. Historical Survey;24
6.1.5;1.4. The Structure of the Book;30
6.2;Chapter II. Mathematical and Statistical Preliminaries;32
6.2.1;2.0. Introduction;32
6.2.2;2.1. Matrix Theory Results;33
6.2.3;2.2. The Bayes Estimator;51
6.2.4;2.3. The Minimax Estimator;56
6.2.5;2.4. Criterion for Comparing Estimators: Theobald's
1974 Result;59
6.2.6;2.5. Some Useful Inequalities;61
6.2.7;2.6. Some Miscellaneous Useful Matrix Results;65
6.2.8;2.7. Summary;68
7;Part II: The Estimators;70
7.1;Chapter III. The Estimators;72
7.1.1;3.0. Introduction;72
7.1.2;3.1. The Least Square Estimator and Its Properties;73
7.1.3;3.2. The Generalized Ridge Regression Estimator;83
7.1.4;3.3. The Mixed Estimators;87
7.1.5;3.4. The Linear Minimax Estimator;96
7.1.6;3.5. The Bayes Estimator;100
7.1.7;3.6. Summary and Remarks;105
7.2;Chapter IV. How the Different Estimators Are Related;106
7.2.1;4.0. Introduction;106
7.2.2;4.1. Alternative Forms of the Bayes Estimator Full Rank Case;107
7.2.3;4.2. Alternative Forms of the Bayes Estimator Non-Full
Rank Case;113
7.2.4;4.3. The Equivalence of the Generalized Ridge Estimator
and the Bayes Estimator;125
7.2.5;4.4. The Equivalence of the Mixed Estimatorand the Bayes
Estimator;129
7.2.6;4.5. Ridge Estimators in the Literature as Special Cases of
the BE, Minimax Estimators, or Mixed Estimators;138
7.2.7;4.6. Extension of Results to the Case where U'FU Is Not Positive
Definite;146
7.2.8;4.7. An Extension of the Gauss-Markov Theorem;158
7.2.9;4.8. Summary and Remarks;160
8;Part III: The Efficiencies of the Estimators;162
8.1;Chapter V. Measures of Efficiency of the Estimators;164
8.2;Chapter VI. The Average MSE;172
8.2.1;6.0. Introduction;172
8.2.2;6.1. The Forms of the MSE for the Minimax, Bayes andthe Mixed Estimator;173
8.2.3;6.2. Relationship Between the Average Variance and the
MSE;178
8.2.4;6.3. The Average Variance and the MSE of the BE;179
8.2.5;6.4. Alternative Forms of the MSE of the Mixed Estimator;184
8.2.6;6.5. Comparison of the MSE of Different BE;186
8.2.7;6.6. Comparison of the Ridge and Contraction Estimator's
MSE;193
8.2.8;6.7. Summary and Remarks;196
8.3;Chapter VII. The MSE Neglecting the Prior Assumptions;198
8.3.1;7.0. Introduction;198
8.3.2;7.1. The MSE of the BE;199
8.3.3;7.2. The MSE of the Mixed Estimators Neglecting the Prior Assumptions;203
8.3.4;7.3. The Comparison of the Conditional MSE of the Bayes Estimator and the Least Square Estimator and the Comparison of the Conditional and the Average
MSE;206
8.3.5;7.4. The Comparison of the MSE of a Mixed Estimator
with the LS Estimators;222
8.3.6;7.5. The Comparison of the MSE of Two BE;227
8.3.7;7.6. Summary;238
8.4;Chapter VIII. The MSE for Incorrect Prior Assumptions;240
8.4.1;8.0. Introductio;240
8.4.2;8.1. The BE and Its MSE;241
8.4.3;8.2. The Minimax Estimator;249
8.4.4;8.3. The Mixed Estimator;251
8.4.5;8.4. Contaminated Priors;255
8.4.6;8.5. Contaminated (Mixed) Bayes Estimators;259
8.4.7;8.6. Summary;263
9;Part IV: Applications;264
9.1;Chapter IX. The Kaiman Filter;266
9.1.1;9.0. Introduction;266
9.1.2;9.1. The Kaiman Filter as a Bayes
Estimator;268
9.1.3;9.2. The Kaiman Filter as a Recursive Least Square
Estimator and the Connection with the Mixed Estimator;274
9.1.4;9.3. The Minimax Estimator;281
9.1.5;9.4. The Generalized Ridge Estimator;283
9.1.6;9.5. The Average MSE;286
9.1.7;9.6. The MSE for Incorrect Initial Prior Assumptions;290
9.1.8;9.7. Applications;292
9.1.9;9.8. Recursive Ridge Regression;297
9.1.10;9.9. Summary;300
9.2;Chapter X. Experimental Design Models;302
9.2.1;10.0. Introduction;302
9.2.2;10.1. The One Way ANOVA Model;304
9.2.3;10.2. The Bayes and Empirical Bayes Estimators;315
9.2.4;10.3. The Two Way Classification;319
9.2.5;10.4. The Bayes and Empirical Bayes Estimators;326
9.2.6;10.5. Summary;333
9.2.7;Appendix to Section 10.2;334
10;Bibliography;340
11;Author Index;348
12;Subject Index;352