E-Book, Englisch, 398 Seiten
Reihe: Springer Texts in Statistics
Sheather A Modern Approach to Regression with R
1. Auflage 2009
ISBN: 978-0-387-09608-7
Verlag: Springer
Format: PDF
Kopierschutz: 1 - PDF Watermark
E-Book, Englisch, 398 Seiten
Reihe: Springer Texts in Statistics
ISBN: 978-0-387-09608-7
Verlag: Springer
Format: PDF
Kopierschutz: 1 - PDF Watermark
This book focuses on tools and techniques for building valid regression models using real-world data. A key theme throughout the book is that it only makes sense to base inferences or conclusions on valid models.
Autoren/Hrsg.
Weitere Infos & Material
1;Preface;7
2;Contents;11
3;Introduction;15
3.1;1.1 Building Valid Models;15
3.2;1.2 Motivating Examples;15
3.3;1.2.1 Assessing the Ability of NFL Kickers;15
3.4;1.2.2 Newspaper Circulation;18
3.5;1.2.3 Menu Pricing in a New Italian Restaurant in New York City;19
3.6;1.2.4 Effect of Wine Critics’ Ratings on Prices of Bordeaux Wines;22
3.7;1.3 Level of Mathematics;27
4;Simple Linear Regression;29
4.1;2.1 Introduction and Least Squares Estimates;29
4.2;2.1.1 Simple Linear Regression Models;29
4.3;2.2 Inferences About the Slope and the Intercept;34
4.4;2.2.1 Assumptions Necessary in Order to Make Inferences About the Regression Model;35
4.5;2.2.2 Inferences About the Slope of the Regression Line;35
4.6;2.2.3 Inferences About the Intercept of the Regression Line;37
4.7;2.3 Confidence Intervals for the Population Regression Line;38
4.8;2.4 Prediction Intervals for the Actual Value of Y;39
4.9;2.5 Analysis of Variance;41
4.10;2.6 Dummy Variable Regression;44
4.11;2.7 Derivations of Results;47
4.12;2.7.1 Inferences about the Slope of the Regression Line;48
4.13;2.7.2 Inferences about the Intercept of the Regression Line;49
4.14;2.7.3 Confidence Intervals for the Population Regression Line;50
4.15;2.7.4 Prediction Intervals for the Actual Value of Y;51
4.16;2.8 Exercises;52
5;Diagnostics and Transformations for Simple Linear Regression;58
5.1;3.1 Valid and Invalid Regression Models: Anscombe’s Four Data Sets;58
5.2;3.1.1 Residuals;61
5.3;3.1.2 Using Plots of Residuals to Determine Whether the Proposed Regression Model Is a Valid Model;62
5.4;3.1.3 Example of a Quadratic Model;63
5.5;3.2 Regression Diagnostics: Tools for Checking the Validity of a Model;63
5.6;3.2.1 Leverage Points;64
5.7;3.2.2 Standardized Residuals;72
5.8;3.2.3 Recommendations for Handling Outliers and Leverage Points;79
5.9;3.2.4 Assessing the Influence of Certain Cases;80
5.10;3.2.5 Normality of the Errors;82
5.11;3.2.6 Constant Variance;84
5.12;3.3 Transformations;89
5.13;3.3.1 Using Transformations to Stabilize Variance;89
5.14;3.3.2 Using Logarithms to Estimate Percentage Effects;92
5.15;3.3.3 Using Transformations to Overcome Problems due to Nonlinearity;96
5.16;3.4 Exercises;116
6;Weighted Least Squares;127
6.1;4.1 Straight-Line Regression Based on Weighted Least Squares;127
6.2;4.1.1 Prediction Intervals for Weighted Least Squares;130
6.3;4.1.2 Leverage for Weighted Least Squares;130
6.4;4.1.3 Using Least Squares to Calculate Weighted Least Squares;131
6.5;4.1.4 Defining Residuals for Weighted Least Squares;133
6.6;4.1.5 The Use of Weighted Least Squares;133
6.7;4.2 Exercises;134
7;Multiple Linear Regression;136
7.1;5.1 Polynomial Regression;136
7.2;5.2 Estimation and Inference in Multiple Linear Regression;141
7.3;5.3 Analysis of Covariance;151
7.4;5.4 Exercises;157
8;Diagnostics and Transformations for Multiple Linear Regression;161
8.1;6.1 Regression Diagnostics for Multiple Regression;161
8.2;6.1.1 Leverage Points in Multiple Regression;162
8.3;6.1.2 Properties of Residuals in Multiple Regression;164
8.4;6.1.3 Added Variable Plots;172
8.5;6.2 Transformations;177
8.6;6.2.1 Using Transformations to Overcome Nonlinearity;177
8.7;6.2.2 Using Logarithms to Estimate Percentage Effects: Real Valued Predictor Variables;194
8.8;6.3 Graphical Assessment of the Mean Function Using Marginal Model Plots;199
8.9;6.4 Multicollinearity;205
8.10;6.4.1 Multicollinearity and Variance Inflation Factors;213
8.11;6.5 Case Study: Effect of Wine Critics’ Ratings on Prices of Bordeaux Wines;213
8.12;6.6 Pitfalls of Observational Studies Due to Omitted Variables;220
8.13;6.6.1 Spurious Correlation Due to Omitted Variables;220
8.14;6.6.2 The Mathematics of Omitted Variables;223
8.15;6.6.3 Omitted Variables in Observational Studies;224
8.16;6.7 Exercises;225
9;Variable Selection;236
9.1;7.1 Evaluating Potential Subsets of Predictor Variables;237
9.2;7.1.1 Criterion 1: R;237
9.3;-Adjusted;237
9.4;7.1.2 Criterion 2: AIC, Akaike’s Information Criterion;239
9.5;7.1.3 Criterion 3: AIC;240
9.6;, Corrected AIC;240
9.7;7.1.4 Criterion 4: BIC, Bayesian Information Criterion;241
9.8;7.1.5 Comparison of AIC, AIC;241
9.9;and BIC;241
9.10;7.2 Deciding on the Collection of Potential Subsets of Predictor Variables;242
9.11;7.2.1 All Possible Subsets;242
9.12;7.2.2 Stepwise Subsets;245
9.13;7.2.3 Inference After Variable Selection;247
9.14;7.3 Assessing the Predictive Ability of Regression Models;248
9.15;7.3.1 Stage 1: Model Building Using the Training Data Set;248
9.16;7.3.2 Stage 2: Model Comparison Using the Test Data Set;256
9.17;7.4 Recent Developments in Variable Selection – LASSO;259
9.18;7.5 Exercises;261
10;Logistic Regression;271
10.1;8.1 Logistic Regression Based on a Single Predictor;271
10.2;8.1.1 The Logistic Function and Odds;273
10.3;8.1.2 Likelihood for Logistic Regression with a Single Predictor;276
10.4;8.1.3 Explanation of Deviance;279
10.5;8.1.4 Using Differences in Deviance Values to Compare Models;280
10.6;8.1.5 R;281
10.7;for Logistic Regression;281
10.8;8.1.6 Residuals for Logistic Regression;282
10.9;8.2 Binary Logistic Regression;285
10.10;8.2.1 Deviance for the Case of Binary Data;288
10.11;8.2.2 Residuals for Binary Data;289
10.12;8.2.3 Transforming Predictors in Logistic Regression for Binary Data;290
10.13;8.2.4 Marginal Model Plots for Binary Data;294
10.14;8.3 Exercises;302
11;Serially Correlated Errors;312
11.1;9.1 Autocorrelation;312
11.2;9.2 Using Generalized Least Squares When the Errors Are AR( 1);317
11.3;9.2.1 Generalized Least Squares Estimation;318
11.4;9.2.2 Transforming a Model with AR(1) Errors into a Model with iid Errors;322
11.5;9.2.3 A General Approach to Transforming GLS into LS;323
11.6;9.3 Case Study;326
11.7;9.4 Exercises;332
12;Mixed Models;337
12.1;10.1 Random Effects;337
12.2;10.1.1 Maximum Likelihood and Restricted Maximum Likelihood;340
12.3;10.1.2 Residuals in Mixed Models;351
12.4;10.2 Models with Covariance Structures Which Vary Over Time;359
12.5;10.2.1 Modeling the Conditional Mean;360
12.6;10.3 Exercises;374
13;Appendix: Nonparametric Smoothing;376
13.1;A.1 Kernel Density Estimation;376
13.2;A.2 Nonparametric Regression for a Single Predictor;379
13.3;A.2.1 Local Polynomial Kernel Methods;380
13.4;A.2.2 Penalized Linear Regression Splines;384
14;Index;392




