E-Book, Englisch, 293 Seiten
Bingham / Fry Regression
1. Auflage 2010
ISBN: 978-1-84882-969-5
Verlag: Springer
Format: PDF
Kopierschutz: 1 - PDF Watermark
Linear Models in Statistics
E-Book, Englisch, 293 Seiten
Reihe: Springer Undergraduate Mathematics Series
ISBN: 978-1-84882-969-5
Verlag: Springer
Format: PDF
Kopierschutz: 1 - PDF Watermark
Regression is the branch of Statistics in which a dependent variable of interest is modelled as a linear combination of one or more predictor variables, together with a random error. The subject is inherently two- or higher- dimensional, thus an understanding of Statistics in one dimension is essential.
Regression: Linear Models in Statistics fills the gap between introductory statistical theory and more specialist sources of information. In doing so, it provides the reader with a number of worked examples, and exercises with full solutions.
The book begins with simple linear regression (one predictor variable), and analysis of variance (ANOVA), and then further explores the area through inclusion of topics such as multiple linear regression (several predictor variables) and analysis of covariance (ANCOVA). The book concludes with special topics such as non-parametric regression and mixed models, time series, spatial processes and design of experiments.
Aimed at 2nd and 3rd year undergraduates studying Statistics, Regression: Linear Models in Statistics requires a basic knowledge of (one-dimensional) Statistics, as well as Probability and standard Linear Algebra. Possible companions include John Haigh's Probability Models, and T. S. Blyth & E.F. Robertsons' Basic Linear Algebra and Further Linear Algebra.
Autoren/Hrsg.
Weitere Infos & Material
1;Preface
;8
2;Contents;12
3;1. Linear Regression;16
3.1;1.1 Introduction;16
3.2;1.2 The Method of Least Squares;18
3.2.1;1.2.1 Correlation version;22
3.2.2;1.2.2 Large-sample limit;23
3.3;1.3 The origins of regression;24
3.4;1.4 Applications of regression;26
3.5;1.5 The Bivariate Normal Distribution;29
3.6;1.6 Maximum Likelihood and Least Squares;36
3.7;1.7 Sums of Squares;38
3.8;1.8 Two regressors;41
3.9;Exercises;43
4;2. The Analysis of Variance (ANOVA);48
4.1;2.1 The Chi-Square Distribution;48
4.2;2.2 Change of variable formula and Jacobians;51
4.3;2.3 The Fisher F-distribution;52
4.4;2.4 Orthogonality;53
4.5;2.5 Normal sample mean and sample variance;54
4.6;2.6 One-Way Analysis of Variance;57
4.7;2.7 Two-Way ANOVA; No Replications;64
4.8;2.8 Two-Way ANOVA: Replications and Interaction;67
4.9;Exercises;71
5;3. Multiple Regression;75
5.1;3.1 The Normal Equations;75
5.2;3.2 Solution of the Normal Equations;78
5.3;3.3 Properties of Least-Squares Estimators;84
5.4;3.4 Sum-of-Squares Decompositions;87
5.4.1;3.4.1 Coefficient of determination;93
5.5;3.5 Chi-Square Decomposition;94
5.5.1;3.5.1 Idempotence, Trace and Rank;95
5.5.2;3.5.2 Quadratic forms in normal variates;96
5.5.3;3.5.3 Sums of Projections;96
5.6;3.6 Orthogonal Projections and Pythagoras's Theorem;99
5.7;3.7 Worked examples;103
5.8;Exercises;108
6;4. Further Multilinear Regression;112
6.1;4.1 Polynomial Regression;112
6.1.1;4.1.1 The Principle of Parsimony;115
6.1.2;4.1.2 Orthogonal polynomials;116
6.1.3;4.1.3 Packages;116
6.2;4.2 Analysis of Variance;117
6.3;4.3 The Multivariate Normal Distribution;118
6.4;4.4 The Multinormal Density;124
6.4.1;4.4.1 Estimation for the multivariate normal;126
6.5;4.5 Conditioning and Regression;128
6.6;4.6 Mean-square prediction;134
6.7;4.7 Generalised least squares and weighted regression;136
6.8;Exercises;138
7;5. Adding additional covariates and the Analysisof Covariance;141
7.1;5.1 Introducing further explanatory variables;141
7.1.1;5.1.1 Orthogonal parameters;145
7.2;5.2 ANCOVA;147
7.2.1;Interactions;148
7.2.2;5.2.1 Nested Models;151
7.2.2.1;Update;151
7.2.2.2;Akaike Information Criterion (AIC);152
7.2.2.3;Step;152
7.3;5.3 Examples;152
7.4;Exercises;157
8;6. Linear Hypotheses;161
8.1;6.1 Minimisation Under Constraints;161
8.2;6.2 Sum-of-Squares Decomposition and F-Test;164
8.3;6.3 Applications: Sequential Methods;169
8.3.1;6.3.1 Forward selection;169
8.3.2;6.3.2 Backward selection;170
8.3.3;6.3.3 Stepwise regression;171
8.4;Exercises;172
9;7. Model Checking and Transformation of Data;175
9.1;7.1 Deviations from Standard Assumptions;175
9.1.1;Residual Plots;175
9.1.2;Scatter Plots;175
9.1.3;Non-constant Variance;176
9.1.4;Unaccounted-for Structure;176
9.1.5;Outliers;176
9.1.6;Detecting outliers via residual analysis;177
9.1.7;Influential Data Points;178
9.1.8;Cook's distance;179
9.1.9;Non-additive or non-Gaussian errors;180
9.1.10;Correlated Errors;180
9.2;7.2 Transformation of Data;180
9.2.1;Dimensional Analysis;183
9.3;7.3 Variance-Stabilising Transformations;183
9.3.1;Taylor's Power Law;184
9.3.2;Delta Method;185
9.4;7.4 Multicollinearity;186
9.4.1;Regression Diagnostics;189
9.5;Exercises;189
10;8. Generalised Linear Models;193
10.1;8.1 Introduction;193
10.2;8.2 Definitions and examples;195
10.2.1;8.2.1 Statistical testing and model comparisons;197
10.2.2;8.2.2 Analysis of residuals;199
10.2.3;8.2.3 Athletics times;200
10.3;8.3 Binary models;202
10.4;8.4 Count data, contingency tables and log-linear models;205
10.5;8.5 Over-dispersion and the Negative Binomial Distribution;209
10.5.1;8.5.1 Practical applications: Analysis of over-dispersed models in R"472;211
10.6;Exercises;212
11;9. Other topics;214
11.1;9.1 Mixed models;214
11.1.1;9.1.1 Mixed models and Generalised Least Squares;217
11.2;9.2 Non-parametric regression;222
11.2.1;9.2.1 Kriging;224
11.3;9.3 Experimental Design;226
11.3.1;9.3.1 Optimality criteria;226
11.3.2;9.3.2 Incomplete designs;227
11.4;9.4 Time series;230
11.4.1;9.4.1 Cointegration and spurious regression;231
11.5;9.5 Survival analysis;233
11.5.1;9.5.1 Proportional hazards;235
11.6;9.6 p >> n;236
12;Solutions;237
13;Dramatis Personae: Who did what when;279
14;Bibliography;281
15;Index;288




