E-Book, Englisch, 464 Seiten, E-Book
Stapleton Models for Probability and Statistical Inference
1. Auflage 2008
ISBN: 978-0-470-18340-3
Verlag: John Wiley & Sons
Format: PDF
Kopierschutz: Adobe DRM (»Systemvoraussetzungen)
Theory and Applications
E-Book, Englisch, 464 Seiten, E-Book
Reihe: Wiley Series in Probability and Statistics
ISBN: 978-0-470-18340-3
Verlag: John Wiley & Sons
Format: PDF
Kopierschutz: Adobe DRM (»Systemvoraussetzungen)
This concise, yet thorough, book is enhanced with simulations andgraphs to build the intuition of readers
Models for Probability and Statistical Inference was written over afive-year period and serves as a comprehensive treatment of thefundamentals of probability and statistical inference. Withdetailed theoretical coverage found throughout the book, readersacquire the fundamentals needed to advance to more specializedtopics, such as sampling, linear models, design of experiments,statistical computing, survival analysis, and bootstrapping.
Ideal as a textbook for a two-semester sequence on probability andstatistical inference, early chapters provide coverage onprobability and include discussions of: discrete models and randomvariables; discrete distributions including binomial,hypergeometric, geometric, and Poisson; continuous, normal, gamma,and conditional distributions; and limit theory. Since limit theoryis usually the most difficult topic for readers to master, theauthor thoroughly discusses modes of convergence of sequences ofrandom variables, with special attention to convergence indistribution. The second half of the book addresses statisticalinference, beginning with a discussion on point estimation andfollowed by coverage of consistency and confidence intervals.Further areas of exploration include: distributions defined interms of the multivariate normal, chi-square, t, and F (central andnon-central); the one- and two-sample Wilcoxon test, together withmethods of estimation based on both; linear models with a linearspace-projection approach; and logistic regression.
Each section contains a set of problems ranging in difficulty fromsimple to more complex, and selected answers as well as proofs toalmost all statements are provided. An abundant amount of figuresin addition to helpful simulations and graphs produced by thestatistical package S-Plus(r) are included to help build theintuition of readers.
Autoren/Hrsg.
Weitere Infos & Material
1. Probability Models.
1.1 Discrete Probability Models.
1.2 Conditional Probability and Independence.
1.3 Random Variables.
1.4 Expectation.
1.5 The Variance.
1.6 Covariance and Correlation.
2. Special Discrete Distributions.
2.1 The Binomial Distribution.
2.2 The Hypergeometric Distribution.
2.3 The Geometric and Negative Binomial Distributions.
2.4 The Poisson Distribution.
3. Continuous Random Variables.
4.1 Continuous RV's and Their Distributions.
4.2 Expected Values and Variances.
4.3 Transformations of Random Variables.
4.4Joint Densities.
4 Special Continuous Distributions.
4.1 The Normal Distribution.
4.2 The Gamma Distribution.
5. Conditional Distributions.
5.1 The Discrete Case.
5.2 Conditional Expectations for the Discrete Case.
5.3 Conditional Densities and Expectations for ContinuousRV's.
6. Limit Laws.
6.1 Moment Generating Functions.
6.2 Convergence in Probability and in Distribution.
6.3 The Central Limit Theorem.
6.4 The Delta-Method.
7. Estimation.
7.1 Point Estimation.
7.2 The Method of Moments.
7.3 Maximum Likelihood.
7.4 Consistency.
7.5 The O-Method.
7.6 Confidence Intervals.
7.7 Fisher Information, The Cramer-Rao Bound, and AsymptoticNormality of MLE's.
7.8 Sufficiency.
8. Testing Hypotheses.
8.1 Introduction.
8.2 The Neyman-Pearson Lemma.
8.3 The Likelihood Ratio Test.
8.4 The p-Value and the Relationship Between Tests of Hypothesesand Confidence Intervals.
9. The Multivariate Normal, Chi-square, t, andF-Distributions.
9.1 The Multivariate Normal Distribution.
9.2 The Central and Noncentral Chi-Square Distributions.
9.3 Student's t-Distribution.
9.4 The F-Distribution.
10.3 Nonparametric Statistics.
10.1 The Wilcoxon Test and Estimator.
10.2 One Sample Methods.
10.3 The Kolmogorov-Smirnov Tests.
11. Linear Models.
11.1 The Principle of Least Squares.
11.2 Linear Models.
11.3 F-Tests for H0.
11.4 Two-Way Analysis of Variance..
12. Frequency Data.
12.1 Logistic Regression.
12.2 Two-Way Frequency Tables.
12.3 Chi-Square Goodness of Fit Tests.
13. Miscellaneous Topics.
13.1 Survival Analysis.
13.2 Bootstrapping.
13.3 Bayesian Statistics.
13.4 Sampling.