Ghosh | Statistical Planning and Inference | Buch | 978-1-119-96278-6 | www.sack.de

Buch, Englisch, 240 Seiten, Format (B × H): 168 mm x 244 mm

Reihe: Wiley Series in Probability and Statistics

Ghosh

Statistical Planning and Inference

Concepts and Applications
1. Auflage 2025
ISBN: 978-1-119-96278-6
Verlag: John Wiley & Sons Inc

Concepts and Applications

Buch, Englisch, 240 Seiten, Format (B × H): 168 mm x 244 mm

Reihe: Wiley Series in Probability and Statistics

ISBN: 978-1-119-96278-6
Verlag: John Wiley & Sons Inc


Explore the foundations of, and cutting-edge developments in, statistics

Statistical Planning and Inference: Concepts and Applications delivers a robust introduction to statistical planning and inference, including classical and computer age developments in statistical science. The book examines the challenges faced in statistical planning and inference, exploring the optimum methods identifying limitations and commonly encountered pitfalls.

It addresses linear and non-linear statistical inference and discusses noise-effect reduction, error rates, balanced and unbalanced data, model selection, discrimination and classification, truncated and censored data, and experimental designs.

Each chapter offers readers problems and solutions and illustrative examples to introduce the concepts and methods discussed within.

The book offers: - Analysis of both classical theory and modern developments in the field of statistical inference and planning
- Expansive discussions of linear and non-linear statistical inference
- Statistical problems and solutions to test the reader’s progress through and retention of the material contained within

Aimed at practitioners and researchers in the field of statistics, Statistical Planning and Inference: Concepts and Applications is also a must-read resource for graduate students, professors, and researchers in the life sciences, agriculture, psychology, education and measurement, sociology, computer and engineering sciences, and all other fields that rely on statistical concepts.

Ghosh Statistical Planning and Inference jetzt bestellen!

Autoren/Hrsg.


Weitere Infos & Material


Preface xi

1 Foundation of Experiments 1

1.1 Uncertainties in Evidences 1

1.2 Examples 2

1.2.1 The Louis Pasteur Anthrax Vaccination Experiment 2

1.2.2 The Lanarkshire Milk Experiment: Milk Tests in Lanarkshire Schools 2

1.3 Replication, Randomization, Blocking, and Blinding 4

1.3.1 Replication 4

1.3.2 Randomization 4

1.3.3 Blocking 4

1.3.4 Blinding 4

1.4 Figuring It Out! 4

Questions and Answers 5

Bibliography 6

2 Completely Randomized Design 7

2.1 An Example 7

2.2 Analyses Using R and SAS 9

2.3 Figuring It Out! 12

Bibliography 16

3 Randomized Complete Block Design 17

3.1 Fixed Effects Model 18

3.2 Binomial Model for Signs 20

3.3 Randomization Model 20

3.4 Mixed Effects Model 25

3.5 General Mixed Effects Model 27

3.6 The REML Variance Components Estimates 28

3.7 BLUEs and BLUPs 31

3.7.1 The Conditional Model 32

3.7.2 The Unconditional Model 32

3.7.3 Computation—The Conditional Model 33

3.7.4 Computation—The Unconditional Model 34

3.8 Figuring It Out! 39

Bibliography 40

4 Randomized Incomplete Block Design 41

4.1 Model M1: Fixed-Effects Model 41

4.2 Model M2: Mixed-Effects Model 43

4.3 Research Questions 44

4.4 Figuring It Out! 45

4.5 Definitions 46

Exercises 46

Bibliography 51

5 Error Rates 53

5.1 Definitions of Error Rates 53

5.2 Single-Stage Methods 55

5.3 A Multistage Method 56

5.3.1 Benjamini and Hochberg Method 57

5.4 Figuring It Out 58

Questions 59

Bibliography 62

6 Nutrition Experiment 63

6.1 Figuring It Out! 63

Bibliography 75

7 The Pearson Dependence 77

7.1 Bivariate Normal Distribution 77

7.2 Estimation of Unknown Parameters 79

7.2.1 The Unconditional Model 79

7.2.2 The Conditional Model 81

7.2.3 Test of Significance 83

7.3 A Bayesian Estimation 84

7.4 Exercises 86

Bibliography 87

8 The Multivariate Dependence 89

8.1 The Multivariate Normal Distribution 90

8.2 Inference 91

8.3 Partial Dependence 96

8.4 Exercises 96

Bibliography 98

9 The Conditional Mean Dependence 99

9.1 LS Estimation 100

9.2 Ridge Estimation 101

9.2.1 A Bayesian Estimation 103

9.3 Dependence of Ridge Estimator on the Tuning Parameter 103

9.4 LASSO Estimation 104

9.5 Dependence of LASSO Estimators on the Tuning Parameter 105

Bibliography 116

10 More Parameters Than Observations 119

10.1 Learning by Doing—Exercises 122

Exercises 123

Bibliography 125

11 Eigenvalues, Eigenvectors, and Applications 127

11.1 Eigenvalues and Eigenvectors 127

11.2 Second-Order Response Surface 129

Exercises 132

Bibliography 133

12 Covariance Estimation 135

12.1 Model 1 135

12.1.1 Characterization of the Covariance Matrix and Its Estimators 135

12.1.2 Likelihood Function 136

12.1.3 Properties 137

12.2 Model 2 137

12.2.1 Characterization of the Covariance Matrix and Its Estimators 138

12.3 Model 3 138

12.4 Model 4 139

12.5 Model 5 140

12.6 Exercises 141

Bibliography 142

13 Discriminant Analysis 145

13.1 Learning from the Univariate Data—Two Normal Populations with Equal Variances 145

13.1.1 Discriminant Analysis for the Univariate Data 147

13.1.2 Example—Univariate Discriminant Analysis 148

13.2 Learning from the Univariate Data—Two Normal Populations with Unequal Variances 151

13.2.1 Classification of 25 Versicolor Iris Flowers 153

13.2.2 Classification of 25 Setosa Iris Flowers 154

13.2.3 Test of Homogeneity of Variances 154

13.3 Learning from the Multivariate Data 155

13.3.1 Classification of Versicolor and Setosa 156

13.3.2 Classification of Versicolor and Virginica 158

13.4 Logistic Regression 159

13.5 Exercises 160

Bibliography 162

14 Optimizing the Variance–Bias Trade-Off 163

14.1 Variance–Bias Trade-Off 163

14.1.1 Example 1 164

14.1.2 Example 2 165

14.1.3 Example 3 166

14.2 Information in Data 167

14.3 Information and Design in Presence of a Covariate 169

14.3.1 Information 169

14.3.2 Optimum Design for a Covariate 170

14.4 Information and Design in Presence of Multiple Covariates 171

14.4.1 Information 171

14.4.2 Exponential Model 175

14.4.3 Exponential Regression Model with Multiple Covariates 176

14.4.4 Poisson Log-Linear Model 177

14.4.5 Non-parametric Regression Model 180

14.5 Exercises 183

Bibliography 187

15 Specification, Discrimination, Robustness, and Sensitivity 189

15.1 The Global and Local Optimal Models 189

15.2 The T-Optimal Design 190

15.3 Convex and Concave Functions 192

15.4 The Kullback–Leibler (KL) Divergence 194

15.5 The KL Design Optimality 197

15.6 The Differential Entropy 198

15.7 Lindley Information Measure 200

15.8 Joint Entropy, Conditional Entropy, and Mutual Information 202

15.9 Maximum Entropy Sampling 204

15.10 Search Linear Models and Search Designs 207

15.10.1 Factorial Experiments 209

15.10.2 Search Probability Matrix 210

15.11 Robustness Against Unavailable Data 210

15.12 Influential Sets of Observations 212

15.13 Exercises 213

Bibliography 214

Data Index 217

Subject Index 219


Subir Ghosh is a Professor of Statistics at the University of California, Riverside, USA. He is known for his research work in Statistical Design and Analysis of Experiments and Modeling. He is an elected fellow of the American Statistical Association, the American Association of the Advancement of Science, and an elected member of the International Statistical Institute. He received the awards at the University of California, Riverside:

2012-2016 Distinguished Teaching Professor and a member of the UCR Academy of Distinguished Teachers.
2003 Graduate Council Dissertation Advisor/Mentoring Award, and 1993 Academic Senate Distinguished Teaching Award.
He also served as the 2000-2003 executive editor of the Journal of Statistical Planning and Inference.



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.