E-Book, Englisch, 624 Seiten
Roussas An Introduction to Probability and Statistical Inference
2. Auflage 2014
ISBN: 978-0-12-800437-1
Verlag: Elsevier Science & Techn.
Format: EPUB
Kopierschutz: 6 - ePub Watermark
E-Book, Englisch, 624 Seiten
ISBN: 978-0-12-800437-1
Verlag: Elsevier Science & Techn.
Format: EPUB
Kopierschutz: 6 - ePub Watermark
An Introduction to Probability and Statistical Inference, Second Edition, guides you through probability models and statistical methods and helps you to think critically about various concepts. Written by award-winning author George Roussas, this book introduces readers with no prior knowledge in probability or statistics to a thinking process to help them obtain the best solution to a posed question or situation. It provides a plethora of examples for each topic discussed, giving the reader more experience in applying statistical methods to different situations. This text contains an enhanced number of exercises and graphical illustrations where appropriate to motivate the reader and demonstrate the applicability of probability and statistical inference in a great variety of human activities. Reorganized material is included in the statistical portion of the book to ensure continuity and enhance understanding. Each section includes relevant proofs where appropriate, followed by exercises with useful clues to their solutions. Furthermore, there are brief answers to even-numbered exercises at the back of the book and detailed solutions to all exercises are available to instructors in an Answers Manual. This text will appeal to advanced undergraduate and graduate students, as well as researchers and practitioners in engineering, business, social sciences or agriculture. - Content, examples, an enhanced number of exercises, and graphical illustrations where appropriate to motivate the reader and demonstrate the applicability of probability and statistical inference in a great variety of human activities - Reorganized material in the statistical portion of the book to ensure continuity and enhance understanding - A relatively rigorous, yet accessible and always within the prescribed prerequisites, mathematical discussion of probability theory and statistical inference important to students in a broad variety of disciplines - Relevant proofs where appropriate in each section, followed by exercises with useful clues to their solutions - Brief answers to even-numbered exercises at the back of the book and detailed solutions to all exercises available to instructors in an Answers Manual
George G. Roussas earned a B.S. in Mathematics with honors from the University of Athens, Greece, and a Ph.D. in Statistics from the University of California, Berkeley. As of July 2014, he is a Distinguished Professor Emeritus of Statistics at the University of California, Davis. Roussas is the author of five books, the author or co-author of five special volumes, and the author or co-author of dozens of research articles published in leading journals and special volumes. He is a Fellow of the following professional societies: The American Statistical Association (ASA), the Institute of Mathematical Statistics (IMS), The Royal Statistical Society (RSS), the American Association for the Advancement of Science (AAAS), and an Elected Member of the International Statistical Institute (ISI); also, he is a Corresponding Member of the Academy of Athens. Roussas was an associate editor of four journals since their inception, and is now a member of the Editorial Board of the journal Statistical Inference for Stochastic Processes. Throughout his career, Roussas served as Dean, Vice President for Academic Affairs, and Chancellor at two universities; also, he served as an Associate Dean at UC-Davis, helping to transform that institution's statistical unit into one of national and international renown. Roussas has been honored with a Festschrift, and he has given featured interviews for the Statistical Science and the Statistical Periscope. He has contributed an obituary to the IMS Bulletin for Professor-Academician David Blackwell of UC-Berkeley, and has been the coordinating editor of an extensive article of contributions for Professor Blackwell, which was published in the Notices of the American Mathematical Society and the Celebratio Mathematica.
Autoren/Hrsg.
Weitere Infos & Material
1;Front Cover;1
2;Inside Front Cover;2
3;An Introduction to Probability and Statistical Inference;6
4;Copyright;7
5;Dedication;8
6;Contents;10
7;Preface;14
7.1;Overview;14
7.2;Chapter Descriptions;14
7.3;Features;14
7.4;Brief Preface of the Revised Version;16
7.5;Acknowledgments and Credits;16
8;Chapter 1: Some motivating examples and some fundamental concepts;18
8.1;1.1 Some motivating examples;18
8.2;1.2 Some fundamental concepts;25
8.3;1.3 Random variables;37
9;Chapter 2: The concept of probability and basic results;40
9.1;2.1 Definition of probability and some basic results;40
9.1.1;2.1.1 Some basic properties of a probability function;44
9.1.2;2.1.2 Justification;45
9.2;2.2 Distribution of a random variable;51
9.3;2.3 Conditional probability and related results;59
9.4;2.4 Independent events and related results;72
9.5;2.5 Basic concepts and results in counting;81
10;Chapter 3: Numerical characteristics of a random variable, some special random variables;94
10.1;3.1 Expectation, variance, and moment generating function of a random variable;94
10.2;3.2 Some probability inequalities;107
10.3;3.3 Some special random variables;110
10.3.1;3.3.1 The discrete case;110
10.3.1.1;Binomial distribution;110
10.3.1.2;Geometric distribution;112
10.3.1.3;Poisson Distribution;114
10.3.1.4;Hypergeometric distribution;116
10.3.2;3.3.2 The continuous case;117
10.3.2.1;Gamma distribution;117
10.3.2.2;Negative Exponential distribution;119
10.3.2.3;Chi-square distribution;121
10.3.2.4;Normal distribution;121
10.3.2.5;Uniform (or Rectangular) distribution;126
10.4;3.4 Median and mode of a random variable;142
11;Chapter 4: Joint and conditional p.d.f.'s, conditional expectation and variance, moment generating function, covariance, a ...;152
11.1;4.1 Joint d.f. and joint p.d.f. of two random variables;152
11.2;4.2 Marginal and conditional p.d.f.'s, conditional expectation and variance;161
11.3;4.3 Expectation of a function of two r.v.'s, joint and marginal m.g.f.'s;175
11.4;Expectation of a function of two r.v.'s, joint and marginal m.g.f.'s, covariance, and correlation coefficient;175
11.5;4.4 Some generalizations to k random variables;188
11.6;4.5 The Multinomial, the Bivariate Normal, and the multivariate Normal;190
11.6.1;4.5.1 Multinomial distribution;190
11.6.2;4.5.2 Bivariate Normal distribution;193
11.6.3;4.5.3 Multivariate Normal distribution;198
12;Chapter 5: Independence of random variables and some applications;204
12.1;5.1 Independence of random variables and criteria of independence;204
12.2;5.2 The reproductive property of certain distributions;215
13;Chapter 6: Transformation of random variables;224
13.1;6.1 Transforming a single random variable;224
13.2;6.2 Transforming two or more random variables;229
13.3;6.3 Linear transformations;243
13.4;6.4 The probability integral transform;249
13.5;6.5 Order statistics;250
14;Chapter 7: Some modes of convergence of random variables, applications;262
14.1;7.1 Convergence in distribution or in probability and their relationship;262
14.2;7.2 Some applications of convergence in distribution: WLLN and CLT;268
14.2.1;7.2.1 Applications of the WLLN;270
14.2.2;7.2.2 Applications of the CLT;274
14.2.3;7.2.3 The continuity correction;276
14.3;7.3 Further limit theorems;284
15;Chapter 8: An overview of statistical inference;290
15.1;8.1 The basics of point estimation;291
15.2;8.2 The basics of interval estimation;293
15.3;8.3 The basics of testing hypotheses;294
15.4;8.4 The basics of regression analysis;298
15.5;8.5 The basics of analysis of variance;299
15.6;8.6 The basics of nonparametric inference;301
16;Chapter 9: Point estimation;304
16.1;9.1 Maximum likelihood estimation: Motivation and examples;304
16.2;9.2 Some properties of MLE's;317
16.3;9.3 Uniformly minimum variance unbiased estimates;325
16.4;9.4 Decision-theoretic approach to estimation;334
16.5;9.5 Other methods of estimation;341
17;Chapter 10: Confidence intervals and confidence regions;346
17.1;10.1 Confidence intervals;346
17.2;10.2 Confidence intervals in the presence of nuisance parameters;354
17.3;10.3 A confidence region for ((µ, s22) in the N((µ, s2) distribution;357
17.4;10.4 Confidence intervals with approximate confidence coefficient;360
18;Chapter 11: Testing hypotheses;366
18.1;11.1 General concepts, formulation of some testing hypotheses;367
18.2;11.2 Neyman–Pearson fundamental lemma, Exponential type families, UMP tests for some composite hypotheses;369
18.2.1;11.2.1 Exponential type families of p.d.f.'s;376
18.2.2;11.2.2 UMP Tests for some composite hypotheses;377
18.3;11.3 Some applications of theorems 2;380
18.3.1;11.3.1 Further uniformly most powerful tests for some composite hypotheses;389
18.3.2;11.3.2 An application of Theorem 3;390
18.4;11.4 Likelihood ratio tests;392
18.4.1;11.4.1 Testing hypotheses for the parameters in a single Normal population;395
18.4.2;11.4.2 Comparing the parameters of two Normal populations;401
19;Chapter 12: More about testing hypotheses;414
19.1;12.1 Likelihood ratio tests in the Multinomial case and contingency tables;414
19.2;12.2 A goodness-of-fit test;420
19.3;12.3 Decision-theoretic approach to testing hypotheses;425
19.4;12.4 Relationship between testing hypotheses and confidence regions;432
20;Chapter 13: A simple linear regression model;436
20.1;13.1 Setting up the model—the principle of least squares;436
20.2;13.2 The least squares estimates of 1 and 2 and some of their properties;439
20.3;13.3 Normally distributed errors: mle’s of ß1, ß2, and s2, some distributional results;447
20.4;13.4 Confidence intervals and hypotheses testing problems;456
20.5;13.5 Some prediction problems;462
20.6;13.6 Proof of theorem 5;466
20.7;13.7 Concluding remarks;468
21;Chapter 14: Two models of analysis of variance;470
21.1;14.1 One-way layout with the same number of observations per cell;470
21.1.1;14.1.1 The MLE's of the parameters of the model;471
21.1.2;14.1.2 Testing the hypothesis of equality of means;472
21.1.3;14.1.3 Proof of Lemmas in Section 14.1;478
21.2;14.2 A multicomparison method;480
21.3;14.3 Two-way layout with one observation per cell;486
21.3.1;14.3.1 The MLE's of the parameters of the model;487
21.3.2;14.3.2 Testing the hypothesis of no row or no column effects;488
21.3.3;14.3.3 Proof of Lemmas in Section 14.3;494
22;Chapter 15: Some topics in nonparametric inference;502
22.1;15.1 Some confidence intervals with given approximate confidence coefficient;503
22.2;15.2 Confidence intervals for quantiles of a distribution function;505
22.3;15.3 The two-sample sign test;507
22.4;15.4 The rank sum and the Wilcoxon–Mann–Whitney two-sample tests;512
22.4.1;15.4.1 Proofs of Lemmas 1 and 2;520
22.5;15.5 Nonparametric curve estimation;522
22.5.1;15.5.1 Nonparametric estimation of a probabilitydensity function;522
22.5.2;15.5.2 Nonparametric regression estimation;527
23;Tables;534
24;Some notation and abbreviations;568
25;Answers to even-numbered exercises;572
26;Index;616
27;Inside Back Cover;624