Book:Morris H. DeGroot/Probability and Statistics/Fourth Edition

From ProofWiki
Jump to navigation Jump to search

Morris H. DeGroot and Mark J. Schervish: Probability and Statistics (4th Edition)

Published $\text {2011}$


Subject Matter


Contents

Preface
1 Introduction to Probability
1.1 The History of Probability
1.2 Interpretations of Probability
1.3 Experiments and Events
1.4 Set Theory
1.5 The Definition of Probability
1.6 Finite Sample Spaces
1.7 Counting Methods
1.8 Combinatorial Methods
1.9 Multinomial Coefficients
1.10 The Probability of a Union of Events
1.11 Statistical Swindles
1.12 Supplementary Exercises
2 Conditional Probability
2.1 The Definition of Conditional Probability
2.2 Independent Events
2.3 Bayes' Theorem
2.4 The Gambler's Ruin Problem
2.5 Supplementary Exercises
3 Random Variables and Distributions
3.1 Random Variables and Discrete Distributions
3.2 Continuous Distributions
3.3 The Cumulative Distribution Function
3.4 Bivariate Distributions
3.5 Marginal Distributions
3.6 Conditional Distributions
3.7 Multivariate Distributions
3.8 Functions of a Random Variable
3.9 Functions of Two or More Random Variables
3.10 Markov Chains
3.11 Supplementary Exercises
4 Expectation
4.1 The Expectation of a Random Variable
4.2 Properties of Expectations
4.3 Variance
4.4 Moments
4.5 The Mean and the Median
4.6 Covariance and Correlation
4.7 Conditional Expectation
4.8 Utility
4.9 Supplementary Exercises
5 Special Distributions
5.1 Introduction
5.2 The Bernoulli and Binomial Distributions
5.3 The Hypergeometric Distributions
5.4 The Poisson Distributions
5.5 The Negative Binomial Distributions
5.6 The Normal Distributions
5.7 The Gamma Distributions
5.8 The Beta Distributions
5.9 The Multinomial Distributions
5.10 The Bivariate Normal Distributions
5.11 Supplementary Exercises
6 Large Random Samples
6.1 Introduction
6.2 The Law of Large Numbers
6.3 The Central Limit Theorem
6.4 The Correction for Continuity
6.5 Supplementary Exercises
7 Estimation
7.1 Statistical Inference
7.2 Prior and Posterior Distributions
7.3 Conjugate Prior Distributions
7.4 Bayes Estimators
7.5 Maximum Likelihood Estimators
7.6 Properties of Maximum Likelihood Estimators
7.7 Sufficient Statistics
7.8 Jointly Sufficient Statistics
7.9 Improving an Estimator
7.10 Supplementary Exercises
8 Sampling Distributions of Estimators
8.1 The Sampling Distribution of a Statistic
8.2 The Chi-Square Distributions
8.3 Joint Distribution of the Sample Mean and Sample Variance
8.4 The $t$ Distributions
8.5 Confidence Intervals
8.6 Bayesian Analysis of Samples from a Normal Distribution
8.7 Unbiased Estimators
8.8 Fisher Information
8.9 Supplementary Exercises
9 Testing Hypotheses
9.1 Problems of Testing Hypotheses
9.2 Testing Simple Hypotheses
9.3 Uniformly Most Powerful Tests
9.4 Two-Sided Alternatives
9.5 The $t$ Test
9.6 Comparing the Means of Two Normal Distributions
9.7 The $F$ Distributions
9.8 Bayes Test Procedures
9.9 Foundational Issues
9.10 Supplementary Exercises
10 Categorical Data and Nonparametric Methods
10.1 Tests of Goodness-of-Fit
10.2 Goodness-of-Fit for Composite Hypotheses
10.3 Contingency Tables
10.4 Tests of Homogeneity
10.5 Simpson’s Paradox
10.6 Kolmogorov-Smirnov Tests
10.7 Robust Estimation
10.8 Sign and Rank Tests
10.9 Supplementary Exercises
11 Linear Statistical Models
11.1 The Method of Least Squares
11.2 Regression
11.3 Statistical Inference in Simple Linear Regression
11.4 Bayesian Inference in Simple Linear Regression
11.5 The General Linear Model and Multiple Regression
11.6 Analysis of Variance
11.7 The Two-Way Layout
11.8 The Two-Way Layout with Replications
11.9 Supplementary Exercises
12 Simulation
12.1 What Is Simulation?
12.2 Why Is Simulation Useful?
12.3 Simulating Specific Distributions
12.4 Importance Sampling
12.5 Markov Chain Monte Carlo
12.6 The Bootstrap
12.7 Supplementary Exercises
Tables
Answers to Odd-Numbered Exercises
References
Index


Further Editions