Statistical Models 1st Edition by Davison – Ebook PDF Instant Download/Delivery: 0521773393, 9780521773393
Full download Statistical Models 1st Edition after payment
Product details:
ISBN 10: 0521773393
ISBN 13: 9780521773393
Author: A.C. Davison
Models and likelihood are the backbone of modern statistics. This 2003 book gives an integrated development of these topics that blends theory and practice, intended for advanced undergraduate and graduate students, researchers and practitioners. Its breadth is unrivaled, with sections on survival analysis, missing data, Markov chains, Markov random fields, point processes, graphical models, simulation and Markov chain Monte Carlo, estimating functions, asymptotic approximations, local likelihood and spline regressions as well as on more standard topics such as likelihood and linear and generalized linear models. Each chapter contains a wide range of problems and exercises. Practicals in the S language designed to build computing and data analysis skills, and a library of data sets to accompany the book, are available over the Web.
Statistical Models 1st Table of contents:
1 Introduction
Examples
Outline
Notation
2 Variation
2.1 Statistics and Sampling Variation
2.1.1 Data summaries
Location and scale
Bad data
Shape
Graphs
2.1.2 Random sample
2.1.3 Sampling variation
2.1.4 Probability plots
Exercises 2.1
2.2 Convergence
2.2.1 Modes of convergence
Convergence in probability
Convergence in distribution
Slutsky?s lemma
2.2.2 Delta method
Several variables
Big and little oh notation: O and o
Exercises 2.2
2.3 Order Statistics
Density function
Several order statistics
Approximate density
Derivation of (2.27)
Exercises 2.3
2.4 Moments and Cumulants
Cumulants
Skewness and kurtosis
Exercises 2.4
2.5 Bibliographic Notes
2.6 Problems
3 Uncertainty
3.1 Confidence Intervals
3.1.1 Standard errors and pivots
Complications
Interpretation
3.1.2 Choice of scale
3.1.3 Tests
3.1.4 Prediction
Exercises 3.1
3.2 Normal Model
3.2.1 Normal and related distributions
Normal distribution
Chi-squared distribution
Student t distribution
F distribution
3.2.2 Normal random sample
3.2.3 Multivariate normal distribution
Multivariate normal distribution
Marginal and conditional distributions
Linear combinations of normal variables
Two samples
Joint distribution of Y and S2
Exercises 3.2
3.3 Simulation
3.3.1 Pseudo-random numbers
Inversion
Rejection
Applications
3.3.2 Variance reduction
Importance sampling
Exercises 3.3
3.4 Bibliographic Notes
3.5 Problems
4 Likelihood
4.1 Likelihood
4.1.1 Definition and examples
Dependent data
4.1.2 Basic properties
Interpretation
Exercises 4.1
4.2 Summaries
4.2.1 Quadratic approximation
4.2.2 Sufficient statistics
Minimal sufficiency
Exercises 4.2
4.3 Information
4.3.1 Expected and observed information
4.3.2 Efficiency
Exercises 4.3
4.4 Maximum Likelihood Estimator
4.4.1 Computation
4.4.2 Large-sample distribution
Scalar parameter
Vector parameter
Consistency of?
Asymptotic normality of?
Exercises 4.4
4.5 Likelihood Ratio Statistic
4.5.1 Basic ideas
4.5.2 Profile log likelihood
4.5.3 Model fit
Chi-squared statistics
Derivations of (4.39) and (4.43)
Exercises 4.5
4.6 Non-Regular Models
Parameter space
Parameter identifiability
Score and information
Wrong model
Exercises 4.6
4.7 Model Selection
Exercises 4.7
4.8 Bibliographic Notes
4.9 Problems
5 Models
5.1 Straight-Line Regression
Linear combinations
Exercises 5.1
5.2 Exponential Family Models
5.2.1 Basic notions
Mean parameter
Variance function
5.2.2 Families of order p
Curved exponential families
5.2.3 Inference
Model adequacy
Likelihood
Derived densities
Exercises 5.2
5.3 Group Transformation Models
Equivariance
Exercises 5.3
5.4 Survival Data
5.4.1 Basic ideas
Hazard and survivor functions
Censoring
5.4.2 Likelihood inference
Discrete data
5.4.3 Product-limit estimator
5.4.4 Other ideas
Competing risks
Frailty
Exercises 5.4
5.5 Missing Data
5.5.1 Types of missingness
Publication bias
5.5.2 EM algorithm
Exponential family models
Exercises 5.5
5.6 Bibliographic Notes
5.7 Problems
6 Stochastic Models
6.1 Markov Chains
6.1.1 Markov chains
Classification of chains
Likelihood inference
Higher-order models
6.1.2 Continuous-time models
Fully observed trajectory
Partially observed trajectory
Inhomogeneous chains
Exercises 6.1
6.2 Markov Random Fields
6.2.1 Basic notions
6.2.2 Directed acyclic graphs
Hammersley?Clifford theorem
Exercises 6.2
6.3 Multivariate Normal Data
6.3.1 Multivariate dependence
Simpson?s paradox
6.3.2 Multivariate normal distribution
6.3.3 Graphical Gaussian models
Conditional independence graphs
Calculation of partial correlation
Exercises 6.3
6.4 Time Series
Stationarity and autocorrelation
Trend removal
Volatility models
Exercises 6.4
6.5 Point Processes
6.5.1 Poisson process
Homogeneous Poisson process
6.5.2 Statistics of extremes
Point process approximation
6.5.3 More general models
Exercises 6.5
6.6 Bibliographic Notes
6.7 Problems
7 Estimation and Hypothesis Testing
7.1 Estimation
7.1.1 Mean squared error
Cram?r?Rao lower bound
7.1.2 Kernel density estimation
7.1.3 Minimum variance unbiased estimation
Rao?Blackwell theorem
Completeness
7.1.4 Interval estimation
Exercises 7.1
7.2 Estimating Functions
7.2.1 Basic notions
Optimality
7.2.2 Robustness
7.2.3 Dependent data
Exercises 7.2
7.3 Hypothesis Tests
7.3.1 Significance levels
Interpretation
Goodness of fit tests
One- and two-sided tests
Nonparametric tests
7.3.2 Comparison of tests
Neyman?Pearson lemma
Local power
7.3.3 Composite null hypotheses
Conditioning
Invariance
7.3.4 Link with confidence intervals
Exercises 7.3
7.4 Bibliographic Notes
7.5 Problems
8 Linear Regression Models
8.1 Introduction
Exercises 8.1
8.2 Normal Linear Model
8.2.1 Estimation
8.2.2 Geometrical interpretation
8.2.3 Likelihood quantities
Likelihood ratio statistic
8.2.4 Weighted least squares
Exercises 8.2
8.3 Normal Distribution Theory
8.3.1 Distributions of?
8.3.2 Confidence and prediction intervals
Exercises 8.3
8.4 Least Squares and Robustness
M-estimation
Misspecified variance
Exercises 8.4
8.5 Analysis of Variance
8.5.1 F statistics
8.5.2 Sums of squares
Analysis of variance
8.5.3 Orthogonality
Exercises 8.5
8.6 Model Checking
8.6.1 Residuals
8.6.2 Nonlinearity
8.6.3 Leverage, influence, and case deletion
Exercises 8.6
8.7 Model Building
8.7.1 General
8.7.2 Collinearity
8.7.3 Automatic variable selection
Stepwise methods
Likelihood criteria
Inference after model selection
Model uncertainty
Exercises 8.7
8.8 Bibliographic Notes
8.9 Problems
9 Designed Experiments
9.1 Randomization
9.1.1 Randomization
Blocking
Randomization inference
9.1.2 Causal inference
Exercises 9.1
9.2 Some Standard Designs
9.2.1 One-way layout
9.2.2 Randomized block design
Balanced incomplete block design
9.2.3 Latin square
9.2.4 Factorial design
Exercises 9.2
9.3 Further Notions
9.3.1 Interaction
Confounding
9.3.2 Contrasts
9.3.3 Analysis of covariance
Exercises 9.3
9.4 Components of Variance
9.4.1 Basic ideas
Nested variation
Split-unit experiments
9.4.2 Linear mixed models
Prediction of random effects
Exercises 9.4
9.5 Bibliographic Notes
9.6 Problems
10 Nonlinear Regression Models
10.1 Introduction
10.2 Inference and Estimation
10.2.1 Likelihood inference
10.2.2 Iterative weighted least squares
10.2.3 Model checking
Exercises 10.2
10.3 Generalized Linear Models
10.3.1 Density and link functions
10.3.2 Estimation and inference
Exercises 10.3
10.4 Proportion Data
10.4.1 Binary data
10.4.2 2 ? 2 table
Small sample analysis
Exercises 10.4
10.5 Count Data
10.5.1 Log-linear models
10.5.2 Contingency tables
Marginal models
10.5.3 Ordinal responses
Exercises 10.5
10.6 Overdispersion
Parametric models
Quasi-likelihood
Exercises 10.6
10.7 Semiparametric Regression
10.7.1 Local polynomial models
Choice of polynomial
Choice of smoothing parameter
Inference
Extensions
Computation of bias and variance
10.7.2 Roughness penalty methods
Penalized log likelihood
How much smoothing?
10.7.3 More general models
Exercises 10.7
10.8 Survival Data
10.8.1 Introduction
Accelerated life models
10.8.2 Proportional hazards model
Log rank test
Time-dependent covariates
Model checking
Counting processes and martingale residuals
Exercises 10.8
10.9 Bibliographic Notes
10.10 Problems
11 Bayesian Models
11.1 Introduction
11.1.1 Bayes? theorem
Inference
Prediction
11.1.2 Likelihood principle
Sufficiency and conditionality principles
Likelihood principle
11.1.3 Prior information
Conjugate densities
Ignorance
Jeffreys priors
Exercises 11.1
11.2 Inference
11.2.1 Posterior summaries
Normal approximation
Posterior confidence sets
11.2.2 Bayes factors
11.2.3 Model criticism
Marginal inference
Prediction diagnostics
11.2.4 Prediction and model averaging
Exercises 11.2
11.3 Bayesian Computation
11.3.1 Laplace approximation
Inference
11.3.2 Importance sampling
11.3.3 Markov chain Monte Carlo
Gibbs sampler
Output analysis
Bayesian application
Metropolis?Hastings algorithm
Exercises 11.3
11.4 Bayesian Hierarchical Models
Justification of (11.49)
Exercises 11.4
11.5 Empirical Bayes Inference
11.5.1 Basic ideas
11.5.2 Decision theory
Admissible decision rules
Shrinkage and squared error loss
Derivation of (11.64)
Exercises 11.5
11.6 Bibliographic Notes
11.7 Problems
12 Conditional and Marginal Inference
12.1 Ancillary Statistics
Basu?s theorem
Location model
Difficulties with ancillaries
Exercises 12.1
12.2 Marginal Inference
Restricted maximum likelihood
Regression-scale model
Exercises 12.2
12.3 Conditional Inference
12.3.1 Exact conditioning
12.3.2 Saddlepoint approximation
Edgeworth series
Derivation of saddlepoint approximation
12.3.3 Approximate conditional inference
Conditional inference
Curved exponential family
Exercises 12.3
12.4 Modified Profile Likelihood
12.4.1 Likelihood adjustment
Derivation of (12.42)
12.4.2 Parameter orthogonality
Exercises 12.4
12.5 Bibliographic Notes
12.6 Problems
APPENDIX A: Practicals
People also search for Statistical Models 1st:
statistical models examples
2 statistical questions
p-model 1990
p-model 11th fact
Tags:
Davison,Statistical,Models