Save up to 30% on Elsevier print and eBooks with free shipping. No promo code needed.
Save up to 30% on print and eBooks.
Parametric Statistical Inference
Basic Theory and Modern Approaches
1st Edition - January 1, 1981
Author: Shelemyahu Zacks
Editors: V. Lakshmikantham, C. P. Tsokos
Language: English
eBook ISBN:9781483150499
9 7 8 - 1 - 4 8 3 1 - 5 0 4 9 - 9
Parametric Statistical Inference: Basic Theory and Modern Approaches presents the developments and modern trends in statistical inference to students who do not have advanced…Read more
Purchase options
LIMITED OFFER
Save 50% on book bundles
Immediately download your ebook while waiting for your print delivery. No promo code is needed.
Parametric Statistical Inference: Basic Theory and Modern Approaches presents the developments and modern trends in statistical inference to students who do not have advanced mathematical and statistical preparation. The topics discussed in the book are basic and common to many fields of statistical inference and thus serve as a jumping board for in-depth study. The book is organized into eight chapters. Chapter 1 provides an overview of how the theory of statistical inference is presented in subsequent chapters. Chapter 2 briefly discusses statistical distributions and their properties. Chapter 3 is devoted to the problem of sufficient statistics and the information in samples, and Chapter 4 presents some basic results from the theory of testing statistical hypothesis. In Chapter 5, the classical theory of estimation is developed. Chapter 6 discusses the efficiency of estimators and some large sample properties, while Chapter 7 studies the topics on confidence intervals. Finally, Chapter 8 is about decision theoretic and Bayesian approach in testing and estimation. Senior undergraduate and graduate students in statistics and mathematics, and those who have taken an introductory course in probability will highly benefit from this book.
List of Illustrations
Chapter1 General Review
1.1 Introduction
1.2 Statistical Models, Distribution Functions and the Essence of Statistical Inference
1.3 The Information in Samples and Sufficient Statistics
1.4 Testing Statistical Hypotheses
1.5 Estimation Theory
1.6 The Efficiency of Estimators
1.7 Confidence and Tolerance Intervals
1.8 Decision Theoretic and Bayesian Approach in Testing and Estimation
Chapter 2 Basic Theory of Statistical Distributions
2.1 Introductory Remarks
2.2 Elementary Properties of Distribution Functions
2.2.1 Discrete Distributions
2.2.2 Absolutely Continuous Distributions
2.2.3 Inverse Functions
2.2.4 Transformations
2.3 Some Families of Discrete Distributions
2.3.1 Binomial Distributions
2.3.2 Hypergeometric Distributions
2.3.3 Poisson Distributions
2.3.4 Geometric, Pascal and Negative Binomial
2.4 Some Families of Continuous Distributions
2.4.1 Rectangular Distributions
2.4.2 Beta Distributions
2.4.3 Gamma Distributions
2.4.4 Weibull and Extreme Value Distributions
2.4.5 Normal Distributions
2.4.6 Normal Approximations
2.5 Expectations, Moments and Generating Functions
2.6 Joint Distributions, Conditional Distributions and Independence
2.6.1 Joint Distributions
2.6.2 Conditional Distributions
2.6.3 Independence
2.6.4 Transformations
2.7 Moments and Covariances of Linear Functions
2.8 Discrete Multivariate Distributions
2.8.1 Multinomial Distributions
2.8.2 Multivariate Negative Binomial
2.8.3 Multivariate Hypergeomettic
2.9 Multinormal Distributions
2.9.1 Basic Theory
2.9.2 Distributions of Subvectors and Distributions of Linear Forms
2.9.3 Independence of Linear Forms
2.9.4 Normal Probability Transformations
2.l0 Distributions of Symmetric Quadratic Forms of Normal Variables
2.11 Independence of Linear and Quadratic Forms of Normal Variables
2.l2 The Order Statistics
2.l3 The T-Distributions
2.14 The F-Distributions
2.15 The Distribution of the Sample Correlation
2.16 Limit Theorems
2.17 Problems
Chapter 3 Sufficient Statistics and the Information in Samples
3.1 Introduction
3.2 Definitions and Characterization of Sufficient Statistics
3.3 Likelihood Functions and Minimal Sufficient Statistics
3.4 Sufficient Statistics and Exponential Type Families
3.5 Sufficiency and Completeness
3.6 Information Functions and Sufficiency
3.6.1 The Fisher Information
3.6.2 The Kullback-Leibler Information
3.7 Problems
Chapter 4 Testing Statistical Hypotheses
4.1 The General Framework
4.2 The Neyman-Pearson Fundamental Lemma
4.3 Testing One-Sided Composite Hypotheses in MLR Models
4.4 Testing Two-Sided Hypotheses in One-Parameter Exponential Families
4.5 Testing Composite Hypotheses with Nuisance Parameters-Unbiased Tests
4.6 Likelihood Ratio Tests
4.6.1 Testing in Normal Regression Theory
4.6.2 Comparison of Normal Means: The Analysis of Variance
4.7 The Analysis of Contingency Tables
4.7.1 The Structure of Multi-Way Contingency Tables and the Statistical Model
4.7.2 Testing the Significance of Association
4.7.3 The Analysis of 2x2 Tables
4.7.4 Likelihood Ratio Tests
4.8 Sequential Testing of Hypotheses
4.8.1 The Wa1d Sequential Probability Ratio Test
4.8.2 Sequential Tests with Power
4.9 Problems
Chapter 5 Estimation Theory
5.1 General Discussion
5.2 Unbiased Estimators
5.2.1 General Definition and Example
5.2.2 Minimum Variance Unbiased Estimators
5.2.3 Bias Reduction by Jackknifing
5.3 Best Linear Unbiased and Least Squares Estimator
5.3.1 Best Linear Unbiased Estimators of the Mean
5.3.2 Least Squares and Best Linear Unbiased Estimators for Linear Models
5.3.3 Best Linear Combination of Order Statistics
5.4 Stabilizing the Least Squares Estimators: Ridge Regression
5.5 Maximum Likelihood Estimation
5.5.1 Definition and Examples
5.5.2 Maximum Likelihood Estimators in Exponential Type Families
5.5.3 The Invariance Principle
5.5.4 Numerical Problems
5.5.5 Anomalous Cases
5.5.6 MLE of the Parameters of Tolerance Distributions
5.6 Equivariant Estimators
5.6.1 The Structure of Equivariant Estimators
5.6.2 Minimum MSE Equivariant Estimators
5.6.3 The Pitman Estimators
5.7 Moment-Equations Estimators
5.8 Pre-Test Estimators
5.9 Robust Estimators
5.10 Problems
Chapter 6 The Efficiency of Estimators
6.1 General Introduction
6.2 The Cramér-Rao Lower Bound in Regular One~Parameter Cases
6.3 Extension of the Cramér-Rao Inequality to Multiparameter Cases
6.4 General Inequalities of the Cramér-Rao Type
6.5 The Efficiency of Estimators in Small Samples
6.6 Asymptotic Properties of Estimators
6.6.1 The Consistency of MLE
6.6.2 Asymptotic Normality and Efficiency of MLE
6.7 Second-Order Asymptotic Efficiency
6.8 Maximum Probability Estimators
6.9 Problems
Chapter 7 Confidence and Tolerance Intervals
7.1 General Introduction
7.2 The Construction of Confidence Intervals
7.3 Optimal Confidence Intervals
7.4 Large Sample Approximations
7.5 Tolerance Intervals
7.6 Distribution-Free Confidence and Tolerance Intervals
7.7 Simultaneous Confidence Intervals
7.8 Two-Stage and Sequential Sampling for Fixed-Width Confidence Intervals
7.9 Problems
Chapter 8 Decision Theoretic and Bayesian Approach in Testing and Estimation
8.1 The Bayesian Framework
8.1.1 Prior, Posterior and Predictive Distributions
8.1.2 Bayesian Information Functions
8.1.3 Non-Informative and Improper Prior Distributions
8.1.4 Risk Functions, Bayes and Minimax Procedures
8.1.5 Bayes Sequential Decision Procedures
8.2 Bayesian Testing of Hypotheses
8.2.1 Testing Simple Hypotheses
8.2.2 Testing Composite Hypotheses
8.2.3 Bayes Sequential Testing of Hypotheses
8.3 Bayesian Confidence Intervals
8.4 Bayes and Minimax Estimators
8.4.1 General Discussion and Examples
8.4.2 Bayesian Estimates in Linear Models
8.4.3 Minimax Estimators
8.5 Minimax Risk and Bayes Equivariant, Formal Bayes and Structural Estimators
8.5.1 Minimum Risk and Bayes Equivariant Estimators
8.5.2 Formal Bayes Estimators for Invariant Priors
8.5.3 Equivariant Estimators Based on Structural Distributions
8.6 Empirical Bayes Estimators
8.7 The Admissibility of Estimators
8.7.1 Some Basic Results
8.7.2 The Inadmissibility of Some Commonly Used Estimators
8.7.3 Minimax and Admissible Estimators of the Location Parameter
8.8 Problems
References
Indexes
Author Index
Subject Index
No. of pages: 404
Language: English
Edition: 1
Published: January 1, 1981
Imprint: Pergamon
eBook ISBN: 9781483150499
VL
V. Lakshmikantham
Affiliations and expertise
University of Texas at Arlington, USA
Read Parametric Statistical Inference on ScienceDirect