Regression Estimators

Regression Estimators

A Comparative Study

1st Edition - April 28, 1990

Write a review

  • Author: Marvin H. J. Gruber
  • eBook ISBN: 9781483260976

Purchase options

Purchase options
DRM-free (PDF)
Sales tax will be calculated at check-out

Institutional Subscription

Free Global Shipping
No minimum order

Description

Regression Estimators: A Comparative Study presents, compares, and contrasts the development and the properties of the ridge type estimators that result from both Bayesian and non-Bayesian (frequentist) methods. The book is divided into four parts. The first part (Chapters I and II) discusses the need for alternatives to least square estimators, gives a historical survey of the literature and summarizes basic ideas in Matrix Theory and Statistical Decision Theory used throughout the book. The second part (Chapters III and IV) covers the estimators from both the Bayesian and from the frequentist points of view and explores the mathematical relationships between them. The third part (Chapters V-VIII) considers the efficiency of the estimators with and without averaging over a prior distribution. Part IV, the final two chapters IX and X, suggests applications of the methods and results of Chapters III-VII to Kaiman Filters and Analysis of Variance, two very important areas of application. Statisticians and workers in fields that use statistical methods who would like to know more about the analytical properties of ridge type estimators will find the book invaluable.

Table of Contents


  • Preface

    Part I Introduction and Mathematical Preliminaries

    Chapter I. Introduction

    1.0. Motivation for Writing This Book

    1.1. Purpose of This Book

    1.2. Least Square Estimators and the Need for Alternatives

    1.3. Historical Survey

    1.4. The Structure of the Book

    Chapter II. Mathematical and Statistical Preliminaries

    2.0 Introduction

    2.1 Matrix Theory Results

    2.2 The Bayes Estimator

    2.3 The Minimax Estimator

    2.4 Criterion for Comparing Estimators: Theobald's 1974 Result

    2.5 Some Useful Inequalities

    2.6 Some Miscellaneous Useful Matrix Results

    2.7 Summary

    Part II The Estimators

    Chapter III. The Estimators

    3.0. Introduction

    3.1. The Least Square Estimator and Its Properties

    3.2. The Generalized Ridge Regression Estimator

    3.3. The Mixed Estimators

    3.4. The Linear Minimax Estimator

    3.5. The Bayes Estimator

    3.6. Summary and Remarks

    Chapter IV. How the Different Estimators Are Related

    4.0. Introduction

    4.1. Alternative Forms of the Bayes Estimator Full Rank Case

    4.2. Alternative Forms of the Bayes Estimator Non-Full Rank Case

    4.3. The Equivalence of the Generalized Ridge Estimator and the Bayes Estimator

    4.4. The Equivalence of the Mixed Estimator and the Bayes Estimator

    4.5. Ridge Estimators in the Literature as Special Cases of the BE, Minimax Estimators, or Mixed Estimators

    4.6. Extension of Results to the Case where U'FU Is Not Positive Definite

    4.7. An Extension of the Gauss-Markov Theorem

    4.8. Summary and Remarks

    Part III The Efficiencies of the Estimators

    Chapter V. Measures of Efficiency of the Estimators

    Chapter VI. The Average MSE

    6.0. Introduction

    6.1. The Forms of the MSE for the Minimax, Bayes and the Mixed Estimator

    6.2. Relationship Between the Average Variance and the MSE

    6.3. The Average Variance and the MSE of the BE

    6.4. Alternative Forms of the MSE of the Mixed Estimator

    6.5. Comparison of the MSE of Different BE

    6.6. Comparison of the Ridge and Contraction Estimator's MSE

    6.7. Summary and Remarks

    Chapter VII. The MSE Neglecting the Prior Assumptions

    7.0. Introduction

    7.1. The MSE of the BE

    7.2 The MSE of the Mixed Estimators Neglecting the Prior Assumptions

    7.3. The Comparison of the Conditional MSE of the Bayes Estimator and the Least Square Estimator and the Comparison of the Conditional and the Average MSE

    7.4. The Comparison of the MSE of a Mixed Estimator with the LS Estimators

    7.5. The Comparison of the MSE of Two BE

    7.6. Summary

    Chapter VIII. The MSE for Incorrect Prior Assumptions

    8.0. Introduction

    8.1. The BE and Its MSE

    8.2. The Minimax Estimator

    8.3. The Mixed Estimator

    8.4. Contaminated Priors

    8.5. Contaminated (Mixed) Bayes Estimators

    8.6. Summary

    Part IV Applications

    Chapter IX. The Kaiman Filter

    9.0. Introduction

    9.1. The Kaiman Filter as a Bayes Estimator

    9.2. The Kaiman Filter as a Recursive Least Square Estimator and the Connection with the Mixed Estimator

    9.3. The Minimax Estimator

    9.4. The Generalized Ridge Estimator

    9.5. The Average MS

    9.6. The MSE for Incorrect Initial Prior Assumptions

    9.7. Applications

    9.8. Recursive Ridge Regression

    9.9. Summary

    Chapter X. Experimental Design Models

    10.0. Introduction

    10.1. The One Way ANOVA Model

    10.2. The Bayes and Empirical Bayes Estimators

    10.3. The Two Way Classification

    10.4. The Bayes and Empirical Bayes Estimators

    10.5. Summary

    Appendix to Section 10.2

    Bibliography

    Author Index

    Subject Index

Product details

  • No. of pages: 360
  • Language: English
  • Copyright: © Academic Press 1990
  • Published: April 28, 1990
  • Imprint: Academic Press
  • eBook ISBN: 9781483260976

About the Author

Marvin H. J. Gruber

About the Editors

Gerald J. Lieberman

Ingram Olkin

Affiliations and Expertise

Stanford University, California

Ratings and Reviews

Write a review

There are currently no reviews for "Regression Estimators"