Secure CheckoutPersonal information is secured with SSL technology.
Free ShippingFree global shipping
No minimum order.
- Introduction. 2. Preliminary Concepts from Algebra, Functional Analysis and Probability Theory. 3. Fundamental Notions from Estimation Theory. 4. Estimators in the Case of Large Samples. 5. Linear and Quadratic Estimators. 6. Normality of Observation Vectors. 7. Some Other Types of Estimators. 8. Conclusion. References. Subject Index.
The application of estimation theory renders the processing of experimental results both rational and effective, and thus helps not only to make our knowledge more precise but to determine the measure of its reliability. As a consequence, estimation theory is indispensable in the analysis of the measuring processes and of experiments in general.
The knowledge necessary for studying this book encompasses the disciplines of probability and mathematical statistics as studied in the third or fourth year at university. For readers interested in applications, comparatively detailed chapters on linear and quadratic estimations, and normality of observation vectors have been included. Chapter 2 includes selected items of information from algebra, functional analysis and the theory of probability, intended to facilitate the reading of the text proper and to save the reader looking up individual theorems in various textbooks and papers; it is mainly devoted to the reproducing kernel Hilbert spaces, helpful in solving many estimation problems. The text proper of the book begins with Chapter 3. This is divided into two parts: the first deals with sufficient statistics, complete sufficient statistics, minimal sufficient statistics and relations between them; the second contains the mostimportant inequalities of estimation theory for scalar and vector valued parameters and presents properties of the exponential family of distributions.
The fourth chapter is an introduction to asymptotic methods of estimation. The method of statistical moments and the maximum-likelihood method are investigated. The sufficient conditions for asymptotical normality of the estimators are given for both methods. The linear and quadratic methods of estimation are dealt with in the fifth chapter. The method of least squares estimation is treated. Five basic regular versions of the regression model and the unified linear model of estimation are described. Unbiased estimators for unit dispersion (factor of the covariance matrix) are given for all mentioned cases. The equivalence of the least-squares method to the method of generalized minimum norm inversion of the design matrix of the regression model is studied in detail. The problem of estimating the covariance components in the mixed model is mentioned as well. Statistical properties of linear and quadratic estimators developed in the fifth chapter in the case of normally distributed errors of measurement are given in Chapter 6. Further, the application of tensor products of Hilbert spaces generated by the covariance matrix of random error vector of observations is demonstrated. Chapter 7 reviews some further important methods of estimation theory. In the first part Wald's method of decision functions is applied to the construction of estimators. The method of contracted estimators and the method of Hoerl and Kennard are presented in the second part. The basic ideas of robustness and Bahadur's approach to estimation theory are presented in the third and fourth parts of this last chapter.
- © North Holland 1988
- 1st November 1987
- North Holland
- eBook ISBN:
Elsevier.com visitor survey
We are always looking for ways to improve customer experience on Elsevier.com.
We would like to ask you for a moment of your time to fill in a short questionnaire, at the end of your visit.
If you decide to participate, a new browser tab will open so you can complete the survey after you have completed your visit to this website.
Thanks in advance for your time.