Statistical Decision Theory in Adaptive Control Systems

Statistical Decision Theory in Adaptive Control Systems

1st Edition - January 1, 1967

Write a review

  • Authors: Yoshikazu Sawaragi, Yoshifumi Sunahara, Takayoshi Nakamizo
  • eBook ISBN: 9781483266770

Purchase options

Purchase options
DRM-free (PDF)
Sales tax will be calculated at check-out

Institutional Subscription

Free Global Shipping
No minimum order

Description

Mathematics in Science and Engineering, Volume 39: Statistical Decision Theory in Adaptive Control Systems focuses on the combination of control theory with statistical decision theory. This volume is divided into nine chapters. Chapter 1 reviews the history of control theory and introduces statistical decision theory. The mathematical description of random processes is covered in Chapter 2. In Chapter 3, the basic concept of statistical decision theory is treated, while in Chapter 4, the method of solving statistical decision problems is described. The application of statistical decision concepts to control problems is explained in Chapter 5. Chapter 6 elaborates a method of designing an adaptive control system. An application of the sequential decision procedure to the design of decision adaptive control systems is illustrated in Chapter 7. Chapter 8 is devoted to the description of a method of the adaptive adjustment of parameters contained in nonlinear control systems, followed by a discussion of the future problems in applications of statistical decision theory to control processes in the last chapter. This book is recommended for students and researchers concerned with statistical decision theory in adaptive control systems.

Table of Contents


  • Preface

    Chapter 1. Introduction

    1.1 Historical Development of Automatic Control

    1.2 Control Systems and Stochastics

    1.3 Adaptive Control and Decision Theory

    Chapter 2. Mathematical Description of Random Processes

    2.1 Introductory Remarks

    2.2 Probability

    2.3 Joint Probability

    2.4 Conditional Probability

    2.5 Bayes' Theorem

    2.6 Probability Distribution and Probability Density Function

    2.7 Joint Probability Distribution and Joint Probability Density Function

    2.8 Conditional Probability Distribution and Conditional Probability Density Function

    2.9 Statistical Parameters of Random Variables

    2.10 Stochastic Processes

    2.11 Stationary Random Processes

    2.12 Ergodic Hypothesis and Time Averages

    2.13 Stationary Gaussian Random Processes

    Chapter 3. Basic Concept of Statistical Decision Theory

    3.1 Introductory Remarks

    3.2 General Description of the Decision Situation

    3.3 Signal Detection

    3.4 Signal Extraction

    Chapter 4. Evaluation Functions and Solutions in Statistical Decision Theory

    4.1 Introductory Remarks

    4.2 Basic Assumptions

    4.3 General Formulation of Evaluation Functions in Decision Problems

    4.4 Solutions of Decision Problems by the Bayes Criterion

    4.5 Solutions of Binary Detection Problems

    4.6 The Neyman-Pearson Detection Rule

    Chapter 5. Statistical Decision Concept in Control Processes

    5.1 Introductory Remarks

    5.2 Decision Adaptive Control Systems under Preassigned Error Probabilities

    5.3 Binary Decision Adaptive Control Systems Based on the Concept of the Sequential Test

    5.4 Decision Adaptive Control Systems Based on the Neyman-Pearson Test

    5.5 Ideal-Observer Decision-Making

    Chapter 6. Nonsequential Decision Approaches in Adaptive Control Systems

    6.1 Introductory Remarks

    6.2 Extension of the Binary Detection Concept to N-Ary Decision Problems

    6.3 Derivation of the Bayesian System

    6.4 Construction of a Decision System Subjected to Gaussian Random Noise

    6.5 Decision-Making in System Identification

    6.6 Decision-Making in System Identification with Gaussian Random Noise

    6.7 Numerical Examples of Application of Decision Concept to Averaging Devices

    6.8 Application of Decision Concept to Nondata Problems

    Chapter 7. Sequential Decision Approaches in Adaptive Control Systems

    7.1 Introductory Remarks

    7.2 An Average Risk of Sequential Decision Procedure

    7.3 Derivation of Bayes Solution

    7.4 Application of Sequential Decision-Making to Adaptive Control Systems

    7.5 Operating Characteristic Function (OC Function) and Average Sample Number Function (ASN Function)

    7.6 Average Amount of Observation Time

    7.7 Numerical Example

    7.8 Comparison of Sequential and Nonsequential Decision Procedures

    Chapter 8. Adaptive Adjustment of Parameters of Nonlinear Control Systems

    8.1 Introductory Remarks

    8.2 Application of Sequential Decision Rule

    8.3 On-Off Relay Decision Control Systems

    Chapter 9. Some Future Problems in Applications of Statistical Decision Theory to Control Processes

    9.1 Introductory Remarks

    9.2 Filtering Problems with Statistical Decision Theory

    9.3 Present Status and Future Problems

    Author Index

    Subject Index

Product details

  • No. of pages: 230
  • Language: English
  • Copyright: © Academic Press 1967
  • Published: January 1, 1967
  • Imprint: Academic Press
  • eBook ISBN: 9781483266770

About the Authors

Yoshikazu Sawaragi

Affiliations and Expertise

Department of Applied Mathematics and Physics Kyoto University, Kyoto, Japan

Yoshifumi Sunahara

Affiliations and Expertise

Department of Applied Mathematics and Physics Kyoto University, Kyoto, Japan

Takayoshi Nakamizo

Affiliations and Expertise

Department of Mechanical Enrineering Defense Academy of Japan Yokusuka, Japan

About the Editor

Richard Bellman

Affiliations and Expertise

Departments of Mathematics, Electrical Engineering, and Medicine University of Southern California Los Angeles, California

Ratings and Reviews

Write a review

There are currently no reviews for "Statistical Decision Theory in Adaptive Control Systems"