Probability Theory and Mathematical Statistics for Engineers - 1st Edition - ISBN: 9780080291482, 9781483190501

Probability Theory and Mathematical Statistics for Engineers

1st Edition

Authors: V. S. Pugachev
eBook ISBN: 9781483190501
Imprint: Pergamon
Published Date: 1st January 1984
Page Count: 468
Tax/VAT will be calculated at check-out Price includes VAT (GST)
30% off
30% off
30% off
30% off
30% off
20% off
20% off
30% off
30% off
30% off
30% off
30% off
20% off
20% off
30% off
30% off
30% off
30% off
30% off
20% off
20% off
54.95
38.47
38.47
38.47
38.47
38.47
43.96
43.96
43.99
30.79
30.79
30.79
30.79
30.79
35.19
35.19
72.95
51.06
51.06
51.06
51.06
51.06
58.36
58.36
Unavailable
Price includes VAT (GST)
× DRM-Free

Easy - Download and start reading immediately. There’s no activation process to access eBooks; all eBooks are fully searchable, and enabled for copying, pasting, and printing.

Flexible - Read on multiple operating systems and devices. Easily read eBooks on smart phones, computers, or any eBook readers, including Kindle.

Open - Buy once, receive and download all available eBook formats, including PDF, EPUB, and Mobi (for Kindle).

Institutional Access

Secure Checkout

Personal information is secured with SSL technology.

Free Shipping

Free global shipping
No minimum order.

Description

Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.

The book underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vectors and their distributions, including conditional distributions of projections of a random vector, conditional numerical characteristics, and information contained in random variables.

The book elaborates on the functions of random variables and estimation of parameters of distributions. Topics include frequency as a probability estimate, estimation of statistical characteristics, estimation of the expectation and covariance matrix of a random vector, and testing the hypotheses on the parameters of distributions. The text then takes a look at estimator theory and estimation of distributions.

The book is a vital source of data for students, engineers, postgraduates of applied mathematics, and other institutes of higher technical education.

Table of Contents


1. Probabilities of Events

1.1. Random Phenomena

1.1.1. Examples of Random Phenomena

1.1.2. Nature of Random Phenomena

1.1.3. Mass Random Phenomena

1.1.4. Object of Probability Theory

1.2. Statistical Approach to the Description of Random Phenomena

1.2.1. Trial, Event, Random Variable

1.2.2. Frequency of an Event

1.2.3. Conditional Frequencies

1.2.4. Properties of Frequencies

1.2.5. Probability of an Event

1.2.6. Sample Mean

1.2.7. Sample Variance and Sample Mean Square Deviation

1.2.8. Least-Squares Method

1.2.9. Sample Covariance and Sample Correlation Coefficient

1.2.10. Histogram

1.2.11. Grouped Sample

1.3. Direct Evaluation of Probabilities

1.3.1. Equiprobable Outcomes of a Trial

1.3.2. Scheme of Chances

1.3.3. Geometrical Probabilities

1.3.4. Calculation of Conditional Probabilities in the Scheme of Chances

1.4. Operations with Events

1.4.1. Union of Two Events

1.4.2. Intersection of Two Events

1.4.3. Union and Intersection of any Set of Events

1.4.4. Properties of Unions and Intersections

1.4.5. Complementary Events

1.4.6. Properties of Operations with Events

1.4.7. Elementary Events

1.5. Axioms of Probability Theory

1.5.1. Space of Elementary Events

1.5.2. Field of Events

1.5.3. Axioms

1.5.4. Probability as a Function of a Set-Measure

1.5.5. Probability Space

1.5.6. Properties of Probabilities

1.5.7. Complete Set of Events

1.6. Conditional Probabilities

1.6.1. Conditional Probability

1.6.2. Dependent and Independent Events

1.6.3. Multiplication Theorem of Probabilities for Independent Events

1.7. Probabilities of Complex Events

1.7.1. Formula of Total Probability

1.7.2. Bayes Formula

1.8. Repeated Trials

1.8.1. Case of Constant Conditions of Trials

1.8.2. Case of Variable Conditions of Trials

1.8.3. Probability of Appearance of an Event Not Less Than a Given Number of Times

1.8.4. Probability of at Least One Appearance of an Event

1.8.5. Case of Trials with Any Number of Events

1.9. Poisson Distribution

1.9.1. Flows of Events

1.9.2. Equation for Probability of Non-Appearance of Events

1.9.3. Equations for Probabilities of Different Numbers of Events

1.9.4. Solution of the Equations

1.9.5. Random Distribution of Points in a Space

1.9.6. Poisson Approximation to Binomial Distribution

2. Random Variables

2.1. General Definitions. Discrete Random Variables

2.1.1. Definition of a Random Variable

2.1.2. Scalar and Vector Random Variables

2.1.3. Distribution of a Random Variable

2.1.4. Discrete Random Variable

2.1.5. Distribution of a Discrete Random Variable

2.2. Continuous Random Variables. Density of a Random Variable

2.2.1. Density of a Random Variable

2.2.2. Continuous Random Variable

2.2.3. Probabilityof Occurrenceof a Random Variable in a Given Domain

2.2.4. Properties of a Density

2.2.5. Random Variable as a Function of Elementary Event

2.3. Generalization of the Density Concept

2.3.1. Density of a Discrete Random Variable

2.3.2. Discrete-Continuous Random Variables

2.3.3. Discrete-Continuous Random Vectors

2.3.4. Singular Distributions

2.3.5. Probability of Occurrence of a Random Variable in a Domain

2.4. Distribution Function

2.4.1. Distribution Function and its Relation to Density

2.4.2. Properties of the Distribution Function of a Scalar Random Variable

2.4.3. Probability of Occurrence of a Scalar Random Variable in an Interval

2.4.4. Probability of Occurrence of a Random Vector in a Rectangle

2.4.5. Properties of the Distribution Function of a Random Vector

2.4.6. Dependent and Independent Random Variables

2.5. Entropy of a Distribution

2.5.1. Entropy as a Measure of Uncertainty of the Result of a Trial

2.5.2. Entropy of a Continuous Random Variable

2.5.3. Increase of Entropy Caused by Smoothing of a Density

2.5.4. Extremal Properties of some Distributions

3. Numerical Characteristics of Random Variables

3.1. Expectation

3.1.1. Expectation of a Discrete Random Variable

3.1.2. General Definition of Expectation

3.1.3. Properties of Expectations

3.2. Characteristics of the Scatter

3.2.1. Variance and Mean Square Deviation

3.2.2. Covariance and Correlation Coefficient

3.2.3. Correlated and Uncorrelated Random Variables

3.2.4. First- and Second-Order Moments

3.3. Second-Order Moments of Random Vectors

3.3.1. Second-Order Moment, Covariance Matrix, Correlation Matrix

3.3.2. Mixed Second-Order Moment and Cross-Covariance Matrix

3.3.3. Second-Order Moment Operators

3.3.4. Properties of Second-Order Moments

3.3.5. Linear Functions of Random Vectors

3.4. Canonical Expansions of Random Vectors

3.4.1. Eigenvector Expansion

3.4.2. Calculation of Eigenvalues and Eigenvectors

3.4.3. Canonical Expansion

3.4.4. Various Forms of Canonical Expansion

3.4.5. The Simplest way to Find a Canonical Expansion

3.4.6. Geometrical Interpretation of a Canonical Expansion

3.4.7. Construction of a Random Vector with a Given Covariance Matrix

3.5. Other Numerical Characteristics of Random Variables

3.5.1. Moments

3.5.2. Moments of Linear Functions of Random Variables

3.5.3. Quantiles

3.6. One-Dimensional Normal Distribution

3.6.1. The Coefficient Before the Exponential Function

3.6.2. Moments

3.6.3. Probability of Occurrence of a Random Variable in an Interval

3.6.4. Case of Symmetrical Interval

3.6.5. Quantiles

3.6.6. Entropy

4. Projections of Random Vectors and Their Distributions

4.1. Distributions of Projections of a Random Vector

4.1.1. Projections of a Vector

4.1.2. Distribution Function of a Projection of a Random Vector

4.1.3. Density of a Projection of a Random Vector

4.2. Conditional Distributions of Projections of a Random Vector

4.2.1. Conditional Density of a Projection of a Random Vector

4.2.2. Multiplication Theorem of Densities

4.2.3. Dependent and Independent Random Variables

4.2.4. Independent Random Variables are Uncorrelated

4.2.5. Independence of Functions of Independent Random Variables

4.2.6. Multiplication Theorem of Expectations

4.3. Conditional Numerical Characteristics

4.3.1. Conditional Expectation

4.3.2. Regression

4.3.3. Conditional Moments

4.3.4. Formula of Total Expectation

4.4. Characteristic Functions of Random Variables

4.4.1. Characteristic Functions as a Characterization of a Distribution

4.4.2. Properties of Characteristic Functions

4.4.3. Relations Between a Characteristic Function and Moments

4.4.4. Semi-Invariants

4.4.5. Order of Residuals in Expansions

4.4.6. Relations Between Semi-Invariants and Moments

4.4.7. Semi-Invariants of Linear Functions of Random Variables

4.5. Multi-Dimensional Normal Distribution

4.5.1. Expectation of a Normally Distributed Random Vector

4.5.2. Covariance Matrix

4.5.3. Coefficient in Front of the Exponential Function

4.5.4. Conditional Distributions of Components

4.5.5. The Case of Uncorrelated Components

4.5.6. Singular Normal Distribution

4.5.7. Characteristic Function

4.5.8. Linear Functions of Normally Distributed Random Variables

4.5.9. Moments

4.5.10. Entropy

4.6. Information Contained in Random Variables

4.6.1. Mean Conditional Entropy

4.6.2. Addition Theorem of Entropies

4.6.3. Information About a Random Variable Contained in Another Random Variable

5. Functions of Random Variables

5.1. Moments of Functions of Random Variables

5.1.1. Exact Formulae for the First and Second Moments

5.1.2. Linearization Method

5.2. Distribution Function of a Function of a Random Variable

5.2.1. General Principle of Finding Distributions of Functions of Random Variables

5.2.2. Finding the Distribution Functions

5.2.3. Transformation of a Random Vector Yielding a Vector with Independent Components

5.3. Density of a Function of a Random Variable

5.3.1. Method of Comparison of Probabilities

5.3.2. Method of Comparison of Probability Elements

5.3.3. Method of Delta-Functions

5.3.4. Method of Characteristic Functions

5.3.5. Method of Moments

5.4. Limit Theorems

5.4.1. The Simplest Limit Theorem

5.4.2. Importance of Limit Theorems

5.5. Information Contained in Transformed Random Variables

5.5.1. Information in Functions of Random Variables

5.5.2. No Transformation of a Random Variable can Increase the Amount of Information

5.5.3. Sufficient Transforms

6. Estimation of Parameters of Distributions

6.1. Main Problems of Mathematical Statistics

6.1.1. Determination of Statistical Characteristics from Trials

6.1.2. Modes of Probabilistic Convergence

6.1.3. Chebyshev Inequality. Relationships Between Various Modes of Convergence

6.2. Estimation of Statistical Characteristics

6.2.1. Estimates and Estimators

6.2.2. Sufficient Statistics

6.2.3. Confidence Intervals and Confidence Regions

6.2.4. Methods for Determining Confidence Regions

6.3. Frequency as a Probability Estimate

6.3.1. Consistency

6.3.2. Confidence Intervals

6.3.3. Approximate Determination of Confidence Intervals

6.4. Estimation of the Expectation and Variance of a Random Variable

6.4.1. Estimation of an Expectation

6.4.2. Estimation of a Variance

6.4.3. Confidence Intervals for an Expectation

6.4.4. Confidence Intervals for a Variance

6.4.5. Confidence Regions for an Expectation and Variance

6.4.6. Estimation of Moments

6.5. Estimation of the Expectation and Covariance Matrix of a Random Vector

6.5.1. Estimation of a Covariance and Correlation Coefficient

6.5.2. Estimation of an Expectation and Covariance Matrix

6.5.3. Confidence Regions for an Expectation

6.5.4. Distribution of a Sample Correlation Coefficient

6.5.5. Confidence Intervals for a Correlation Coefficient

6.5.6. Confidence Regions for a Covariance Matrix

6.6. Testing Hypotheses About Parameters of Distributions

6.6.1. Problems of Testing Hypotheses

6.6.2. Testing Hypotheses About a Parameter Value

6.6.3. Testing Hypotheses About the Coincidence of Parameter Values

6.6.4. Elimination of Anomalous Observations (Outlyers)

7. Estimator Theory

7.1. General Properties of Estimators

7.1.1. Some Relations

7.1.2. Lower Dispersion Bound of the Sstimate of a Scalar Parameter

7.1.3. Efficient Estimator of a Scalar Parameter

7.1.4. Lower Dispersion Bound of the Estimate of a Vector Parameter

7.1.5. Efficient Estimator of a Vector Parameter

7.1.6. Lower Bounds of Variances of the Components of a Vector Parameter Estimate

7.1.7. Sufficiency of an Efficient Estimator

7.1.8. The Case of Independent Trials

7.1.9. The Case of a Discrete Observed Random Variable

7.2. Main Methods for Finding Estimators

7.2.1. Maximum-Likelihood Method

7.2.2. A Property of the Maximum-Likelihood Method

7.2.3. Moments Method

7.3. Recursive Estimation of the Root of a Regression Equation

7.3.1. Recursive Estimation of an Expectation

7.3.2. Stochastic Approximations Process

7.3.3. Convergence of the Stochastic Approximations Process

7.4. Recursive Estimation of the Extremum Point of a Regression

7.4.1. Stochastic Approximations Process

7.4.2. Convergence of the Stochastic Approximations Process

7.4.3. Random-Search Method

8. Estimation of Distributions

8.1. Estimators of Densities and Distribution Functions

8.1.1. Parametric and Non-Parametric Estimation of Distributions

8.1.2. Estimation of a Density by a Histogram

8.1.3. Confidence Regions for a Density

8.1.4. Estimation of a Distribution Function

8.1.5. Confidence Regions for a Distribution Function

8.1.6. Other Estimators of a Density

8.2. Approximate Representation of Distributions

8.2.1. Pearson Curves System

8.2.2. Orthogonal Expansions of Densities

8.2.3. Hermitian Polynomial Expansion of a Density

8.2.4. Hermitian Polynomial Expansion of Multi-Dimensional Densities

8.2.5. Edgeworth Series

8.2.6. Representation of a Density by a Linear Combination of Given Functions

8.3. Testing Hypotheses About Distributions

8.3.1. Problems of Testing Hypotheses

8.3.2. χ2-Test

8.3.3. Deduction of Limit χ2-Distribution

8.3.4. Estimation of Distribution Parameters by χ2-Minimum Method

8.3.5. Other Methods for Testing Hypotheses About Distributions

8.3.6. Testing Hypotheses About Independence of Random Variables

8.3.7. Testing Hypotheses About Coincidence of Distributions

8.4. Statistical Simulation Methods

8.4.1. Problems of Statistical Simulation

8.4.2. Simulation of Random Variables

8.4.3. Simulation of Events

8.4.4. Practical Applications of the Method

8.4.5. Accuracy of the Method

8.4.6. Solution of Probability Problems

8.4.7. Evaluation of Integrals

9. Statistical Models, I

9.1. Mathematical Models

9.1.1. Theoretical and Statistical Models

9.1.2. Deterministic and Stochastic Models

9.1.3. Role of Mathematical Models

9.2. Regression Models

9.2.1. Regression as an Estimator of the Dependence of a Random Variable on Another Variable

9.2.3. Optimal Estimators

9.2.4. Necessary and Sufficient Condition of Optimality

9.2.5. Linear Regression Models

9.2.6. Solution of Equations Determining a Linear Regression

9.2.7. Deterministic and Stochastic Regression Models

9.3. Estimation of Regressions

9.3.1. Estimation of the Coefficient Matrix of a Linear Regression

9.3.2. Statistical Properties of the Estimator

9.3.3. Estimator of the Covariance Matrix of the Observed Random Variable

9.3.4. Statistical Properties of the Estimators of Regression Values

9.3.5. Estimation of a Non-Linear Regression

9.3.6. Case of a Linear Regression and Normal Distribution

9.3.7. Choice of the Values of the Independent Variable

9.3.8. Confidence Regions for a Regression

9.3.9. Estimation of a Shifted Linear Regression

9.3.10. Estimation of the Regression of a Random Variable on Another Random Variable

9.4. Testing Hypotheses About Regressions

9.4.1. Statistics for Testing Hypotheses About Equality to Zero of Regression Coefficients

9.4.2. An Auxiliary Relation

9.4.3. Testing Hypotheses in the Case of a Scalar Observed Random Variable

9.4.4. Testing Hypotheses in the Case of a Vector Observed Random Variable

9.4.5. Testing Hypotheses About Linearity of a Regression

9.4.6. Choice of the Type of a Regression Model

9.5. Analysis of Variance

9.5.1. Qualitative Variables-Factors

9.5.2. Complete Two-Factor Design of Experiments

9.5.3. Reduction of the Problem to the Estimation of a Linear Regression

9.5.4. Incomplete Designs of Experiments

10. Statistical Models, II

10.1. Models Described by Difference Equations

10.1.1. Autoregression Models

10.1.2. Linear Models

10.1.3. Reduction of a Linear Model to an Autoregression Model of the First Order

10.1.4. Non-Linear Models

10.2. Estimation of Random Variables Determined by Difference Equations

10.2.1. Non-Linear Models in the General Case

10.2.2. Non-Linear Autoregression Models

10.2.3. Linear Autoregression Models

10.2.4. Kalman Filters

10.2.5. Innovation Sequences

10.2.6. Approximate Solution of Non-Linear Estimation Problems

10.2.7. Estimation of Unknown Parameters in Difference Equations

10.3. Factor Models

10.3.1. Problems of Factor Analysis

10.3.2. Method of Main Components

10.3.3. Centroid Method

10.3.4. Rotation of Factors

10.3.5. Using the Method of Section 3.4.5.

10.3.6. Estimation of Factors

10.4. Recognition Models

10.4.1. Mathematical Statement of Recognition Problems

10.4.2. Deterministic Recognition Models

10.4.3. Stochastic Recognition Models

10.4.4. Teaching of Recognition Models

10.4.5. Design of Recognition Models Without Knowledge of Prior Probabilities

10.4.6. Testing Hypotheses

10.4.7. Sequential Recognition Models

10.5. Decision-Making Models

10.5.1. Decision-Making Problems

10.5.2. Risk and Loss Function

10.5.3. Optimal Decisions

10.5.4. Optimal Decisions in a Given Class of Functions

10.5.5. Decision-Making Under Uncertainty

10.5.6. Teaching of Decision-Making Models

Appendices

1. Impulse Delta-Function and its Derivatives

2. Some Definite Integrals

3. Tables

Table 1. Laplace Function Φ(u) = (2π)-1/2 u∫0 e-t2/2 dt

Table 2. Derivatives of the Laplace Function

Table 3. Two-Side Boundaries of the T-Distribution: The Values of tα Determined by the Equation tα∫-tα Sk(t)dt = α

Table 4. Two-Side Boundaries of the χ2-Distribution: The Values of εα Determined by the Equation k/(1-εα)2∫k/(1+εα)2 pk(z)dz = α

Table 5. Kolmogorov Limit Distribution Function K(u) = ∞∑v=-∞ (-1)v e-2v2u3

Table 6. Upper 100 (1-α)-Percentage Points of the χ2-Distribution: The Values of χ2α Determined by the Equation Ρ(χ2α) — χ2α∫0 pk(z)dz =

Table 7. Upper 5-Percentage and 1-Percentage Points of the F-Distribution: The Values of fα Determined by the Equation Flk(fα) fα∫0 flk(f)dt = α

Main Notations

References

Index

Details

No. of pages:
468
Language:
English
Copyright:
© Pergamon 1984
Published:
Imprint:
Pergamon
eBook ISBN:
9781483190501

About the Author

V. S. Pugachev