CONVERGENCE ANALYSIS OF CENTRAL AND MINIMAX ALGORITHMS IN SCALAR REGRESSOR MODELS
In this paper, estimation of a scalar parameter is considered with given lower and upper bounds of the scalar regressor. We derive non-asymptotic, lower and upper bounds on the convergence rates of the parameter estimate variances for noise probability density functions charecterized by a thin tail...
Saved in:
| Published in | IFAC Proceedings Volumes Vol. 39; no. 1; pp. 594 - 599 |
|---|---|
| Main Authors | , |
| Format | Journal Article |
| Language | English |
| Published |
2006
|
| Subjects | |
| Online Access | Get full text |
| ISSN | 1474-6670 |
| DOI | 10.3182/20060329-3-AU-2901.00091 |
Cover
| Summary: | In this paper, estimation of a scalar parameter is considered with given lower and upper bounds of the scalar regressor. We derive non-asymptotic, lower and upper bounds on the convergence rates of the parameter estimate variances for noise probability density functions charecterized by a thin tail distribution. This presents an extension of the previous work for constant scalar regressors to arbitrary scalar regressors with magnitude constraints. We expect our results to stimulate further research interests in the statistical analysis of these set-based estimators when the unknown parameter is multi-dimensional and the probability distribution function of the noise is more general than the present setup. |
|---|---|
| ISSN: | 1474-6670 |
| DOI: | 10.3182/20060329-3-AU-2901.00091 |