CONVERGENCE ANALYSIS OF CENTRAL AND MINIMAX ALGORITHMS IN SCALAR REGRESSOR MODELS

In this paper, estimation of a scalar parameter is considered with given lower and upper bounds of the scalar regressor. We derive non-asymptotic, lower and upper bounds on the convergence rates of the parameter estimate variances for noise probability density functions charecterized by a thin tail...

Full description

Saved in:
Bibliographic Details
Published inIFAC Proceedings Volumes Vol. 39; no. 1; pp. 594 - 599
Main Authors Akçay, Hüseyin, At, Nuray
Format Journal Article
LanguageEnglish
Published 2006
Subjects
Online AccessGet full text
ISSN1474-6670
DOI10.3182/20060329-3-AU-2901.00091

Cover

More Information
Summary:In this paper, estimation of a scalar parameter is considered with given lower and upper bounds of the scalar regressor. We derive non-asymptotic, lower and upper bounds on the convergence rates of the parameter estimate variances for noise probability density functions charecterized by a thin tail distribution. This presents an extension of the previous work for constant scalar regressors to arbitrary scalar regressors with magnitude constraints. We expect our results to stimulate further research interests in the statistical analysis of these set-based estimators when the unknown parameter is multi-dimensional and the probability distribution function of the noise is more general than the present setup.
ISSN:1474-6670
DOI:10.3182/20060329-3-AU-2901.00091