Convergence analysis of central and minimax algorithms in scalar regressor models

In this paper, the estimation of a scalar parameter is considered with given lower and upper bounds of the scalar regressor. We derive non-asymptotic, lower and upper bounds on the convergence rates of the parameter estimate variances of the central and the minimax algorithms for noise probability d...

Full description

Saved in:
Bibliographic Details
Published inMathematics of control, signals, and systems Vol. 18; no. 1; pp. 66 - 99
Main Authors Akçay, Hüseyin, At, Nuray
Format Journal Article
LanguageEnglish
Published London Springer Nature B.V 01.02.2006
Subjects
Online AccessGet full text
ISSN0932-4194
1435-568X
DOI10.1007/s00498-005-0162-7

Cover

More Information
Summary:In this paper, the estimation of a scalar parameter is considered with given lower and upper bounds of the scalar regressor. We derive non-asymptotic, lower and upper bounds on the convergence rates of the parameter estimate variances of the central and the minimax algorithms for noise probability density functions characterized by a thin tail distribution. This presents an extension of the previous work for constant scalar regressors to arbitrary scalar regressors with magnitude constraints. We expect our results to stimulate further research interests in the statistical analysis of these set-based estimators when the unknown parameter is multi-dimensional and the probability distribution function of the noise is more general than the present setup. [PUBLICATION ABSTRACT]
Bibliography:SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-1
ObjectType-Feature-2
ISSN:0932-4194
1435-568X
DOI:10.1007/s00498-005-0162-7