A Gauss-Markov Theorem for Infinite Dimensional Regression Models with Possibly Singular Covariance

Consider the linear model z = Bx + v where x is a vector, B a linear operator, and v is a random variable with mean 0 and covariance$V^2 = \operatorname{cov} (\mathbf{v, v})$. A linear functional (g, ·) is called a best linear unbiased estimator for c if (i) B * g = c, and (ii) |V2g|2is as small as...

Full description

Saved in:
Bibliographic Details
Published inSIAM journal on applied mathematics Vol. 37; no. 2; pp. 257 - 260
Main Author Morley, T. D.
Format Journal Article
LanguageEnglish
Published Philadelphia Society for Industrial and Applied Mathematics 01.10.1979
Subjects
Online AccessGet full text
ISSN0036-1399
1095-712X
DOI10.1137/0137016

Cover

More Information
Summary:Consider the linear model z = Bx + v where x is a vector, B a linear operator, and v is a random variable with mean 0 and covariance$V^2 = \operatorname{cov} (\mathbf{v, v})$. A linear functional (g, ·) is called a best linear unbiased estimator for c if (i) B * g = c, and (ii) |V2g|2is as small as possible, subject to (i). The classical Gauss-Markov theorem gives a formula for the best linear unbiased estimator in finite dimensions, under the condition that V2is invertible. We extend the Gauss-Markov theorem to Hilbert space without the condition that the covariance is invertible.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
content type line 14
ISSN:0036-1399
1095-712X
DOI:10.1137/0137016