A Gauss-Markov Theorem for Infinite Dimensional Regression Models with Possibly Singular Covariance
Consider the linear model z = Bx + v where x is a vector, B a linear operator, and v is a random variable with mean 0 and covariance$V^2 = \operatorname{cov} (\mathbf{v, v})$. A linear functional (g, ·) is called a best linear unbiased estimator for c if (i) B * g = c, and (ii) |V2g|2is as small as...
        Saved in:
      
    
          | Published in | SIAM journal on applied mathematics Vol. 37; no. 2; pp. 257 - 260 | 
|---|---|
| Main Author | |
| Format | Journal Article | 
| Language | English | 
| Published | 
        Philadelphia
          Society for Industrial and Applied Mathematics
    
        01.10.1979
     | 
| Subjects | |
| Online Access | Get full text | 
| ISSN | 0036-1399 1095-712X  | 
| DOI | 10.1137/0137016 | 
Cover
| Summary: | Consider the linear model z = Bx + v where x is a vector, B a linear operator, and v is a random variable with mean 0 and covariance$V^2 = \operatorname{cov} (\mathbf{v, v})$. A linear functional (g, ·) is called a best linear unbiased estimator for c if (i) B * g = c, and (ii) |V2g|2is as small as possible, subject to (i). The classical Gauss-Markov theorem gives a formula for the best linear unbiased estimator in finite dimensions, under the condition that V2is invertible. We extend the Gauss-Markov theorem to Hilbert space without the condition that the covariance is invertible. | 
|---|---|
| Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 content type line 14  | 
| ISSN: | 0036-1399 1095-712X  | 
| DOI: | 10.1137/0137016 |