Introduction to regression analysis
Regression analysis has been one of the most widely used statistical methodologies for analyzing relationships among variables during the past fifty years. Due to its flexibility, usefulness, applicability, theoretical and technical succinctness, it has become a basic statistical tool for solving pr...
Saved in:
Main Authors | , |
---|---|
Format | eBook Book |
Language | English |
Published |
Southampton
WIT Press
2003
MIT Press |
Edition | 1 |
Subjects | |
Online Access | Get full text |
ISBN | 1853126241 9781853126246 |
Cover
Table of Contents:
- Introduction to regression analysis -- Contents -- Preface -- Dedication -- 1. Introduction -- 2. Some Basic Results in Probability and Statistics -- 3. Simple Linear Regression -- 4. Random Vectors and Matrix Algebra -- 5. Multiple Regression -- 6. Residuals, Diagnostics and Transformations -- 7. Further Applications of Regression Techniques -- 8. Selection of a Regression Model -- 9. Multicollinearity: Diagnosis and Remedies -- Appendix -- Bibliography -- Index.
- 6.1 Introduction -- 6.2 Residuals -- 6.3 Residual Plots -- 6.4 PRESS Residuals -- 6.5 Transformations -- 6.6 Correlated Errors -- 6. 7 Generalized Least Squares -- 6.8 Exercises -- 7. Further Applications of Regression Techniques -- 7.1 Introduction -- 7.2 Polynomial Models in One Variable -- 7.3 Radial Basis Functions -- 7.4 Dummy Variables -- 7.5 Interactions -- 7.6 Logistic Regression Revisited -- 7.7 The Generalized Linear Model -- 7.8 Exercises -- 8. Selection of a Regression Model -- 8.1 Introduction -- 8.2 Consequences of Model Mispecification -- 8.3 Criteria Functions -- 8.4 Various Methods for Model Selection -- 8.5 Exercises -- 9. Multicollinearity: Diagnosis and Remedies -- 9.1 Introduction -- 9.2 Detecting Multicollinearity -- 9.3 Other Multicollinearity Diagnostics -- 9.4 Combatting Multicollinearity -- 9.5 Biased Estimation -- 9.6 Other Alternatives to OLS -- 9.7 Exercises -- Appendix -- Bibliography -- Index -- A -- B -- C -- D -- E -- F -- G -- H -- I -- J -- K -- L -- M -- N -- O -- P -- Q -- R -- S -- T -- U -- V -- W -- Z
- Cover -- Introduction to Regression Analysis -- Copyright Page -- Dedication -- Contents -- Preface -- 1. Introduction -- 1.1 A Brief History of Regression -- 1.2 Typical Applications of Regression Analysis -- 1.3 Computer Usage -- 2. Some Basic Results in Probability and Statistics -- 2.1 Introduction -- 2.2 Probability Spaces -- 2.3 Random Variables -- 2.4 The Probability Distribution of X -- 2.5 Some Random Variables and their Distributions -- 2.6 Joint Probability Distributions -- 2.7 Expectation -- 2.8 The Normal and Related Random Variables -- 2.9 Statistical Estimation -- 2.10 Properties of Estimators -- 2.11 Confidence Intervals -- 2.12 Hypothesis Testing -- 2.13 Hypothesis Testing and Confidence Intervals -- 2.14 Exercises -- 3. Simple Linear Regression -- 3.1 Introduction -- 3.2 The Error Model -- 3.3 Estimating σ2 -- 3.4 Properties of (β0, β1, S2) -- 3.5 The Gauss-Markov Theorem -- 3.6 Confidence Intervals for (β0, β1) -- 3.7 Hypothesis Tests for (β0, β1 ) -- 3.8 The ANOVA Approach to Testing -- 3.9 Assessing Model Validity -- 3.10 Transformations -- 3.11 Exercises -- 4. Random Vectors and Matrix Algebra -- 4.1 Introduction -- 4.2 Matrices and Vectors -- 4.3 Fundamentals of Matrix Algebra -- 4.4 Matrices and Linear Transformations -- 4.5 The Geometry of Vectors -- 4.6 Orthogonal Matrices -- 4. 7 The Multivariate Normal Distribution -- 4.8 Solving Systems of Equations -- 4.9 The Singular Value Decomposition -- 4.10 Exercises -- 5. Multiple Regression -- 5.1 Introduction -- 5.2 The General Linear Model -- 5.3 Least Squares Estimation -- 5.4 Properties of (β, S2, ε) -- 5.5 The Gauss-Markov Theorem -- 5.6 Testing the Fit- the Basic ANOVA Table -- 5.7 Confidence Intervals and t-Tests for the Coefficients -- 5.8 The Extra Sum of Squares Principle -- 5.9 Prediction -- 5.10 Exercises -- 6. Residuals, Diagnostics and Transformations