Basic Chemometric Techniques in Atomic Spectroscopy (2nd Edition)

This is the first book for atomic spectroscopists to present the basic principles of experimental designs, optimization and multivariate regression. Multivariate regression is a valuable statistical method for handling complex problems (such as spectral and chemical interferences) which arise during...

Full description

Saved in:
Bibliographic Details
Main Author Andrade-Garda José M
Format eBook Book
LanguageEnglish
Published Cambridge Royal Society of Chemistry (RSC) 2013
Royal Society of Chemistry
Royal Society of Chemistry, The
RSC Pub
Edition2
SeriesRSC analytical spectroscopy monographs
Subjects
Online AccessGet full text
ISBN1849737967
9781849737968

Cover

Table of Contents:
  • Title Page Structure of the Book Preface Table of Contents 1. An Overview of Atomic Spectrometric Techniques 2. Classical Linear Regression by the Least Squares Method 3. Implementing a Robust Methodology: Experimental Designs and Optimisation 4. Ordinary Multiple Linear Regression and Principal Components Regression 5. Partial Least-Squares Regression 6. Multivariate Regression Using Artificial Neural Networks and Support Vector Machines Subject Index
  • 6.4.1 Learning Mechanisms -- 6.4.2 Evolution of the Weights -- 6.5 Error Back-Propagation Artificial Neural Networks -- 6.6 When to Stop Learning -- 6.7 Validating the Artificial Neural Network -- 6.8 Limitations of the Artificial Neural Networks -- 6.9 Relationships with other Regression Methods -- 6.10 Worked Example -- 6.11 Support Vector Machines -- 6.11.1 Support Vector Machines for Classification -- 6.11.2 Support Vector Machines for Regression -- 6.11.3 Worked Example -- 6.12 Examples of Practical Applications -- 6.12.1 Flame Atomic Absorption and Atomic Emission Spectrometry (FAAS and FAES) -- 6.12.2 Electrothermal Atomic Absorption Spectrometry (ETAAS) -- 6.12.3 Inductively Coupled Plasma Optica Emission Spectrometry (ICP-OES) -- 6.12.4 Inductively Coupled Plasma Mass Spectrometry (ICP-MS) -- 6.12.5 X-Ray Fluorescence (XRF) -- 6.12.6 Laser-Induced Breakdown Spectrometry (LIBS) -- 6.12.7 Secondary Ion Mass Spectrometry (SIMS) -- 6.12.8 Applications of SVM -- References -- Subject Index
  • Basic Chemometric Techniques in Atomic Spectroscopy -- Contents -- List of Contributors -- Chapter 1 An Overview of Atomic Spectrometric Techniques -- 1.1 Introduction: Basis of Analytical Atomic Spectrometric Techniques -- 1.2 Atomic Optical Spectrometry -- 1.2.1 Classification of Techniques: Absorption, Emission and Fluorescence -- 1.2.2 A Comparative View of Basic Instrumentation -- 1.2.3 Comparative Analytical Performance Characteristics and Interferences -- 1.3 Atomic Mass Spectrometry -- 1.3.1 Fundamentals and Basic Instrumentation of Inductively Coupled Plasma-Mass Spectrometry -- 1.3.2 Analytical Performance Characteristics and Interferences in ICP-MS -- 1.3.3 Isotope Ratio Measurements and Their Applications -- 1.4 Flow Systems with Atomic Spectrometry Detection -- 1.4.1 Flow Injection Analysis and Atomic Spectrometry -- 1.4.2 Chromatographic Separations Coupled On-line to Atomic Spectrometry -- 1.4.3 Detection of Fast Transient Signals -- 1.5 Direct Analysis of Solids by Spectrometric Techniques -- 1.5.1 Direct Elemental Analysis by Optical Spectrometry -- 1.5.2 Direct Elemental Analysis by Mass Spectrometry -- 1.6 Quality Control and Troubleshooting -- References -- Chapter 2 Classical Linear Regression by the Least Squares Method -- 2.1 Introduction -- 2.1.1 Defining Calibration -- 2.2 The Least Squares Criterion -- 2.2.1 Basic Assumptions of the Least Squares Method -- 2.3 The Least Squares Fit -- 2.3.1 Planning Standardization -- 2.3.2 Setting the Working Range -- 2.4 Validation of the Least Squares Fit -- 2.4.1 The Correlation Coefficient -- 2.4.2 Residuals and Outlying Data -- 2.4.3 Tests for Equal Variances -- 2.4.4 Tests for Linearity -- 2.5 Estimating Unknown Concentrations -- 2.6 The Standard Additions Method (SAM) -- 2.6.1 Description of the Method: Advantages and Disadvantages -- 2.6.2 Interpolation or Extrapolation
  • 2.7 Practical Application -- 2.7.1 Experimental Design -- 2.7.2 Effect on the Predictions and Confidence Intervals -- 2.7.3 Comparison of the Predictions and Confidence Intervals Obtained with Interpolation and Extrapolation -- 2.8 Polynomial Regression -- 2.8.1 Over-fitting -- 2.8.2 Shape of the Polynomial -- 2.9 Appendix 1. Mandel's Test to Check for Linearity -- Comparing the Two Definitions -- 2.10 Appendix 2. Comparison of Two Regression Lines -- Another Possibility -- Acknowledgements -- References -- Chapter 3 Implementing a Robust Methodology: Experimental Designs and Optimisation -- 3.1 Basics of Experimental Design -- 3.1.1 Objectives and Strategies -- 3.1.2 Variables and Responses: Factors, Levels, Effects and Interactions -- 3.2 Analysis of Experimental Designs -- 3.2.1 Factorial Designs -- 3.2.2 2f Factorial Designs -- 3.2.3 Algorithms: BH2 and Yates -- 3.2.4 Graphical and Statistical Analysis -- 3.2.5 Blocking Experiments -- 3.2.6 Confounding: Fractional Factorial Designs -- 3.2.7 Saturated Designs: Plackett-Burman Designs. Use in Screening and Robustness Studies -- 3.3 Taguchi's Approach to Experimental Design -- 3.3.1 Strategies for Robust Designs -- 3.3.2 Planning Experiments: Orthogonal Arrays -- 3.3.3 Robust Parameter Design: Reducing Variation -- 3.3.4 Worked Example -- 3.4 Optimisation -- 3.4.1 Experimental Optimisation -- 3.4.2 The Simplex Method -- 3.4.3 The Modified Simplex Method -- 3.4.4 Response Surface Designs -- 3.5 Examples of Practical Applications -- References -- Chapter 4 Ordinary Multiple Linear Regression and Principal Components Regression -- 4.1 Introduction -- 4.1.1 Multivariate Calibration in Quantitative Analysis -- 4.1.2 Notation -- 4.2 Basics of Multivariate Regression -- 4.2.1 The Multiple Linear Regression Model -- 4.2.2 Estimation of the Model Coefficients -- 4.2.3 Prediction
  • 4.2.4 The Collinearity Problem in Multivariate Regression -- 4.3 Multivariate Direct Models -- 4.3.1 Classical Least Squares -- 4.4 Multivariate Inverse Models -- 4.4.1 Inverse Least Squares -- 4.4.2 Principal Components Regression -- 4.5 Practical Applications and Comparative Example -- 4.6 Appendix -- References -- Chapter 5 Partial Least-Squares Regression -- 5.1 A Graphical Approach to the Basic PLS Algorithm -- 5.2 Sample Sets -- 5.3 Data Pretreatment -- 5.3.1 Baseline Correction -- 5.3.2 Smoothing -- 5.3.3 Mean Centring and Autoscaling -- 5.3.4 Derivatives -- 5.4 Dimensionality of the Model -- 5.4.1 Crossvalidation -- 5.4.2 Other Approaches -- 5.5 Diagnostics -- 5.5.1 t vs. t Plots -- 5.5.2 t vs. u Plots -- 5.5.3 The T2, h and Q Statistics -- 5.5.4 Studentized Concentration Residuals -- 5.5.5 Predicted vs. Reference Plot -- 5.6 Validation -- 5.7 Multivariate Figures of Merit -- 5.7.1 Accuracy (Trueness and Precision) -- 5.7.2 Limit of Detection -- 5.7.3 Limit of Quantification -- 5.7.4 Sensitivity -- 5.7.5 Selectivity -- 5.7.6 Sample-specific Standard Error of Prediction -- 5.8 Chemical Interpretation of the Model -- 5.9 Examples of Practical Applications -- 5.9.1 Flame and Electrothermal Atomic Spectrometry (FAAS and ETAAS) -- 5.9.2 Inductively Coupled Plasma Optical Emission Spectrometry (ICP-OES) -- 5.9.3 Inductively Coupled Plasma Mass Spectrometry (ICP-MS) -- 5.9.4 Laser-Induced Breakdown Spectrometry (LIBS) -- 5.9.5 Direct Analysis of Solids -- References -- Chapter 6 Multivariate Regression using Artificial Neural Networks and Support Vector Machines -- 6.1 Introduction -- 6.2 Neurons and Artificial Neural Networks -- 6.3 Basic Elements of the Neuron -- 6.3.1 Input Function -- 6.3.2 Activation and Transfer Function -- 6.3.3 Output Function -- 6.3.4 Raw Data Preprocessing -- 6.4 Training an Artificial Neural Network