Pattern recognition and classification : an introduction

This volume, both comprehensive and accessible, introduces all the key concepts in pattern recognition, and includes many examples and exercises that make it an ideal guide to an important methodology widely deployed in today's ubiquitous automated systems.

Saved in:
Bibliographic Details
Main Author Dougherty, Geoff
Format eBook Book
LanguageEnglish
Published New York, NY Springer 2012
Springer New York
Edition1
Subjects
Online AccessGet full text
ISBN1461453224
9781461453222
DOI10.1007/978-1-4614-5323-9

Cover

Table of Contents:
  • Chapter 7: Feature Extraction and Selection -- 7.1 Reducing Dimensionality -- 7.1.1 Preprocessing -- 7.2 Feature Selection -- 7.2.1 Inter/Intraclass Distance -- 7.2.2 Subset Selection -- 7.3 Feature Extraction -- 7.3.1 Principal Component Analysis -- 7.3.2 Linear Discriminant Analysis -- 7.4 Exercises -- References -- Chapter 8: Unsupervised Learning -- 8.1 Clustering -- 8.2 k-Means Clustering -- 8.2.1 Fuzzy c-Means Clustering -- 8.3 (Agglomerative) Hierarchical Clustering -- 8.4 Exercises -- References -- Chapter 9: Estimating and Comparing Classifiers -- 9.1 Comparing Classifiers and the No Free Lunch Theorem -- 9.1.1 Bias and Variance -- 9.2 Cross-Validation and Resampling Methods -- 9.2.1 The Holdout Method -- 9.2.2 k-Fold Cross-Validation -- 9.2.3 Bootstrap -- 9.3 Measuring Classifier Performance -- 9.4 Comparing Classifiers -- 9.4.1 ROC Curves -- 9.4.2 McNemar´s Test -- 9.4.3 Other Statistical Tests -- 9.4.4 The Classification Toolbox -- 9.5 Combining Classifiers -- References -- Chapter 10: Projects -- 10.1 Retinal Tortuosity as an Indicator of Disease -- 10.2 Segmentation by Texture -- 10.3 Biometric Systems -- 10.3.1 Fingerprint Recognition -- 10.3.2 Face Recognition -- References -- Index
  • Intro -- Pattern Recognition and Classification -- Preface -- Acknowledgments -- Contents -- Chapter 1: Introduction -- 1.1 Overview -- 1.2 Classification -- 1.3 Organization of the Book -- 1.4 Exercises -- References -- Chapter 2: Classification -- 2.1 The Classification Process -- 2.2 Features -- 2.3 Training and Learning -- 2.4 Supervised Learning and Algorithm Selection -- 2.5 Approaches to Classification -- 2.6 Examples -- 2.6.1 Classification by Shape -- 2.6.2 Classification by Size -- 2.6.3 More Examples -- 2.6.4 Classification of Letters -- 2.7 Exercises -- References -- Chapter 3: Nonmetric Methods -- 3.1 Introduction -- 3.2 Decision Tree Classifier -- 3.2.1 Information, Entropy, and Impurity -- 3.2.2 Information Gain -- 3.2.3 Decision Tree Issues -- 3.2.4 Strengths and Weaknesses -- 3.3 Rule-Based Classifier -- 3.4 Other Methods -- 3.5 Exercises -- References -- Chapter 4: Statistical Pattern Recognition -- 4.1 Measured Data and Measurement Errors -- 4.2 Probability Theory -- 4.2.1 Simple Probability Theory -- 4.2.2 Conditional Probability and Bayes´ Rule -- 4.2.3 Naïve Bayes Classifier -- 4.3 Continuous Random Variables -- 4.3.1 The Multivariate Gaussian -- 4.3.2 The Covariance Matrix -- 4.3.3 The Mahalanobis Distance -- 4.4 Exercises -- References -- Chapter 5: Supervised Learning -- 5.1 Parametric and Non-parametric Learning -- 5.2 Parametric Learning -- 5.2.1 Bayesian Decision Theory -- 5.2.1.1 Single Feature (1D) -- 5.2.1.2 Multiple features -- 5.2.2 Discriminant Functions and Decision Boundaries -- 5.2.3 MAP (Maximum A Posteriori) Estimator -- 5.3 Exercises -- References -- Chapter 6: Nonparametric Learning -- 6.1 Histogram Estimator and Parzen Windows -- 6.2 k-Nearest Neighbor (k-NN) Classification -- 6.3 Artificial Neural Networks -- 6.4 Kernel Machines -- 6.5 Exercises -- References