An Optimal Self-Pruning Neural Network and Nonlinear Descriptor Selection in QSAR

Feature selection is an important but still poorly solved problem in QSAR modeling. We employ a Bayesian regularized neural network with a sparse Laplacian prior as an efficient method for supervised feature selection, and robust parsimonious nonlinear QSAR modeling. The method simultaneously select...

Full description

Saved in:
Bibliographic Details
Published inQSAR & combinatorial science Vol. 28; no. 10; pp. 1092 - 1097
Main Authors Burden, Frank R., Winkler, David A.
Format Journal Article
LanguageEnglish
Published Weinheim WILEY-VCH Verlag 01.10.2009
WILEY‐VCH Verlag
Subjects
Online AccessGet full text
ISSN1611-020X
1611-0218
DOI10.1002/qsar.200810202

Cover

More Information
Summary:Feature selection is an important but still poorly solved problem in QSAR modeling. We employ a Bayesian regularized neural network with a sparse Laplacian prior as an efficient method for supervised feature selection, and robust parsimonious nonlinear QSAR modeling. The method simultaneously selects the most relevant descriptors for model, and automatically prunes the neural network to have the architecture with optimum prediction ability. We illustrate the advantages of the method using a suite of diverse data sets, and compare the results obtained by the new method against those obtained by alternative contemporary methods.
Bibliography:istex:DA26F882D14DBE9DA73F56673ABA5953AFC26A02
ark:/67375/WNG-N9J9G101-2
ArticleID:QSAR200810202
ISSN:1611-020X
1611-0218
DOI:10.1002/qsar.200810202