Hyperparameter optimization and neural architecture search algorithms for graph Neural Networks in cheminformatics

[Display omitted] •Comprehensive review of cheminformatics datasets for molecular property prediction.•Survey of optimization techniques for Graph Neural Networks in cheminformatics.•Comparison of optimization methods, highlighting strengths and limitations.•Identify gaps and future directions in Gr...

Full description

Saved in:
Bibliographic Details
Published inComputational materials science Vol. 254; p. 113904
Main Authors Ebadi, Ali, Kaur, Manpreet, Liu, Qian
Format Journal Article
LanguageEnglish
Published Elsevier B.V 20.05.2025
Subjects
Online AccessGet full text
ISSN0927-0256
1879-0801
DOI10.1016/j.commatsci.2025.113904

Cover

More Information
Summary:[Display omitted] •Comprehensive review of cheminformatics datasets for molecular property prediction.•Survey of optimization techniques for Graph Neural Networks in cheminformatics.•Comparison of optimization methods, highlighting strengths and limitations.•Identify gaps and future directions in Graph Neural Networks for cheminformatics. Cheminformatics, an interdisciplinary field bridging chemistry and information science, leverages computational tools to analyze and interpret chemical data, playing a critical role in drug discovery, material science, and environmental chemistry. Traditional methods, reliant on rule-based algorithms and expert-curated datasets, face challenges in scalability and adaptability. Recently, machine learning and deep learning have revolutionized cheminformatics by offering data-driven approaches that uncover complex patterns in vast chemical datasets, advancing molecular property prediction, chemical reaction modeling, and de novo molecular design. Among the most promising techniques are Graph Neural Networks (GNNs), which have emerged as a powerful tool for modeling molecules in a manner that mirrors their underlying chemical structures. Despite their success, the performance of GNNs is highly sensitive to architectural choices and hyperparameters, making optimal configuration selection a non-trivial task. Neural Architecture Search (NAS) and Hyperparameter Optimization (HPO) are crucial for improving GNN performance, but the complexity and computational cost of these processes have traditionally hindered progress. This review examines various strategies for automating NAS and HPO in GNNs, highlighting their potential to enhance model performance, scalability, and efficiency in key cheminformatics applications. As the field evolves, automated optimization techniques are expected to play a pivotal role in advancing GNN-based solutions in cheminformatics.
ISSN:0927-0256
1879-0801
DOI:10.1016/j.commatsci.2025.113904