Automatic hyperparameter tuning of topology optimization algorithms using surrogate optimization

This paper presents a new approach that automates the tuning process in topology optimization of parameters that are traditionally defined by the user. The new method draws inspiration from hyperparameter optimization in machine learning. A new design problem is formulated where the topology optimiz...

Full description

Saved in:
Bibliographic Details
Published inStructural and multidisciplinary optimization Vol. 67; no. 9; p. 157
Main Authors Ha, Dat, Carstensen, Josephine
Format Journal Article
LanguageEnglish
Published Berlin/Heidelberg Springer Berlin Heidelberg 01.09.2024
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN1615-147X
1615-1488
1615-1488
DOI10.1007/s00158-024-03850-7

Cover

More Information
Summary:This paper presents a new approach that automates the tuning process in topology optimization of parameters that are traditionally defined by the user. The new method draws inspiration from hyperparameter optimization in machine learning. A new design problem is formulated where the topology optimization hyperparameters are defined as design variables and the problem is solved by surrogate optimization. The new design problem is nested, such that a topology optimization problem is solved as an inner problem. To encourage the identification of high-performing solutions while limiting the computational resource requirements, the outer objective function is defined as the original objective combined with penalization for intermediate densities and deviations from the prescribed material consumption. The contribution is demonstrated on density-based topology optimization with various hyperparameters and objectives, including compliance minimization, compliant mechanism design, and buckling load factor maximization. Consistent performance is observed across all tested examples. For a simple two hyperparameter case, the new framework is shown to reduce amount of times a topology optimization algorithm is executed by 90% without notably sacrificing the objective compared to a rigorous manual grid search.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1615-147X
1615-1488
1615-1488
DOI:10.1007/s00158-024-03850-7