Manifold mapping optimization with or without true gradients

This paper deals with the space mapping optimization algorithms in general and with the manifold mapping technique in particular. The idea of such algorithms is to optimize a model with a minimum number of each objective function evaluations using a less accurate but faster model. In this optimizati...

Full description

Saved in:
Bibliographic Details
Published inMathematics and computers in simulation Vol. 90; pp. 256 - 265
Main Authors Delinchant, B., Lahaye, D., Wurtz, F., Coulomb, J.-L.
Format Journal Article
LanguageEnglish
Published Elsevier B.V 01.04.2013
Subjects
Online AccessGet full text
ISSN0378-4754
1872-7166
DOI10.1016/j.matcom.2012.09.005

Cover

More Information
Summary:This paper deals with the space mapping optimization algorithms in general and with the manifold mapping technique in particular. The idea of such algorithms is to optimize a model with a minimum number of each objective function evaluations using a less accurate but faster model. In this optimization procedure, fine and coarse models interact at each iteration in order to adjust themselves in order to converge to the real optimum. The manifold mapping technique guarantees mathematically this convergence but requires gradients of both fine and coarse model. Approximated gradients can be used for some cases but are subject to divergence. True gradients can be obtained for many numerical model using adjoint techniques, symbolic or automatic differentiation. In this context, we have tested several manifold mapping variants and compared their convergence in the case of real magnetic device optimization.
ISSN:0378-4754
1872-7166
DOI:10.1016/j.matcom.2012.09.005