Improved Concave-Convex procedure and its application to analysis for the stability of Hopfield neural network

This paper discusses the Improvement of Concave-Convex procedure, where the objective function in optimization problem can be decomposed into a convex function minus a generalized differential function. While preserving the property of monotonic decreasing for optimization objective function, the co...

Full description

Saved in:
Bibliographic Details
Published in2010 3rd International Conference on Advanced Computer Theory and Engineering(ICACTE) Vol. 2; pp. V2-173 - V2-177
Main Authors Shiwei Ye, Wenjie Wang
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.08.2010
Subjects
Online AccessGet full text
ISBN1424465397
9781424465392
ISSN2154-7491
DOI10.1109/ICACTE.2010.5579263

Cover

More Information
Summary:This paper discusses the Improvement of Concave-Convex procedure, where the objective function in optimization problem can be decomposed into a convex function minus a generalized differential function. While preserving the property of monotonic decreasing for optimization objective function, the convergence conditions of this procedure and the scope it can be applied to were also improved greatly. Use the properties of sub-gradient and of convex function to prove thess procedures are globally descent convergent. The optimization problem it solved can be smooth or non-smooth objective functions. Meanwhile, the global convergence of this procedure can be used for analyzing the stability of Hopfield neural networks. Also it can be used both as a new way to understand existing optimization algorithms and as a procedure for generating new algorithms.
ISBN:1424465397
9781424465392
ISSN:2154-7491
DOI:10.1109/ICACTE.2010.5579263