Improved Concave-Convex procedure and its application to analysis for the stability of Hopfield neural network
This paper discusses the Improvement of Concave-Convex procedure, where the objective function in optimization problem can be decomposed into a convex function minus a generalized differential function. While preserving the property of monotonic decreasing for optimization objective function, the co...
Saved in:
| Published in | 2010 3rd International Conference on Advanced Computer Theory and Engineering(ICACTE) Vol. 2; pp. V2-173 - V2-177 |
|---|---|
| Main Authors | , |
| Format | Conference Proceeding |
| Language | English |
| Published |
IEEE
01.08.2010
|
| Subjects | |
| Online Access | Get full text |
| ISBN | 1424465397 9781424465392 |
| ISSN | 2154-7491 |
| DOI | 10.1109/ICACTE.2010.5579263 |
Cover
| Summary: | This paper discusses the Improvement of Concave-Convex procedure, where the objective function in optimization problem can be decomposed into a convex function minus a generalized differential function. While preserving the property of monotonic decreasing for optimization objective function, the convergence conditions of this procedure and the scope it can be applied to were also improved greatly. Use the properties of sub-gradient and of convex function to prove thess procedures are globally descent convergent. The optimization problem it solved can be smooth or non-smooth objective functions. Meanwhile, the global convergence of this procedure can be used for analyzing the stability of Hopfield neural networks. Also it can be used both as a new way to understand existing optimization algorithms and as a procedure for generating new algorithms. |
|---|---|
| ISBN: | 1424465397 9781424465392 |
| ISSN: | 2154-7491 |
| DOI: | 10.1109/ICACTE.2010.5579263 |