We improve a recent accelerated proximal gradient (APG) method in [Li, Q., Zhou, Y., Liang, Y. and Varshney, P. K., Convergence analysis of proximal gradient with momentum for nonconvex optimization, in
Proceedings of the 34th International Conference on Machine Learning, Sydney, Australia, PMLR 70, 2017] for nonconvex optimization by allowing variable stepsizes. We prove the convergence of the APG method for a composite nonconvex optimization problem under the assumption that the composite objective function satisfies the Kurdyka-Lojasiewicz property.
A note on the accelerated proximal gradient method for nonconvex optimization
Wang, Huijuan and Xu, Hong-Kun
Abstract
![Download PDF file](https://www.carpathian.cunbm.utcluj.ro/wp-content/uploads/2016/07/adobe-PDF-icon.jpg)
Full PDF
![Download PDF file](https://www.carpathian.cunbm.utcluj.ro/wp-content/uploads/2016/07/adobe-PDF-icon.jpg)
Additional Information
Author(s) | Wang, H., Xu, H.-K. |
---|---|
DOI | https://doi.org/10.37193/CJM.2018.03.22 |