TY - GEN
T1 - Quasi-newton based preconditioning and damped quasi-newton schemes for nonlinear conjugate gradient methods
AU - Al-Baali, Mehiddin
AU - Caliciotti, Andrea
AU - Fasano, Giovanni
AU - Roma, Massimo
N1 - Funding Information:
Acknowledgements The research is partially supported by the Italian Flagship Project RITMARE, coordinated by the Italian National Research Council and funded by the Italian Ministry of Education, University and Research.
Publisher Copyright:
© Springer International Publishing AG, part of Springer Nature 2018.
PY - 2018
Y1 - 2018
N2 - In this paper, we deal with matrix-free preconditioners for nonlinear conjugate gradient (NCG) methods. In particular, we review proposals based on quasi-Newton updates, and either satisfying the secant equation or a secant-like equation at some of the previous iterates. Conditions are given proving that, in some sense, the proposed preconditioners also approximate the inverse of the Hessian matrix. In particular, the structure of the preconditioners depends both on low-rank updates along with some specific parameters. The low-rank updates are obtained as by-product of NCG iterations. Moreover, we consider the possibility to embed damped techniques within a class of preconditioners based on quasi-Newton updates. Damped methods have proved to be effective to enhance the performance of quasi-Newton updates, in those cases where the Wolfe linesearch conditions are hardly fulfilled. The purpose is to extend the idea behind damped methods also to improve NCG schemes, following a novel line of research in the literature. The results, which summarize an extended numerical experience using large-scale CUTEst problems, is reported, showing that these approaches can considerably improve the performance of NCG methods.
AB - In this paper, we deal with matrix-free preconditioners for nonlinear conjugate gradient (NCG) methods. In particular, we review proposals based on quasi-Newton updates, and either satisfying the secant equation or a secant-like equation at some of the previous iterates. Conditions are given proving that, in some sense, the proposed preconditioners also approximate the inverse of the Hessian matrix. In particular, the structure of the preconditioners depends both on low-rank updates along with some specific parameters. The low-rank updates are obtained as by-product of NCG iterations. Moreover, we consider the possibility to embed damped techniques within a class of preconditioners based on quasi-Newton updates. Damped methods have proved to be effective to enhance the performance of quasi-Newton updates, in those cases where the Wolfe linesearch conditions are hardly fulfilled. The purpose is to extend the idea behind damped methods also to improve NCG schemes, following a novel line of research in the literature. The results, which summarize an extended numerical experience using large-scale CUTEst problems, is reported, showing that these approaches can considerably improve the performance of NCG methods.
KW - Conjugate gradient method
KW - Damped techniques
KW - Large scale unconstrained optimization
KW - Nonlinear conjugate gradient methods
KW - Preconditioning
KW - Quasi-Newton methods
UR - http://www.scopus.com/inward/record.url?scp=85048243750&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85048243750&partnerID=8YFLogxK
U2 - 10.1007/978-3-319-90026-1_1
DO - 10.1007/978-3-319-90026-1_1
M3 - Conference contribution
AN - SCOPUS:85048243750
SN - 9783319900254
T3 - Springer Proceedings in Mathematics and Statistics
SP - 1
EP - 21
BT - Numerical Analysis and Optimization - NAO-IV, 2017
A2 - Grandinetti, Lucio
A2 - Al-Baali, Mehiddin
A2 - Purnama, Anton
PB - Springer New York LLC
T2 - 4th International Conference on Numerical Analysis and Optimization, NAO-IV 2017
Y2 - 2 January 2017 through 5 January 2017
ER -