Abstract
This paper considers simple modifications of the limited memory BFGS (L-BFGS) method for large scale optimization. It describes algorithms in which alternating ways of re-using a given set of stored difference vectors are outlined. The proposed algorithms resemble the L-BFGS method, except that the initial Hessian approximation is defined implicitly like the L-BFGS Hessian in terms of some stored vectors rather than the usual choice of a multiple of the unit matrix. Numerical experiments show that the new algorithms yield desirable improvement over the L-BFGS method.
Original language | English |
---|---|
Pages (from-to) | 99-112 |
Number of pages | 14 |
Journal | Numerical Algorithms |
Volume | 22 |
Issue number | 1 |
DOIs | |
Publication status | Published - 1999 |
Keywords
- BFGS updating formula
- Large scale optimization
- Limited memory BFGS method
- Quasi-Newton methods
ASJC Scopus subject areas
- Applied Mathematics