Global and superlinear convergence of a restricted class of self-scaling methods with inexact line searches, for convex functions

M. Al-Baali

نتاج البحث: المساهمة في مجلةArticleمراجعة النظراء

22 اقتباسات (Scopus)

ملخص

This paper studies the convergence properties of algorithms belonging to the class of self-scaling (SS) quasi-Newton methods for unconstrained optimization. This class depends on two parameters, say θk and τk, for which the choice τk = 1 gives the Broyden family of unsealed methods, where θk = 1 corresponds to the well known DFP method. We propose simple conditions on these parameters that give rise to global convergence with inexact line searches, for convex objective functions. The q-superlinear convergence is achieved if further restrictions on the scaling parameter are introduced. These convergence results are an extension of the known results for the unsealed methods. Because the scaling parameter is heavily restricted, we consider a subclass of SS methods which satisfies the required conditions. Although convergence for the unsealed methods with θk ≥ 1 is still an open question, we show that the global and superlinear convergence for SS methods is possible and present, in particular, a new SS-DFP method.

اللغة الأصليةEnglish
الصفحات (من إلى)191-203
عدد الصفحات13
دوريةComputational Optimization and Applications
مستوى الصوت9
رقم الإصدار2
المعرِّفات الرقمية للأشياء
حالة النشرPublished - فبراير 1998

ASJC Scopus subject areas

  • ???subjectarea.asjc.2600.2606???
  • ???subjectarea.asjc.2600.2605???
  • ???subjectarea.asjc.2600.2604???

بصمة

أدرس بدقة موضوعات البحث “Global and superlinear convergence of a restricted class of self-scaling methods with inexact line searches, for convex functions'. فهما يشكلان معًا بصمة فريدة.

قم بذكر هذا