What is it about?

On accelerated gradient SM method for solving unconstrained optimization problems was applied hybrid process and that way the HSM scheme was defined in [5]. The HSM is numerically confirmed to be more efficient than the SM method regarding the analysed performance metrics: number of iterations, CPU time, number of function evaluations. Herein, we upgrade the HSM method by optimizing its initial step length value taken in the Backtracking line search procedure. This way, we defined a modified version of the HSM method, so called the MHSM, which has better performance features than its forerunner.

Featured Image

Why is it important?

We explain the way to upgrade performance characteristics of a certain model. This type of efficiency improving of a relevant model can be use on some other gradient methods.

Perspectives

Presented ideas can be used for further researches within this field.

Milena J. Petrović
Faculty of sciences and mathematics, University of Priština, Kosovska Mitrovica, Serbia

Read the Original

This page is a summary of: INITIAL IMPROVEMENT OF THE HYBRID ACCELERATED GRADIENT DESCENT PROCESS, Bulletin of the Australian Mathematical Society, August 2018, Cambridge University Press,
DOI: 10.1017/s0004972718000552.
You can read the full text:

Read

Contributors

The following have contributed to this page