What is it about?
We present a matrix-free algorithm that satisfies the descent direction condition for solving nonlinear least squares problem.
Featured Image
Why is it important?
We have proposed a structured spectral gradient method for addressing a special case of the unconstrained optimization problems, that is, minimizing sums of squares of nonlinear functions. Considering the fact that several modifications of the two-point stepsize methods for general unconstrained optimization have been studied in the literature, we believe that this study uncovered an interesting technique for exploiting the special structure of the nonlinear least squares problems. The proposed approach has an advantage on large-scale problems because it requires neither the exact Jacobian computation nor its storage space; instead, it requires only a loop-free script of the action of its transpose upon a vector. In addition, we proposed a simple strategy with theoretical support and numerical backing for safeguarding the negative curvature direction.
Perspectives
I had a great pleasure and a better experience in writing this article. It contains some new ideas that we developed during my Ph.D. research visit in Unicamp.
Hassan Mohammad
Bayero University
Read the Original
This page is a summary of: Structured Two-Point Stepsize Gradient Methods for Nonlinear Least Squares, Journal of Optimization Theory and Applications, November 2018, Springer Science + Business Media,
DOI: 10.1007/s10957-018-1434-y.
You can read the full text:
Contributors
The following have contributed to this page







