A new modified trust region algorithm for solving unconstrained optimization problems
Abstract
Iterative methods for optimization can be classified into two categories:
line search methods and trust region methods.
In this paper, we propose a modified regularized Newton method without line search for minimizing nonconvex
functions whose Hessian matrix may be singular.
The proposed method is proved to converge globally if the Gradient and Hessian of the objective
function are Lipschitz continuous.
Moreover, we report numerical results that show that the proposed algorithm is competitive with the existing methods.
line search methods and trust region methods.
In this paper, we propose a modified regularized Newton method without line search for minimizing nonconvex
functions whose Hessian matrix may be singular.
The proposed method is proved to converge globally if the Gradient and Hessian of the objective
function are Lipschitz continuous.
Moreover, we report numerical results that show that the proposed algorithm is competitive with the existing methods.
Keywords
Regularized Newton method; Unconstrained optimization; Nonconvex; Trust-region method; Convergence analysis.
Refbacks
- There are currently no refbacks.
This work is licensed under a Creative Commons Attribution 3.0 License.