ResearchBib Share Your Research, Maximize Your Social Impacts
Sign for Notice Everyday Sign up >> Login

On Modification of Preconditioning Conjugate Gradient Method with Self-Scaling Quasi-Newton

Journal: International Journal of Advanced Trends in Computer Science and Engineering (IJATCSE) (Vol.8, No. 5)

Publication Date:

Authors : ; ;

Page : 2395-2398

Keywords : Self-Scaling; PCG; SR1 method; quasiNewton method; optimization.;

Source : Downloadexternal Find it from : Google Scholarexternal

Abstract

In this paper, a new class of self-scaling QuasiNewton method updates for solving unconstrained non-linear optimization problem is investigated. The general strategy of self-scaling quasi newton is to scale hessian approximation matrix before it is updated at each iteration. This is to shun huge differences in the eigenvalues of the approximated Hessian of the objective function. The methods are convenient for large scale problems because the amount of storage required by the algorithms can be controlled by user. In comparison to standard serial Quasi-Newton methods, the suggested parallel selfscaling quasi newton algorithms show noticeable improvement in the total number of iterations and function/gradient evaluations needed for solving a broad extent of test problems.

Last modified: 2019-11-13 17:48:29