On Modification of Preconditioning Conjugate Gradient Method with Self-Scaling Quasi-Newton
Journal: International Journal of Advanced Trends in Computer Science and Engineering (IJATCSE) (Vol.8, No. 5)Publication Date: 2019-10-15
Authors : Salah Ghaze Shareef; Hasan Hazim Jameel;
Page : 2395-2398
Keywords : Self-Scaling; PCG; SR1 method; quasiNewton method; optimization.;
Abstract
In this paper, a new class of self-scaling QuasiNewton method updates for solving unconstrained non-linear optimization problem is investigated. The general strategy of self-scaling quasi newton is to scale hessian approximation matrix before it is updated at each iteration. This is to shun huge differences in the eigenvalues of the approximated Hessian of the objective function. The methods are convenient for large scale problems because the amount of storage required by the algorithms can be controlled by user. In comparison to standard serial Quasi-Newton methods, the suggested parallel selfscaling quasi newton algorithms show noticeable improvement in the total number of iterations and function/gradient evaluations needed for solving a broad extent of test problems.
Other Latest Articles
- Prediction of Plasma Arc Cutting Performance for SS-304 Material using Artificial Neural Network
- Promethee Based Distributed Coverage and Connectivity Preserving Scheduling Algorithm for MWSN
- Experimental Investigation of Spring Back and Wrinkling Phenomena in Square Pipes during Bending
- Mechanical Characterization of Aluminum 6061 with B4c and High Entropy Alloys
- Microstructure and Mechanical Properties of Hot Extruded Mg-3zn-1cu-0.7mn Alloy Produced by Powder Metallurgy
Last modified: 2019-11-13 17:48:29