The PDF file you selected should load here if your Web browser has a PDF reader plug-in installed (for example, a recent version of Adobe Acrobat Reader).

If you would like more information about how to print, save, and work with PDFs, Highwire Press provides a helpful Frequently Asked Questions about PDFs.

Alternatively, you can download the PDF file directly to your computer, from where it can be opened using a PDF reader. To download the PDF, click the Download link above.

Fullscreen Fullscreen Off


Background/Objectives: The Conjugate Gradient (CG) methods are the well-known iterative methods use for finding solutions to nonlinear system equations. There is need to address the jamming phenomenal facing the current class of this methods. Methods/Statistical Analysis: In order to address the shortcomings, we work on the denominator of the Yao et al., CG method which is known to generate descent direction for objective functions by proposing an entire different CG coefficient which can easily switch in case jamming occurs by imposing some parameters thereby guarantee global convergence. Findings: The proposed CG formula performs better than classical methods as well as Yao et al. Under Wolfe line search condition, the convergence analysis of the proposed CG formula was established. Some benchmark problems from cute collections are used as basis of strength comparisons of the proposed formula against some other CG formulas. Effectiveness and efficiency of the obtained results for the proposed formula is clearly shown by adopting the performance profile of Dolan and More’ which is one of most acceptable techniques of strength comparisons among methods. Application: Mathematicians and Engineers who are interested in finding solutions to large scale nonlinear equations can apply the method leading to global optimization dealing with best possible solutions ever for given problems.

Keywords

Conjugate Gradient, Descent Algorithm, Global Convergence, Line Search, Unconstrained Optimization.
User