Open Access Open Access  Restricted Access Subscription Access

Global Convergence Analysis of a Nonlinear Conjugate Gradient Method for Unconstrained Optimization Problems


Affiliations
1 Department of Mathematical Sciences, Universiti Teknologi Malaysia (UTM), 81310 Johor Bahru, Johor, Malaysia
2 Department of Mathematics, Federal University Dutse (FUD), P.M.B 7156 Dutse JIgawa State, Nigeria
 

Background/Objectives: The Conjugate Gradient (CG) methods are the well-known iterative methods use for finding solutions to nonlinear system equations. There is need to address the jamming phenomenal facing the current class of this methods. Methods/Statistical Analysis: In order to address the shortcomings, we work on the denominator of the Yao et al., CG method which is known to generate descent direction for objective functions by proposing an entire different CG coefficient which can easily switch in case jamming occurs by imposing some parameters thereby guarantee global convergence. Findings: The proposed CG formula performs better than classical methods as well as Yao et al. Under Wolfe line search condition, the convergence analysis of the proposed CG formula was established. Some benchmark problems from cute collections are used as basis of strength comparisons of the proposed formula against some other CG formulas. Effectiveness and efficiency of the obtained results for the proposed formula is clearly shown by adopting the performance profile of Dolan and More’ which is one of most acceptable techniques of strength comparisons among methods. Application: Mathematicians and Engineers who are interested in finding solutions to large scale nonlinear equations can apply the method leading to global optimization dealing with best possible solutions ever for given problems.

Keywords

Conjugate Gradient, Descent Algorithm, Global Convergence, Line Search, Unconstrained Optimization.
User

Abstract Views: 138

PDF Views: 0




  • Global Convergence Analysis of a Nonlinear Conjugate Gradient Method for Unconstrained Optimization Problems

Abstract Views: 138  |  PDF Views: 0

Authors

Ibrahim Abdullahi
Department of Mathematical Sciences, Universiti Teknologi Malaysia (UTM), 81310 Johor Bahru, Johor, Malaysia
Rohanin Ahmad
Department of Mathematics, Federal University Dutse (FUD), P.M.B 7156 Dutse JIgawa State, Nigeria

Abstract


Background/Objectives: The Conjugate Gradient (CG) methods are the well-known iterative methods use for finding solutions to nonlinear system equations. There is need to address the jamming phenomenal facing the current class of this methods. Methods/Statistical Analysis: In order to address the shortcomings, we work on the denominator of the Yao et al., CG method which is known to generate descent direction for objective functions by proposing an entire different CG coefficient which can easily switch in case jamming occurs by imposing some parameters thereby guarantee global convergence. Findings: The proposed CG formula performs better than classical methods as well as Yao et al. Under Wolfe line search condition, the convergence analysis of the proposed CG formula was established. Some benchmark problems from cute collections are used as basis of strength comparisons of the proposed formula against some other CG formulas. Effectiveness and efficiency of the obtained results for the proposed formula is clearly shown by adopting the performance profile of Dolan and More’ which is one of most acceptable techniques of strength comparisons among methods. Application: Mathematicians and Engineers who are interested in finding solutions to large scale nonlinear equations can apply the method leading to global optimization dealing with best possible solutions ever for given problems.

Keywords


Conjugate Gradient, Descent Algorithm, Global Convergence, Line Search, Unconstrained Optimization.



DOI: https://doi.org/10.17485/ijst%2F2016%2Fv9i41%2F125336