Open Access Open Access  Restricted Access Subscription Access

A Review on Enhancements to Speed up Training of the Batch Back Propagation Algorithm


Affiliations
1 Faculty of Informatics and Computing, Universiti Sultan Zainal Abidin, Terengganu, Malaysia
 

Objectives: The present review is focused on determining the efficiency of some of the parameters for enhancing the time and accuracy training in the batch back propagation (BP) algorithm. Methods: Researchers have used many methods, including heuristic methods, flat-spots, Fletcher -Powel and Quasi-Newton methods to enhance the BP algorithm for speeding up time training. The current heuristic method covers two techniques. The first focuses on choosing the suitable value for each training rate and momentum term, either together or individually. The second technique is to create a dynamic training rate with a penalty to avoid the local minimum. Findings: Slow training or fast training depends on the weight adjusted in the BP algorithm. The training rate and momentum are significant parameters for controlling the updated weight, but it is difficult to choose the suitable value to adjust the weight for improving the BP algorithm. If the weights are adjusted too small, the BP algorithm gives slow training; if the weight is over-adjusted, the BP algorithm gives faster training with an oscillating value of error training. The small or large adjustment of the weights is unsuitable for learning of the BP algorithm. Existing studies do not mention the relationship between the values of training rate and momentum term with gross weight. Gross weight leads to saturation training or reduction in training accuracy. This study suggests creating the dynamic training rate with boundary and momentum terms and then establishing the relationship between them to keep the weight adjusted moderate to avoid the gross the weight being updated. Improvements: This study will guide researchers to create a dynamic training rate and momentum term with an inverse relationship or boundary to escape gross weight training and maintain high accuracy training.

Keywords

Batch Back Propagation Algorithm, Local Minimum, Momentum Term, Speed Up Training, Training Rate.
User

Abstract Views: 130

PDF Views: 0




  • A Review on Enhancements to Speed up Training of the Batch Back Propagation Algorithm

Abstract Views: 130  |  PDF Views: 0

Authors

Mohammed Sarhan Al_Duais
Faculty of Informatics and Computing, Universiti Sultan Zainal Abidin, Terengganu, Malaysia
F. S. Mohamad
Faculty of Informatics and Computing, Universiti Sultan Zainal Abidin, Terengganu, Malaysia

Abstract


Objectives: The present review is focused on determining the efficiency of some of the parameters for enhancing the time and accuracy training in the batch back propagation (BP) algorithm. Methods: Researchers have used many methods, including heuristic methods, flat-spots, Fletcher -Powel and Quasi-Newton methods to enhance the BP algorithm for speeding up time training. The current heuristic method covers two techniques. The first focuses on choosing the suitable value for each training rate and momentum term, either together or individually. The second technique is to create a dynamic training rate with a penalty to avoid the local minimum. Findings: Slow training or fast training depends on the weight adjusted in the BP algorithm. The training rate and momentum are significant parameters for controlling the updated weight, but it is difficult to choose the suitable value to adjust the weight for improving the BP algorithm. If the weights are adjusted too small, the BP algorithm gives slow training; if the weight is over-adjusted, the BP algorithm gives faster training with an oscillating value of error training. The small or large adjustment of the weights is unsuitable for learning of the BP algorithm. Existing studies do not mention the relationship between the values of training rate and momentum term with gross weight. Gross weight leads to saturation training or reduction in training accuracy. This study suggests creating the dynamic training rate with boundary and momentum terms and then establishing the relationship between them to keep the weight adjusted moderate to avoid the gross the weight being updated. Improvements: This study will guide researchers to create a dynamic training rate and momentum term with an inverse relationship or boundary to escape gross weight training and maintain high accuracy training.

Keywords


Batch Back Propagation Algorithm, Local Minimum, Momentum Term, Speed Up Training, Training Rate.



DOI: https://doi.org/10.17485/ijst%2F2016%2Fv9i46%2F130243