Open Access
Subscription Access
A Novel Multi-Objective Moth-Flame Optimization Algorithm for Feature Selection
Objectives: A novel hybrid multi-objective swarm system is proposed in order to find the optimal feature subset that achieves data description with minor redundancy as well as keeps classification performance. Methods/Statistical Analysis: The advantages of filter and wrapper approaches characteristics’ are merged simultaneously via different phases of optimization. The proposed system based on Modified Moth Flame Optimization algorithm is assessed against the original algorithm in different experimentations with a single objective based on using MFO and with a multi-objective MFO. The proposed system has been tested over 21 data sets derived from UCI data repository under set of statistical assessment indicators. Findings: The experimental results proved the capability of the hybrid multi-objective MFO-2 to adaptively search the feature space to find optimal feature subset with highest mutual information and achieves maximum classification accuracy and tolerates the problems that are common on both wrapper-based feature selection as well as filter based ones. In addition to, the results obtained from using Random Forests ensemble classifier within the proposed system proved that the performance of hybrid system based on the modified MFO-2 is superior to the hybrid system based on original MFO algorithm in 62% from the data sets, while in case of using the K-Nearest Neighbor classifier we can highlight that the proposed hybrid multi-objective MFO-2 outperforms the other single objective MFO-2 as well as the hybrid multi-objective MFO algorithm in classification performance but also has a comparable ratio of features selected, which confirms that the hybrid multi-objective MFO-2 can select the optimal feature combination with comparable size. Application/Improvements: The novel hybrid multi-objective MFO-2 proved the capability to adaptively search the feature space and its ability to avoid premature convergence caused by falling in local minima.
User
Information
- James G, Witten D, Trevor H, Robert T. An introduction to statistical learning-with Applications in R. Dordrecht London: Springer New York Heidelberg; 2013. p. 1–440. https://doi.org/10.1007/978-1-4614-7138-7
- Bermingham ML, Ricardo PW, Spiliopoulou A, Haley C. Application of high-dimensional feature selection: Evaluation for genomic prediction in man. Scientific Reports. 2015; 5:1–12. PMid: 25988841 PMCid: PMC4437376. https://doi.org/10.1038/srep10312
- Kraskov A, Stogbauer H, Andrzejak R, Grassberger P. Hierarchical clustering based on mutual information. Quantitative Methods. ArXiv q-bio/0311039. 2013. p. 1–11.
- Mirjalili S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Elsevier. KnowledgeBased Systems. 2015; 89:228–49. https://doi.org/10.1016/j.knosys.2015.07.006
- Eberhart R, Kennedy J. A new optimizer using particle swarm theory. Proceedings of the Sixth International Symposium on Micro Machine and Human Science; 1995. p. 39–43. https://doi.org/10.1109/MHS.1995.494215
- Colorni A, Dorigo M, Maniezzo V. Distributed optimization by ant colonies. Proceedings of the First European Conference on Artificial Life; 1991. p.134–42.
- Storn R, Price K. Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. Journal of Global Optimization. 1997; 11(4):341– 59. https://doi.org/10.1023/A:1008202821328
- Rechenberg I. Evolution strategy: Optimization of technical systems by means of biological evolution. Fromman-Holzboog: Stuttgart; 1973.
- Fogel LJ, Owens AJ, Walsh M J. Artificial intelligence through simulated evolution. Oxford, England: John Wiley and Sons; 1966.
- Yao X, Liu Y, Lin G. Evolutionary programming made faster. IEEE Transactions on Evolutionary Computation. 1999; 3(2):82–102. https://doi.org/10.1109/4235.771163
- Soliman G, Khorshid M, Abou-El-Enien T. Modified mothflame optimization algorithms for terrorism prediction. IJAIEM. 2016; 5(7):047–59.
- Roffo G, Melzi S, Cristani M. Infinite feature selection. IEEE International Conference on Computer Vision (ICCV); 2016. p. 01–25.
- Hammon J. Optimisation combinatoire pour la selection de variables enregression engrande dimension: Application engenetiqueanimale. Applications [stat.AP]. Universite des Sciences et Technologie de Lille - Lille I; 2013.
- Phuong TM, Linet R. Altman B. Choosing SNPs using feature selection. Proceedings IEEE Computational Systems Bioinformatics Conference; 2005. p. 301–9.
- Villacampa O. Feature selection and classification methods for decision making: A comparative analysis. Doctoral dissertation. Nova Southeastern University. Retrieved from NSU Works, College of Engineering and Computing. 2015; 63:1–12.
- Duval B, Hao JK, Hernandez JC. A memetic algorithm for gene selection and molecular classification of a cancer. Proceedings of the 11th Annual Conference on Genetic and Evolutionary Computation; New York, NY, USA. 2009. p. 201–8.
- Emary E,Yamani W, Hassanien A, Snasel V. A Wolf search algorithm for attribute reduction in classification. Computational Intelligence and Data Mining (CIDM). IEEE Symposium on Computational Intelligence and Data Mining (CIDM); 2014. p. 351–8.
- Bache K, Lichman M. UCI machine learning repository. Irvine, CA: University of California, School of Information and Computer Science; 2013. http://archive.ics.uci.edu/ml]
- Rice J A. Mathematical statistics and data analysis. 3rd Ed. Duxbury Advanced; 2006. p. 1–603.
Abstract Views: 214
PDF Views: 0