In Neural networks : the official journal of the International Neural Network Society
Feature selection is an important issue in machine learning and data mining. Most existing feature selection methods are greedy in nature thus are prone to sub-optimality. Though some global feature selection methods based on unsupervised redundancy minimization can potentiate clustering performance improvements, their efficacy for classification may be limited. In this paper, a neurodynamics-based holistic feature selection approach is proposed via feature redundancy minimization and relevance maximization. An information-theoretic similarity coefficient matrix is defined based on multi-information and entropy to measure feature redundancy with respect to class labels. Supervised feature selection is formulated as a fractional programming problem based on the similarity coefficients. A neurodynamic approach based on two one-layer recurrent neural networks is developed for solving the formulated feature selection problem. Experimental results with eight benchmark datasets are discussed to demonstrate the global convergence of the neural networks and superiority of the proposed neurodynamic approach to several existing feature selection methods in terms of classification accuracy, precision, recall, and F-measure.
Wang Yadi, Li Xiaoping, Wang Jun
Feature selection, Fractional programming, Information-theoretic measures, Neurodynamic optimization