Receive a weekly summary and discussion of the top papers of the week by leading researchers in the field.

In Neural networks : the official journal of the International Neural Network Society

Feature selection is a crucial part of machine learning and pattern recognition, which aims at selecting a subset of informative features from the original dataset. Because of label information, supervised feature selection performs better than unsupervised feature selection without label information. However, in the presence of a small number of labeled data and a large number of unlabeled data, it is challenging for supervised feature selection methods to select relevant features. In this paper, we propose three neurodynamics-driven holistic approaches to semi-supervised feature selection via semi-supervised feature redundancy minimization and semi-supervised feature relevancy maximization. We first define information-theoretic semi-supervised similarity coefficient matrix and semi-supervised feature relevancy vector based on multi-information, unsupervised symmetric uncertainty, and entropy to measure feature redundancy and relevancy. We then formulate a fractional programming problem and an iteratively weighted quadratic programming problem based on the semi-supervised similarity coefficient matrix and semi-supervised feature relevancy vector for semi-supervised feature selection. To solve the formulated problems, we delineate three neurodynamic optimization approaches based on two projection neural networks. We elaborate on the experimental results on six benchmark datasets to demonstrate the superior classification performance of the proposed neurodynamic approaches against six existing supervised and semi-supervised feature selection methods.

Wang Yadi, Wang Jun

2022-Nov-03

Fractional programming, Information-theoretic measures, Neurodynamic optimization, Semi-supervised feature selection