Receive a weekly summary and discussion of the top papers of the week by leading researchers in the field.

In IEEE transactions on cybernetics

Feature selection (FS) is an important step in machine learning since it has been shown to improve prediction accuracy while suppressing the curse of dimensionality of high-dimensional data. Neural networks have experienced tremendous success in solving many nonlinear learning problems. Here, we propose a new neural-network-based FS approach that introduces two constraints, the satisfaction of which leads to a sparse FS layer. We performed extensive experiments on synthetic and real-world data to evaluate the performance of our proposed FS method. In the experiments, we focus on high-dimensional, low-sample-size data since they represent the main challenge for FS. The results confirm that the proposed FS method based on a sparse neural-network layer with normalizing constraints (SNeL-FS) is able to select the important features and yields superior performance compared to other conventional FS methods.

Bugata Peter, Drotar Peter

2021-Jul-08