Receive a weekly summary and discussion of the top papers of the week by leading researchers in the field.

In Neural networks : the official journal of the International Neural Network Society

Rectified activation units make an important contribution to the success of deep neural networks in many computer vision tasks. In this paper, we propose a Parametric Deformable Exponential Linear Unit (PDELU) and theoretically verify its effectiveness for improving the convergence speed of learning procedure. By means of flexible map shape, the proposed PDELU could push the mean value of activation responses closer to zero, which ensures the steepest descent in training a deep neural network. We verify the effectiveness of the proposed method in the image classification task. Extensive experiments on three classical databases (i.e., CIFAR-10, CIFAR-100, and ImageNet-2015) indicate that the proposed method leads to higher convergence speed and better accuracy when it is embedded into different CNN architectures (i.e., NIN, ResNet, WRN, and DenseNet). Meanwhile, the proposed PDELU outperforms many existing shape-specific activation functions (i.e., Maxout, ReLU, LeakyReLU, ELU, SELU, SoftPlus, Swish) and the shape-adaptive activation functions (i.e., APL, PReLU, MPELU, FReLU).

Cheng Qishang, Li HongLiang, Wu Qingbo, Ma Lei, Ngan King Ngi

2020-Feb-26

Deep learning, Deformable exponential, Image classification, Rectified activation