Receive a weekly summary and discussion of the top papers of the week by leading researchers in the field.

In Neural networks : the official journal of the International Neural Network Society

This paper investigates the approximation properties of deep neural networks with piecewise-polynomial activation functions. We derive the required depth, width, and sparsity of a deep neural network to approximate any Hölder smooth function up to a given approximation error in Hölder norms in such a way that all weights of this neural network are bounded by 1. The latter feature is essential to control generalization errors in many statistical and machine learning applications.

Belomestny Denis, Naumov Alexey, Puchkin Nikita, Samsonov Sergey

2023-Feb-02

Approximation complexity, Deep neural networks, Hölder class, ReLU(k) activations, ReQU activations