Receive a weekly summary and discussion of the top papers of the week by leading researchers in the field.

In IEEE transactions on neural networks and learning systems

Training deep neural networks (DNNs) typically requires massive computational power. Existing DNNs exhibit low time and storage efficiency due to the high degree of redundancy. In contrast to most existing DNNs, biological and social networks with vast numbers of connections are highly efficient and exhibit scale-free properties indicative of the power law distribution, which can be originated by preferential attachment in growing networks. In this work, we ask whether the topology of the best performing DNNs shows the power law similar to biological and social networks and how to use the power law topology to construct well-performing and compact DNNs. We first find that the connectivities of sparse DNNs can be modeled by truncated power law distribution, which is one of the variations of the power law. The comparison of different DNNs reveals that the best performing networks correlated highly with the power law distribution. We further model the preferential attachment in DNNs evolution and find that continual learning in networks with growth in tasks correlates with the process of preferential attachment. These identified power law dynamics in DNNs can lead to the construction of highly accurate and compact DNNs based on preferential attachment. Inspired by the discovered findings, two novel applications have been proposed, including evolving optimal DNNs in sparse network generation and continual learning tasks with efficient network growth using power law dynamics. Experimental results indicate that the proposed applications can speed up training, save storage, and learn with fewer samples than other well-established baselines. Our demonstration of preferential attachment and power law in well-performing DNNs offers insight into designing and constructing more efficient deep learning.

Feng Fan, Hou Lu, She Qi, Chan Rosa H M, Kwok James T

2022-Nov-07