Receive a weekly summary and discussion of the top papers of the week by leading researchers in the field.

In IEEE transactions on neural networks and learning systems

Composite function optimization problem often arises in machine learning known as regularized empirical minimization. We introduce the acceleration technique to the Newton-type proximal method and propose a novel algorithm called accelerated proximal subsampled Newton method (APSSN). APSSN only subsamples a small subset of samples to construct an approximate Hessian that achieves computational efficiency. At the same time, APSSN still keeps a fast convergence rate. Furthermore, we obtain the scaled proximal mapping by solving its dual problem using the semismooth Newton method instead of resorting to the first-order methods. Due to our sampling strategy and the fast convergence rate of the semismooth Newton method, we can get the scaled proximal mapping efficiently. Both our theoretical analysis and empirical study show that APSSN is an effective and computationally efficient algorithm for composite function optimization problems.

Ye Haishan, Luo Luo, Zhang Zhihua

2020-Sep-09