Receive a weekly summary and discussion of the top papers of the week by leading researchers in the field.

In Optics express

Spatial-frequency shift (SFS) imaging microscopy can break the diffraction limit of fluorescently labeled and label-free samples by transferring the high spatial-frequency information into the passband of microscope. However, the resolution improvement is at the cost of decreasing temporal resolution since dozens of raw SFS images are needed to expand the frequency spectrum. Although some deep learning methods have been proposed to solve this problem, no neural network that is compatible to both labeled and label-free SFS imaging has been proposed. Here, we propose the joint spatial-Fourier channel attention network (JSFCAN), which learns the general connection between the spatial domain and Fourier frequency domain from complex samples. We demonstrate that JSFCAN can achieve a resolution similar to the traditional algorithm using nearly 1/4 raw images and increase the reconstruction speed by two orders of magnitude. Subsequently, we prove that JSFCAN can be applied to both fluorescently labeled and label-free samples without architecture changes. We also demonstrate that compared with the typical spatial domain optimization network U-net, JSFCAN is more robust to deal with deep-SFS images and noisy images. The proposed JSFCAN provides an alternative route for fast SFS imaging reconstruction, enabling future applications for real-time living cell research.

Zhang Qianwei, Liang Chenhui, Tang Mingwei, Yang Xiaoyu, Lin Muchun, Han Yubing, Liu Xu, Yang Qing

2023-Jan-30