Receive a weekly summary and discussion of the top papers of the week by leading researchers in the field.

In Cerebral cortex (New York, N.Y. : 1991)

Deep learning has become an effective tool for classifying biological sex based on functional magnetic resonance imaging (fMRI). However, research on what features within the brain are most relevant to this classification is still lacking. Model interpretability has become a powerful way to understand "black box" deep-learning models, and select features within the input data that are most relevant to the correct classification. However, very little work has been done employing these methods to understand the relationship between the temporal dimension of functional imaging signals and the classification of biological sex. Consequently, less attention has been paid to rectifying problems and limitations associated with feature explanation models, e.g. underspecification and instability. In this work, we first provide a methodology to limit the impact of underspecification on the stability of the measured feature importance. Then, using intrinsic connectivity networks from fMRI data, we provide a deep exploration of sex differences among functional brain networks. We report numerous conclusions, including activity differences in the visual and cognitive domains and major connectivity differences.

Lewis Noah, Miller Robyn, Gazula Harshvardhan, Calhoun Vince

2023-Feb-24

brain connectivity, deep learning, model interpretability, neuroimaging, sex differences