Receive a weekly summary and discussion of the top papers of the week by leading researchers in the field.

In Scientific reports ; h5-index 158.0

Machine learning has been applied in recent years to categorize sleep stages (NREM, REM, and wake) using electroencephalogram (EEG) recordings; however, a well-validated sleep scoring automatic pipeline in rodent research is still not publicly available. Here, we present IntelliSleepScorer, a software package with a graphic user interface to score sleep stages automatically in mice. IntelliSleepScorer uses the light gradient boosting machine (LightGBM) to score sleep stages for each epoch of recordings. We developed LightGBM models using a large cohort of data, which consisted of 5776 h of sleep EEG and electromyogram (EMG) signals across 519 unique recordings from 124 mice. The LightGBM model achieved an overall accuracy of 95.2% and a Cohen's kappa of 0.91, which outperforms the baseline models such as the logistic regression model (accuracy = 93.3%, kappa = 0.88) and the random forest model (accuracy = 94.3%, kappa = 0.89). The overall performance of the LightGBM model as well as the performance across different sleep stages are on par with that of the human experts. Most importantly, we validated the generalizability of the LightGBM models: (1) The LightGBM model performed well on two publicly available, independent datasets (kappa >  = 0.80), which have different sampling frequency and epoch lengths; (2) The LightGBM model performed well on data recorded at a lower sampling frequency (kappa = 0.90); (3) The performance of the LightGBM model is not affected by the light/dark cycle; and (4) A modified LightGBM model performed well on data containing only one EEG and one EMG electrode (kappa >  = 0.89). Taken together, the LightGBM models offer state-of-the-art performance for automatic sleep stage scoring in mice. Last, we implemented the IntelliSleepScorer software package based on the validated model to provide an out-of-box solution to sleep researchers (available for download at https://sites.broadinstitute.org/pan-lab/resources ).

Wang Lei A, Kern Ryan, Yu Eunah, Choi Soonwook, Pan Jen Q

2023-Mar-15