Receive a weekly summary and discussion of the top papers of the week by leading researchers in the field.

General General

Accurate multistage prediction of protein crystallization propensity using deep-cascade forest with sequence-based features.

In Briefings in bioinformatics

X-ray crystallography is the major approach for determining atomic-level protein structures. Because not all proteins can be easily crystallized, accurate prediction of protein crystallization propensity provides critical help in guiding experimental design and improving the success rate of X-ray crystallography experiments. This study has developed a new machine-learning-based pipeline that uses a newly developed deep-cascade forest (DCF) model with multiple types of sequence-based features to predict protein crystallization propensity. Based on the developed pipeline, two new protein crystallization propensity predictors, denoted as DCFCrystal and MDCFCrystal, have been implemented. DCFCrystal is a multistage predictor that can estimate the success propensities of the three individual steps (production of protein material, purification and production of crystals) in the protein crystallization process. MDCFCrystal is a single-stage predictor that aims to estimate the probability that a protein will pass through the entire crystallization process. Moreover, DCFCrystal is designed for general proteins, whereas MDCFCrystal is specially designed for membrane proteins, which are notoriously difficult to crystalize. DCFCrystal and MDCFCrystal were separately tested on two benchmark datasets consisting of 12 289 and 950 proteins, respectively, with known crystallization results from various experimental records. The experimental results demonstrated that DCFCrystal and MDCFCrystal increased the value of Matthew's correlation coefficient by 199.7% and 77.8%, respectively, compared to the best of other state-of-the-art protein crystallization propensity predictors. Detailed analyses show that the major advantages of DCFCrystal and MDCFCrystal lie in the efficiency of the DCF model and the sensitivity of the sequence-based features used, especially the newly designed pseudo-predicted hybrid solvent accessibility (PsePHSA) feature, which improves crystallization recognition by incorporating sequence-order information with solvent accessibility of residues. Meanwhile, the new crystal-dataset constructions help to train the models with more comprehensive crystallization knowledge.

Zhu Yi-Heng, Hu Jun, Ge Fang, Li Fuyi, Song Jiangning, Zhang Yang, Yu Dong-Jun


bioinformatics, deep-cascade forest, predictor, protein crystallization propensity, sequence-based feature

Radiology Radiology

A review on the use of artificial intelligence for medical imaging of the lungs of patients with coronavirus disease 2019.

In Diagnostic and interventional radiology (Ankara, Turkey)

The results of research on the use of artificial intelligence (AI) for medical imaging of the lungs of patients with coronavirus disease 2019 (COVID-19) has been published in various forms. In this study, we reviewed the AI for diagnostic imaging of COVID-19 pneumonia. PubMed, arXiv, medRxiv, and Google scholar were used to search for AI studies. There were 15 studies of COVID-19 that used AI for medical imaging. Of these, 11 studies used AI for computed tomography (CT) and 4 used AI for chest radiography. Eight studies presented independent test data, 5 used disclosed data, and 4 disclosed the AI source codes. The number of datasets ranged from 106 to 5941, with sensitivities ranging from 0.67-1.00 and specificities ranging from 0.81-1.00 for prediction of COVID-19 pneumonia. Four studies with independent test datasets showed a breakdown of the data ratio and reported prediction of COVID-19 pneumonia with sensitivity, specificity, and area under the curve (AUC). These 4 studies showed very high sensitivity, specificity, and AUC, in the range of 0.9-0.98, 0.91-0.96, and 0.96-0.99, respectively.

Ito Rintaro, Iwano Shingo, Naganawa Shinji


General General

Global burden of sleep-disordered breathing and its implications.

In Respirology (Carlton, Vic.)

One-seventh of the world's adult population, or approximately one billion people, are estimated to have OSA. Over the past four decades, obesity, the main risk factor for OSA, has risen in striking proportion worldwide. In the past 5 years, the WHO estimates global obesity to affect almost two billion adults. A second major risk factor for OSA is advanced age. As the prevalence of the ageing population and obesity increases, the vulnerability towards having OSA increases. In addition to these traditional OSA risk factors, studies of the global population reveal select contributing features and phenotypes, including extreme phenotypes and symptom clusters that deserve further examination. Untreated OSA is associated with significant comorbidities and mortality. These represent a tremendous threat to the individual and global health. Beyond the personal toll, the economic costs of OSA are far-reaching, affecting the individual, family and society directly and indirectly, in terms of productivity and public safety. A better understanding of the pathophysiology, individual and ethnic similarities and differences is needed to better facilitate management of this chronic disease. In some countries, measures of the OSA disease burden are sparse. As the global burden of OSA and its associated comorbidities are projected to further increase, the infrastructure to diagnose and manage OSA will need to adapt. The use of novel approaches (electronic health records and artificial intelligence) to stratify risk, diagnose and affect treatment are necessary. Together, a unified multi-disciplinary, multi-organizational, global approach will be needed to manage this disease.

Lyons M Melanie, Bhatt Nitin Y, Pack Allan I, Magalang Ulysses J


economics, global burden, obesity, obstructive sleep apnoea, risk factors

Pathology Pathology

Deep learning-guided joint attenuation and scatter correction in multitracer neuroimaging studies.

In Human brain mapping

PET attenuation correction (AC) on systems lacking CT/transmission scanning, such as dedicated brain PET scanners and hybrid PET/MRI, is challenging. Direct AC in image-space, wherein PET images corrected for attenuation and scatter are synthesized from nonattenuation corrected PET (PET-nonAC) images in an end-to-end fashion using deep learning approaches (DLAC) is evaluated for various radiotracers used in molecular neuroimaging studies. One hundred eighty brain PET scans acquired using 18 F-FDG, 18 F-DOPA, 18 F-Flortaucipir (targeting tau pathology), and 18 F-Flutemetamol (targeting amyloid pathology) radiotracers (40 + 5, training/validation + external test, subjects for each radiotracer) were included. The PET data were reconstructed using CT-based AC (CTAC) to generate reference PET-CTAC and without AC to produce PET-nonAC images. A deep convolutional neural network was trained to generate PET attenuation corrected images (PET-DLAC) from PET-nonAC. The quantitative accuracy of this approach was investigated separately for each radiotracer considering the values obtained from PET-CTAC images as reference. A segmented AC map (PET-SegAC) containing soft-tissue and background air was also included in the evaluation. Quantitative analysis of PET images demonstrated superior performance of the DLAC approach compared to SegAC technique for all tracers. Despite the relatively low quantitative bias observed when using the DLAC approach, this approach appears vulnerable to outliers, resulting in noticeable local pseudo uptake and false cold regions. Direct AC in image-space using deep learning demonstrated quantitatively acceptable performance with less than 9% absolute SUV bias for the four different investigated neuroimaging radiotracers. However, this approach is vulnerable to outliers which result in large local quantitative bias.

Arabi Hossein, Bortolin Karin, Ginovart Nathalie, Garibotto Valentina, Zaidi Habib


PET, attenuation correction, deep learning, neuroimaging tracers, quantification

Public Health Public Health

Expansion of the dimensions in the current management of acute ischemic stroke.

In Journal of neurology

Stroke is the fifth leading cause of death in the United States with a huge burden on health care. Acute ischemic stroke (AIS) accounts for 87% of all stroke. The use of thrombolytic agents in AIS treatment is well known since 1950 but no FDA approval until 1996, due to lack of strong evidence showing benefits outweigh the risk of intracranial hemorrhage. The NINDS trial led to the approval of intravenous tissue plasminogen activator treatment (IV recombinant tPA) within 3 h of stroke. Due to this limitation of 3-4.5 h. window, evolution began in the development of effective endovascular therapy (EVT). Multiple trials were unsuccessful in establishing the strong evidence for effectiveness of EVT. In 2015, MR CLEAN trial made progress and showed improved outcomes with EVT in AIS patients with large vessel occlusion (LVO), with 6-h window period. In 2018, two major trials-DAWN and DEFUSE 3-along with few other trials had shown improved outcomes with EVT and stretched window period from 6 to 24 h. AHA Stroke Council is constantly working to provide focused guidelines and recommendations in AIS management since 2013. SVIN had started the initiative "Mission Thrombectomy-2020" to increase global EVT utilization rate 202,000 procedures by 2020. Physicians are using safer and easier approach like brachial and radial approach for EVT. TeleNeurology and artificial intelligence also played a significant role in increasing the availability of IV recombinant tPA in AIS treatment in remote hospitals and also in screening, triaging and identifying LVO patients for EVT. In this review article, we aim to describe the history of stroke management along with the new technological advancements in AIS treatment.

Malik Preeti, Anwar Arsalan, Patel Ruti, Patel Urvish


Acute ischemic stroke, Artificial intelligence and stem cell therapy, DAWN, DEFUSE 3, Endovascular therapy, Large vessel occlusion, Telestroke

General General

Viewpoint on Time Series and Interrupted Time Series Optimum Modeling for Predicting Arthritic Disease Outcomes.

In Current rheumatology reports ; h5-index 35.0

PURPOSE OF REVIEW : The propose of this viewpoint is to improve or facilitate the clinical decision-making in the management/treatment strategies of arthritis patients through knowing, understanding, and having access to an interactive process allowing assessment of the patient disease outcome in the future.

RECENT FINDINGS : In recent years, the time series (TS) concept has become the center of attention as a predictive model for making forecast of unseen data values. TS and one of its technologies, the interrupted TS (ITS) analysis (TS with one or more interventions), predict the next period(s) value(s) of a given patient based on their past and current information. Traditional TS/ITS methods involve segmented regression-based technologies (linear and nonlinear), while stochastic (linear modeling) and artificial intelligence approaches, including machine learning (complex nonlinear relationships between variables), are also used; however, each have limitations. We will briefly describe TS/ITS, provide examples of their application in arthritic diseases; describe their methods, challenges, and limitations; and propose a combined (stochastic and artificial intelligence) procedure in post-intervention that will optimize ITS modeling. This combined method will increase the accuracy of ITS modeling by profiting from the advantages of both stochastic and nonlinear models to capture all ITS deterministic and stochastic components. In addition, this combined method will allow ITS outcomes to be predicted as continuous variables without having to consider the time lag produced between the pre- and post-intervention periods, thus minimizing the prediction error not only for the given data but also for all possible future patterns in ITS. The use of reliable prediction methodologies for arthritis patients will permit treatment of not only the disease, but also the patient with the disease, ensuring the best outcome prediction for the patient.

Bonakdari Hossein, Pelletier Jean-Pierre, Martel-Pelletier Johanne


Arthritis, Clinical decision-making, Data-driven, Interrupted time series, Management/treatment strategies, Time series