Receive a weekly summary and discussion of the top papers of the week by leading researchers in the field.

In Journal of medical imaging (Bellingham, Wash.)

PURPOSE : Contouring Collaborative for Consensus in Radiation Oncology (C3RO) is a crowdsourced challenge engaging radiation oncologists across various expertise levels in segmentation. An obstacle to artificial intelligence (AI) development is the paucity of multiexpert datasets; consequently, we sought to characterize whether aggregate segmentations generated from multiple nonexperts could meet or exceed recognized expert agreement.

APPROACH : Participants who contoured 1 region of interest (ROI) for the breast, sarcoma, head and neck (H&N), gynecologic (GYN), or gastrointestinal (GI) cases were identified as a nonexpert or recognized expert. Cohort-specific ROIs were combined into single simultaneous truth and performance level estimation (STAPLE) consensus segmentations. STAPLE nonexpert ROIs were evaluated against STAPLE expert contours using Dice similarity coefficient (DSC). The expert interobserver DSC ( IODSC expert ) was calculated as an acceptability threshold between STAPLE nonexpert and STAPLE expert . To determine the number of nonexperts required to match the IODSC expert for each ROI, a single consensus contour was generated using variable numbers of nonexperts and then compared to the IODSC expert .

RESULTS : For all cases, the DSC values for STAPLE nonexpert versus STAPLE expert were higher than comparator expert IODSC expert for most ROIs. The minimum number of nonexpert segmentations needed for a consensus ROI to achieve IODSC expert acceptability criteria ranged between 2 and 4 for breast, 3 and 5 for sarcoma, 3 and 5 for H&N, 3 and 5 for GYN, and 3 for GI.

CONCLUSIONS : Multiple nonexpert-generated consensus ROIs met or exceeded expert-derived acceptability thresholds. Five nonexperts could potentially generate consensus segmentations for most ROIs with performance approximating experts, suggesting nonexpert segmentations as feasible cost-effective AI inputs.

Lin Diana, Wahid Kareem A, Nelms Benjamin E, He Renjie, Naser Mohammed A, Duke Simon, Sherer Michael V, Christodouleas John P, Mohamed Abdallah S R, Cislo Michael, Murphy James D, Fuller Clifton D, Gillespie Erin F

2023-Feb

artificial intelligence, autosegmentation, contouring, crowdsourcing, radiation oncology, segmentation