Machine-assisted interpolation algorithm for semi-automated segmentation of highly deformable organs

Med Phys. 2022 Jan;49(1):41-51. doi: 10.1002/mp.15351. Epub 2021 Nov 27.

Abstract

Purpose: Accurate and robust auto-segmentation of highly deformable organs (HDOs), for example, stomach or bowel, remains an outstanding problem due to these organs' frequent and large anatomical variations. Yet, time-consuming manual segmentation of these organs presents a particular challenge to time-limited modern radiotherapy techniques such as on-line adaptive radiotherapy and high-dose-rate brachytherapy. We propose a machine-assisted interpolation (MAI) that uses prior information in the form of sparse manual delineations to facilitate rapid, accurate segmentation of the stomach from low field magnetic resonance images (MRI) and the bowel from computed tomography (CT) images.

Methods: Stomach MR images from 116 patients undergoing 0.35T MRI-guided abdominal radiotherapy and bowel CT images from 120 patients undergoing high dose rate pelvic brachytherapy treatment were collected. For each patient volume, the manual delineation of the HDO was extracted from every 8th slice. These manually drawn contours were first interpolated to obtain an initial estimate of the HDO contour. A two-channel 64 × 64 pixel patch-based convolutional neural network (CNN) was trained to localize the position of the organ's boundary on each slice within a five-pixel wide road using the image and interpolated contour estimate. This boundary prediction was then input, in conjunction with the image, to an organ closing CNN which output the final organ segmentation. A Dense-UNet architecture was used for both networks. The MAI algorithm was separately trained for the stomach segmentation and the bowel segmentation. Algorithm performance was compared against linear interpolation (LI) alone and against fully automated segmentation (FAS) using a Dense-UNet trained on the same datasets. The Dice Similarity Coefficient (DSC) and mean surface distance (MSD) metrics were used to compare the predictions from the three methods. Statistically significance was tested using Student's t test.

Results: For the stomach segmentation, the mean DSC from MAI (0.91 ± 0.02) was 5.0% and 10.0% higher as compared to LI and FAS, respectively. The average MSD from MAI (0.77 ± 0.25 mm) was 0.54 and 3.19 mm lower compared to the two other methods. Only 7% of MAI stomach predictions resulted in a DSC < 0.8, as compared to 30% and 28% for LI and FAS, respectively. For the bowel segmentation, the mean DSC of MAI (0.90 ± 0.04) was 6% and 18% higher, and the average MSD of MAI (0.93 ± 0.48 mm) was 0.42 and 4.9 mm lower as compared to LI and FAS. Sixteen percent of the predicted contour from MAI resulted in a DSC < 0.8, as compared to 46% and 60% for FAS and LI, respectively. All comparisons between MAI and the baseline methods were found to be statistically significant (p-value < 0.001).

Conclusions: The proposed MAI algorithm significantly outperformed LI in terms of accuracy and robustness for both stomach segmentation from low-field MRIs and bowel segmentation from CT images. At this time, FAS methods for HDOs still require significant manual editing. Therefore, we believe that the MAI algorithm has the potential to expedite the process of HDO delineation within the radiation therapy workflow.

Keywords: deep learning; radiation therapy; segmentation.

MeSH terms

  • Humans
  • Image Processing, Computer-Assisted*
  • Magnetic Resonance Imaging
  • Neural Networks, Computer
  • Radiotherapy, Image-Guided*
  • Tomography, X-Ray Computed