Deep Learning Automated Segmentation for Muscle and Adipose Tissue from Abdominal Computed Tomography in Polytrauma Patients

L.L.G.C. Ackermans*, L. Volmer*, L. Wee, R. Brecheisen, P. Sanchez-Gonzalez, A.P. Seiffert, E.J. Gomez, A. Dekker, J.A. Ten Bosch, S.M.W.O. Damink, T.J. Blokhuis

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

7 Citations (Web of Science)

Abstract

Manual segmentation of muscle and adipose compartments from computed tomography (CT) axial images is a potential bottleneck in early rapid detection and quantification of sarcopenia. A prototype deep learning neural network was trained on a multi-center collection of 3413 abdominal cancer surgery subjects to automatically segment truncal muscle, subcutaneous adipose tissue and visceral adipose tissue at the L3 lumbar vertebral level. Segmentations were externally tested on 233 polytrauma subjects. Although after severe trauma abdominal CT scans are quickly and robustly delivered, with often motion or scatter artefacts, incomplete vertebral bodies or arms that influence image quality, the concordance was generally very good for the body composition indices of Skeletal Muscle Radiation Attenuation (SMRA) (Concordance Correlation Coefficient (CCC) = 0.92), Visceral Adipose Tissue index (VATI) (CCC = 0.99) and Subcutaneous Adipose Tissue Index (SATI) (CCC = 0.99). In conclusion, this article showed an automated and accurate segmentation system to segment the cross-sectional muscle and adipose area L3 lumbar spine level on abdominal CT. Future perspectives will include fine-tuning the algorithm and minimizing the outliers.
Original languageEnglish
Article number2083
Number of pages13
JournalSensors
Volume21
Issue number6
DOIs
Publication statusPublished - 1 Mar 2021

Keywords

  • sarcopenia
  • deep learning neural network
  • automated segmentation
  • computed tomography
  • SKELETAL-MUSCLE
  • BODY-COMPOSITION
  • SARCOPENIA
  • CANCER
  • CHEMOTHERAPY
  • SURVIVAL
  • OBESITY
  • CT

Cite this