TAI-GAN: A Temporally and Anatomically Informed Generative Adversarial Network for early-to-late frame conversion in dynamic cardiac PET inter-frame motion correction

Xueqi Guo*, Luyao Shi, Xiongchao Chen, Qiong Liu, Bo Zhou, Huidong Xie, Yi Hwa Liu, Richard Palyo, Edward J. Miller, Albert J. Sinusas, Lawrence Staib, Bruce Spottiswoode, Chi Liu, Nicha C. Dvornek

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Inter-frame motion in dynamic cardiac positron emission tomography (PET) using rubidium-82 (82Rb) myocardial perfusion imaging impacts myocardial blood flow (MBF) quantification and the diagnosis accuracy of coronary artery diseases. However, the high cross-frame distribution variation due to rapid tracer kinetics poses a considerable challenge for inter-frame motion correction, especially for early frames where intensity-based image registration techniques often fail. To address this issue, we propose a novel method called Temporally and Anatomically Informed Generative Adversarial Network (TAI-GAN) that utilizes an all-to-one mapping to convert early frames into those with tracer distribution similar to the last reference frame. The TAI-GAN consists of a feature-wise linear modulation layer that encodes channel-wise parameters generated from temporal information and rough cardiac segmentation masks with local shifts that serve as anatomical information. Our proposed method was evaluated on a clinical 82Rb PET dataset, and the results show that our TAI-GAN can produce converted early frames with high image quality, comparable to the real reference frames. After TAI-GAN conversion, the motion estimation accuracy and subsequent myocardial blood flow (MBF) quantification with both conventional and deep learning-based motion correction methods were improved compared to using the original frames. The code is available at https://github.com/gxq1998/TAI-GAN.

Original languageEnglish (US)
Article number103190
JournalMedical Image Analysis
Volume96
DOIs
StatePublished - Aug 2024

Funding

This work was supported by the National Institutes of Health (NIH), United States under grant R01CA224140. An early version of this work was accepted for publication by Springer Nature at SASHIMI: Simulation and Synthesis in Medical Imaging, a workshop of the International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI) 2023 (Guo et al. 2023b). Here, we included a more comprehensive analysis of the complete evaluation using simulated motion and ablation studies, a comparison of the conventional non-rigid and deep learning-based motion correction methods, an evaluation of an independent cohort of real-patient motion cases, additional background and implementation details, and in-depth discussions considering the viability of clinical use and prospective future directions.

Keywords

  • Dynamic cardiac PET
  • Early-to-late frame conversion
  • Inter-frame motion correction

ASJC Scopus subject areas

  • Radiological and Ultrasound Technology
  • Radiology Nuclear Medicine and imaging
  • Computer Vision and Pattern Recognition
  • Health Informatics
  • Computer Graphics and Computer-Aided Design

Fingerprint

Dive into the research topics of 'TAI-GAN: A Temporally and Anatomically Informed Generative Adversarial Network for early-to-late frame conversion in dynamic cardiac PET inter-frame motion correction'. Together they form a unique fingerprint.

Cite this