Context region discovery for automatic motion compensation in fluoroscopy

Yin Xia*, Sarfaraz Hussein, Vivek Singh, Matthias John, Ying Wu, Terrence Chen

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

Purpose: Image-based tracking for motion compensation is an important topic in image-guided interventions, as it enables physicians to operate in a less complex space. In this paper, we propose an automatic motion compensation scheme to boost image guidence power in transcatheter aortic valve implantation (TAVI). Methods: The proposed tracking algorithm automatically discovers reliable regions that correlate strongly with the target. These discovered regions can assist to estimate target motion under severe occlusion, even if target tracker fails. Results: We evaluate the proposed method for pigtail tracking during TAVI. We obtain significant improvement (12 %) over the baseline in a clinical dataset. Calcification regions are automatically discovered during tracking, which would aid TAVI processes. Conclusion: In this work, we open a new paradigm to provide dynamic real-time guidance for TAVI without user interventions, specially in case of severe occlusion where conventional tracking methods are challenged.

Original languageEnglish (US)
Pages (from-to)977-985
Number of pages9
JournalInternational Journal of Computer Assisted Radiology and Surgery
Volume11
Issue number6
DOIs
StatePublished - Jun 1 2016

Keywords

  • Image assisted intervention
  • Instrument and patient localization and tracking
  • Tracking systems

ASJC Scopus subject areas

  • Surgery
  • Biomedical Engineering
  • Radiology Nuclear Medicine and imaging
  • Computer Vision and Pattern Recognition
  • Computer Science Applications
  • Health Informatics
  • Computer Graphics and Computer-Aided Design

Fingerprint

Dive into the research topics of 'Context region discovery for automatic motion compensation in fluoroscopy'. Together they form a unique fingerprint.

Cite this