Spatial-Spectral Representation for X-Ray Fluorescence Image Super-Resolution

Qiqin Dai, Emeline Pouyet, Oliver Cossairt, Marc Walton, Aggelos K. Katsaggelos

Research output: Contribution to journalArticle

Abstract

X-ray fluorescence (XRF) scanning of works of art is becoming an increasing popular nondestructive analytical method. The high quality XRF spectra is necessary to obtain significant information on both major and minor elements used for characterization and provenance analysis. However, there is a tradeoff between the spatial resolution of an XRF scan and the signal-to-noise ratio (SNR) of each pixel's spectrum, due to the limited scanning time. In this project, we propose an XRF image super-resolution method to address this tradeoff; thus, obtaining a high spatial resolution XRF scan with high SNR. We fuse a low-resolution XRF image and a conventional RGB high-resolution image into a product of both high spatial and high spectral resolution XRF image. There is no guarantee of a one to one mapping between XRF spectrum and RGB color since, for instance, paintings with hidden layers cannot be detected in visible but can in X-ray wavelengths. We separate the XRF image into the visible and nonvisible components. The spatial resolution of the visible component is increased utilizing the high-resolution RGB image, whereas the spatial resolution of the non-visible component is increased using a total variation super-resolution method. Finally, the visible and nonvisible components are combined to obtain the final result.
Original languageEnglish (US)
Article number7927468
Pages (from-to)432-444
Number of pages13
JournalIEEE Transactions on Computational Imaging
Volume3
Issue number3
DOIs
StatePublished - Sep 1 2017

Keywords

  • Spatial resolution
  • Signal resolution
  • Hyperspectral imaging
  • Signal to noise ratio
  • X-ray imaging

Fingerprint

Dive into the research topics of 'Spatial-Spectral Representation for X-Ray Fluorescence Image Super-Resolution'. Together they form a unique fingerprint.

Cite this