Abstract
We present a systematic approach for training and testing structural texture similarity metrics (STSIMs) so that they can be used to exploit texture redundancy for structurally lossless image compression. The training and testing is based on a set of image distortions that reflect the characteristics of the perturbations present in natural texture images. We conduct empirical studies to determine the perceived similarity scale across all pairs of original and distorted textures. We then introduce a data-driven approach for training the Mahalanobis formulation of STSIM based on the resulting annotated texture pairs. Experimental results demonstrate that training results in significant improvements in metric performance. We also show that the performance of the trained STSIM metrics is competitive with state of the art metrics based on convolutional neural networks, at substantially lower computational cost.
Original language | English (US) |
---|---|
Pages (from-to) | 1614-1626 |
Number of pages | 13 |
Journal | IEEE Transactions on Image Processing |
Volume | 33 |
DOIs | |
State | Published - 2024 |
Keywords
- Perceptual image quality
- matched-texture coding (MTC)
- structural texture similarity metrics (STSIMs)
- visual texture analysis
ASJC Scopus subject areas
- Software
- Computer Graphics and Computer-Aided Design