An adaptive lighting correction method for matched-texture coding

Guoxin Jin, Thrasyvoulos N Pappas, David L. Neuhoff

Research output: Chapter in Book/Report/Conference proceedingConference contribution

6 Scopus citations

Abstract

Matched-Texture Coding is a novel image coder that utilizes the self-similarity of natural images that include textures, in order to achieve structurally lossless compression. The key to a high compression ratio is replacing large image blocks with previously encoded blocks with similar structure. Adjusting the lighting of the replaced block is critical for eliminating illumination artifacts and increasing the number of matches. We propose a new adaptive lighting correction method that is based on the Poisson equation with incomplete boundary conditions. In order to fully exploit the benefits of the adaptive Poisson lighting correction, we also propose modifications of the side-matching (SM) algorithm and structural texture similarity metric. We show that the resulting matched-texture algorithm achieves better coding performance.

Original languageEnglish (US)
Title of host publication2014 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2014
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages2006-2010
Number of pages5
ISBN (Print)9781479928927
DOIs
StatePublished - Jan 1 2014
Event2014 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2014 - Florence, Italy
Duration: May 4 2014May 9 2014

Other

Other2014 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2014
Country/TerritoryItaly
CityFlorence
Period5/4/145/9/14

Keywords

  • Dirichlet problem
  • Neumman problem
  • Poisson equation
  • structural texture similarity metric

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'An adaptive lighting correction method for matched-texture coding'. Together they form a unique fingerprint.

Cite this