Video coding algorithm based on recovery techniques using mean field annealing

Taner Ozcelik*, James C. Brailean, Aggelos K. Katsaggelos

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution


Most of the existing video coding algorithms produce highly visible artifacts in the reconstructed images as the bit-rate is lowered. These artifacts are due to the information loss caused by the quantization process. Since these algorithms treat decoding as simply the inverse process of encoding, these artifacts are inevitable. In this paper, we propose an encoder/decoder paradigm in which both the encoder and decoder solve an estimation problem based on the available bitstream and prior knowledge about the source image and video. The proposed technique makes use of a priori information about the original image through a nonstationary Gauss-Markov model. Utilizing this mode, a maximum a posteriori (MAP) estimate is obtained iteratively using mean field annealing. The fidelity to the data is preserved by projecting the image onto a constraint set defined by the quantizer at each iteration. The performance of the proposed algorithm is demonstrated on an H.261-type video codec. It is shown to be effective in improving the reconstructed image quality considerably while reducing the bit-rate.

Original languageEnglish (US)
Title of host publicationProceedings of SPIE - The International Society for Optical Engineering
PublisherSociety of Photo-Optical Instrumentation Engineers
Number of pages12
ISBN (Print)0891418587
StatePublished - Jan 1 1995
EventVisual Communications and Image Processing '95 - Taipei, Taiwan
Duration: May 24 1995May 26 1995


OtherVisual Communications and Image Processing '95
CityTaipei, Taiwan

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Condensed Matter Physics


Dive into the research topics of 'Video coding algorithm based on recovery techniques using mean field annealing'. Together they form a unique fingerprint.

Cite this