A composite discriminator for generative adversarial network based video super-resolution

Xijun Wang, Alice Lucas, Santiago Lopez-Tapia, Xinyi Wu, Rafael Molina, Aggelos K. Katsaggelos

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Generative Adversarial Networks (GANs) have been used for solving the video super-resolution problem. So far, video super-resolution GAN-based methods use the traditional GAN framework which consists of a single generator and a single discriminator that are trained against each other. In this work we propose a new framework which incorporates two collaborative discriminators whose aim is to jointly improve the quality of the reconstructed video sequence. While one discriminator concentrates on general properties of the images, the second one specializes on obtaining realistically reconstructed features, such as, edges. Experiments results demonstrate that the learned model outperforms current state of the art models and obtains super-resolved frames, with fine details, sharp edges, and fewer artifacts.

Original languageEnglish (US)
Title of host publicationEUSIPCO 2019 - 27th European Signal Processing Conference
PublisherEuropean Signal Processing Conference, EUSIPCO
ISBN (Electronic)9789082797039
DOIs
StatePublished - Sep 2019
Event27th European Signal Processing Conference, EUSIPCO 2019 - A Coruna, Spain
Duration: Sep 2 2019Sep 6 2019

Publication series

NameEuropean Signal Processing Conference
Volume2019-September
ISSN (Print)2219-5491

Conference

Conference27th European Signal Processing Conference, EUSIPCO 2019
CountrySpain
CityA Coruna
Period9/2/199/6/19

Keywords

  • Generative Adversarial Networks
  • Spatially Adaptive
  • The Composite Discriminator
  • Video Super-Resolution

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'A composite discriminator for generative adversarial network based video super-resolution'. Together they form a unique fingerprint.

Cite this