Gated recurrent networks for video super resolution

Santiago López-Tapia, Alice Lucas, Rafael Molina, Aggelos K. Katsaggelos

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Scopus citations

Abstract

Despite the success of Recurrent Neural Networks in tasks involving temporal video processing, few works in Video Super-Resolution (VSR) have employed them. In this work we propose a new Gated Recurrent Convolutional Neural Network for VSR adapting some of the key components of a Gated Recurrent Unit. Our model employs a deformable attention module to align the features calculated at the previous time step with the ones in the current step and then uses a gated operation to combine them. This allows our model to effectively reuse previously calculated features and exploit longer temporal relationships between frames without the need of explicit motion compensation. The experimental validation shows that our approach outperforms current VSR learning based models in terms of perceptual quality and temporal consistency.

Original languageEnglish (US)
Title of host publication28th European Signal Processing Conference, EUSIPCO 2020 - Proceedings
PublisherEuropean Signal Processing Conference, EUSIPCO
Pages700-704
Number of pages5
ISBN (Electronic)9789082797053
DOIs
StatePublished - Jan 24 2021
Event28th European Signal Processing Conference, EUSIPCO 2020 - Amsterdam, Netherlands
Duration: Aug 24 2020Aug 28 2020

Publication series

NameEuropean Signal Processing Conference
Volume2021-January
ISSN (Print)2219-5491

Conference

Conference28th European Signal Processing Conference, EUSIPCO 2020
Country/TerritoryNetherlands
CityAmsterdam
Period8/24/208/28/20

Keywords

  • Convolutional Neuronal Networks
  • Recurrent Neural Networks
  • Super-resolution
  • Video

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Gated recurrent networks for video super resolution'. Together they form a unique fingerprint.

Cite this