On Tighter Generalization Bounds for Deep Neural Networks: CNNs, ResNets, and beyond

Xingguo Li, Junwei Lu, Zhaoran Wang, Jarvis Haupt, Tuo Zhao*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

We establish a margin based data dependent generalization error bound for a general family of deep neural networks in terms of the depth and width of the networks, as well as the spectral norm of weight matrices. Through introducing a new characterization of the Lipschitz properties of neural network family, we achieve a tighter generalization error bound. Moreover, we show that the generalization bound can be further improved for bounded losses. In addition, we demonstrate that the margin scales with the product of norm, which eliminate the concern on the vacuity of the norm based bound. Aside from the general feedforward deep neural networks, our results can be applied to derive new bounds for several popular architectures, including convolutional neural networks (CNNs), residual networks (ResNets), and hyperspherical networks (SphereNets). When achieving same generalization errors with previous arts, our bounds allow for the choice of larger parameter spaces of weight matrices, inducing potentially stronger expressive ability for neural networks. Moreover, we discuss the limitation of existing generalization bounds for understanding deep neural networks with ReLU activations in classification.

Original languageEnglish (US)
JournalUnknown Journal
StatePublished - Jun 13 2018

ASJC Scopus subject areas

  • General

Fingerprint Dive into the research topics of 'On Tighter Generalization Bounds for Deep Neural Networks: CNNs, ResNets, and beyond'. Together they form a unique fingerprint.

Cite this