Learning with noise and regularizers in multilayer neural networks

David Saad, Sara A. Solla

Research output: Chapter in Book/Report/Conference proceedingConference contribution

10 Scopus citations

Abstract

We study the effect of noise and regularization in an on-line gradient-descent learning scenario for a general two-layer student network with an arbitrary number of hidden units. Training examples are randomly drawn input vectors labeled by a two-layer teacher network with an arbitrary number of hidden units; the examples are corrupted by G aussian noise affecting either the output or the model itself. We examine the effect of both types of noise and that of weight-decay regularization on the dynamical evolution of the order parameters and the generalization error in various phases of the learning process.

Original languageEnglish (US)
Title of host publicationAdvances in Neural Information Processing Systems 9 - Proceedings of the 1996 Conference, NIPS 1996
PublisherNeural information processing systems foundation
Pages260-266
Number of pages7
ISBN (Print)0262100657, 9780262100656
StatePublished - 1997
Event10th Annual Conference on Neural Information Processing Systems, NIPS 1996 - Denver, CO, United States
Duration: Dec 2 1996Dec 5 1996

Publication series

NameAdvances in Neural Information Processing Systems
ISSN (Print)1049-5258

Other

Other10th Annual Conference on Neural Information Processing Systems, NIPS 1996
Country/TerritoryUnited States
CityDenver, CO
Period12/2/9612/5/96

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'Learning with noise and regularizers in multilayer neural networks'. Together they form a unique fingerprint.

Cite this