The learing time of a simple neural-network model is obtained through an analytic computation of the eigenvalue spectrum for the Hessian matrix, which describes the second-order properties of the objective function in the space of coupling coefficients. The results are generic for symmetric matrices obtained by summing outer products of random vectors. The form of the eigenvalue distribution suggests new techniques for accelerating the learning process, and provides a theoretical justification for the choice of centered versus biased state variables.
|Original language||English (US)|
|Number of pages||4|
|Journal||Physical review letters|
|State||Published - 1991|
ASJC Scopus subject areas
- Physics and Astronomy(all)