Abstract
Shallow feed-forward networks are incapable of addressing complex tasks such as natural language processing that require learning of temporal signals. To address these requirements, we need deep neuromorphic architectures with recurrent connections such as deep recurrent neural networks. However, the training of such networks demand very high precision of weights, excellent conductance linearity and low write-noise- not satisfied by current memristive implementations. Inspired from optogenetics, here we report a neuromorphic computing platform comprised of photo-excitable neuristors capable of in-memory computations across 980 addressable states with a high signal-to-noise ratio of 77. The large linear dynamic range, low write noise and selective excitability allows high fidelity opto-electronic transfer of weights with a two-shot write scheme, while electrical in-memory inference provides energy efficiency. This method enables implementing a memristive deep recurrent neural network with twelve trainable layers with more than a million parameters to recognize spoken commands with >90% accuracy.
Original language | English (US) |
---|---|
Article number | 3211 |
Journal | Nature communications |
Volume | 11 |
Issue number | 1 |
DOIs | |
State | Published - Dec 1 2020 |
Funding
We would like to acknowledge the funding from MOE Tier 1 grants: RG87/16, RG 166/ 16 and MOE Tier 2 grants: MOE2015-T2-2-007, MOE2015-T2-2-043, MOE2017-T2-2-136 and MOE Tier 2 grant MOE2016-T2-1-100.
ASJC Scopus subject areas
- General Chemistry
- General Biochemistry, Genetics and Molecular Biology
- General Physics and Astronomy