Publications Details
Using Floating-Gate Memory to Train Ideal Accuracy Neural Networks
Agarwal, Sapan; Garland, Diana; Niroula, John; Jacobs-Gedrim, Robin B.; Hsia, Alex; Van Heukelom, Michael S.; Fuller, Elliot; Draper, Bruce; Marinella, Matthew J.
Floating-gate silicon-oxygen-nitrogen-oxygen-silicon (SONOS) transistors can be used to train neural networks to ideal accuracies that match those of floating-point digital weights on the MNIST handwritten digit data set when using multiple devices to represent a weight or within 1% of ideal accuracy when using a single device. This is enabled by operating devices in the subthreshold regime, where they exhibit symmetric write nonlinearities. A neural training accelerator core based on SONOS with a single device per weight would increase energy efficiency by $120\times $, operate $2.1\times $ faster, and require $5\times $ lower area than an optimized SRAM-based ASIC.