Long Short-Term memory (LSTM) architecture is a well-known approach for building recurrent neural networks (RNN) useful in sequential processing of data in application to natural language processing. The near-sensor hardware implementation of LSTM is challenged due to large parallelism and complexity. We propose a 0.18 μ m CMOS, GST memristor LSTM hardware architecture for near-sensor processing. The proposed system is validated in a forecasting problem based on Keras model.
|Original language||English (US)|
|Title of host publication||2018 IEEE International Conference on Electron Devices and Solid State Circuits, EDSSC 2018|
|Publisher||Institute of Electrical and Electronics Engineers Inc.|
|State||Published - Oct 9 2018|