TY - GEN
T1 - Fast and Low-Cost Mitigation of ReRAM Variability for Deep Learning Applications
AU - Lee, Sugil
AU - Fouda, Mohammed
AU - Lee, Jongeun
AU - Eltawil, Ahmed
AU - Kurdahi, Fadi
N1 - KAUST Repository Item: Exported on 2022-03-21
Acknowledgements: This work was supported by NRF grant (No. 2020R1A2C2015066) and IITP grant (No. 2020-0-01336, Artificial Intelligence Graduate School Program) funded by MSIT of Korea, and by Free Innovative Research Fund of UNIST (1.170067.01). The EDA tool was supported by the IC Design Education Center (IDEC), Korea.
PY - 2021/10
Y1 - 2021/10
N2 - To overcome the programming variability (PV) of ReRAM crossbar arrays (RCAs), the most common method is program-verify, which, however, has high energy and latency overhead. In this paper we propose a very fast and low-cost method to mitigate the effect of PV and other variability for RCA-based DNN (Deep Neural Network) accelerators. Leveraging the statistical properties of DNN output, our method called Online Batch-Norm Correction (OBNC) can compensate for the effect of programming and other variability on RCA output without using on-chip training or an iterative procedure, and is thus very fast. Also our method does not require a nonideality model or a training dataset, hence very easy to apply. Our experimental results using ternary neural networks with binary and 4-bit activations demonstrate that our OBNC can recover the baseline performance in many variability settings and that our method outperforms a previously known method (VCAM) by large margins when input distribution is asymmetric or activation is multi-bit.
AB - To overcome the programming variability (PV) of ReRAM crossbar arrays (RCAs), the most common method is program-verify, which, however, has high energy and latency overhead. In this paper we propose a very fast and low-cost method to mitigate the effect of PV and other variability for RCA-based DNN (Deep Neural Network) accelerators. Leveraging the statistical properties of DNN output, our method called Online Batch-Norm Correction (OBNC) can compensate for the effect of programming and other variability on RCA output without using on-chip training or an iterative procedure, and is thus very fast. Also our method does not require a nonideality model or a training dataset, hence very easy to apply. Our experimental results using ternary neural networks with binary and 4-bit activations demonstrate that our OBNC can recover the baseline performance in many variability settings and that our method outperforms a previously known method (VCAM) by large margins when input distribution is asymmetric or activation is multi-bit.
UR - http://hdl.handle.net/10754/675897
UR - https://ieeexplore.ieee.org/document/9643837/
UR - http://www.scopus.com/inward/record.url?scp=85123953511&partnerID=8YFLogxK
U2 - 10.1109/ICCD53106.2021.00051
DO - 10.1109/ICCD53106.2021.00051
M3 - Conference contribution
SN - 9781665432191
SP - 269
EP - 276
BT - 2021 IEEE 39th International Conference on Computer Design (ICCD)
PB - IEEE
ER -