Phase-Change Memory in Neural Network Layers with Measurements-based Device Models

Carmine Paolino, Alessio Antolini, Fabio Pareschi, Mauro Mangia, Riccardo Rovatti, Eleonora Franchi Scarselli, Gianluca Setti, Roberto Canegallo, Marcella Carissimi, Marco Pasotti

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations


The search for energy efficient circuital implementations of neural networks has led to the exploration of phase-change memory (PCM) devices as their synaptic element, with the advantage of compact size and compatibility with CMOS fabrication technologies. In this work, we describe a methodology that, starting from measurements performed on a set of real PCM devices, enables the training of a neural network. The core of the procedure is the creation of a computational model, sufficiently general to include the effect of unwanted non-idealities, such as the voltage dependence of the conductances and the presence of surrounding circuitry. Results show that, depending on the task at hand, a different level of accuracy is required in the PCM model applied at train-time to match the performance of a traditional, reference network. Moreover, the trained networks are robust to the perturbation of the weight values, up to 10% standard deviation, with performance losses within 3.5% for the accuracy in the classification task being considered and an increase of the regression RMS error by 0.014 in a second task. The considered perturbation is compatible with the performance of state-of-the-art PCM programming techniques.
Original languageEnglish (US)
Title of host publicationProceedings - IEEE International Symposium on Circuits and Systems
PublisherInstitute of Electrical and Electronics Engineers Inc.
Number of pages5
ISBN (Print)9781665484855
StatePublished - Jan 1 2022
Externally publishedYes


Dive into the research topics of 'Phase-Change Memory in Neural Network Layers with Measurements-based Device Models'. Together they form a unique fingerprint.

Cite this