Efficient physics-informed neural networks using hash encoding

Xinquan Huang*, Tariq Alkhalifah

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

2 Scopus citations


Physics-informed neural networks (PINNs) have attracted much attention in scientific computing as their functional representation of partial differential equation (PDE) solutions offers flexibility and accuracy features. However, their training cost has limited their practical use as a real alternative to classic numerical methods. Thus, we propose to incorporate multi-resolution hash encoding into PINNs to improve the training efficiency, as such encoding offers a locally-aware (at multi-resolution) coordinate input to the neural network. Borrowed from the neural representation field community (NeRF), we investigate the robustness of calculating the derivatives of such hash-encoded neural networks with respect to the input coordinates, which is often needed by the PINN loss terms. We propose to replace the automatic differentiation with finite-difference calculations of the derivatives to address the discontinuous nature of such derivatives. We also share the appropriate ranges for the hash encoding hyperparameters to obtain robust derivatives. We test the proposed method on several benchmark problems, and the proposed method admits about a 10-fold improvement in efficiency over the vanilla PINN implementation.

Original languageEnglish (US)
Article number112760
JournalJournal of Computational Physics
StatePublished - Mar 15 2024


  • Hash encoding
  • Partial differential equation
  • Physics-informed neural networks

ASJC Scopus subject areas

  • Numerical Analysis
  • Modeling and Simulation
  • Physics and Astronomy (miscellaneous)
  • General Physics and Astronomy
  • Computer Science Applications
  • Computational Mathematics
  • Applied Mathematics


Dive into the research topics of 'Efficient physics-informed neural networks using hash encoding'. Together they form a unique fingerprint.

Cite this