Abstract
We review neural network architectures which were motivated by Fourier series and integrals and which are referred to as Fourier neural networks. These networks are empirically evaluated in synthetic and real-world tasks. Neither of them outperforms the standard neural network with sigmoid activation function in the real-world tasks. All neural networks, both Fourier and the standard one, empirically demonstrate lower approximation error than the truncated Fourier series when it comes to approximation of a known function of multiple variables.
Original language | English (US) |
---|---|
Pages (from-to) | 1107-1120 |
Number of pages | 14 |
Journal | Intelligent Data Analysis |
Volume | 24 |
Issue number | 5 |
DOIs | |
State | Published - 2020 |
Keywords
- convergence
- Fourier series
- function approximation
- Neural networks
ASJC Scopus subject areas
- Theoretical Computer Science
- Computer Vision and Pattern Recognition
- Artificial Intelligence