Abstract
A better understanding of ElectroEncephaloGraphy (EEG) and ElectroCorticoGram (ECoG) signals would get us closer to comprehending brain functionality, creating new avenues for treating brain abnormalities and developing novel Brain-Computer Interface (BCI)-related applications. Deep Neural Networks (DNN)s have lately been employed with remarkable success to decode EEG/ECoG signals for BCI. However, the optimal architectural/training parameter values in these DNN architectures have yet to receive much attention. In addition, new data-driven optimization methodologies that leverage significant advancements in Machine Learning, such as the Transformer model, have recently been proposed. Because an exhaustive search on all possible architectural/training parameter values of the state-of-the-art DNN model (our baseline model) decoding the motor imagery EEG and finger tension ECoG signals comprising the BCI IV 2a and 4 datasets, respectively, would require prohibitively much time, this paper proposes an offline model-based optimization technique based on the Transformer model for the discovery of the optimal architectural/training parameter values for that model. Our findings indicate that we could pick better values for the baseline model's architectural/training parameters, enhancing the baseline model's performance by up to 14.7% in the BCI IV 2a dataset and by up to 61.0% in the BCI IV 4 dataset.
Original language | English (US) |
---|---|
Pages (from-to) | 35888-35900 |
Number of pages | 13 |
Journal | IEEE Access |
Volume | 11 |
DOIs | |
State | Published - 2023 |
Keywords
- BCI
- CNN
- data-driven optimization
- deep learning
- ECoG/EEG signals
- self-attention model
ASJC Scopus subject areas
- General Computer Science
- General Materials Science
- General Engineering