GRAPH AUTOENCODER-BASED EMBEDDED LEARNING IN DYNAMIC BRAIN NETWORKS FOR AUTISM SPECTRUM DISORDER IDENTIFICATION

Fuad Noman, Sin Yee Yap, Raphaël C.W. Phan, Hernando Ombao, Chee Ming Ting

Research output: Chapter in Book/Report/Conference proceedingConference contribution

6 Scopus citations

Abstract

Recent applications of pattern recognition techniques to brain connectome-based classification focus on static functional connectivity (FC) neglecting the dynamics of FC over time, and use input connectivity matrices on a regular Euclidean grid. We exploit the graph convolutional networks (GCNs) to learn irregular structural patterns in brain FC networks and propose extensions to capture dynamic changes in network topology. We develop a dynamic graph autoencoder (DyGAE)-based framework to leverage the time-varying topological structures of dynamic brain networks for identification of autism spectrum disorder (ASD). The framework combines a GCN-based DyGAE to encode individual-level dynamic networks into time-varying low-dimensional network embeddings, and classifiers based on weighted fully-connected neural network (FCNN) and long short-term memory (LSTM) to facilitate dynamic graph classification via the learned spatial-temporal information. Evaluation on a large ABIDE resting-state functional magnetic resonance imaging (rs-fMRI) dataset shows that our method outperformed state-of-the-art methods in detecting altered FC in ASD. Dynamic FC analyses with DyGAE learned embeddings also reveal apparent group difference between ASD and healthy controls in network profiles and switching dynamics of brain states.
Original languageEnglish (US)
Title of host publication2022 IEEE International Conference on Image Processing (ICIP)
PublisherIEEE
Pages2891-2895
Number of pages5
ISBN (Print)9781665496209
DOIs
StatePublished - Oct 18 2022

Fingerprint

Dive into the research topics of 'GRAPH AUTOENCODER-BASED EMBEDDED LEARNING IN DYNAMIC BRAIN NETWORKS FOR AUTISM SPECTRUM DISORDER IDENTIFICATION'. Together they form a unique fingerprint.

Cite this