TY - JOUR
T1 - Cross-parametric generative adversarial network-based magnetic resonance image feature synthesis for breast lesion classification
AU - Fan, Ming
AU - Huang, Guangyao
AU - Lou, Junhong
AU - Gao, Xin
AU - Zeng, Tieyong
AU - Li, Lihua
N1 - KAUST Repository Item: Exported on 2023-09-04
Acknowledged KAUST grant number(s): FCC/1/1976-44-01
Acknowledgements: This work was supported in part by National Key R&D program of China (2021YFE0203700), the National Natural Science Foundation of China (62271178, U21A20521), the Natural Science Foundation of Zhejiang Province of China (LR23F010002), and the King Abdullah University of Science and Technology (KAUST) Office of Sponsored Research (OSR) under award nos. FCC/1/1976-44-01.
PY - 2023/9/1
Y1 - 2023/9/1
N2 - Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) contains information on tumor morphology and physiology for breast cancer diagnosis and treatment. However, this technology requires contrast agent injection with more acquisition time than other parametric images, such as T2-weighted imaging (T2WI). Current image synthesis methods attempt to map the image data from one domain to another, whereas it is challenging or even infeasible to map the images with one sequence into images with multiple sequences. Here, we propose a new approach of cross-parametric generative adversarial network (GAN)-based feature synthesis (CPGANFS) to generate discriminative DCE-MRI features from T2WI with applications in breast cancer diagnosis. The proposed approach decodes the T2W images into latent cross-parameter features to reconstruct the DCE-MRI and T2WI features by balancing the information shared between the two. A Wasserstein GAN with a gradient penalty is employed to differentiate the T2WI-generated features from ground-truth features extracted from DCE-MRI. The synthesized DCE-MRI feature-based model achieved significantly (p = 0.036) higher prediction performance (AUC = 0.866) in breast cancer diagnosis than that based on T2WI (AUC = 0.815). Visualization of the model shows that our CPGANFS method enhances the predictive power by levitating attention to the lesion and the surrounding parenchyma areas, which is driven by the interparametric information learned from T2WI and DCE-MRI. Our proposed CPGANFS provides a framework for cross-parametric MR image feature generation from a single-sequence image guided by an information-rich, time-series image with kinetic information. Extensive experimental results demonstrate its effectiveness with high interpretability and improved performance in breast cancer diagnosis.
AB - Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) contains information on tumor morphology and physiology for breast cancer diagnosis and treatment. However, this technology requires contrast agent injection with more acquisition time than other parametric images, such as T2-weighted imaging (T2WI). Current image synthesis methods attempt to map the image data from one domain to another, whereas it is challenging or even infeasible to map the images with one sequence into images with multiple sequences. Here, we propose a new approach of cross-parametric generative adversarial network (GAN)-based feature synthesis (CPGANFS) to generate discriminative DCE-MRI features from T2WI with applications in breast cancer diagnosis. The proposed approach decodes the T2W images into latent cross-parameter features to reconstruct the DCE-MRI and T2WI features by balancing the information shared between the two. A Wasserstein GAN with a gradient penalty is employed to differentiate the T2WI-generated features from ground-truth features extracted from DCE-MRI. The synthesized DCE-MRI feature-based model achieved significantly (p = 0.036) higher prediction performance (AUC = 0.866) in breast cancer diagnosis than that based on T2WI (AUC = 0.815). Visualization of the model shows that our CPGANFS method enhances the predictive power by levitating attention to the lesion and the surrounding parenchyma areas, which is driven by the interparametric information learned from T2WI and DCE-MRI. Our proposed CPGANFS provides a framework for cross-parametric MR image feature generation from a single-sequence image guided by an information-rich, time-series image with kinetic information. Extensive experimental results demonstrate its effectiveness with high interpretability and improved performance in breast cancer diagnosis.
UR - http://hdl.handle.net/10754/694036
UR - https://ieeexplore.ieee.org/document/10237240/
U2 - 10.1109/jbhi.2023.3311021
DO - 10.1109/jbhi.2023.3311021
M3 - Article
SN - 2168-2194
SP - 1
EP - 11
JO - IEEE Journal of Biomedical and Health Informatics
JF - IEEE Journal of Biomedical and Health Informatics
ER -