Abstract
Revoking personal private data is one of the basic human rights. However, such right is often overlooked or infringed upon due to the increasing collection and use of patient data for model training. In order to secure patients’ right to be forgotten, we proposed a solution by using auditing to guide the forgetting process, where auditing means determining whether a dataset has been used to train the model and forgetting requires the information of a query dataset to be forgotten from the target model. We unified these two tasks by introducing an approach called knowledge purification. To implement our solution, we developed an audit to forget software (AFS), which is able to evaluate and revoke patients’ private data from pre-trained deep learning models. Here, we show the usability of AFS and its application potential in real-world intelligent healthcare to enhance privacy protection and data revocation rights.
Original language | English (US) |
---|---|
Article number | 6255 |
Journal | Nature Communications |
Volume | 14 |
Issue number | 1 |
DOIs | |
State | Published - Dec 2023 |
ASJC Scopus subject areas
- General Chemistry
- General Biochemistry, Genetics and Molecular Biology
- General Physics and Astronomy