TY - JOUR
T1 - Preventing Sensitive Information Leakage from Mobile Sensor Signals via IntegrativeTransformation
AU - Zhang, Dalin
AU - Yao, Lina
AU - Chen, Kaixuan
AU - Yang, Zheng
AU - Gao, Xin
AU - Liu, Yunhao
N1 - KAUST Repository Item: Exported on 2021-05-12
PY - 2021
Y1 - 2021
N2 - Ubiquitous mobile sensors on human activity recognition pose the threat of leaking personal information that is explicitly contained within the time-series sensor signals and can be extracted by attackers. Existing protective methods only support specific sensitive attributes and require massive relevant sensitive ground truth for training, which is unfavourable to users. To fill this gap, we propose a novel data transformation framework for prohibiting the leakage of sensitive information from sensor data. The proposed framework transforms raw sensor data into a new format, where the sensitive information is hidden and the desired information (e.g., human activities) is retained. Training can be conducted without using any personal information as ground truth. Meanwhile, all attributes of sensitive information (e.g., age, gender) can be hidden through a one-time transformation collectively. The experimental results on two multimodal sensor-based human activity datasets manifest the feasibility of the presented framework in hiding users sensitive information (MAE increases 2 times and accuracy degrades 50%) without degrading the usability of the data for activity recognition (2% accuracy degradation).
AB - Ubiquitous mobile sensors on human activity recognition pose the threat of leaking personal information that is explicitly contained within the time-series sensor signals and can be extracted by attackers. Existing protective methods only support specific sensitive attributes and require massive relevant sensitive ground truth for training, which is unfavourable to users. To fill this gap, we propose a novel data transformation framework for prohibiting the leakage of sensitive information from sensor data. The proposed framework transforms raw sensor data into a new format, where the sensitive information is hidden and the desired information (e.g., human activities) is retained. Training can be conducted without using any personal information as ground truth. Meanwhile, all attributes of sensitive information (e.g., age, gender) can be hidden through a one-time transformation collectively. The experimental results on two multimodal sensor-based human activity datasets manifest the feasibility of the presented framework in hiding users sensitive information (MAE increases 2 times and accuracy degrades 50%) without degrading the usability of the data for activity recognition (2% accuracy degradation).
UR - http://hdl.handle.net/10754/669165
UR - https://ieeexplore.ieee.org/document/9424974/
U2 - 10.1109/TMC.2021.3078086
DO - 10.1109/TMC.2021.3078086
M3 - Article
SN - 2161-9875
SP - 1
EP - 1
JO - IEEE Transactions on Mobile Computing
JF - IEEE Transactions on Mobile Computing
ER -