TY - GEN
T1 - Federated Learning is Better with Non-Homomorphic Encryption
AU - Burlachenko, Konstantin
AU - Alrowithi, Abdulmajeed
AU - Albalawi, Fahad Ali
AU - Richtárik, Peter
N1 - Publisher Copyright:
© 2023 Owner/Author.
PY - 2023/12/8
Y1 - 2023/12/8
N2 - Traditional AI methodologies necessitate centralized data collection, which becomes impractical when facing problems with network communication, data privacy, or storage capacity. Federated Learning (FL) offers a paradigm that empowers distributed AI model training without collecting raw data. There are different choices for providing privacy during FL training. One of the popular methodologies is employing Homomorphic Encryption (HE)-a breakthrough in privacy-preserving computation from Cryptography. However, these methods have a price in the form of extra computation and memory footprint. To resolve these issues, we propose an innovative framework that synergizes permutation-based compressors with Classical Cryptography, even though employing Classical Cryptography was assumed to be impossible in the past in the context of FL. Our framework offers a way to replace HE with cheaper Classical Cryptography primitives which provides security for the training process. It fosters asynchronous communication and provides flexible deployment options in various communication topologies.
AB - Traditional AI methodologies necessitate centralized data collection, which becomes impractical when facing problems with network communication, data privacy, or storage capacity. Federated Learning (FL) offers a paradigm that empowers distributed AI model training without collecting raw data. There are different choices for providing privacy during FL training. One of the popular methodologies is employing Homomorphic Encryption (HE)-a breakthrough in privacy-preserving computation from Cryptography. However, these methods have a price in the form of extra computation and memory footprint. To resolve these issues, we propose an innovative framework that synergizes permutation-based compressors with Classical Cryptography, even though employing Classical Cryptography was assumed to be impossible in the past in the context of FL. Our framework offers a way to replace HE with cheaper Classical Cryptography primitives which provides security for the training process. It fosters asynchronous communication and provides flexible deployment options in various communication topologies.
KW - AES
KW - asynchronous training
KW - CKKS
KW - federated learning
KW - optimization
KW - privacy preserving machine learning
KW - security
UR - http://www.scopus.com/inward/record.url?scp=85183586482&partnerID=8YFLogxK
U2 - 10.1145/3630048.3630182
DO - 10.1145/3630048.3630182
M3 - Conference contribution
AN - SCOPUS:85183586482
T3 - DistributedML 2023 - Proceedings of the 4th International Workshop on Distributed Machine Learning
SP - 49
EP - 84
BT - DistributedML 2023 - Proceedings of the 4th International Workshop on Distributed Machine Learning
PB - Association for Computing Machinery, Inc
T2 - 4th International Workshop on Distributed Machine Learning, DistributedML 2023
Y2 - 8 December 2023
ER -