TY - JOUR
T1 - Privacy-Preserving Federated Learning With Resource Adaptive Compression for Edge Devices
AU - Hidayat, Muhammad Ayat
AU - Nakamura, Yugo
AU - Arakawa, Yutaka
N1 - Publisher Copyright:
IEEE
PY - 2023
Y1 - 2023
N2 - Federated learning has gained widespread attention as a distributed machine learning technique that offers data protection when training on local devices. Unlike conventional centralized training in traditional machine learning, federated learning incorporates privacy and security measures as it does not share raw data between the client and server, thereby safeguarding potentially sensitive information. However, there are still vulnerabilities in the FL field, and commonly used approaches, such as encryption and blockchain technologies, often result in significant computational and communication costs, making them impractical for devices with restricted resources. To tackle this challenge, we present a privacy-preserving federated learning system specifically designed for resource-constrained devices, leveraging compressive sensing and differential privacy techniques. We implemented the weight-pruning-based compressive sensing method with an adaptive compression ratio based on resource availability. In addition, we employ differential privacy to introduce noise to the gradient before sending it to a central server for aggregation, thereby protecting the gradient’s privacy. Evaluation results demonstrate that our proposed method achieves slightly better accuracy when compared to state-of-the-art methods like DP-FedAvg, DP-FedOpt, and AGC-DP for the MNIST, Fashion-MNIST, and Human Activity Recognition datasets. Furthermore, our approach achieves this higher accuracy with a lower total communication cost and training time than the current state-of-the-art methods. Moreover, we comprehensively evaluate our method’s resilience against poisoning attacks, revealing its better resistance than existing state-of-the-art approaches.
AB - Federated learning has gained widespread attention as a distributed machine learning technique that offers data protection when training on local devices. Unlike conventional centralized training in traditional machine learning, federated learning incorporates privacy and security measures as it does not share raw data between the client and server, thereby safeguarding potentially sensitive information. However, there are still vulnerabilities in the FL field, and commonly used approaches, such as encryption and blockchain technologies, often result in significant computational and communication costs, making them impractical for devices with restricted resources. To tackle this challenge, we present a privacy-preserving federated learning system specifically designed for resource-constrained devices, leveraging compressive sensing and differential privacy techniques. We implemented the weight-pruning-based compressive sensing method with an adaptive compression ratio based on resource availability. In addition, we employ differential privacy to introduce noise to the gradient before sending it to a central server for aggregation, thereby protecting the gradient’s privacy. Evaluation results demonstrate that our proposed method achieves slightly better accuracy when compared to state-of-the-art methods like DP-FedAvg, DP-FedOpt, and AGC-DP for the MNIST, Fashion-MNIST, and Human Activity Recognition datasets. Furthermore, our approach achieves this higher accuracy with a lower total communication cost and training time than the current state-of-the-art methods. Moreover, we comprehensively evaluate our method’s resilience against poisoning attacks, revealing its better resistance than existing state-of-the-art approaches.
UR - http://www.scopus.com/inward/record.url?scp=85181565659&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85181565659&partnerID=8YFLogxK
U2 - 10.1109/JIOT.2023.3347552
DO - 10.1109/JIOT.2023.3347552
M3 - Article
AN - SCOPUS:85181565659
SN - 2327-4662
VL - 11
SP - 1
JO - IEEE Internet of Things Journal
JF - IEEE Internet of Things Journal
IS - 8
ER -