TY - GEN
T1 - AGC-DP
T2 - 24th IEEE International Conference on Mobile Data Management, MDM 2023
AU - Hidayat, Muhammad Ayat
AU - Nakamura, Yugo
AU - Dawton, Billy
AU - Arakawa, Yutaka
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - Federated learning provides techniques for training algorithms using mobile or decentralized devices, in contrast to traditional machine learning in which algorithm training is performed on centralized devices. In addition, federated learning provides privacy and security features, as the client and server do not share raw data, which may contain confidential information. A number of studies have shown, however, that using federated learning alone is not enough to protect data privacy in certain situations. To overcome this problem, differential privacy is proposed, which is a technique in which artificial noise is added to the raw data. By implementing this method, a high level of privacy protection can be obtained, however this added noise also reduces model accuracy. To address this issue, this paper proposes a new approach to implement differential privacy in federated learning using adaptive Gaussian clipping. We implemented the method by tightening the privacy budget, and introducing dynamic sampling probability, adaptive clipping based on hyperparameters, and a new privacy loss calculation. Our method's main objective is to adaptively change the amount of noise given to the model, thereby maximizing the model's accuracy performance, while maintaining privacy protection levels. Evaluation results show that our proposed method presents slightly better accuracy when compared to other existing differential privacy variants such as RDP, DP-SGD, and ZcDP, for both balanced (i.i.d.) and unbalanced datasets (non-i.i.d.), for a lower total communication cost than some variants.
AB - Federated learning provides techniques for training algorithms using mobile or decentralized devices, in contrast to traditional machine learning in which algorithm training is performed on centralized devices. In addition, federated learning provides privacy and security features, as the client and server do not share raw data, which may contain confidential information. A number of studies have shown, however, that using federated learning alone is not enough to protect data privacy in certain situations. To overcome this problem, differential privacy is proposed, which is a technique in which artificial noise is added to the raw data. By implementing this method, a high level of privacy protection can be obtained, however this added noise also reduces model accuracy. To address this issue, this paper proposes a new approach to implement differential privacy in federated learning using adaptive Gaussian clipping. We implemented the method by tightening the privacy budget, and introducing dynamic sampling probability, adaptive clipping based on hyperparameters, and a new privacy loss calculation. Our method's main objective is to adaptively change the amount of noise given to the model, thereby maximizing the model's accuracy performance, while maintaining privacy protection levels. Evaluation results show that our proposed method presents slightly better accuracy when compared to other existing differential privacy variants such as RDP, DP-SGD, and ZcDP, for both balanced (i.i.d.) and unbalanced datasets (non-i.i.d.), for a lower total communication cost than some variants.
UR - http://www.scopus.com/inward/record.url?scp=85171127876&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85171127876&partnerID=8YFLogxK
U2 - 10.1109/MDM58254.2023.00042
DO - 10.1109/MDM58254.2023.00042
M3 - Conference contribution
AN - SCOPUS:85171127876
T3 - Proceedings - IEEE International Conference on Mobile Data Management
SP - 199
EP - 208
BT - Proceedings - 2023 24th IEEE International Conference on Mobile Data Management, MDM 2023
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 3 July 2023 through 6 July 2023
ER -