Innovative Privacy Preserving Strategies in Federal Learning

Автор: Deny P. Francis, R. Sharmila

Журнал: International Journal of Information Engineering and Electronic Business @ijieeb

Статья в выпуске: 6 vol.17, 2025 года.

Бесплатный доступ

Federated Learning (FL) enables collaborative model training across distributed clients without sharing raw data, but it remains vulnerable to privacy risks. This study introduces FL-ODP-DFT, a novel framework that integrates Optimal Differential Privacy (ODP) with Discrete Fourier Transform (DFT) to enhance both model performance and privacy. By transforming local gradients into the frequency domain, the method reduces data size and adds a layer of encryption before transmission. Adaptive Gaussian Clipping (AGC) is employed to dynamically adjust clipping thresholds based on gradient distribution, further improving gradient handling. ODP then calibrates noise addition based on data sensitivity and privacy budgets, ensuring a balance between privacy and accuracy. Extensive experiments demonstrate that FL-ODP-DFT outperforms existing techniques in terms of accuracy, computational efficiency, convergence speed, and privacy protection, making it a robust and scalable solution for privacy-preserving FL.

Еще

Federated Learning, Privacy-Preservation, Optimal Differential Privacy, DFT, AGC, Gradient Management

Короткий адрес: https://sciup.org/15020075

IDR: 15020075   |   DOI: 10.5815/ijieeb.2025.06.09