Defending against targeted poisoning attacks in federated learning

Placeholder

Publication Date

2022

Advisor

Institution Author

Gürsoy, Mehmet Emre
Erbil Pınar

Co-Authors

Journal Title

Journal ISSN

Volume Title

Publisher:

IEEE Computer Soc

Type

Conference proceeding
View PlumX Details

Abstract

Federated learning (FL) enables multiple participants to collaboratively train a deep neural network (DNN) model. To combat malicious participants in FL, Byzantine-resilient aggregation rules (AGRs) have been developed. However, although Byzantine-resilient AGRs are effective against untargeted attacks, they become suboptimal when attacks are stealthy and targeted. In this paper, we study the problem of defending against targeted data poisoning attacks in FL and make three main contributions. First, we propose a method for selective extraction of DNN parameters from FL participants' update vectors that are indicative of attack, and embedding them into low-dimensional latent space. We show that the effectiveness of Byzantine-resilient AGRs such as Trimmed Mean and Krum can be improved if they are used in combination with our proposed method. Second, we develop a clustering-based defense using X-Means for separating items into malicious versus benign clusters in latent space. Such separation allows identification of malicious versus benign updates. Third, using the separation from the previous step, we show that a "clean" model (i.e., a model that is not negatively impacted by the attack) can be trained using only the benign updates. We experimentally evaluate our defense methods on Fashion-MNIST and CIFAR-10 datasets. Results show that our methods can achieve up to 95% true positive rate and 99% accuracy in malicious update identification across various settings. In addition, the clean models trained using our approach achieve similar accuracy compared to a baseline scenario without poisoning.

Description

Subject

Computer science, Information systems

Citation

Endorsement

Review

Supplemented By

Referenced By

Copy Rights Note