Federated Learning (FL) is a decentralized training paradigm where clients collaboratively train a Machine Learning (ML) model without outsourcing their raw data. Each participant locally trains a model utilizing their data, and these models are periodically aggregated to build a global model. In this scenario, clients may be willing to verify the inclusion of their contributions to ensure accurate and fair computation of the global model. This verification also serves to confirm that their participation has been adequately rewarded. Therefore, FL frameworks should allow clients to efficiently verify their proof of inclusion (i.e., membership proof) in the training without affecting privacy. This paper presents a protocol for efficient membership proof in FL (MPFL). Our protocol leverages cryptographic accumulators, which enable clients to verify membership proof with minimum overhead, and a smart contract deployed on a blockchain to ensure the correct generation of the global model and membership proofs. We implemented MPFL and conducted evaluations across various datasets, ML models, and varying the number of clients. The experimental findings reveal that a client only requires 96 bytes to maintain the necessary information for verifying their inclusion, achieving this task in approximately 20 ms.
Mazzocca, C., Mora, A., Romandini, N., Montanari, R., Bellavista, P. (2024). Membership Proof in Federated Learning via Cryptographic Accumulators. IEEE [10.1109/bcca62388.2024.10844447].
Membership Proof in Federated Learning via Cryptographic Accumulators
Mazzocca, Carlo;Mora, Alessio;Romandini, Nicolò;Montanari, Rebecca;Bellavista, Paolo
2024
Abstract
Federated Learning (FL) is a decentralized training paradigm where clients collaboratively train a Machine Learning (ML) model without outsourcing their raw data. Each participant locally trains a model utilizing their data, and these models are periodically aggregated to build a global model. In this scenario, clients may be willing to verify the inclusion of their contributions to ensure accurate and fair computation of the global model. This verification also serves to confirm that their participation has been adequately rewarded. Therefore, FL frameworks should allow clients to efficiently verify their proof of inclusion (i.e., membership proof) in the training without affecting privacy. This paper presents a protocol for efficient membership proof in FL (MPFL). Our protocol leverages cryptographic accumulators, which enable clients to verify membership proof with minimum overhead, and a smart contract deployed on a blockchain to ensure the correct generation of the global model and membership proofs. We implemented MPFL and conducted evaluations across various datasets, ML models, and varying the number of clients. The experimental findings reveal that a client only requires 96 bytes to maintain the necessary information for verifying their inclusion, achieving this task in approximately 20 ms.File | Dimensione | Formato | |
---|---|---|---|
BCCA2024.pdf
accesso aperto
Tipo:
Postprint
Licenza:
Licenza per accesso libero gratuito
Dimensione
454.7 kB
Formato
Adobe PDF
|
454.7 kB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.