The provision of communication services via portable and mobile devices, such as aerial base stations, is a crucial concept to be realized in 5G/6G networks. Conventionally, IoT/edge devices need to transmit data directly to the base station for training the model using machine learning techniques. The data transmission introduces privacy issues that might lead to security concerns and monetary losses. Recently, federated learning was proposed to partially solve privacy issues via model sharing with the base station. However, the centralized nature of federated learning only allows the devices within the vicinity of base stations to share trained models. Furthermore, the long-range communication compels the devices to increase transmission power, which raises energy efficiency concerns. In this work, we propose the distributed federated learning (DBFL) framework that overcomes the connectivity and energy efficiency issues for distant devices. The DBFL framework is compatible with mobile edge computing architecture that connects the devices in a distributed manner using clustering protocols. Experimental results show that the framework increases the classification performance by 7.4 percent in comparison to conventional federated learning while reducing the energy consumption.
Khowaja, S.A., Dev, K., Khowaja, P., Bellavista, P. (2021). Toward Energy-Efficient Distributed Federated Learning for 6G Networks. IEEE WIRELESS COMMUNICATIONS, 28(6), 34-40 [10.1109/MWC.012.2100153].
Toward Energy-Efficient Distributed Federated Learning for 6G Networks
Bellavista, Paolo
2021
Abstract
The provision of communication services via portable and mobile devices, such as aerial base stations, is a crucial concept to be realized in 5G/6G networks. Conventionally, IoT/edge devices need to transmit data directly to the base station for training the model using machine learning techniques. The data transmission introduces privacy issues that might lead to security concerns and monetary losses. Recently, federated learning was proposed to partially solve privacy issues via model sharing with the base station. However, the centralized nature of federated learning only allows the devices within the vicinity of base stations to share trained models. Furthermore, the long-range communication compels the devices to increase transmission power, which raises energy efficiency concerns. In this work, we propose the distributed federated learning (DBFL) framework that overcomes the connectivity and energy efficiency issues for distant devices. The DBFL framework is compatible with mobile edge computing architecture that connects the devices in a distributed manner using clustering protocols. Experimental results show that the framework increases the classification performance by 7.4 percent in comparison to conventional federated learning while reducing the energy consumption.File | Dimensione | Formato | |
---|---|---|---|
2201.08270.pdf
accesso aperto
Tipo:
Postprint
Licenza:
Licenza per accesso libero gratuito
Dimensione
2.58 MB
Formato
Adobe PDF
|
2.58 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.