@INPROCEEDINGS{Alfa2106:Distributed, AUTHOR="Haneen Alfauri and Flavio Esposito", TITLE="A Distributed Consensus Protocol for Sustainable Federated Learning", BOOKTITLE="2021 IEEE 7th International Conference on Network Softwarization (NetSoft) (NetSoft 2021)", ADDRESS="Tokyo, Japan", DAYS=27, MONTH=jun, YEAR=2021, KEYWORDS="Efficient", ABSTRACT="The most significant challenge of our time is global warming, it impacts virtually every area of our lives. This study was motivated by the observation that to train Artificial Intelligence and Machine learning (AI/ML) algorithms result in staggering carbon footprints. Moreover, centralized implementations are becoming a bottleneck of several AI/ML applications that needs frequent retraining and low latency responses. To overcome the limitations of a centralized ML research community has proposed Federated Learning, a technique used to train AI/ML algorithms in a distributed fashion. There has been significant previous work to reduce power consumption adopting efficient hardware techniques; while such techniques yield large savings, they are not focusing on distributed learning. We propose an Energy-efficient Consensus Protocol for sustainable Federated Learning. Our protocol iterates over bidding and an agreement(or consensus) phase by only exchanging bids and a few other policy-driven information with neighbor workers. The consensus is the mapping of AI/ML jobs to the underline infrastructure. The proposed mechanism is proven to converge and has a worst-case efficiency of 63 Percent. Our simulations show significant energy savings of 22.7 percent with respect to our benchmark" }