Decentralized Incremental Federated Learning with Echo State Networks

Abstract

Federated Echo State Networks proved their efficiency in learning low-resource collaborative settings where data is regulated privacy. In this work, we broaden the applicability of this machine learning approach to a decentralized setting, where we have a set of peers connected through a logical communication topology and cannot rely on a centralized aggregation entity. In particular, we propose Decentralized Incremental Federated Learning (DIncFed), where multiple agents collaborate to learn a readout by leveraging exact consensus strategies. Such strategies include mechanisms for collaboratively aggregating knowledge towards consensus, as well as policies for dynamically updating the communication topology. Experiments prove the efficacy and the efficiency of the proposed learning methodology against a state-of-the-art iterative competitor on multiple benchmarks characterized by different levels of statistical heterogeneity.

Publication
In 2024 International Joint Conference on Neural Networks (IJCNN)
Valerio De Caro
Valerio De Caro

My research interests include Federated Learning, Continual Learning and Reservoir Computing Systems.