The recent interest in federated learning has initiated the investigation for efficient models deployable in scenarios with strict communication and computational constraints. Furthermore, the inherent privacy concerns in decentralized and federated learning call for efficient distribution of information in a network of interconnected agents. Therefore, we propose a novel distributed classification solution that is based on shallow randomized networks equipped with a compression mechanism that is used for sharing the local model in the federated context. We make extensive use of hyperdimensional computing both in the local network model and in the compressed communication protocol, which is enabled by the binding and the superposition operations. Accuracy, precision, and stability of our proposed approach are demonstrated on a collection of datasets with several network topologies and for different data partitioning schemes.
ISBN för värdpublikation: 978-1-7281-8671-9