Decentralized federated learning meets Physics-Informed Neural Networks
- Authors: Alfano G.; Greco S.; Mandaglio D.; Parisi F.; Shahbazian R.; Trubitsyna I.
- Publication year: 2025
- Type: Articolo in rivista
- OA Link: http://hdl.handle.net/10447/683123
Abstract
The integration of domain knowledge into the learning process of artificial intelligence (AI) has received significant attention in the last few years. Most of the approaches proposed so far have focused on centralized machine learning scenarios, with less emphasis on how domain knowledge can be effectively integrated in decentralized settings. In this paper, we address this gap by evaluating the effectiveness of domain knowledge integration in distributed settings, specifically in the context of Decentralized Federated Learning (DFL). We propose the Physics-Informed DFL (PIDFL) architecture by integrating domain knowledge expressed as differential equations. We introduce a serverless data aggregation algorithm for PIDFL, prove its convergence, and discuss its computational complexity. We performed comprehensive experiments across various datasets and demonstrated that PIDFL significantly reduces average loss across diverse applications. The proposed PIDFL framework achieves on average over 40% lower test loss compared with the baseline DFLA, and outperforms benchmark approaches (FedAvg, SegGos, and Scaffold) across a variety of datasets. This highlights the potential of PIDFL and offers a promising avenue for improving decentralized learning through domain knowledge integration.