Federated Learning (FL) enables training over sensitive, distributed data without centralizing it, but the canonical server-based architecture introduces a single point of failure, severe communication bottlenecks, and strong trust assumptions. Decentralized FL (DFL) addresses these issues by removing the central aggregator and relying on peer-to-peer exchanges, yet most existing designs either ignore realistic bandwidth and latency constraints or treat model parameters as opaque bytes. FLTorrent combines DFL with a BitTorrent-style P2P substrate and makes the transport layer explicitly model-aware. It repurposes the FL server into a lightweight orchestrator, uses a managed BitTorrent-like overlay for chunk-based model exchange, and introduces importance-aware chunk selection together with robust aggregation mechanisms tailored to partial models, which are the norm under tight per-round time and bandwidth budgets. This design aims to reconcile scalability, bandwidth efficiency, and learning accuracy in heterogeneous, bandwidth-constrained decentralized environments.
Naicheng Li is a second-year PhD researcher at IMDEA Networks Institute (+UC3M), working under the supervision of Prof. Nikolaos Laoutaris. His research lies at the intersection of distributed systems, networking, and machine learning, with a particular focus on decentralized federated learning and privacy-preserving protocols. Before joining IMDEA Networks, he obtained his master’s degree from Chalmers University of Technology in Sweden.
Este evento se impartirá en inglés