PersA-FL: Personalized Asynchronous Federated LearningWe study the personalized federated learning problem under asynchronous
updates. In this problem, each client seeks to obtain a personalized model that
simultaneously outperforms local and global models. We consider two
optimization-based frameworks for personalization: (i) Model-Agnostic
Meta-Learning (MAML) and (ii) Moreau Envelope (ME). MAML involves learning a
joint model adapted for each client through fine-tuning, whereas ME requires a
bi-level optimization problem with implicit gradients to enforce
personalization via regularized losses. We focus on improving the scalability
of personalized federated learning by removing the synchronous communication
assumption. Moreover, we extend the studied function class by removing
boundedness assumptions on the gradient norm. Our main technical contribution
is a unified proof for asynchronous federated learning with bounded staleness
that we apply to MAML and ME personalization frameworks. For the smooth and
non-convex functions class, we show the convergence of our method to a
first-order stationary point. We illustrate the performance of our method and
its tolerance to staleness through experiments for classification tasks over
heterogeneous datasets.
arxiv.org