- This event has passed.
Federated Learning Based on Dynamic Regularization to Debias Model Updates
Join us at our Arm AI Virtual Tech Talk!
Join us for the first of our Arm ML Quarterly Research Specials where we bring you the latest in the world of ML research from our incredible Arm team. Check out the abstract below and sign up today!
We propose a novel federated learning method for training neural network models in a decentralized fashion, where the server orchestrates cooperation between a subset of randomly chosen devices in each round. We view Federated Learning problems primarily from a communication perspective and allow more device level computations to save transmission costs.
We point out a fundamental dilemma, in that the minima of the local-device level empirical loss are inconsistent with those of the global empirical loss. Different from recent prior works, that either attempt inexact minimization or utilize devices for parallelizing gradient computation, we propose a dynamic regularizer for each device at each round, so that in the limit the global and device solutions are aligned.
We demonstrate both through empirical results on real and synthetic data as well as analytical results that our scheme leads to efficient training, in both convex and non-convex settings, while being fully agnostic to device heterogeneity and robust to large number of devices, partial participation and unbalanced data.
This talk is part of the bi-weekly AI Virtual Tech Talk Series: https://developer.arm.com/solutions/machine-learning-on-arm/ai-virtual-tech-talks
Feb 22, 2022 04:00 PM in London