A Resource-Aware Federated Learning Simulation Platform

  • Fellipe Leandro (King Abdullah University of Science and Technology (KAUST) (Creator)

Dataset

Description

The increasing concerns regarding users‘ data privacy leads to the infeasibility of distributed Machine Learning applications, which are usually data-hungry. Federated Learning has emerged as a privacy-preserving distributed machine learning paradigm, in which the client dataset is kept locally, and only the local model parameters are transmitted to the central server. However, adoption of the Federated Learning paradigm leads to new edge computing challenges, since it assumes computationally intensive tasks can be executed locally by each device. The diverse hardware resources in a population of edge devices (e.g., smartphone models) can negatively impact the performance of Federated Learning, at both the global and local levels. This thesis contributes to this context with the implementation of a hardware-aware Federated Learning platform, which provides comprehensive support regarding the impacts of hardware heterogeneity on Federated Learning performance metrics by modeling the costs associated with training tasks on aspects of computation and communication.
Date made available2021
PublisherKAUST Research Repository

Cite this