Communication-Efficient Vertical Federated Learning

A. Khan*, M. ten Thij, A. Wilbik

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

Federated learning (FL) is a privacy-preserving distributed learning approach that allows multiple parties to jointly build machine learning models without disclosing sensitive data. Although FL has solved the problem of collaboration without compromising privacy, it has a significant communication overhead due to the repetitive updating of models during training. Several studies have proposed communication-efficient FL approaches to address this issue, but adequate solutions are still lacking in cases where parties must deal with different data features, also referred to as vertical federated learning (VFL). In this paper, we propose a communication-efficient approach for VFL that compresses the local data of clients, and then aggregates the compressed data from all clients to build an ML model. Since local data are shared in compressed form, the privacy of these data is preserved. Experiments on publicly available benchmark datasets using our proposed method show that the final model obtained by aggregation of compressed data from clients outperforms the performance of the local models of the clients.
Original languageEnglish
Article number273
Number of pages14
JournalAlgorithms
Volume15
Issue number8
DOIs
Publication statusPublished - 1 Aug 2022

Keywords

  • federated machine learning
  • heterogeneous federated learning
  • communication efficient
  • data privacy

Cite this