Distill2Vec: Dynamic Graph Representation Learning with Knowledge Distillation

Stefanos Antaris*, Dimitrios Rafailidis

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingAcademicpeer-review

Abstract

Dynamic graph representation learning strategies are based on different neural architectures to capture the graph evolution over time. However, the underlying neural architectures require a large amount of parameters to train and suffer from high online inference latency, that is several model parameters have to be updated when new data arrive online. In this study we propose Distill2Vec, a knowledge distillation strategy to train a compact model with a low number of trainable parameters, so as to reduce the latency of online inference and maintain the model accuracy high. We design a distillation loss function based on Kullback-Leibler divergence to transfer the acquired knowledge from a teacher model trained on offline data, to a small-size student model for online data. Our experiments with publicly available datasets show the superiority of our proposed model over several state-of-the-art approaches with relative gains up to 5% in the link prediction task. In addition, we demonstrate the effectiveness of our knowledge distillation strategy, in terms of number of required parameters, where Distill2Vec achieves a compression ratio up to 7:100 when compared with baseline approaches. For reproduction purposes, our implementation is publicly available at https://stefanosantaris.github.io/Distill2Vec.
Original languageEnglish
Title of host publication2020 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining (ASONAM)
EditorsM Atzmuller, M Coscia, R Missaoui
PublisherIEEE Xplore
Pages60-64
Number of pages5
ISBN (Print)978-1-7281-1057-8
DOIs
Publication statusPublished - 10 Dec 2020
Event2020 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining (ASONAM) - The Hague, Netherlands
Duration: 7 Dec 202010 Dec 2020

Conference

Conference2020 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining (ASONAM)
Period7/12/2010/12/20

Keywords

  • Analytical models
  • Data models
  • Predictive models
  • Social networking (online)
  • Task analysis
  • model compression
  • Dynamic graph representation learning
  • knowledge distillation

Cite this