Skip to content

v1.1.0

Latest
Compare
Choose a tag to compare
@baoleai baoleai released this 09 Oct 08:26
· 29 commits to master since this release

We are glad to announce several new features and improvements to graphlearn, including heterogeneous graph support in SubGraph-based GNN, KNN support, HDFS support, new models, recommendation datasets and evaluation metrics, etc. We also introduce an online sampling module, named Dynamic Graph Service (DGS), for online inference services. We restructured the code structure to make it easier to follow, the training part is put into graphlearn and DGS is put into dynamic_graph_service.

New Features

  • [DGS] Add DGS to GraphLearn. DGS is an online GNN inference service, it supports real-time sampling on dynamic graphs with streaming graph updates. doc
  • Add Heterogeneous graph support for SubGraph-based GNN, add HeteroSubGraph and HeteroConv and bipartite GraphSAGE example.
  • Add nn.dataset support for sparse data.
  • Add Edge feature support in both EgoGraph and SubGraph.
  • Add HDFS graph source support.
  • Add KNN operator based on faiss.
  • Add AmazonBooks data and ego_bipartite_sage example.
  • Add recemendation metrics: Recall, NDCG and HitRate.
  • Add UltraGCN(CIKM'2021).
  • Add hiactor-based graph engine implementation.
  • Enable dump TensorFlow timeline in trainer for profiling.
  • Add a setting for negative sampling strictness.
  • [pytorch] Add interface get_stats to get the number of nodes and edges on each graphlearn server and refine PyG's GCN example.
  • [pytorch] Add support for HeteroData in PyG 2.x.

Breaking changes

  • Refactor SubGraph inducer to support any sampling and subgraph generation method.

Bugfixes

  • Fix the trainer log when there is no data in an epoch.
  • Fix session is not closed as expected in DistTrainer.
  • Fix transform in EgoGraph and change window size.
  • [pytorch] Add get real data length for PyTorch DDP distributed training.