Skip to content

martenlienen/graph2gauss-pytorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

23 Commits
 
 
 
 
 
 
 
 

Repository files navigation

graph2gauss in PyTorch

This is a pytorch implementation of Deep Gaussian Embedding of Graphs: Unsupervised Inductive Learning via Ranking. Run python g2g.py -h to learn about training options. Below you can see an example run on the citeseer dataset as provided in the implementation by the original authors.

$ python g2g.py --seed 0 --samples 3 --epochs 120 --workers 5 -k 1 citeseer.npz
LR F1 score 0.4491554535256343
Epoch 10 - Loss 35145104.000
Epoch 20 - Loss 28642094.000
Epoch 30 - Loss 21908854.000
Epoch 40 - Loss 17939046.000
Epoch 50 - Loss 14932645.000
LR F1 score 0.7872897465883691
Epoch 60 - Loss 12318501.000
Epoch 70 - Loss 11219567.000
Epoch 80 - Loss 9815305.000
Epoch 90 - Loss 8618428.000
Epoch 100 - Loss 7848496.500
LR F1 score 0.8252173651998809

A significant difference between this and the reference implementation is the Monte-Carlo approximation to the loss function. The authors used node-based sampling whereas I used edge sampling as described on my website.

About

PyTorch implementation of graph2gauss

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages