Skip to content

DaymudeLab/LearnDynamicMaxIS

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LearnDynamicMaxIS

A graph neural network (GNN)-based update mechanism for finding maximum independent sets in dynamic graphs. This README contains instructions for running the model yourself; steps to reproduce our experiments can be found in REPRODUCE.md.

Repository Structure

All our of our model's code is housed under src folder. The top-level Python scripts in this folder, (pretrain.py, train.py, test.py, and generalization_test.py) handle the primary model functions. The model is housed in model/, with the constituent memory and local aggregator modules defined in modules/. Scripts to perform evaluation and validation are in evaluation/. Scripts for data preprocessing and loading along with other common functions are in utils/.

Other top-level folders: datatsets/ contains miscellaneous scripts for synthetic dataset creation and baselines/ baseline comparison experiments. Finally, data/ is an empty directory where actual dynamic graph datasets must be placed; instructions for reproducing these are in REPRODUCE.md.

Installation

We use uv to manage Python environments. After cloning this repository or downloading the latest release, install uv and then run the following to get all dependencies:

uv sync

Activate the newly created virtual environment with:

source .venv/bin/activate

If you prefer to use some tool other than uv to manage your Python environment, the pyproject.toml file lists the necessary Python version and dependencies.

Model Pipeline

  1. Begin by preparing your data. The scripts in datatsets/ can be used to create synthetic dynamic graphs (e.g., Erdős–Rényi or power law dynamics) or introduce degree-distribution-preserving expansions of some input static topology.
  2. Run baslines/gurobi_estimates.py to use Gurobi (a mixed integer programming solver) to generate approximate maximum independent set sizes for each of the dynamic graph's snapshots. These will be used during evaluation to calculate loss.
  3. Run src/utils/preprocess_data.py to prepare your dynamic graph for model training.
  4. Run model pretraining using src/pretrain.py.
  5. Run model training using src/train.py. This also runs a final test of the best model once the training concludes.
  6. Optionally, if the training routine is cut short due to time constraints and the final testing for the model does not run, use src/test.py to run model testing for a specific model checkpoint of your choosing.

License and Copyright

Our license and copyright statement can be found in LICENSE.

Our model and modules architecture closely follows that of Temporal Graph Networks (TGNs) developed by Twitter Research (twitter-research/tgn, Rossi et al., ICML 2020), though our implementation details differ significantly. TGN source code is licensed under Apache-2.0, requiring that any downstream works preserve the original copyright and license notices. To this end, we also license our work under Apache-2.0 and, where appropriate, denote in our source code where TGN code is present and how we modified it.

About

A GNN-based update mechanism for Maximum-Independent-Set in dynamic graphs

Topics

Resources

License

Stars

Watchers

Forks

Contributors

Languages