Skip to content

recursionpharma/gflownet

Repository files navigation

Build-and-Test Code Quality Python versions license: MIT

gflownet

GFlowNet-related training and environment code on graphs.

Primer

GFlowNet, short for Generative Flow Network, is a novel generative modeling framework, particularly suited for discrete, combinatorial objects. Here in particular it is implemented for graph generation.

The idea behind GFN is to estimate flows in a (graph-theoretic) directed acyclic network*. The network represents all possible ways of constructing an object, and so knowing the flow gives us a policy which we can follow to sequentially construct objects. Such a sequence of partially constructed objects is a trajectory. *Perhaps confusingly, the network in GFN refers to the state space, not a neural network architecture.

Here the objects we construct are themselves graphs (e.g. graphs of atoms), which are constructed node by node. To make policy predictions, we use a graph neural network. This GNN outputs per-node logits (e.g. add an atom to this atom, or add a bond between these two atoms), as well as per-graph logits (e.g. stop/"done constructing this object").

The GNN model can be trained on a mix of existing data (offline) and self-generated data (online), the latter being obtained by querying the model sequentially to obtain trajectories. For offline data, we can easily generate trajectories since we know the end state.

Repo overview

  • algo, contains GFlowNet algorithms implementations (Trajectory Balance, SubTB, Flow Matching), as well as some baselines. These implement how to sample trajectories from a model and compute the loss from trajectories.
  • data, contains dataset definitions, data loading and data sampling utilities.
  • envs, contains environment classes; a graph-building environment base, and a molecular graph context class. The base environment is agnostic to what kind of graph is being made, and the context class specifies mappings from graphs to objects (e.g. molecules) and torch geometric Data.
  • examples, contains simple example implementations of GFlowNet.
  • models, contains model definitions.
  • tasks, contains training code.
    • qm9, temperature-conditional molecule sampler based on QM9's HOMO-LUMO gap data as a reward.
    • seh_frag, reproducing Bengio et al. 2021, fragment-based molecule design targeting the sEH protein
    • seh_frag_moo, same as the above, but with multi-objective optimization (incl. QED, SA, and molecule weight objectives).
  • utils, contains utilities (multiprocessing, metrics, conditioning).
  • trainer.py, defines a general harness for training GFlowNet models.
  • online_trainer.py, defines a typical online-GFN training loop.

See implementation notes for more.

Getting started

A good place to get started is with the sEH fragment-based MOO task. The file seh_frag_moo.py is runnable as-is (although you may want to change the default configuration in main()).

Installation

PIP

This package is installable as a PIP package, but since it depends on some torch-geometric package wheels, the --find-links arguments must be specified as well:

pip install -e . --find-links https://data.pyg.org/whl/torch-2.1.2+cu121.html

Or for CPU use:

pip install -e . --find-links https://data.pyg.org/whl/torch-2.1.2+cpu.html

To install or depend on a specific tag, for example here v0.0.10, use the following scheme:

pip install git+https://github.com/recursionpharma/gflownet.git@v0.0.10 --find-links ...

If package dependencies seem not to work, you may need to install the exact frozen versions listed requirements/, i.e. pip install -r requirements/main-3.10.txt.

Developing & Contributing

External contributions are welcome.

To install the developers dependencies

pip install -e '.[dev]' --find-links https://data.pyg.org/whl/torch-2.1.2+cu121.html

We use tox to run tests and linting, and pre-commit to run checks before committing. To ensure that these checks pass, simply run tox -e style and tox run to run linters and tests, respectively.