Skip to content
Generative Tensorial Reinforcement Learning (GENTRL) model
Python
Branch: master
Clone or download
Latest commit f01c55b Sep 9, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
examples GENTRL Sep 1, 2019
gentrl GENTRL Sep 1, 2019
images GENTRL Sep 1, 2019
README.md Update README.md Sep 2, 2019
setup.py MOSES 0.1.3 Sep 9, 2019

README.md

Generative Tensorial Reinforcement Learning (GENTRL)

Supporting Information for the paper "Deep learning enables rapid identification of potent DDR1 kinase inhibitors".

The GENTRL model is a variational autoencoder with a rich prior distribution of the latent space. We used tensor decompositions to encode the relations between molecular structures and their properties and to learn on data with missing values. We train the model in two steps. First, we learn a mapping of a chemical space on the latent manifold by maximizing the evidence lower bound. We then freeze all the parameters except for the learnable prior and explore the chemical space to find molecules with a high reward.

GENTRL

Repository

In this repository, we provide an implementation of a GENTRL model with an example trained on a MOSES dataset.

To run the training procedure,

  1. Install RDKit to process molecules
  2. Install GENTRL model: python setup.py install
  3. Install MOSES from the repository
  4. Run the pretrain.ipynb to train an autoencoder
  5. Run the train_rl.ipynb to optimize a reward function
You can’t perform that action at this time.