Skip to content

classicvalues/MAT

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MAT

The official implementation of the Molecule Attention Transformer. ArXiv

architecture

Code

  • EXAMPLE.ipynb jupyter notebook with an example of loading pretrained weights into MAT,
  • transformer.py file with MAT class implementation,
  • utils.py file with utils functions.

More functionality will be available soon!

Pretrained weights

Pretrained weights are available here

Results

In this section we present the average rank across the 7 datasets from our benchmark.

  • Results for hyperparameter search budget of 500 combinations.

  • Results for hyperparameter search budget of 150 combinations.

  • Results for pretrained model

Requirements

  • PyTorch 1.4

Acknowledgments

Transformer implementation is inspired by The Annotated Transformer.

About

The official implementation of the Molecule Attention Transformer.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 80.2%
  • Jupyter Notebook 19.8%