Extract Molecular SMILES embeddings from language models pre-trained with various objectives architectures.
-
Updated
Nov 9, 2023 - Python
Extract Molecular SMILES embeddings from language models pre-trained with various objectives architectures.
Training pre-trained BERT language model on molecular SMILES from the Molecule Net benchmark by leveraging mixup and enumeration augmentations.
Molecule Transformers is a collection of recipes for pre-training and fine-tuning molecular transformer language models, including BART, BERT, etc. Full thesis available at https://moleculetransformers.github.io/thesis_cs_msc_Khan_Shahrukh.pdf.
Semi-supervised learning techniques (pseudo-label, mixmatch, and co-training) for pre-trained BERT language model amidst low-data regime based on molecular SMILES from the Molecule Net benchmark.
Add a description, image, and links to the moleculenet topic page so that developers can more easily learn about it.
To associate your repository with the moleculenet topic, visit your repo's landing page and select "manage topics."