Molecule Transformers's repositories
smiles-featurizers
Extract Molecular SMILES embeddings from language models pre-trained with various objectives architectures.
moleculenet-smiles-bert-mixup
Training pre-trained BERT language model on molecular SMILES from the Molecule Net benchmark by leveraging mixup and enumeration augmentations.
moleculetransformers.github.io
Documentation for the Molecule Transformers.
moleculenet-bert-ssl
Semi-supervised learning techniques (pseudo-label, mixmatch, and co-training) for pre-trained BERT language model amidst low-data regime based on molecular SMILES from the Molecule Net benchmark.
smiles-augment
Augment molecular SMILES with methods including enumeration, and mixup, for low-data regime settings for downstream supervised drug discovery tasks.
rdkit-benchmarking-platform-transformers
Port of RDKit Benchmarking platform for pre-trained transformers-based language models for virtual screening drug discovery task.