Molecule Transformers (MoleculeTransformers)

Molecule Transformers

MoleculeTransformers

Geek Repo

Molecule Transformers is a collection of recipes for pre-training and fine-tuning molecular transformer language models, including BART, BERT, etc.

Home Page:https://moleculetransformers.github.io/

Github PK Tool:Github PK Tool

Molecule Transformers's repositories

smiles-featurizers

Extract Molecular SMILES embeddings from language models pre-trained with various objectives architectures.

Language:PythonLicense:Apache-2.0Stargazers:14Issues:1Issues:2

moleculenet-smiles-bert-mixup

Training pre-trained BERT language model on molecular SMILES from the Molecule Net benchmark by leveraging mixup and enumeration augmentations.

Language:PythonLicense:Apache-2.0Stargazers:3Issues:2Issues:0

moleculetransformers.github.io

Documentation for the Molecule Transformers.

License:Apache-2.0Stargazers:2Issues:2Issues:0

moleculenet-bert-ssl

Semi-supervised learning techniques (pseudo-label, mixmatch, and co-training) for pre-trained BERT language model amidst low-data regime based on molecular SMILES from the Molecule Net benchmark.

Language:PythonLicense:Apache-2.0Stargazers:1Issues:2Issues:0

smiles-augment

Augment molecular SMILES with methods including enumeration, and mixup, for low-data regime settings for downstream supervised drug discovery tasks.

Language:PythonLicense:Apache-2.0Stargazers:1Issues:1Issues:0

rdkit-benchmarking-platform-transformers

Port of RDKit Benchmarking platform for pre-trained transformers-based language models for virtual screening drug discovery task.

Language:PythonStargazers:0Issues:2Issues:0