Simon Levine's repositories
dl-for-genomics
Selections from CS185-Deep Learning for Genomics, Brown University
membership-query-synthesis
A look at how active learning methods could use MQS via the modAL and SDV libraries.
molecular-GNN
Predicting LogD Lipophilicity values from SMILES objects using graph-convolutional networks
baal
Library to enable Bayesian active learning in your research or labeling work.
basenji
Sequential regulatory activity predictions with deep convolutional neural networks.
singlecell-sim-by-VAE
A repository for VAE-based simulation of latent cell characteristics.
Compact-Transformers
[Preprint] Escaping the Big Data Paradigm with Compact Transformers, 2021
deepmind-research
This repository contains implementations and illustrative code to accompany DeepMind publications
DeepSilencer
A deep convolutional neural network for the accurate prediction of silencers
DNABERT
DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome
EmbeddingRecycling
Embedding Recycling for Language models
enformer-pytorch
Implementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch
esm
Evolutionary Scale Modeling (esm): Pretrained language models for proteins
FastSK
Bioinformatics 2020: FastSK: Fast and Accurate Sequence Classification by making gkm-svm faster and scalable. https://fastsk.readthedocs.io/en/master/
genesis
Deep Exploration Networks - Diverse Deep Generative Models for DNA, RNA and Protein Sequences
h-transformer-1d
Implementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
improved_CcGAN
Continuous Conditional Generative Adversarial Networks (CcGAN)
lambo
Code to reproduce experiments in "Accelerating Bayesian Optimization for Protein Design with Denoising Autoencoders" (Stanton et al 2022)
lipophilicity-prediction
Code for "Lipophilicity Prediction with Multitask Learning and Molecular Substructures Representation" paper. Machine Learning for Molecules Workshop @ NeurIPS 2020
modAL
A modular active learning framework for Python
MPRA-DragoNN
Code accompanying the paper "Deciphering regulatory DNA sequences and noncoding genetic variants using neural network models of massively parallel reporter assays"
peft
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
peptimizer
Peptide optimization with Machine Learning
Swin-Transformer
This is an official implementation for "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows".
Swin-Transformer-V2
PyTorch reimplementation of the paper "Swin Transformer V2: Scaling Up Capacity and Resolution" [arXiv 2021].
trl
Train transformer language models with reinforcement learning.