Jeffrey Hsu's repositories
basenji
Sequential regulatory activity predictions with deep convolutional neural networks.
cellSNP
Pileup biallelic SNPs from single-cell and bulk RNA-seq data
centermask2
CenterMask : Real-time Anchor-Free Instance Segmentation, in CVPR 2020
ClynMut
To be a next-generation DL-based phenotype prediction from genome mutations.
cut_and_run
Snakemake Cut and Run Pipeline
deep-review
A collaboratively written review paper on deep learning, genomics, and precision medicine
fairseq
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
FewShotCellSegmentation
Code of "Few-shot microscopy image cell segmentation " https://arxiv.org/abs/2007.01671
gpt-neo
An implementation of model parallel GPT2& GPT3-like models, with the ability to scale up to full GPT3 sizes (and possibly more!), using the mesh-tensorflow library.
KBoost
KBoost: A package to infer gene regulatory networks from gene expression data
lie-transformer-pytorch
Implementation of Lie Transformer, Equivariant Self-Attention, in Pytorch
liftover
liftover for python, made fast with cython
longformer
Longformer: The Long-Document Transformer
opentrons
Software for writing protocols and running them on the Opentrons OT-2
opentrons-integration-tools
Guides and code snippets for integrating external devices with Opentrons
pyfra
Python Research Framework
satori
A new and revamped implementation of SATORI.
se3-transformer-pytorch
Implementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.
SegNeXt
Official Pytorch implementations for "SegNeXt: Rethinking Convolutional Attention Design for Semantic Segmentation" (NeurIPS 2022)
stylegan2-pytorch
Simplest working implementation of Stylegan2 in Pytorch
tpudiepie
Babysit your preemptible TPUs
umato
Uniform Manifold Approximation with Two-phase Optimization (IEEE VIS 2022 short)
vpolo
To remember what have been lost !
x-transformers
A simple but complete full-attention transformer with a set of promising experimental features from various papers
x-unet
Implementation of a U-net complete with efficient attention as well as the latest research findings