Jeffrey Hsu's repositories

selene

a framework for training sequence-level deep learning networks

Language:Jupyter NotebookLicense:BSD-3-Clause-ClearStargazers:1Issues:0Issues:0

basenji

Sequential regulatory activity predictions with deep convolutional neural networks.

Language:Jupyter NotebookLicense:Apache-2.0Stargazers:0Issues:0Issues:0
License:MITStargazers:0Issues:0Issues:0

cellSNP

Pileup biallelic SNPs from single-cell and bulk RNA-seq data

License:Apache-2.0Stargazers:0Issues:0Issues:0

centermask2

CenterMask : Real-time Anchor-Free Instance Segmentation, in CVPR 2020

License:NOASSERTIONStargazers:0Issues:0Issues:0

ClynMut

To be a next-generation DL-based phenotype prediction from genome mutations.

License:MITStargazers:0Issues:0Issues:0

cut_and_run

Snakemake Cut and Run Pipeline

License:GPL-3.0Stargazers:0Issues:0Issues:0

deep-review

A collaboratively written review paper on deep learning, genomics, and precision medicine

License:NOASSERTIONStargazers:0Issues:0Issues:0

fairseq

Facebook AI Research Sequence-to-Sequence Toolkit written in Python.

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

FewShotCellSegmentation

Code of "Few-shot microscopy image cell segmentation " https://arxiv.org/abs/2007.01671

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

gpt-neo

An implementation of model parallel GPT2& GPT3-like models, with the ability to scale up to full GPT3 sizes (and possibly more!), using the mesh-tensorflow library.

License:MITStargazers:0Issues:0Issues:0

KBoost

KBoost: A package to infer gene regulatory networks from gene expression data

Stargazers:0Issues:0Issues:0

lie-transformer-pytorch

Implementation of Lie Transformer, Equivariant Self-Attention, in Pytorch

License:MITStargazers:0Issues:0Issues:0

liftover

liftover for python, made fast with cython

Language:C++License:MITStargazers:0Issues:0Issues:0
Language:PythonLicense:MITStargazers:0Issues:0Issues:0

longformer

Longformer: The Long-Document Transformer

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

opentrons

Software for writing protocols and running them on the Opentrons OT-2

License:Apache-2.0Stargazers:0Issues:0Issues:0

opentrons-integration-tools

Guides and code snippets for integrating external devices with Opentrons

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

pyfra

Python Research Framework

License:MITStargazers:0Issues:0Issues:0

satori

A new and revamped implementation of SATORI.

Language:HTMLLicense:MITStargazers:0Issues:0Issues:0

se3-transformer-pytorch

Implementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.

License:MITStargazers:0Issues:0Issues:0

SegNeXt

Official Pytorch implementations for "SegNeXt: Rethinking Convolutional Attention Design for Semantic Segmentation" (NeurIPS 2022)

Language:PythonStargazers:0Issues:0Issues:0

stylegan2-pytorch

Simplest working implementation of Stylegan2 in Pytorch

Language:PythonLicense:GPL-3.0Stargazers:0Issues:0Issues:0
Language:PythonStargazers:0Issues:0Issues:0

tpudiepie

Babysit your preemptible TPUs

License:NOASSERTIONStargazers:0Issues:0Issues:0

umato

Uniform Manifold Approximation with Two-phase Optimization (IEEE VIS 2022 short)

Language:Jupyter NotebookLicense:MITStargazers:0Issues:0Issues:0

vpolo

To remember what have been lost !

License:GPL-3.0Stargazers:0Issues:0Issues:0

x-transformers

A simple but complete full-attention transformer with a set of promising experimental features from various papers

License:MITStargazers:0Issues:0Issues:0

x-unet

Implementation of a U-net complete with efficient attention as well as the latest research findings

Language:PythonLicense:MITStargazers:0Issues:0Issues:0