Yuhui Ding (skeletondyh)

skeletondyh

Geek Repo

Company:@dalab

Location:Zurich

Home Page:https://skeletondyh.github.io/

Twitter:@yuhui_ding

Github PK Tool:Github PK Tool


Organizations
dalab

Yuhui Ding's starred repositories

Reflected-Diffusion

[ICML 2023] Reflected Diffusion Models (https://arxiv.org/abs/2304.04740)

Language:PythonLicense:MITStargazers:153Issues:0Issues:0

GRED

[ICML 2024] Recurrent Distance Filtering for Graph Representation Learning

Language:PythonStargazers:6Issues:0Issues:0

FoldFlow

FoldFlow: SE(3)-Stochastic Flow Matching for Protein Backbone Generation

Language:Jupyter NotebookLicense:NOASSERTIONStargazers:148Issues:0Issues:0

riemannian-fm

code for "Riemannian Flow Matching on General Geometries".

Language:PythonLicense:NOASSERTIONStargazers:161Issues:0Issues:0

al-folio

A beautiful, simple, clean, and responsive Jekyll theme for academics

Language:HTMLLicense:MITStargazers:10229Issues:0Issues:0

geometric_ml

This repository contains code for applying Riemannian geometry in machine learning.

Language:PythonStargazers:74Issues:0Issues:0

alphaflow

AlphaFold Meets Flow Matching for Generating Protein Ensembles

Language:PythonLicense:MITStargazers:321Issues:0Issues:0

chroma

A generative model for programmable protein design

Language:PythonLicense:Apache-2.0Stargazers:648Issues:0Issues:0

e3nn

A modular framework for neural networks with Euclidean symmetry

Language:PythonLicense:NOASSERTIONStargazers:922Issues:0Issues:0

open-interpreter

A natural language interface for computers

Language:PythonLicense:AGPL-3.0Stargazers:51470Issues:0Issues:0

consistency_models

Official repo for consistency models.

Language:PythonLicense:MITStargazers:6048Issues:0Issues:0

attention-rank-collapse

[ICML 2021 Oral] We show pure attention suffers rank collapse, and how different mechanisms combat it.

Language:Jupyter NotebookLicense:Apache-2.0Stargazers:155Issues:0Issues:0

llama

Inference code for Llama models

Language:PythonLicense:NOASSERTIONStargazers:55043Issues:0Issues:0

tuning_playbook

A playbook for systematically maximizing the performance of deep learning models.

License:NOASSERTIONStargazers:26139Issues:0Issues:0

Awesome-Diffusion-Models

A collection of resources and papers on Diffusion Models

Language:HTMLLicense:MITStargazers:10602Issues:0Issues:0

X-Decoder

[CVPR 2023] Official Implementation of X-Decoder for generalized decoding for pixel, image and language

Language:PythonLicense:Apache-2.0Stargazers:1277Issues:0Issues:0

awesome-jax

JAX - A curated list of resources https://github.com/google/jax

License:CC0-1.0Stargazers:1424Issues:0Issues:0

lrgb

Long Range Graph Benchmark, NeurIPS 2022 Track on D&B

Language:Jupyter NotebookLicense:MITStargazers:149Issues:0Issues:0
Language:Jupyter NotebookLicense:MITStargazers:183Issues:0Issues:0

cords

Reduce end to end training time from days to hours (or hours to minutes), and energy requirements/costs by an order of magnitude using coresets and data selection.

Language:Jupyter NotebookLicense:MITStargazers:315Issues:0Issues:0

jaxopt

Hardware accelerated, batchable and differentiable optimizers in JAX.

Language:PythonLicense:Apache-2.0Stargazers:912Issues:0Issues:0

stable-diffusion

A latent text-to-image diffusion model

Language:Jupyter NotebookLicense:NOASSERTIONStargazers:67148Issues:0Issues:0

TransformersCanDoBayesianInference

Official Implementation of "Transformers Can Do Bayesian Inference", the PFN paper

Language:PythonStargazers:179Issues:0Issues:0

how-do-vits-work

(ICLR 2022 Spotlight) Official PyTorch implementation of "How Do Vision Transformers Work?"

Language:PythonLicense:Apache-2.0Stargazers:802Issues:0Issues:0

SAT

Official Pytorch code for Structure-Aware Transformer.

Language:PythonLicense:BSD-3-ClauseStargazers:233Issues:0Issues:0

DatasetCondensation

Dataset Condensation (ICLR21 and ICML21)

Language:PythonLicense:MITStargazers:462Issues:0Issues:0

syne-tune

Large scale and asynchronous Hyperparameter and Architecture Optimization at your fingertips.

Language:PythonLicense:Apache-2.0Stargazers:380Issues:0Issues:0

unilm

Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities

Language:PythonLicense:MITStargazers:19365Issues:0Issues:0

s4

Structured state space sequence models

Language:Jupyter NotebookLicense:Apache-2.0Stargazers:2311Issues:0Issues:0

diffstride

TF/Keras code for DiffStride, a pooling layer with learnable strides.

Language:PythonLicense:Apache-2.0Stargazers:123Issues:0Issues:0