Phil Wang (lucidrains)

lucidrains

Geek Repo

Location:San Francisco

Home Page:lucidrains.github.io

Twitter:@lucidrains

Github PK Tool:Github PK Tool

Phil Wang's repositories

lightweight-gan

Implementation of 'lightweight' GAN, proposed in ICLR 2021, in Pytorch. High resolution image generations that can be trained within a day or two

Language:PythonLicense:MITStargazers:1605Issues:34Issues:111

alphafold2

To eventually become an unofficial Pytorch implementation / replication of Alphafold2, as details of the architecture get released

Language:PythonLicense:MITStargazers:1501Issues:66Issues:48

flamingo-pytorch

Implementation of 🦩 Flamingo, state-of-the-art few-shot visual question answering attention net out of Deepmind, in Pytorch

Language:PythonLicense:MITStargazers:1140Issues:21Issues:13

PaLM-pytorch

Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways

Language:PythonLicense:MITStargazers:807Issues:16Issues:11

nuwa-pytorch

Implementation of NÜWA, state of the art attention network for text to video synthesis, in Pytorch

Language:PythonLicense:MITStargazers:533Issues:23Issues:9

slot-attention

Implementation of Slot Attention from GoogleAI

Language:PythonLicense:MITStargazers:349Issues:11Issues:6

segformer-pytorch

Implementation of Segformer, Attention + MLP neural network for segmentation, in Pytorch

Language:PythonLicense:MITStargazers:305Issues:9Issues:12

Adan-pytorch

Implementation of the Adan (ADAptive Nesterov momentum algorithm) Optimizer in Pytorch

Language:PythonLicense:MITStargazers:242Issues:11Issues:1

se3-transformer-pytorch

Implementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.

Language:PythonLicense:MITStargazers:236Issues:11Issues:15

flash-cosine-sim-attention

Implementation of fused cosine similarity attention in the same style as Flash Attention

Language:CudaLicense:MITStargazers:192Issues:12Issues:10

res-mlp-pytorch

Implementation of ResMLP, an all MLP solution to image classification, in Pytorch

Language:PythonLicense:MITStargazers:190Issues:4Issues:3

chroma-pytorch

Implementation of Chroma, generative models of protein using DDPM and GNNs, in Pytorch

Language:PythonLicense:MITStargazers:158Issues:20Issues:3

invariant-point-attention

Implementation of Invariant Point Attention, used for coordinate refinement in the structure module of Alphafold2, as a standalone Pytorch module

Language:PythonLicense:MITStargazers:138Issues:5Issues:7

gated-state-spaces-pytorch

Implementation of Gated State Spaces, from the paper "Long Range Language Modeling via Gated State Spaces", in Pytorch

Language:PythonLicense:MITStargazers:94Issues:6Issues:3

perceiver-ar-pytorch

Implementation of Perceiver AR, Deepmind's new long-context attention network based on Perceiver architecture, in Pytorch

Language:PythonLicense:MITStargazers:85Issues:4Issues:8

n-grammer-pytorch

Implementation of N-Grammer, augmenting Transformers with latent n-grams, in Pytorch

Language:PythonLicense:MITStargazers:72Issues:7Issues:3

memory-compressed-attention

Implementation of Memory-Compressed Attention, from the paper "Generating Wikipedia By Summarizing Long Sequences"

Language:PythonLicense:MITStargazers:71Issues:2Issues:4

transframer-pytorch

Implementation of Transframer, Deepmind's U-net + Transformer architecture for up to 30 seconds video generation, in Pytorch

Language:PythonLicense:MITStargazers:65Issues:4Issues:3

adjacent-attention-network

Graph neural network message passing reframed as a Transformer with local attention

Language:PythonLicense:MITStargazers:59Issues:6Issues:0

equiformer-diffusion

Implementation of Denoising Diffusion for protein design, but using the new Equiformer (successor to SE3 Transformers) with some additional improvements

License:MITStargazers:55Issues:13Issues:0

isab-pytorch

An implementation of (Induced) Set Attention Block, from the Set Transformers paper

Language:PythonLicense:MITStargazers:53Issues:6Issues:1

einops-exts

Implementation of some personal helper functions for Einops, my most favorite tensor manipulation library ❤️

Language:PythonLicense:MITStargazers:51Issues:4Issues:1

memory-editable-transformer

My explorations into editing the knowledge and memories of an attention network

License:MITStargazers:35Issues:5Issues:0

open_clip

An open source implementation of CLIP.

Language:PythonLicense:NOASSERTIONStargazers:9Issues:1Issues:0

bitsandbytes

8-bit CUDA functions for PyTorch

Language:PythonLicense:MITStargazers:4Issues:1Issues:0

Nim

Nim is a statically typed compiled systems programming language. It combines successful concepts from mature languages like Python, Ada and Modula. Its design focuses on efficiency, expressiveness, and elegance (in that order of priority).

Language:NimLicense:NOASSERTIONStargazers:3Issues:1Issues:0

nucleotide-transformer

🧬 Nucleotide Transformer: Building and Evaluating Robust Foundation Models for Human Genomics

Language:PythonLicense:NOASSERTIONStargazers:3Issues:1Issues:0

CLIP

Contrastive Language-Image Pretraining

Language:Jupyter NotebookLicense:MITStargazers:2Issues:1Issues:0

pytorch

Tensors and Dynamic neural networks in Python with strong GPU acceleration

Language:C++License:NOASSERTIONStargazers:2Issues:1Issues:0

RFdiffusion

Code for running RFdiffusion

Language:PythonLicense:NOASSERTIONStargazers:1Issues:1Issues:0