Piji Li's repositories
TranSummar
Transformer for abstractive summarization on cnn/daily-mail and gigawords
wikiextractor
A tool for extracting plain text from Wikipedia dumps
ray
Ray is a unified framework for scaling AI and Python applications. Ray consists of a core distributed runtime and a toolkit of libraries (Ray AIR) for accelerating ML workloads.
RL4LMs
A modular RL library to fine-tune language models to human preferences
NAS_transformer
Evolutionary Neural Architecture Search on Transformers for RUL Prediction
TwiBot-22
Offical repository of TwiBot-22 @ NeurIPS 2022, Datasets and Benchmarks Track.
stablediffusion
High-Resolution Image Synthesis with Latent Diffusion Models
OFA
Official repository of OFA (ICML 2022). Paper: OFA: Unifying Architectures, Tasks, and Modalities Through a Simple Sequence-to-Sequence Learning Framework
torchscale
Transformers at any scale
AIStartups
Startups about artificial intelligence. (DM, ML, NLP, CV...)
CodeGeeX
CodeGeeX: An Open Multilingual Code Generation Model
Poisson_flow
Code for NeurIPS 2022 Paper, "Poisson Flow Generative Models"
minimal-text-diffusion
A minimal implementation of diffusion models for text generation
unilm
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
L2I
The baseline method for CCIR 22 https://www.datafountain.cn/competitions/573
Diffusion-LM
Diffusion-LM
N-CMAPSS_DL
N-CMAPSS data preparation for Machine Learning and Deep Learning models. (Python source code for new CMAPSS dataset)
TAT-QA
TAT-QA (Tabular And Textual dataset for Question Answering) contains 16,552 questions associated with 2,757 hybrid contexts from real-world financial reports.
vit-pytorch
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
mathQ
Data and code for the paper "A Neural Network Solves, Explains, and Generates University Math Problems by Program Synthesis and Few-Shot Learning at Human Level" by Drori et al., 2022.
QA-CivilAviationKG
基于民航业知识图谱的自动问答系统
Megatron-LM
Ongoing research training transformer language models at scale, including: BERT & GPT-2