Charlie Mou's repositories

adapter-transformers

Huggingface Transformers + Adapters = ❤️

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

CodeT5

Code for CodeT5: a new code-aware pre-trained encoder-decoder model.

Language:PythonLicense:BSD-3-ClauseStargazers:0Issues:0Issues:0

flax

Flax is a neural network library for JAX that is designed for flexibility.

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

github-slideshow

A robot powered training repository :robot:

Language:RubyLicense:MITStargazers:0Issues:0Issues:0

jax2torch

Use Jax functions in Pytorch

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

mup

maximal update parametrization (µP)

Language:Jupyter NotebookLicense:MITStargazers:0Issues:0Issues:0

PPT

Official Code for "PPT: Pre-trained Prompt Tuning for Few-shot Learning". ACL 2022

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

reinforcement-learning

Implementation of Reinforcement Learning Algorithms. Python, OpenAI Gym, Tensorflow. Exercises and Solutions to accompany Sutton's Book and David Silver's course.

Language:Jupyter NotebookLicense:MITStargazers:0Issues:0Issues:0

reinforcement-learning-an-introduction

Python Implementation of Reinforcement Learning: An Introduction

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

SCoRE

ICLR 2021: Pre-Training for Context Representation in Conversational Semantic Parsing

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

TurboTransformers

a fast and user-friendly runtime for transformer inference (Bert, Albert, GPT2, Decoders, etc) on CPU and GPU.

Language:C++License:NOASSERTIONStargazers:0Issues:0Issues:0