mbabby's repositories
x-crawl
x-crawl is a flexible Node.js multi-purpose crawler library. / x-crawl 是一个灵活的 Node.js 多功能爬虫库。
TensorLayer
Deep Learning and Reinforcement Learning Library for Scientists and Engineers
draw.io.file
my_draw_io_file
AISO
Authors' implementation of the paper Adaptive Information Seeking for Open-Domain Question Answering, published in EMNLP 2021.
reinforcement-learning-an-introduction
Python Implementation of Reinforcement Learning: An Introduction
lizard
A simple code complexity analyser without caring about the C/C++ header files or Java imports, supports most of the popular languages.
7days-golang
7 days golang programs from scratch (web framework Gee, distributed cache GeeCache, object relational mapping ORM framework GeeORM, rpc framework GeeRPC etc) 7天用Go动手写/从零实现系列
trueskill
An implementation of the TrueSkill rating system for Python
pytorch-book
PyTorch tutorials and fun projects including neural talk, neural style, poem writing, anime generation
Depix
Recovers passwords from pixelized screenshots
tinygrad
You like pytorch? You like micrograd? You love tinygrad! ❤️
ICLR2021-OpenReviewData
Crawl & visualize ICLR papers and reviews.
google-research
Google Research
ganbert
Enhancing the BERT training with Semi-supervised Generative Adversarial Networks
gpt-3
GPT-3: Language Models are Few-Shot Learners
mt-dnn
Multi-Task Deep Neural Networks for Natural Language Understanding
lite-transformer
[ICLR 2020] Lite Transformer with Long-Short Range Attention
MultiObjectiveOptimization
Source code for Neural Information Processing Systems (NeurIPS) 2018 paper "Multi-Task Learning as Multi-Objective Optimization"
adanet
Fast and flexible AutoML with learning guarantees.
uda
Unsupervised Data Augmentation (UDA)
avatarify
Avatars for Zoom and Skype
bert-of-theseus-tf
tensorflow version of bert-of-theseus
bert-multi-gpu
Feel free to fine tune large BERT models with Multi-GPU and FP16 support.
PLMpapers
Must-read Papers on pre-trained language models.
nlp-architect
A model library for exploring state-of-the-art deep learning topologies and techniques for optimizing Natural Language Processing neural networks
BiaffineDependencyParsing
BERT+Self-attention Encoder ; Biaffine Decoder ; Pytorch Implement
radish
C++ model train&inference framework
Pretrained-Language-Model
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.