AiLearning: 机器学习 - MachineLearning - ML、深度学习 - DeepLearning - DL、自然语言处理 NLP
A PyTorch implementation of the Transformer model in "Attention is All You Need".
LeetCode, HackRank, 剑指offer, classic algorithm implementation
TensorFlow code and pre-trained models for BERT
Keras implementation of BERT(Bidirectional Encoder Representations from Transformers)
Google AI 2018 BERT pytorch implementation
Pre-training of Deep Bidirectional Transformers for Language Understanding
Bi-directional Attention Flow (BiDAF) network is a multi-stage hierarchical process that represents context at different levels of granularity and uses a bi-directional attention flow mechanism to achieve a query-aware context representation without early summarization.
A visualization analysis tool for price bubble of Bitcoin, including basic price information, 60-days accumulative increase, hot keywords index, and bubble index.
A Chinese information extraction tool.
Complete solutions for Stanford CS224n, winter, 2019
An experiment and demo-level tool for text information extraction (event-triples extraction), which can be a route to the event chain and topic graph, 基于依存句法与语义角色标注的事件三元组抽取，可用于文本理解如文档主题链，事件线等应用。
Code and model for the paper "Improving Language Understanding by Generative Pre-Training"
Shared repository for open-sourced projects from the Google AI Language team.
Overview of Modern Deep Learning Techniques Applied to Natural Language Processing
Sample Codes for NNDL
《神经网络与深度学习》 Neural Network and Deep Learning
Lisp code for the textbook "Paradigms of Artificial Intelligence Programming"
A PyTorch implementation of OpenAI's finetuned transformer language model with a script to import the weights pre-trained by OpenAI
Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.
A TensorFlow Implementation of the Transformer: Attention Is All You Need
Python interface to Google word2vec