zhengzx-nlp's repositories
dynamic-nmt
Pytorch implementation of EMNLP'19 paper "Dynamic past and future for neural machine translation"
DeepLearning-500-questions
深度学习500问,以问答形式对常用的概率知识、线性代数、机器学习、深度学习、计算机视觉等热点问题进行阐述,以帮助自己及有需要的读者。 全书分为18个章节,近30万字。由于水平有限,书中不妥之处恳请广大读者批评指正。 未完待续............ 如有意合作,联系scutjy2015@163.com 版权所有,违权必究 Tan 2018.06
NJUNMT-tf-server
A C-S web demo for NJUNMT-tf
academicpages.github.io
Github Pages template for academic personal websites, forked from mmistakes/minimal-mistakes
al-folio
A beautiful, simple, clean, and responsive Jekyll theme for academics
distance-parser
Source code for ``Straight to the Tree: Constituency Parsing with Neural Syntactic Distance'' published at ACL 2018
hardware-aware-transformers
[ACL 2020] HAT: Hardware-Aware Transformers for Efficient Natural Language Processing
lightning-hydra-template
PyTorch Lightning + Hydra. A feature-rich template for rapid, scalable and reproducible ML experimentation with best practices. ⚡🔥⚡
longformer
Longformer: The Long-Document Transformer
markdown-cv
a simple template to write your CV in a readable markdown file and use CSS to publish/print it.
neat-vision
Neat (Neural Attention) Vision, is a visualization tool for the attention mechanisms of deep-learning models for Natural Language Processing (NLP) tasks. (framework-agnostic)
numerical-linear-algebra
Free online textbook of Jupyter notebooks for fast.ai Computational Linear Algebra course
pseudo-ref
Implementation of the pseudo-reference generation algorithm proposed in EMNLP 2018 paper: Multi-Reference Training with Pseudo-References for Neural Translation and Text Generation
pytorch_nmt
A neural machine translation model in PyTorch
PyTorchDiscreteFlows
Discrete Normalizing Flows implemented in PyTorch
reformer-pytorch
Reformer, the efficient Transformer, in Pytorch