Fandong Meng (fandongmeng)

fandongmeng

Geek Repo

Company:Tencent WeChat AI

Location:Beijing

Home Page:http://fandongmeng.github.io/

Github PK Tool:Github PK Tool

Fandong Meng's repositories

DTMT_InDec

Implementation of DTMT with incremental decoding

Language:xBaseLicense:BSD-3-ClauseStargazers:13Issues:3Issues:0

StackedDTMT

Stacked Encoder Enhanced Deep Transition RNMT

Language:xBaseLicense:BSD-3-ClauseStargazers:3Issues:1Issues:0

RSI-NAT

Source code for "Retrieving Sequential Information for Non-Autoregressive Neural Machine Translation"

Language:PythonLicense:BSD-3-ClauseStargazers:1Issues:2Issues:0

awesome-multimodal-ml

Reading list for research topics in multimodal machine learning

License:MITStargazers:0Issues:2Issues:0

bert

TensorFlow code and pre-trained models for BERT

Language:PythonLicense:Apache-2.0Stargazers:0Issues:3Issues:0
Language:PythonStargazers:0Issues:1Issues:0

CapsNet-Tensorflow

A Tensorflow implementation of CapsNet(Capsules Net) in Hinton's paper Dynamic Routing Between Capsules

Language:PythonLicense:Apache-2.0Stargazers:0Issues:3Issues:0
Language:PythonStargazers:0Issues:1Issues:0

ClidSum

ClidSum: A Benchmark Dataset for Cross-Lingual Dialogue Summarization

Language:PythonStargazers:0Issues:1Issues:0

ColossalAI

Colossal-AI: A Unified Deep Learning System for Large-Scale Parallel Training

Language:PythonLicense:Apache-2.0Stargazers:0Issues:1Issues:0

Cross-Domain_NER

Cross-domain NER using cross-domain language modeling, code for ACL 2019 paper

Language:PythonStargazers:0Issues:2Issues:0

fandongmeng.github.com

My online resume

Language:HTMLStargazers:0Issues:0Issues:0

GCDT

Code for the paper: GCDT: A Global Context Enhanced Deep Transition Architecture for Sequence Labeling

Language:PLSQLLicense:BSD-3-ClauseStargazers:0Issues:1Issues:0
Stargazers:0Issues:2Issues:0

MSCTD

Code and Data for the ACL22 main conference paper "MSCTD: A Multimodal Sentiment Chat Translation Dataset"

Language:PythonLicense:MITStargazers:0Issues:1Issues:0

MTBook

《机器翻译:基础与模型》肖桐 朱靖波 著 - Machine Translation: Foundations and Models

Language:TeXStargazers:0Issues:1Issues:0
Language:PythonStargazers:0Issues:1Issues:0

nmtpytorch

Sequence-to-Sequence Framework in PyTorch

Language:Jupyter NotebookLicense:NOASSERTIONStargazers:0Issues:2Issues:0

OR-NMT

Source code for the paper <Bridging the Gap between Training and Inference for Neural Machine Translation>

Language:PythonStargazers:0Issues:1Issues:0

sha-rnn

Single Headed Attention RNN - "Stop thinking with your head"

Language:PythonStargazers:0Issues:2Issues:0

sparse-and-robust-PLM

[NeurIPS 2022] "A Win-win Deal: Towards Sparse and Robust Pre-trained Language Models", Yuanxin Liu, Fandong Meng, Zheng Lin, Jiangnan Li, Peng Fu, Yanan Cao, Weiping Wang, Jie Zhou

Language:PythonStargazers:0Issues:1Issues:0

tf_ner

Simple and Efficient Tensorflow implementations of NER models with tf.estimator and tf.data

Language:PythonLicense:Apache-2.0Stargazers:0Issues:3Issues:0

TLAT-NMT

Source code for the EMNLP 2020 long paper <Token-level Adaptive Training for Neural Machine Translation>.

Language:PythonLicense:MITStargazers:0Issues:1Issues:0
Language:PythonLicense:Apache-2.0Stargazers:0Issues:2Issues:0

TurboTransformers

a fast and user-friendly tool for transformer inference on CPU and GPU

Language:C++License:NOASSERTIONStargazers:0Issues:1Issues:0
Language:C++Stargazers:0Issues:1Issues:0

wenmt

nmt models

Language:PythonStargazers:0Issues:1Issues:0

WeTS

A benchmark for the task of translation suggestion

Language:MaskLicense:UnlicenseStargazers:0Issues:1Issues:0

wit

WIT (Wikipedia-based Image Text) Dataset is a large multimodal multilingual dataset comprising 37M+ image-text sets with 11M+ unique images across 100+ languages.

License:NOASSERTIONStargazers:0Issues:1Issues:0