Maluuba / nlg-eval

Evaluation code for various unsupervised automated metrics for Natural Language Generation.

Home Page:http://arxiv.org/abs/1706.09799

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Other Languages Chinese

zzcceeyy opened this issue · comments

Can the evaluation be applied to Chinese

Sort of, we mostly answer this is #118 and other issues linked in there.