EricLee8 / GLMP

PyTorch code for ICLR 2019 paper: Global-to-local Memory Pointer Networks for Task-Oriented Dialogue https://arxiv.org/pdf/1901.04713

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Train a model for task-oriented dialog datasets

We created myTrain.py to train models. You can run: GLMP bAbI dialogue t1-5:

❱❱❱ python3 myTrain.py -lr=0.001 -l=1 -hdd=128 -dr=0.2 -dec=GLMP -bsz=8 -ds=babi -t=1 

or GLMP SMD

❱❱❱ python3 myTrain.py -lr=0.001 -l=1 -hdd=128 -dr=0.2 -dec=GLMP -bsz=8 -ds=kvr

While training, the model with the best validation is saved. If you want to reuse a model add -path=path_name_model to the function call. The model is evaluated by using per responce accuracy, WER, F1 and BLEU.

Test a model for task-oriented dialog datasets

We created myTest.py to train models. You can run: GLMP bAbI t1-5:

❱❱❱ python myTest.py -ds=babi -path=<path_to_saved_model> 

or GLMP SMD

❱❱❱ python myTest.py -ds=kvr -path=<path_to_saved_model> -rec=1

About

PyTorch code for ICLR 2019 paper: Global-to-local Memory Pointer Networks for Task-Oriented Dialogue https://arxiv.org/pdf/1901.04713


Languages

Language:Python 94.0%Language:Perl 6.0%