DRSY / MAML_CP_VAE

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

MAML-CP-VAE

This repository contains Pytorch implementation of the paper ST2:Small-data Text Style Transfer via Multi-task Meta-Learning. Up to now only CP-VAE version is contained, CrossAlign and VAE versions will be updated soon.

Dependencies

  • torch==1.3.1
  • python > 3.6
  • tqdm
  • pandas
  • numpy
  • scipy

Usage

clone the repo and install required packages

git clone https://github.com/DRSY/MAML_CP_VAE.git
pip install -r requirements.txt

enter the code dir

cd code

generate corpus for building vocab

bash scripts/get_pretrain_text.sh

make dirs

bash scripts/make_dirs.sh

set the corpus(s1 or s2)

export corpus=s1
export s=1

start training and inference

python main.py --configpath ../config/s$s.json --corpus $corpus --maml-epochs 20 --transfer-epochs 0 --epochs-per-val 5 --maml-batch-size 8 --sub-batch-size --train-batch-size 16 --device-idx 0

fine-tuning for each sub-task

cd ../CP-VAE
bash fine_tune.sh
bash fine_tune_copy.sh

inference after task-specific fine-tuning

bash infer.sh

Note that this repo only serve as fast prototype implementation, more robust refactor will be done soon.

Acknowledgement

Underpinning code for cp-vae is adapted from CP-VAE.

About

License:GNU General Public License v2.0


Languages

Language:Roff 98.5%Language:Python 1.5%Language:Shell 0.0%