This repository contains Pytorch implementation of the paper ST2:Small-data Text Style Transfer via Multi-task Meta-Learning. Up to now only CP-VAE version is contained, CrossAlign and VAE versions will be updated soon.
- torch==1.3.1
- python > 3.6
- tqdm
- pandas
- numpy
- scipy
clone the repo and install required packages
git clone https://github.com/DRSY/MAML_CP_VAE.git
pip install -r requirements.txt
enter the code dir
cd code
generate corpus for building vocab
bash scripts/get_pretrain_text.sh
make dirs
bash scripts/make_dirs.sh
set the corpus(s1 or s2)
export corpus=s1
export s=1
start training and inference
python main.py --configpath ../config/s$s.json --corpus $corpus --maml-epochs 20 --transfer-epochs 0 --epochs-per-val 5 --maml-batch-size 8 --sub-batch-size --train-batch-size 16 --device-idx 0
fine-tuning for each sub-task
cd ../CP-VAE
bash fine_tune.sh
bash fine_tune_copy.sh
inference after task-specific fine-tuning
bash infer.sh
Note that this repo only serve as fast prototype implementation, more robust refactor will be done soon.
Underpinning code for cp-vae is adapted from CP-VAE.