Ziems / HomebrewNLP-MTF

HomebrewNLP in Mesh-TensorFlow flavour for distributed TPU training

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

OBST

Copyright (c) 2020-2021 Yannic Kilcher (yk), Lucas Nestler (clashluke), Shawn Presser (shawwn), Jan (xmaster96)

Quickstart

First, create your VM through google cloud shell with ctpu up --vm-only. This way it has all the necessary permissions to connect to your Buckets and TPUs.
Next, install the requirements with pip on your VM using git clone https://github.com/tensorfork/obst && cd obst && python3 -m pip install -r requirements.txt.
Finally, start a TPU to kick off a training run using python3 main.py --model configs/big_ctx.json --tpu ${YOUR_TPU_NAME}.

Acknowledgements

We also want to explicitly thank

About

HomebrewNLP in Mesh-TensorFlow flavour for distributed TPU training

License:BSD 2-Clause "Simplified" License


Languages

Language:Python 97.8%Language:Shell 1.2%Language:Cython 1.0%Language:Dockerfile 0.1%