taras-sereda / Trainer

๐Ÿธ - A general purpose model trainer, as flexible as it gets

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

๐Ÿ‘Ÿ Trainer

An opinionated general purpose model trainer on PyTorch with a simple code base.

Installation

From Github:

git clone https://github.com/coqui-ai/Trainer
cd Trainer
make install

From PyPI:

pip install trainer

Prefer installing from Github as it is more stable.

Implementing a model

Subclass and overload the functions in the TrainerModel()

Training a model

See the test script here training a basic MNIST model.

Training with DDP

$ python -m trainer.distribute --script path/to/your/train.py --gpus "0,1"

We don't use .spawn() to initiate multi-gpu training since it causes certain limitations.

  • Everything must the pickable.
  • .spawn() trains the model in subprocesses and the model in the main process is not updated.
  • DataLoader with N processes gets really slow when the N is large.

Supported Experiment Loggers

To add a new logger, you must subclass BaseDashboardLogger and overload its functions.

About

๐Ÿธ - A general purpose model trainer, as flexible as it gets


Languages

Language:Python 99.3%Language:Makefile 0.7%