mrseeker / clm_model_tuning

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

CLM Model Tuning

Note: This script was adapted from HuggingFace's Transformers/language-modeling code.

Installation & Requirements

bittensor must be installed either locally or in the virtual environment you are working from.

Run pip install -r requirements.txt to install the additional packages for this script.

Language model tuning

Fine-tuning the library models for language modeling on a text dataset for GPT, GPT-2. Causal languages like this are trained or fine-tuned using a causal language modeling (CLM) loss.

The following examples, we will run on datasets hosted on Bittensor's IPFS mountain dataset, on HuggingFace's dataset hub or with your own text files.

On bittensor

By default, the script will fine-tune GPT2 for bittensor's mountain dataset. Running:

python finetune_using_clm.py

will tune gpt2 with bittensor's dataset and save the output to tuned-model.

to change the model you are tuning to, e.g. distilgpt2, run:

python finetune_using_clm.py model.name=distilgpt2

Some sample models to try are available under the server customization section of bittensor's documentation. A full list of models that can be trained by this script are available on huggingface.

On huggingface datasets

Any text dataset on huggingface should work by default by overriding the dataset.name and dataset.config parameters:

python finetune_using_clm.py dataset.name=wikitext dataset.config_name=wikitext-103-v1

On your own data

If you have a .txt file saved locally, you can override dataset.name as above:

python finetune_using_clm.py dataset.name=./path/to/your/data.txt

Note if using your own data, you may have many short sentences and the block size may be insufficient for reasonable performance. It's recommended you pass the flag dataset.concatenate_raw=true to give the model more context when training. This will reduce the number of batches.

Configuring training parameters

All configurable parameters are visible and documented in conf/config.yaml. The defaults are chosen for quick training and not tuned; you will need to experiment and adjust these.

Serving custom models on bittensor

To serve your tuned model on bittensor, just override neuron.model_name with the path to your tuned model:

btcli run ..... --neuron.model_name=/home/user/models/my-tuned-gpt2

Limitations & Warnings

Early stopping is not yet supported. Many features are implemented but not thoroughly tested, if you encounter an issue, reach out on discord or (preferably) create an issue on this github page.

About

License:MIT License


Languages

Language:Python 100.0%