whjzsy / Scaleformer

A scalable transformer with linear complexity

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Scaleformer

A scalable transformer with linear complexity.

Details of implementation and origins of the proposed approach are provided in the reference paper.

Usage

Recommended way of using this project is in a Docker container built with the provided Dockerfile. For integration with VSCode, extension vscode-docker can provide a fully immersive solution. NVidia provides images with PyTorch pre-installed here. If you do not dispose of a local GPU, an alternative is the use of Google Colab.

If you desire to install package in a local environment, simply run pip install . or add the flag -e for development install.

NOTE: to enable GPU in VSCode, add "runArgs": ["--runtime=nvidia"] to .devcontainer/devcontainer.json.

Samples

A general workflow sample is provided here. The workflow will produce model in PyTorch *.pty format, which later can be used for predictions as per the following snippet.

import torch

model = torch.load("models/model.pty").to("cpu")
model.predict("Tom is gone")

About

A scalable transformer with linear complexity

License:MIT License


Languages

Language:Python 68.3%Language:TeX 31.2%Language:Makefile 0.3%Language:Dockerfile 0.1%