ag027592 / block-recurrent-transformer

Pytorch implementation of "Block Recurrent Transformers" (Hutchins & Schlag et al., 2022)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Block Recurrent Transformer

A PyTorch implementation of Hutchins & Schlag et al.. Owes very much to Phil Wang's x-transformers. Very much in-progress.

Dockerfile, requirements.txt, and environment.yaml because I love chaos.

About

Pytorch implementation of "Block Recurrent Transformers" (Hutchins & Schlag et al., 2022)

License:MIT License


Languages

Language:Python 89.0%Language:Dockerfile 11.0%