vatsalsaglani / Multi-Headed-Self-Attention-Transformer

A Self-Attention transformer with multiple heads wrapped in a python package to train on a text corpus and generate texts.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

This repository is not active

About

A Self-Attention transformer with multiple heads wrapped in a python package to train on a text corpus and generate texts.


Languages

Language:Python 100.0%