gdewael / bio-attention

Simple implementations of attention modules adapted for the biological data domain.

Home Page:https://bio-attention.readthedocs.io/en/latest/index.html

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

bio-attention

Simple implementations of attention modules adapted for the biological data domain.

PyPi Version GitHub license Documentation

Why use this package?

There are already plenty of excellent implementations out there that allow you to test out the countless variants of transformers [1], [2]. This repository primarily separates itself from the previous in that it contains positional encodings schemes adapted to allow for irregularly-spaced positions in sequences. This is useful in, for example: (1) the mass spectral domain (proteomics, metabolomics, ...), where transformers operate on sets of peaks, (2) any kind of (epi)genomic data that measures sites of interests on the genome that are irregularly-spaced (such as WGBS/CpG sites, ATAC-seq/chromatin accessibility, ...). Additionally, the attention definitions in this repository are compatible with multi-dimensional data, such as the MSAs used in some protein language models, and AlphaFold.

Install

Since PyTorch is a dependency of bio-attention, we recommend installing PyTorch independently first, as your system may require a specific version (e.g. CUDA drivers).

After PyTorch installation, bio-attention can be installed using pip

pip install bio-attention

Note

This package used to be a 2D sliding window attention package. The current formulation of the package does not allow for this type of attention anymore (instead, I recommend to perform axial attention with alternating sliding window attention across one axis and full self-attention across the other). If you want to use 2D sliding window attention, check out the old version of this repo.

Usage

Package roadmap

  • Embedding layers
    • Continuous
    • Discrete
    • Binary
    • Bin
  • [~] Positional encoding schemes
    • Sinusoidal
    • Embedding
    • Continuous
    • Rotary
    • AliBi
    • DPB
    • XL
    • Test support for multi-dimensional inputs
  • [~] Attention modules
    • Vanilla
    • Windowed
    • Random
    • Performer
    • Encoder
    • Decoder
    • Cross
    • Support for multi-dim inputs
  • Add a warning if non-increasing positional indices are used with a decoder attention
  • Add docs clarifying that clf tokens are automatically accounted for if no pos is provided for them
  • Tests
  • Typing
  • Docs

About

Simple implementations of attention modules adapted for the biological data domain.

https://bio-attention.readthedocs.io/en/latest/index.html

License:MIT License


Languages

Language:Python 100.0%