There are 0 repository under ring-attention topic.
Sequence Parallel Attention for Long Context LLM Model Training and Inference
Packaged Ring Attention with Blockwise Transformers for Near-Infinite Context implemented in Jax + Flax.