yueshell / layer_norm

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

This CUDA extension implements fused dropout + residual + LayerNorm, building on Apex's FastLayerNorm. Major changes:

  • Add dropout and residual.
  • Make it work for both pre-norm and post-norm architecture.
  • Support more hidden dimensions (all dimensions divisible by 8, up to 8192).
  • Implement RMSNorm as an option.
  • Support layer norm with parallel residual (e.g., GPT-J, GPT-NeoX, PaLM).

If you want to use it for dimensions larger than 8k, please file an issue.

This extension has only been tested on A100s.

cd csrc/layer_norm && pip install .

About


Languages

Language:Cuda 73.1%Language:C++ 23.3%Language:Python 3.6%