Samuel Arcadinho's repositories
CAA
Steering Llama 2 with Contrastive Activation Addition
ecco
Explain, analyze, and visualize NLP language models. Ecco creates interactive visualizations directly in Jupyter notebooks explaining the behavior of Transformer-based language models (like GPT2, BERT, RoBERTA, T5, and T0).
marketplace
A centralized packages manager for Logseq marketplace plugins.
pymde
Minimum-distortion embedding with PyTorch
RWKV-LM
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
rwkv-long-range-arena
LRA Benchmark RWKV
safari
Convolutions for Sequence Modeling
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
trl
Train transformer language models with reinforcement learning.