Princeton Natural Language Processing's repositories
tree-of-thought-llm
[NeurIPS 2023] Tree of Thoughts: Deliberate Problem Solving with Large Language Models
AutoCompressors
[EMNLP 2023] Adapting Language Models to Compress Long Contexts
CoFiPruning
[ACL 2022] Structured Pruning Learns Compact and Accurate Models https://arxiv.org/abs/2204.00408
OptiPrompt
[NAACL 2021] Factual Probing Is [MASK]: Learning vs. Learning to Recall https://arxiv.org/abs/2104.05240
TransformerPrograms
[NeurIPS 2023] Learning Transformer Programs
DinkyTrain
Princeton NLP's pre-training library based on fairseq with DeepSpeed kernel integration 🚃
LM-Kernel-FT
A Kernel-Based View of Language Model Fine-Tuning https://arxiv.org/abs/2210.05643
ShortcutGrammar
EMNLP 2022: Finding Dataset Shortcuts with Grammar Induction https://arxiv.org/abs/2210.11560
rationale-robustness
NAACL 2022: Can Rationalization Improve Robustness? https://arxiv.org/abs/2204.11790
WhatICLLearns
[ACL 2023 Findings] What In-Context Learning “Learns” In-Context: Disentangling Task Recognition and Task Learning
datamux-pretraining
MUX-PLMs: Pretraining LMs with Data Multiplexing
attribute-tagging
[LaReL 2022] Towards an Enhanced, Faithful, and Adaptable Web Interaction Environment
semsup_vae
Semantic Supervision: Enabling Generalization over Output Spaces