zdxdsw / inductive_counting_with_LMs

This work provides extensive empirical results on training LMs to count. We find that while traditional RNNs trivially achieve inductive counting, Transformers have to rely on positional embeddings to count out-of-domain. Modern RNNs (e.g. rwkv, mamba) also largely underperform traditional RNNs in generalizing counting inductively.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

zdxdsw/inductive_counting_with_LMs Stargazers