kyegomez / Andromeda

An all-new Language Model That Processes Ultra-Long Sequences of 100,000+ Ultra-Fast

Home Page:https://discord.gg/qUtxnK2NMf

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Maybe parametrize tokenizer input max length to rise to Andromeda's magnifcent potential

JacobFV opened this issue · comments

Not sure if you intended to require texts to be tokenized in 8k chunks

model_max_length=8192

Upvote & Fund

  • We're using Polar.sh so you can upvote and help fund this issue.
  • We receive the funding once the issue is completed & confirmed by you.
  • Thank you in advance for helping prioritize & fund our backlog.
Fund with Polar