Maybe parametrize tokenizer input max length to rise to Andromeda's magnifcent potential
JacobFV opened this issue · comments
Jacob Valdez commented
Not sure if you intended to require texts to be tokenized in 8k chunks
Line 20 in df7f8d5
Upvote & Fund
- We're using Polar.sh so you can upvote and help fund this issue.
- We receive the funding once the issue is completed & confirmed by you.
- Thank you in advance for helping prioritize & fund our backlog.
Eternal Reclaimer commented
@JacobFV yes i did