InternLM / InternLM

Official release of InternLM2 7B and 20B base and chat models. 200K context support

Home Page:https://internlm.intern-ai.org.cn/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

请问是否支持200k上文的微调,需要什么样的配置?[QA]

Labmem009 opened this issue · comments

Describe the question.

目前我使用的是xtuner的zero3对internlm2-chat-20b进行全量微调,8*A100只能微调2k上文,请问200k或者比较长几十k的上文internevo是否支持,要怎么设置?

This issue is marked as stale because it has been marked as invalid or awaiting response for 7 days without any further response. It will be closed in 7 days if the stale label is not removed or if there is no further response.

This issue is closed because it has been stale for 7 days. Please open a new issue if you have similar issues or you have any new updates now.