A large language model, GPT-2, is trained on ~1.6M manuscript abstracts from the ArXiv. Fine-tuning is performed with the abstracts from NASA's ADS and the model is later used to generate text for a predictive keyboard
Geek Repo:Geek Repo
Github PK Tool:Github PK Tool