A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
Geek Repo:Geek Repo
Github PK Tool:Github PK Tool
marxav opened this issue a year ago · comments
In model.py, in GPT class, the number of layers self.transformer.h should be renamed into self.transformer.l (h is reminiscent of "heads" and l would be reminiscent of "layers").