pradeepdev-1995 / BERT-models-finetuning

BERT (Bidirectional Encoder Representations from Transformers) is a transformer-based method of learning language representations. It is a bidirectional transformer pre-trained model developed using a combination of two tasks namely: masked language modeling objective and next sentence prediction on a large corpus.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

pradeepdev-1995/BERT-models-finetuning Stargazers