amiruzzaman1 / Distil-Bert

DistilBERT, a distilled version of BERT (Bidirectional Encoder Representations from Transformers). DistilBERT is known for its efficiency and reduced computational requirements while retaining significant language understanding capabilities.

Home Page:https://amiruzzaman-distilbert.hf.space/#question-answering-model-distilbert

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

amiruzzaman1/Distil-Bert Issues

No issues in this repository yet.