KennethEnevoldsen / spacy-curated-transformers

spaCy entry points for Curated Transformers

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

πŸ’« πŸ€– spaCy Curated Transformers

This package provides spaCy components and architectures to use a curated set of transformer models via curated-transformers in spaCy.

PyPi GitHub

Features

  • Use pretrained models based on one of the following architectures to power your spaCy pipeline:
    • ALBERT
    • BERT
    • CamemBERT
    • RoBERTa
    • XLM-RoBERTa
  • All the nice features supported by spacy-transformers such as support for Hugging Face Hub, multi-task learning, the extensible config system and out-of-the-box serialization
  • Deep integration into spaCy, which lays the groundwork for deployment-focused features such as distillation and quantization
  • Minimal dependencies

⏳ Installation

Installing the package from pip will automatically install all dependencies.

pip install spacy-curated-transformers

πŸš€ Quickstart

An example project is provided in the project directory.

πŸ“– Documentation

Bug reports and other issues

Please use spaCy's issue tracker to report a bug, or open a new thread on the discussion board for any other issue.

About

spaCy entry points for Curated Transformers

License:MIT License


Languages

Language:Python 100.0%