There are 1 repository under transformers-library topic.
PhoBERT: Pre-trained language models for Vietnamese (EMNLP-2020 Findings)
DIET Classifier mini implementation on pytorch.
Minimal example of using a traced huggingface transformers model with libtorch
Sentential Semantic Similarity measurement library using BERT Embeddings for spatial distance evaluation.
An ASR (Automatic Speech Recognition) adversarial attack repository.
AllenNLP integration for Shiba: Japanese CANINE model
Multi-Label Text Classification by fine-tuning BERT and XLNet and deployment using Flask
Contextual Emotion Detection in Text (DoubleDistilBert Model)
Training a BERT model from scratch.
A study to benchmark whisper based ASRs in Malayalam
A dedicated convenient repo for different Music Transformers implementations (Reformer/XTransformer/Sinkhorn/etc)
"Open Source Models with Hugging Face" course empowers you with the skills to leverage open-source models from the Hugging Face Hub for various tasks in NLP, audio, image, and multimodal domains.
Resources regarding Transformers library
PyTorch code for cross-modal-retrieval on Flickr8k/30k using Bert and EfficientNet
ML (Machine Learning)/NLP project for extraction of Names, Aadhar ID, PAN numbers. It uses Pytesseract OCR for extracting text from images, and uses Hugging Face NER model for Name extraction and regex library for extracting PAN and Aadhaar ID numbers.
How to fine-tune transformer models for text classification using Hugging Face Transformers and Datasets
Re-implementation of the method proposed in the paper "Style Aligned Image Generation via Shared Attention" in PyTorch.
GUI based YouTube Transcriptor developed using Flask
Testing of the possible use of transformers model for various NLP tasks leveraging BERT pretrained model from Hugginface
A Study of Attentions in a Question Answering System
This Python application leverages the Flask framework to create an interactive chatbot powered by the Microsoft DialoGPT model. The chatbot engages users in meaningful conversations, providing human-like responses to their input.