There are 14 repositories under huggingface-transformers topic.
🧠💬 Articles I wrote about machine learning, archived from MachineCurve.com.
中文nlp解决方案(大模型、数据、模型、训练、推理)
Simple UI for LLM Model Finetuning
Social networking platform with automated content moderation and context-based authentication system
Chronos: Pretrained (Language) Models for Probabilistic Time Series Forecasting
a fast and user-friendly runtime for transformer inference (Bert, Albert, GPT2, Decoders, etc) on CPU and GPU.
Pocket-Sized Multimodal AI for content understanding and generation across multilingual texts, images, and 🔜 video, up to 5x faster than OpenAI CLIP and LLaVA 🖼️ & 🖋️
Transformer models from BERT to GPT-4, environments from Hugging Face to OpenAI. Fine-tuning, training, and prompt engineering examples. A bonus section with ChatGPT, GPT-3.5-turbo, GPT-4, and DALL-E including jump starting GPT-4, speech-to-text, text-to-speech, text to image generation with DALL-E, Google Cloud AI,HuggingGPT, and more
Multimodal model for text and tabular data with HuggingFace transformers as building block for text data
Fast Inference Solutions for BLOOM
AI-First Process Automation with Large ([Language (LLMs) / Action (LAMs) / Multimodal (LMMs)] / Visual Language (VLMs)) Models
Learn Cloud Applied Generative AI Engineering (GenEng) using OpenAI, Gemini, Streamlit, Containers, Serverless, Postgres, LangChain, Pinecone, and Next.js
[EMNLP 2022] Unifying and multi-tasking structured knowledge grounding with language models
Guide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpeed
Sentiment analysis neural network trained by fine-tuning BERT, ALBERT, or DistilBERT on the Stanford Sentiment Treebank.
Low latency JSON generation using LLMs ⚡️
A tool for generating function arguments and choosing what function to call with local LLMs
Extract knowledge from all information sources using gpt and other language models. Index and make Q&A session with information sources.
Package to compute Mauve, a similarity score between neural text and human text. Install with `pip install mauve-text`.
Amazon SageMaker Local Mode Examples
Build and train state-of-the-art natural language processing models using BERT
Phoneme Recognition using pre-trained models Wav2vec2, HuBERT and WavLM. Throughout this project, we compared specifically three different self-supervised models, Wav2vec (2019, 2020), HuBERT (2021) and WavLM (2022) pretrained on a corpus of English speech that we will use in various ways to perform phoneme recognition for different languages with a network trained with Connectionist Temporal Classification (CTC) algorithm.
extending stable diffusion prompts with suitable style cues using text generation
Easy-Translate is a script for translating large text files with a SINGLE COMMAND. Easy-Translate is designed to be as easy as possible for beginners and as seamlesscustomizable and as possible for advanced users.
Indonesian Language Models and its Usage
Dreambooth implementation based on Stable Diffusion with minimal code.
A codebase that makes differentially private training of transformers easy.
KLUE 데이터를 활용한 HuggingFace Transformers 튜토리얼