There are 62 repositories under bert topic.
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Natural Language Processing Tutorial for Deep Learning Researchers
🏄 Embed/reason/rank images and sentences with CLIP models
👑 Easy-to-use and powerful NLP library with 🤗 Awesome model zoo, supporting wide-range of NLP tasks from research to industrial applications, including 🗂Text Classification, 🔍 Neural Search, ❓ Question Answering, ℹ️ Information Extraction, 📄 Document Intelligence, 💌 Sentiment Analysis and 🖼 Diffusion AIGC system etc.
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
大规模中文自然语言处理语料 Large Scale Chinese Corpus for NLP
:mag: Haystack is an open source NLP framework to interact with your data using Transformer models and LLMs (GPT-3 and alike). Haystack offers production-ready tools to quickly build ChatGPT-like question answering, semantic search, text generation, and more.
💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
Official implementations for various pre-training models of ERNIE-family, covering topics of Language Understanding & Generation, Multimodal Understanding & Generation, and beyond.
Google AI 2018 BERT pytorch implementation
Tensorflow solution of NER task Using BiLSTM-CRF model with Google BERT Fine-tuning And private Server services
pycorrector is a toolkit for text error correction. 文本纠错,Kenlm,ConvSeq2Seq,BERT,MacBERT,ELECTRA,ERNIE,Transformer,T5等模型实现,开箱即用。
This repository contains demos I made with the Transformers library by HuggingFace.
Tutorials on getting started with PyTorch and TorchText for sentiment analysis.
A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
State of the Art Natural Language Processing
中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
该仓库主要记录 NLP 算法工程师相关的顶会论文研读笔记
A model library for exploring state-of-the-art deep learning topologies and techniques for optimizing Natural Language Processing neural networks
Must-read papers on prompt-based tuning for pre-trained language models.
Transformer related optimization, including BERT, GPT
Implementation of BERT that could load official pre-trained models for feature extraction and prediction
Kashgari is a production-level NLP Transfer learning framework built on top of tf.keras for text-labeling and text-classification, includes Word2Vec, BERT, and GPT2 Language Embedding.
Toolkit for Machine Learning, Natural Language Processing, and Text Generation, in TensorFlow. This is part of the CASL project: http://casl-project.ai/
RoBERTa中文预训练模型: RoBERTa for Chinese
快速上手Ai理论及应用实战:基础知识、ML、DL、NLP-BERT、竞赛。含大量注释及数据集,力求每一位能看懂并复现。
A curated list of pretrained sentence and word embedding models
深度学习入门课、资深课、特色课、学术案例、产业实践案例、深度学习知识百科及面试题库The course, case and knowledge of Deep Learning and AI
Jupyter Notebook tutorials on solving real-world problems with Machine Learning & Deep Learning using PyTorch. Topics: Face detection with Detectron 2, Time Series anomaly detection with LSTM Autoencoders, Object Detection with YOLO v5, Build your first Neural Network, Time Series forecasting for Coronavirus daily cases, Sentiment Analysis with BERT.