There are 0 repository under bart topic.
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
Self-contained Machine Learning and Natural Language Processing library in Go
Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
TextGen: Implementation of Text Generation models, include LLaMA, BLOOM, GPT2, BART, T5, SongNet and so on. 文本生成模型,实现了包括LLaMA,ChatGLM,BLOOM,GPT2,Seq2Seq,BART,T5,UDA等模型的训练和预测,开箱即用。
Multilingual/multidomain question generation datasets, models, and python library for question generation.
Cybertron: the home planet of the Transformers in Go
Build and train state-of-the-art natural language processing models using BERT
code for AAAI2022 paper "Open Vocabulary Electroencephalography-To-Text Decoding and Zero-shot Sentiment Classification"
Code associated with the "Data Augmentation using Pre-trained Transformer Models" paper
Pytorch implementation of baseline models of KQA Pro, a large-scale dataset of complex question answering over knowledge base.
BARTpho: Pre-trained Sequence-to-Sequence Models for Vietnamese (INTERSPEECH 2022)
Abstractive and Extractive Text summarization using Transformers.
NAACL 2021 - Progressive Generation of Long Text
Official implementation of the paper "IteraTeR: Understanding Iterative Revision from Human-Written Text" (ACL 2022)
Automated Categorization: Utilizing the power of neural networks, this project offers an automated solution to categorize bank descriptions, reducing manual effort and enhancing efficiency while maintaining privacy.
Source codes and dataset of Call for Customized Conversation: Customized Conversation Grounding Persona and Knowledge
TrAVis: Visualise BERT attention in your browser
Code associated with the "Data Augmentation using Pre-trained Transformer Models" paper
An English-to-Cantonese machine translation model
Implement Question Generator with SOTA pre-trained Language Models (RoBERTa, BERT, GPT, BART, T5, etc.)
Abstractive text summarization by fine-tuning seq2seq models.
Script to pre-train hugginface transformers BART with Tensorflow 2
Thank you BART! Rewarding Pre-Trained Models Improves Formality Style Transfer (ACL 2021)
JAX implementation of the bart-base model
KorQuAD Korean domain Question Generation module based on KoBART
A project improves the quality and accuracy of the Vietnamese language.
Point-and-click bartCause analysis and causal inference education