codingchild's repositories
MonaCoBERT
Monotonic Attention based ConvBERT for Knowledge Tracing
lm-trainer
forked from easy-lm-trainer
-lecture-pytorch_basic
[lecture]pytorch_basic
BiDKT
BiDKT
coldstart_KT
estimating coldstart problems in KT
BERT_Media
BERT_Media
cl4kt_clone
cl4kt_clone
codingchild2424
Config files for my GitHub profile.
deepart
deepart
Diffusion-LM
Diffusion-LM
django-blog
clone coding
easy-lm-trainer
π€ μ΅μνμ μΈν μΌλ‘ LMμ νμ΅νκΈ° μν μνμ½λ
FeedbackPrize_BERT
FeedbackPrize_BERT
KoBART-summarization
Summarization module based on KoBART
kobart_summary
kobart_summary
LMMArxivTalk
[Google Meet] LMM Arxiv Casual Talk
Megatron-LM
Ongoing research training transformer models at scale
MF-DAKT
source code for model MF-DAKT
movie_app_2019
good
nlp-datasets
Curation note of NLP datasets
NLP_BASELINE
This repository contains natural language processing base line models for learning.
oslo
OSLO: Open Source framework for Large-scale model Optimization
parallelformers
Parallelformers: An Efficient Model Parallelization Toolkit for Deployment
RWKV-LM
RWKV is a RNN with transformer-level performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
title_extraction
title_extraction
toolformer-pytorch
Implementation of Toolformer, Language Models That Can Use Tools, by MetaAI
underscore
JavaScript's utility _ belt