Seonghwan Kim's repositories
WellnessConversation-LanguageModel
Korean Language Model을 이용한 심리상담 대화 언어 모델
pytorch-transformer
Pytorch 트랜스포머 구현과 언어모델(BERT MLM, ELECTRA), 기계번역 테스트.
pytorch-gpt-x
Implementation of autoregressive language model using improved Transformer and DeepSpeed pipeline parallelism.
pytorch-meena
Implementation Google Meena for open domain conversation.
reformer-language-model
Reformer Language Model
NamuwikiExtractor
나무위키덤프에서 정제된 텍스트를 얻기 위한 NamuwikiExtractor
BERT-QA-APP
BERT QA App using React-native, node.js, mongodb, tensorflow
pytorch-poly-chatbot
Chatbot using BERT and Poly-encoder
WikiExtractor
Extracts and cleans text from Wikipedia database dump and stores output in a number of files of similar size in a given directory.
dacon-ko-summarization
데이콘 한국어 문서 추출 및 요약 AI 경진대회
bert
TensorFlow code and pre-trained models for BERT
data-preprocess
데이터 전처리
example-node-server
Example Node Server w/ Babel
faiss-serving
A lightweight Faiss HTTP Server 🚀
maxtext
A simple, performant and scalable Jax LLM!
Megatron-DeepSpeed
Ongoing research training transformer language models at scale, including: BERT & GPT-2
metaseq
Repo for external large-scale work
narrativeKoGPT2
koGPT-2를 이용한 이야기 생성 AI
pytorch-performer
Performer simple implementation using pytorch
RL-Adventure
Pytorch Implementation of DQN / DDQN / Prioritized replay/ noisy networks/ distributional values/ Rainbow/ hierarchical RL
season2
Jiphyeonjeon Season 2
test
Measuring Massive Multitask Language Understanding | ICLR 2021