Junbum Lee's repositories
InfiniTransformer
Unofficial PyTorch/🤗Transformers(Gemma/Llama3) implementation of Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attention
BitNet-Transformers
0️⃣1️⃣🤗 BitNet-Transformers: Huggingface Transformers Implementation of "BitNet: Scaling 1-bit Transformers for Large Language Models" in pytorch with Llama(2) Architecture
easy-lm-trainer
🤗 최소한의 세팅으로 LM을 학습하기 위한 샘플코드
ko-lm-evaluation-harness
Forked repo from https://github.com/EleutherAI/lm-evaluation-harness/commit/1f66adc
Gemma-EasyLM
Train GEMMA on TPU/GPU! (Codebase for training Gemma-Ko Series)
data_camp_wcr
파이썬을 활용한 실전 웹크롤링 CAMP 강의 1-2기 소스코드
data_camp_wcr_3
파이썬을 활용한 실전 웹크롤링 CAMP 3기 소스코드
naverNewsContentsCrawler
naverNewsContentsCrawler
megatronlm_dataset_autotokenizer
Megatron-LM/GPT-NeoX compatible Text Encoder with 🤗Transformers AutoTokenizer.
Eigen-MoRA
Eigen-MoRA: MoRA with Eigenvector based non-param projection
alpaca-lora
Instruct-tune LLaMA on consumer hardware
heroku-helloworld
Test DRF Project for Heroku Deploy
SmartSpamFilter-QnA
Smart Spam Filter QnA
20171205-NexonTalk
2017/12/05에 진행한 Nexon Talk에서 사용한 크롤러 예제
accelerate
🚀 A simple way to train and use PyTorch models with multi-GPU, TPU, mixed-precision
MS-AMP-Examples_OLD
forked from MS-AMP package.
ordinal-log-loss
Repository of the COLING 2022 paper : Ordinal Log Loss - A simple log-based loss function for ordinal text classification.
SNUE-P-KAKAO-BOT
스누피용 카카오봇 with AWS Lambda(Zappa)