Daehyeon Kong's repositories
pyllama
LLaMA: Open and Efficient Foundation Language Models
KoSentenceBERT_revised
KoSentenceBERT_revised
detail_filter
detail_filter
KoBERT
Korean BERT pre-trained cased (KoBERT)
N2T
Notion에서 Tistory로 포스팅을 해줍니다.
self-supervised-learning-narratives-1
거꾸로 읽는 self-supervised learning 파트 1
Advances-in-Label-Noise-Learning
A curated (most recent) list of resources for Learning with Noisy Labels
vdsl_meal_choices
vdsl_연구실_식당_선택지
ko-sentence-transformers
한국어 사전학습 모델을 활용한 문장 임베딩
DeepClustering
Methods and Implements of Deep Clustering
korea-covid-19-remaining-vaccine-macro
잔여백신 조회 및 예약 매크로
KoSentenceBERT_SKTBERT
Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks (EMNLP 2019) 논문에서 공개한 코드, kakaobrain 팀이 공개한 KorNLUDatasets 과 SKT KoBERT를 통해 Korea Sentence BERT를 학습하였습니다.
tf
🤗Transformers: State-of-the-art Natural Language Processing for Pytorch, TensorFlow, and JAX.
DistilKoBERT
Distillation of KoBERT from SKTBrain (Lightweight KoBERT)
git_test
nsml git clone test