aiyinyuedejustin's repositories
A-Guide-to-Retrieval-Augmented-LLM
an intro to retrieval augmented large language model
app-builder
appbuilder-sdk, 千帆AppBuilder-SDK帮助开发者灵活、快速的搭建AI原生应用
bert
TensorFlow code and pre-trained models for BERT
chapyter
Chapyter: ChatGPT Code Interpreter in Jupyter Notebooks
DataScienceCrashCourse
鲸析王牌课程:颠覆你对Python数据科学的认知!
Duplicate-Question-Pair
We will take 2 questions and have to find out that given questions is same or not.
EmoLLM_1
心理健康大模型、LLM、The Big Model of Mental Health、Finetune、InternLM2、Qwen、ChatGLM、Baichuan、DeepSeek、Mixtral
everyday_leet
每天练习一下啦~熟能生巧~
FlagEmbedding
Dense Retrieval and Retrieval-augmented LLMs
gorilla_justin
Gorilla: An API store for LLMs
huanhuan-chat
Chat-甄嬛是利用《甄嬛传》剧本中所有关于甄嬛的台词和语句,基于ChatGLM2进行LoRA微调得到的模仿甄嬛语气的聊天语言模型。
leetcode-master
《代码随想录》LeetCode 刷题攻略:200道经典题目刷题顺序,共60w字的详细图解,视频难点剖析,50余张思维导图,支持C++,Java,Python,Go,JavaScript等多语言版本,从此算法学习不再迷茫!🔥🔥 来看看,你会发现相见恨晚!🚀
linkedin-skill-assessments-quizzes
Full reference of LinkedIn answers 2022 for skill assessments (aws-lambda, rest-api, javascript, react, git, html, jquery, mongodb, java, Go, python, machine-learning, power-point) linkedin excel test lösungen, linkedin machine learning test LinkedIn test questions and answers
LlamaIndex-ScienceChat
LlamaIndex-ScienceChat is a RAG scientific multiple-choice QAbot based on LlamaIndex and LLMs such as Llama2 and Mistral. The external knowledge base is Wikipedia.
llm-action
本项目旨在分享大模型相关技术原理以及实战经验。
llm-science-exam
Solution Code for Kaggle - LLM Science Exam Competition
MedicalGPT
MedicalGPT: Training Your Own Medical GPT Model with ChatGPT Training Pipeline. 训练医疗大模型,实现了包括增量预训练、有监督微调、RLHF(奖励建模、强化学习训练)和DPO(直接偏好优化)。
nlp_demo
bilibili-nlp
Outlier-Detection
Applying Outlier Detection Algorithms on different Data Sample with different testing Size
Partial-Paperreview
This is a partial repo for review GPT
Read_Bert_Code
Bert源码阅读与讲解(Pytorch版本)-以BERT文本分类代码为例子
self-rag_
This includes the original implementation of SELF-RAG: Learning to Retrieve, Generate and Critique through self-reflection by Akari Asai, Zeqiu Wu, Yizhong Wang, Avirup Sil, and Hannaneh Hajishirzi.
TFC-pretraining
Self-supervised contrastive learning for time series via time-frequency consistency
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.