Nealcly's starred repositories
Selective_Context
Compress your input to ChatGPT or other LLMs, to let them process 2x more content and save 40% memory and GPU time.
detect-pretrain-code
This repository provides an original implementation of Detecting Pretraining Data from Large Language Models by *Weijia Shi, *Anirudh Ajith, Mengzhou Xia, Yangsibo Huang, Daogao Liu , Terra Blevins , Danqi Chen , Luke Zettlemoyer.
Knowledge-Constrained-Decoding
Official Code for EMNLP2023 Main Conference paper: "KCTS: Knowledge-Constrained Tree Search Decoding with Token-Level Hallucination Detection"
Instruction-Tuning-Papers
Reading list of Instruction-tuning. A trend starts from Natrural-Instruction (ACL 2022), FLAN (ICLR 2022) and T0 (ICLR 2022).
self-speculative-decoding
Code associated with the paper **Draft & Verify: Lossless Large Language Model Acceleration via Self-Speculative Decoding**
EKD_Impacts_PKG
This is the respository for paper "Merge Conflicts! Exploring the Impacts of External Distractors to Parametric Knowledge Graphs"
LLMSpeculativeSampling
Fast inference from large lauguage models via speculative decoding
chatgpt-prompts-for-academic-writing
This list of writing prompts covers a range of topics and tasks, including brainstorming research ideas, improving language and style, conducting literature reviews, and developing research plans.
gpt_academic
为GPT/GLM等LLM大语言模型提供实用化交互接口,特别优化论文阅读/润色/写作体验,模块化设计,支持自定义快捷按钮&函数插件,支持Python和C++等项目剖析&自译解功能,PDF/LaTex论文翻译&总结功能,支持并行问询多种LLM模型,支持chatglm3等本地模型。接入通义千问, deepseekcoder, 讯飞星火, 文心一言, llama2, rwkv, claude2, moss等。
tdc2023-starter-kit
This is the starter kit for the Trojan Detection Challenge 2023 (LLM Edition), a NeurIPS 2023 competition.
tdc-starter-kit
Starter kit and data loading code for the Trojan Detection Challenge NeurIPS 2022 competition
alpaca_farm
A simulation framework for RLHF and alternatives. Develop your RLHF method without collecting human data.
direct-preference-optimization
Reference implementation for DPO (Direct Preference Optimization)