Han Zhou (hanzhou032)

hanzhou032

Geek Repo

Company:University of Cambridge

Location:Cambridge, UK

Home Page:hzhou.top

Twitter:@hanzhou032

Github PK Tool:Github PK Tool

Han Zhou's repositories

xqa-dst

XQA-DST: Multi-Domain and Multi-Lingual Dialogue State Tracking (Zhou et al.; EACL 2023 Findings)

Language:PythonLicense:Apache-2.0Stargazers:4Issues:1Issues:0
Language:PythonLicense:Apache-2.0Stargazers:1Issues:0Issues:0

Alpaca-CoT

We unified the interfaces of instruction-tuning data (e.g., CoT data), multiple LLMs and parameter-efficient methods (e.g., lora, p-tuning) together for easy use. Meanwhile, we created a new branch to build a Tabular LLM.(我们分别统一了丰富的IFT数据(如CoT数据,目前仍不断扩充)、多种训练效率方法(如lora,p-tuning)以及多种LLMs,三个层面上的接口,打造方便研究人员上手的LLM-IFT研究平台。同时tabular_llm分支构建了面向表格智能任务的LLM。

Language:Jupyter NotebookLicense:Apache-2.0Stargazers:0Issues:0Issues:0

alpaca-lora

Instruct-tune LLaMA on consumer hardware

Language:Jupyter NotebookLicense:Apache-2.0Stargazers:0Issues:0Issues:0

awesome-adapter-resources

Collection of Tools and Papers related to Adapters (aka Parameter-Efficient Transfer Learning/ Fine-Tuning)

Language:PythonLicense:ISCStargazers:0Issues:0Issues:0

Awesome-LLM-Prompt-Optimization

Awesome-LLM-Prompt-Optimization: a curated list of advanced prompt optimization and tuning methods in Large Language Models

Stargazers:0Issues:0Issues:0

Awesome-Parameter-Efficient-Transfer-Learning

A collection of parameter-efficient transfer learning papers focusing on computer vision and multimodal domains.

License:MITStargazers:0Issues:0Issues:0

Black-Box-Tuning

ICML'2022: Black-Box Tuning for Language-Model-as-a-Service & EMNLP'2022: BBTv2: Towards a Gradient-Free Future with Large Language Models

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

multi3woz_ltl

The official repository for Multi3WOZ: A Multilingual, Multi-Domain, Multi-Parallel Dataset for Training and Evaluating Culturally Adapted Task-Oriented Dialog Systems (Hu et al., to appear; TACL)

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

Awesome-LLM-Uncertainty-Reliability-Robustness

Awesome-LLM-Robustness: a curated list of Uncertainty, Reliability and Robustness in Large Language Models

License:MITStargazers:0Issues:0Issues:0

Channel-LM-Prompting

An original implementation of "Noisy Channel Language Model Prompting for Few-Shot Text Classification"

Stargazers:0Issues:0Issues:0
Stargazers:0Issues:0Issues:0

DoLa

Official implementation for the paper "DoLa: Decoding by Contrasting Layers Improves Factuality in Large Language Models"

Stargazers:0Issues:0Issues:0

function_vectors

Function Vectors in Large Language Models [ICLR 2024]

Stargazers:0Issues:0Issues:0
Language:Jupyter NotebookStargazers:0Issues:0Issues:0

GrIPS

Code for our paper: "GrIPS: Gradient-free, Edit-based Instruction Search for Prompting Large Language Models"

License:MITStargazers:0Issues:0Issues:0
Stargazers:0Issues:0Issues:0
Language:JavaScriptLicense:MITStargazers:0Issues:0Issues:0
Language:Jupyter NotebookLicense:MITStargazers:0Issues:1Issues:0

ICV

Code for In-context Vectors: Making In Context Learning More Effective and Controllable Through Latent Space Steering

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

LLM-Safeguard

Official repository for ICML 2024 paper "On Prompt-Driven Safeguarding for Large Language Models"

Stargazers:0Issues:0Issues:0
Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

peft

🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.

License:Apache-2.0Stargazers:0Issues:0Issues:0

ProPETL

One Network, Many Masks: Towards More Parameter-Efficient Transfer Learning

License:Apache-2.0Stargazers:0Issues:0Issues:0

pyreft

ReFT: Representation Finetuning for Language Models

License:Apache-2.0Stargazers:0Issues:0Issues:0

rl-prompt

Accompanying repo for the RLPrompt paper

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

S3Delta

code for paper Sparse Structure Search for Delta Tuning

Stargazers:0Issues:0Issues:0
Language:Jupyter NotebookLicense:MITStargazers:0Issues:0Issues:0
Language:PythonStargazers:0Issues:0Issues:0

transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.

License:Apache-2.0Stargazers:0Issues:0Issues:0