Wanru Zhao's repositories
multilingual_borders
ICLR 2024
BraTS21
Solution of the RSNA/ASNR/MICCAI Brain Tumor Segmentation (BraTS) Challenge 2021
breaching
Breaching privacy in federated learning scenarios for vision and text
ccf-deadlines
⏰ Collaboratively track deadlines of conferences recommended by CCF (Website, Python Cli)/ Please star this project, thanks~
doremi
Pytorch implementation of DoReMi, a method for optimizing the data mixture weights in language modeling datasets
dsir
Pre-filtered datasets and code for selecting relevant language model training data from The Pile.
knowledge-unlearning
[ACL 2023] Knowledge Unlearning for Mitigating Privacy Risks in Language Models
LAVA
This is an official repository for "LAVA: Data Valuation without Pre-Specified Learning Algorithms" (ICLR2023).
llm-attacks
Universal and Transferable Attacks on Aligned Language Models
LLMs-Finetuning-Safety
We jailbreak GPT-3.5 Turbo’s safety guardrails by fine-tuning it on only 10 adversarially designed examples, at a cost of less than $0.20 via OpenAI’s APIs.
LoRA_Easy_Training_Scripts
A UI made in Pyside6 to make training LoRA/LoCon and other LoRA type models in sd-scripts easy
mimic3-time-series
MIMIC-III Time Series Models
NeMo-Guardrails
NeMo Guardrails is an open-source toolkit for easily adding programmable guardrails to LLM-based conversational systems.
NVFlare
NVIDIA Federated Learning Application Runtime Environment
offsite-tuning
Offsite-Tuning: Transfer Learning without Full Model
peft
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
qlora
QLoRA: Efficient Finetuning of Quantized LLMs
Ryan0v0.github.io
A beautiful, simple, clean, and responsive Jekyll theme for academics
scale-fl
Code for ScaleFL
smoothquant
SmoothQuant: Accurate and Efficient Post-Training Quantization for Large Language Models
ssl-data-curation
PyTorch code for hierarchical k-means -- a data curation method for self-supervised learning
task_vectors
Editing Models with Task Arithmetic
thesis
A LaTeX document class that conforms to the Computer Laboratory's PhD thesis formatting guidelines.