AngelAlita's repositories

taker

Transformer Activation Knowledge Extraction Resources

Language:PythonStargazers:1Issues:0Issues:0
Language:Jupyter NotebookStargazers:0Issues:0Issues:0

MoE-LLaVA

Mixture-of-Experts for Large Vision-Language Models

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

PLATON

This pytorch package implements PLATON: Pruning Large Transformer Models with Upper Confidence Bound of Weight Importance (ICML 2022).

Language:PythonStargazers:0Issues:0Issues:0

UPop

[ICML 2023] UPop: Unified and Progressive Pruning for Compressing Vision-Language Transformers.

Language:PythonLicense:BSD-3-ClauseStargazers:0Issues:0Issues:0

vlm_ranndom

A simple and effective LLM pruning approach.

Language:PythonLicense:MITStargazers:0Issues:0Issues:0
Language:C++Stargazers:0Issues:0Issues:0