intel / neural-compressor

SOTA low-bit LLM quantization (INT8/FP8/INT4/FP4/NF4) & sparsity; leading model compression techniques on TensorFlow, PyTorch, and ONNX Runtime

Home Page:https://intel.github.io/neural-compressor/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

how to get layer_mappings for distillation?

Michael-Fuu opened this issue · comments

hi, I want to write scripts to print layer_mappings for distillation, my script like this:
for name, module in model.named_modules(): print(name)
while the results is far away from default layer_mapping. For example, when I use resnet50, the default result is [['resblock.1.feature.output', 'resblock.deepst.feature.output'], while my result is something like layer1.0.conv1. How to define the correct layer_mapping?

Hi @Michael-Fuu , are you looking at this self distillation example? As you can see in the code below, this example does not follow the layer names (such as layer1.0.conv1 you mentioned) of original model (e.g. resnet50) for layer mapping, it creates layer names (such as resblock.1.feature.output) not related to the original model.
image