adapter-hub / adapters

A Unified Library for Parameter-Efficient and Modular Transfer Learning

Home Page:https://docs.adapterhub.ml

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Total Parameter added to original model

km5ar opened this issue · comments

Is there anyway we can calculate the total parameter added to original model?

Hi @km5ar,
to get a summary of the currently added adapters, you can use the adapter_summary() function like this:

print(model.adapter_summary())

For an example script & output, see here: #371
You can then sum up the "#Param" column values to get the total number of parameters added.

Alternatively, if you want a function that directly returns the total number of added parameters, you could write a new function based on the implementation of adapter_summary(). This would need just very few changes; the relevant part in the code is here:

for name, config_name in self.adapters_config.adapters.items():
if config_name in self.adapters_config.config_map:
config = self.adapters_config.config_map.get(config_name, None)
else:
config = ADAPTER_CONFIG_MAP.get(config_name, None)
if isinstance(config, str):
config = ADAPTER_CONFIG_MAP[config]
row = {"name": name, "architecture": config.get("architecture", None) or "bottleneck"}
weights = self.get_adapter(name)
row["active"] = self.active_adapters is not None and name in self.active_adapters.flatten()
# count parameters
no_params = 0
train = True
for _, module_dict in weights.items():
for _, module in module_dict.items():
no_params += sum(p.numel() for p in module.parameters())
train &= all(p.requires_grad for p in module.parameters())
row["#param"] = no_params