lxuechen / private-transformers

A codebase that makes differentially private training of transformers easy.

Home Page:https://arxiv.org/abs/2110.05679

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[DistilBERT] RuntimeError: stack expects each tensor to be equal size

LinkToPast1900 opened this issue · comments

Hi, @lxuechen, thanks for your repo.

I met a problem as follows when I tied to finetune DistilBERT.
Both BERT and Roberta work well.
Any idea about this? Thanks!

Traceback (most recent call last):
...
File "/opt/conda/lib/python3.8/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context
return func(*args, **kwargs)
File "/opt/conda/lib/python3.8/site-packages/private_transformers/privacy_utils/privacy_engine.py", line 360, in step
self._ghost_step(loss=kwargs.pop("loss"))
File "/opt/conda/lib/python3.8/site-packages/private_transformers/privacy_utils/privacy_engine.py", line 261, in _ghost_step
self._ghost_helper(loss)
File "/opt/conda/lib/python3.8/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context
return func(*args, **kwargs)
File "/opt/conda/lib/python3.8/site-packages/private_transformers/privacy_utils/privacy_engine.py", line 334, in _ghost_helper
coef_sample = self.get_coef_sample()
File "/opt/conda/lib/python3.8/site-packages/private_transformers/privacy_utils/privacy_engine.py", line 348, in get_coef_sample
norm_sample = self.get_norm_sample()
File "/opt/conda/lib/python3.8/site-packages/private_transformers/privacy_utils/privacy_engine.py", line 343, in get_norm_sample
norm_sample = torch.stack([param.norm_sample for name, param in self.named_params], dim=0).norm(2, dim=0)
RuntimeError: stack expects each tensor to be equal size, but got [50] at entry 0 and [1] at entry 1

(50 is my batch size)

pip install transformers==4.10.0

solved this problem