salesforce / CodeGen

CodeGen is a family of open-source model for program synthesis. Trained on TPU-v4. Competitive with OpenAI Codex.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Atrribute Error: 'AlignConfig' object has no attribute 'encoder', 'PoolFormerConfig' object has no attribute 'encoder'.

PriyaBSavithiri opened this issue · comments

Hi! I am using python==3.10, torch==1.13.0+cpu, transformers==4.35.0. I am trying to (cpu)benchmark the transformer models in pytorch framework using the command:

python run_benchmark.py --models kakaobrain/align-base --batch_sizes 1 --sequence_lengths 384

when I execute the command, am getting the atrribute error as:


FutureWarning: The class <class 'transformers.benchmark.benchmark.PyTorchBenchmark'> is deprecated. Hugging Face Benchmarking utils are deprecated in general and it is advised to use external Benchmarking libraries to benchmark Transformer models.
warnings.warn(
1 / 1

'AlignConfig' object has no attribute 'encoder'
'AlignConfig' object has no attribute 'encoder'

Traceback (most recent call last):
File "/home/priya/priya/transformers/examples/pytorch/benchmarking/run_benchmark.py", line 50, in
main()
File "/home/priya/priya/transformers/examples/pytorch/benchmarking/run_benchmark.py", line 46, in main
benchmark.run()
File "/home/priya/miniconda3/envs/pyo/lib/python3.10/site-packages/transformers/benchmark/benchmark_utils.py", line 710, in run
memory, inference_summary = self.inference_memory(model_name, batch_size, sequence_length)
ValueError: too many values to unpack (expected 2)


I have encountered the same issue while trying to install the transformers package using pip as well as from the source code.
And the same issue persists in many transformer models. Hereby, am listing few of them:

  1. kakaobrain/align-base
  2. BAAI/AltCLIP
  3. SenseTime/deformable-detr
  4. sail/poolformer_s12

etc.,

Am I missing something ?

Any assistance on this issue is greatly appreciated, thankyou!

Seen similar issue here so posted by mistake, moving the issue to transformers repo, thanks!