OpenNMT / CTranslate2

Fast inference engine for Transformer models

Home Page:https://opennmt.net/CTranslate2

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Converter not working for NLLB models

marcelgoya opened this issue · comments

When I run the following converter script:

ct2-transformers-converter --model facebook/nllb-200-distilled-1.3B --quantization float16 --output_dir nllb-200-distilled-1.3B-ct2-float16

I now get the following error:

config.json: 100%|████████████████████████████████████████████████████████████████████| 808/808 [00:00<00:00, 4.01MB/s] pytorch_model.bin: 100%|███████████████████████████████████████████████████████████| 5.48G/5.48G [00:22<00:00, 248MB/s] generation_config.json: 100%|█████████████████████████████████████████████████████████| 189/189 [00:00<00:00, 1.27MB/s] tokenizer_config.json: 100%|██████████████████████████████████████████████████████████| 564/564 [00:00<00:00, 3.51MB/s] sentencepiece.bpe.model: 100%|█████████████████████████████████████████████████████| 4.85M/4.85M [00:00<00:00, 139MB/s] tokenizer.json: 100%|██████████████████████████████████████████████████████████████| 17.3M/17.3M [00:00<00:00, 335MB/s] special_tokens_map.json: 100%|████████████████████████████████████████████████████| 3.55k/3.55k [00:00<00:00, 20.6MB/s] Traceback (most recent call last): File "/usr/local/bin/ct2-transformers-converter", line 8, in <module> sys.exit(main()) File "/usr/local/lib/python3.10/dist-packages/ctranslate2/converters/transformers.py", line 2234, in main converter.convert_from_args(args) File "/usr/local/lib/python3.10/dist-packages/ctranslate2/converters/converter.py", line 50, in convert_from_args return self.convert( File "/usr/local/lib/python3.10/dist-packages/ctranslate2/converters/converter.py", line 89, in convert model_spec = self._load() File "/usr/local/lib/python3.10/dist-packages/ctranslate2/converters/transformers.py", line 142, in _load spec = loader(model, tokenizer) File "/usr/local/lib/python3.10/dist-packages/ctranslate2/converters/transformers.py", line 194, in __call__ spec = self.get_model_spec(model) File "/usr/local/lib/python3.10/dist-packages/ctranslate2/converters/transformers.py", line 429, in get_model_spec return super().get_model_spec(model) File "/usr/local/lib/python3.10/dist-packages/ctranslate2/converters/transformers.py", line 257, in get_model_spec self.set_encoder(spec.encoder, model.model.encoder) File "/usr/local/lib/python3.10/dist-packages/ctranslate2/converters/transformers.py", line 286, in set_encoder self.set_common_layers(spec, encoder) File "/usr/local/lib/python3.10/dist-packages/ctranslate2/converters/transformers.py", line 347, in set_common_layers spec.scale_embeddings = module.embed_scale File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1709, in __getattr__ raise AttributeError(f"'{type(self).__name__}' object has no attribute '{name}'") AttributeError: 'M2M100Encoder' object has no attribute 'embed_scale'

Which ctranslate2's version that you used? I don't have any issue like this.

@minhthuc2502 I am using the latest version, 4.3.0. I am also using the latest Transformers and Torch versions

Try to reinstall Transformers. M2M100Encoder has attribute embed_scale.

Can confirm that the conversion script when using Transformers >=4.41.0 is broken. Looking at the commit history of modeling_m2m_100.py in the Tranformers repo, it seems that huggingface/transformers#30410 may be the cause

Having the same problem

Can confirm that the conversion script when using Transformers >=4.41.0 is broken

Using

pip install transformers==4.40.2

Fixed the issue for now

For new transformers version (v.4.41+), embed_scale is no longer a class attribute. I think we would need to change converter script (python/ctranslate2/converters/transformers.py)
e.g. from

    def set_common_layers(self, spec, module):
        spec.scale_embeddings = module.embed_scale

to

    def set_common_layers(self, spec, module):
        import math
        if not hasattr(module,'embed_scale'): embed_scale = math.sqrt(module.config.d_model) if module.config.scale_embedding else 1.0
        else: embed_scale = module.embed_scale
        spec.scale_embeddings = embed_scale

For new transformers version (v.4.41+), embed_scale is no longer a class attribute. I think we would need to change converter script (python/ctranslate2/converters/transformers.py) e.g. from

    def set_common_layers(self, spec, module):
        spec.scale_embeddings = module.embed_scale

to

    def set_common_layers(self, spec, module):
        import math
        if not hasattr(module,'embed_scale'): embed_scale = math.sqrt(module.config.d_model) if module.config.scale_embedding else 1.0
        else: embed_scale = module.embed_scale
        spec.scale_embeddings = embed_scale

Super helpful. Editing my ctranslate2/converters/transformers.py this way, helped me get the ct2 conversion working. Thanks.

Sorry for the late response.

For new transformers version (v.4.41+), embed_scale is no longer a class attribute. I think we would need to change converter script (python/ctranslate2/converters/transformers.py) e.g. from

Thank you for your investigation. I will update the script transformers.py.