huggingface / exporters

Export Hugging Face models to Core ML and TensorFlow Lite

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

exporting to coreml format throws errors

a-maci opened this issue · comments

commented

I am doing this:

python -m exporters.coreml --model=bert-base-uncased exported/

and running into error:

RuntimeError: Error compiling model: "compiler error: Encountered an error while compiling a neural network model: validator error: Model output 'pooler_output' has a different shape than its corresponding return value to main.".

Did the underlying Bert implementation's api change?

I hit similar errors with some of the other models mentioned in the Readme (ready-made configurations)

It's quite possible something has changed, I haven't worked on this in a while. This is very much beta software right now.

One thing you can try is to remove the pooler_output from the outputs. This requires changing the BertCoreMLConfig object.

Hi @a-maci! We have been investigating this and it's looking like it could be some sort of issue with coremltools. While we try to find a solution, you can use the following workaround that @hollance designed:

import coremltools as ct
mlmodel = ct.models.MLModel("exported/Model.mlpackage")
del mlmodel._spec.description.output[1].type.multiArrayType.shape[:]
mlmodel = ct.models.MLModel(mlmodel._spec, weights_dir=mlmodel.weights_dir)
mlmodel.save("ModelFixed.mlpackage")

This removes the shape information of the pooler output (which is correct, but gets Core ML confused) and saves a new version of the model as ModelFixed.mlpackage.

If you don't need the pooler output at all, you can also create a custom configuration to remove it. Let us know if you are interested in this approach and want some assistance.

This has been addressed in #10 following the latest Apple recommendations.