huggingface / evaluate

🤗 Evaluate: A library for easily evaluating machine learning models and datasets.

Home Page:https://huggingface.co/docs/evaluate

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[BUG] Loading metrics is extremely slow

NightMachinery opened this issue · comments

In [7]: %time evaluate.load('f1')
Using the latest cached version of the module from /opt/huggingface/modules/evaluate_modules/metrics/evaluate-metric--f1/0ca73f6cf92ef5a268320c697f7b940d1030f8471714bffdb6856c641b818974 (last modified on Mon Mar 27 21:11:41 2023) since it couldn't be found locally at evaluate-metric--f1, or remotely on the Hugging Face Hub.
CPU times: user 44.1 ms, sys: 0 ns, total: 44.1 ms
Wall time: 1.8 s

This happens every time I load any module, and not just the first time.


In [11]: evaluate.__version__
Out[11]: '0.4.0'

Related issue:

This workaround speeds up the loading to around 10ms:

We need to do this before importing evaluate.

import os

os.environ['HF_EVALUATE_OFFLINE'] = '1'

Yes, I set os.environ['HF_EVALUATE_OFFLINE'] = '1' but loading meteor is still quite slow.
Seems it takes long to verify the nltk-data is up-to-date