OCR-D / core

Collection of OCR-related python tools and wrappers from @OCR-D

Home Page:https://ocr-d.de/core/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

GPU processors: (uniformly) log CPU vs GPU use

bertsky opened this issue · comments

Due to the lack of a specification on that aspect, our processors have no or no uniform way to inform the user whether or not a GPU device is used (or even parameterise which one to prefer). Here's the current state of affairs in this regard:

  • TF based processors:
    • ocrd-eynollah-segment: WARNING no GPU device available
    • ocrd-anybaseocr-block-segmentation: WARNING Tensorflow cannot detect CUDA installation. Running without GPU will be slow
    • ocrd-cor-asv-ann-process: INFO using ... LSTM implementation to compile ...
    • ocrd-keraslm-rate: INFO using ... LSTM implementation to compile ...
    • all others (ocrd-calamari-recognize, ocrd-sbb-binarize, ocrd-anybaseocr-tiseg, ocrd-anybaseocr-layout-analysis): rely on TF logging, which due to Tensorflow gabbiness we have to mute via envvar TF_CPP_MIN_LOG_LEVEL=3
  • Torch based processors:
    • ocrd-kraken-segment: nothing
    • ocrd-kraken-recognize: nothing
    • ocrd-typegroups-classifier: nothing
    • ocrd-detectron2-segment: INFO Using compute device ...
    • ocrd-anybaseocr-dewarp: WARNING torch cannot detect CUDA installation

The good news is that when there is a GPU, we try to utilise it. But the user has no way to validate this works (other than perhaps make test-cuda in ocrd_all).

Personally, I like the warnings best. Could we agree on standardising and enforcing these?

@bertsky : Many thanks for opening this issue. The "Warning"-variant would be very nice.
In my opinion, this is needed,
e.g. in my environment I can only "believe" that ocrd-kraken-segment uses the GPU.

e.g. in my environment I can only "believe" that ocrd-kraken-segment uses the GPU.

BTW I just amended OCR-D/ocrd_kraken#38 to include such warnings for Kraken.

Having consistent WARNING about this across processors would be a good first step indeed.