ckiplab / ckip-transformers

CKIP Transformers

Home Page:https://ckip-transformers.readthedocs.io

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Pinning memory issue

qqaatw opened this issue · comments

Hi,

I'm currently using ckip-transformers-ws as a preprocessing tool in my project, and I noticed that the DataLoader's pin_memory flag was hard-coded True in util.py.

As pinning memory is incompatible with multiprocessing (or multiple workers) [1], when users leverage ckip-transformers in their collate_fn of DataLoader with multiple workers, a CUDA error will occur as shown in [1], even if only using CPU for inference.

Therefore, I think it would be better that:

  1. Pin memory only when the device is GPU.
  2. Add an option to decide whether or not to enable memory pinning.

Regards.

[1] https://discuss.pytorch.org/t/pin-memory-vs-sending-direct-to-gpu-from-dataset/33891/2

Implemented in 0.2.7.