aws / amazon-s3-plugin-for-pytorch

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Thread safety: Dataloaders (with multiple workers) only supports multiprocessing_context "spawn"

rehno-lindeque opened this issue · comments

This may be a documentation issue:

Pytorch dataloaders only appear to work when multiprocessing_context is set to "spawn". (or when workers=0)

At least with pytorch 1.10

>>> torch.__version__
'1.10.0+cu113'

I obtain the error

ERROR: Unexpected segmentation fault encountered in worker.

unless multiprocessing_context = "spawn" is explicitly set.

@rehno-lindeque

We're upstreaming the amazon-s3-plugin-for-pytorch into the torchdata package (pytorch/data#318).
We're dropping support for this plugin.

Thanks for raising this issue. However, we're not updating this repository anymore. If the issue still occurs in the new torchdata package, we'll investigate and resolve the issue.

Thanks @ydaiming. I've already started transitioning to torchdata myself so that is pretty convenient 👍