Sygil-Dev / stable-diffusion

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

--gpu parameter not working anymore

sazabidesu opened this issue · comments

Whenever I try to use the --gpu 0 parameter, it gets ignored and goes for the second one, "gpu 1".
If I try to use --gpu 1 I get this error
[MemMon] Recording max memory usage... !!Runtime error (txt2img)!! Expected all tensors to be on the same device, but found at least two devices, cuda:1 and cuda:0! (when checking argument for argument index in method wrapper__index_select) exiting...calling os._exit(0).
If I input no parameters at all, I get the same result.

Somehow, this REPO broke the --gpu GPU parameter for me.
Earlier ones work flawlessly.

Just add:

torch.cuda.set_device(device)

between lines 820 and 821 in webui.py, make sure the line has the same indentation as line 822

Thanks, going to try this and report back.