asagi4 / comfyui-prompt-control

ComfyUI nodes for prompt editing and LoRA control

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

ERROR WHEN TRYING TO LOAD TWO LORAS AT DIFFERENT STEPS

yovsac opened this issue · comments

Hello,

First of all, thank you for your work on these nodes. It has a lot of potential.

I am having issues when trying to load two loras at different steps of generation. The prompt I'm using is "[p1hf car<lora:p1hf car_V1:0.5>:cnptn car<lora:cnptn car_V1:0.5>:0.5]"

Here is a snapshot of the nodes:
Screenshot 2023-11-03 160813

(I USE ANYWHERE NODES, SO ALL INPUTS ARE CONNECTED PROPERLY EVEN IF THEIR NOT VISIBLE)

Error occurred when executing KSamplerAdvanced:

Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cpu! (when checking argument for argument mat2 in method wrapper_CUDA_addmm)

File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\ComfyUI\execution.py", line 153, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\ComfyUI\execution.py", line 83, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\ComfyUI\execution.py", line 76, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\ComfyUI\nodes.py", line 1271, in sample
return common_ksampler(model, noise_seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise, disable_noise=disable_noise, start_step=start_at_step, last_step=end_at_step, force_full_denoise=force_full_denoise)

File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\ComfyUI\nodes.py", line 1207, in common_ksampler
samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-prompt-control\prompt_control\hijack.py", line 35, in pc_sample
r = cb(orig_sampler, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-prompt-control\prompt_control\node_lora.py", line 104, in sampler_cb
s = orig_sampler(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\sample_error_enhancer.py", line 22, in informative_sample
raise e
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\sample_error_enhancer.py", line 9, in informative_sample
return original_sample(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\ComfyUI\comfy\sample.py", line 100, in sample
samples = sampler.sample(noise, positive_copy, negative_copy, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-prompt-control\prompt_control\hijack.py", line 79, in sample
return super().sample(
^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 694, in sample
return sample(self.model, noise, positive, negative, cfg, self.device, sampler(), sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 600, in sample
samples = sampler.sample(model_wrap, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 560, in sample
samples = getattr(k_diffusion_sampling, "sample_{}".format(sampler_name))(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **extra_options)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\ComfyUI\comfy\k_diffusion\sampling.py", line 137, in sample_euler
denoised = model(x, sigma_hat * s_in, **extra_args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 277, in forward
out = self.inner_model(x, sigma, cond=cond, uncond=uncond, cond_scale=cond_scale, model_options=model_options, seed=seed)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1527, in call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 267, in forward
return self.apply_model(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 264, in apply_model
out = sampling_function(self.inner_model.apply_model, x, timestep, uncond, cond, cond_scale, model_options=model_options, seed=seed)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 252, in sampling_function
cond, uncond = calc_cond_uncond_batch(model_function, cond, uncond, x, timestep, max_total_area, model_options)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 226, in calc_cond_uncond_batch
output = model_function(input_x, timestep
, **c).chunk(batch_chunks)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\ComfyUI\comfy\model_base.py", line 140, in apply_model
model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, **extra_conds).float()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\ComfyUI\custom_nodes\FreeU_Advanced\nodes.py", line 173, in __temp__forward
h = forward_timestep_embed(module, h, emb, context, transformer_options)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\ComfyUI\comfy\ldm\modules\diffusionmodules\openaimodel.py", line 56, in forward_timestep_embed
x = layer(x, context, transformer_options)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\ComfyUI\comfy\ldm\modules\attention.py", line 557, in forward
x = self.proj_in(x)
^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\yovan\ComfyUI_windows_portable_03\ComfyUI_windows_portable\ComfyUI\comfy\ops.py", line 18, in forward
return torch.nn.functional.linear(input, self.weight, self.bias)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

That should work just fine. At least it works for me but I run Comfy solely on Linux and can't test on Windows.

A couple things to check:

  1. Is your ComfyUI up-to-date?
  2. Is the extension up-to-date?
  3. Do you still get this error if you disable all other extensions?
    The FreeU_Advanced node I see in the stacktrace seems like a likely culprit, since it is on the sampling path.

Thank you for your reply. I will check it out!

I'm assuming you figured it out

Finally got some time to continue with this. Solved the issue, apparently it was the fact that my comfyUI was on Normal VRAM by default. I added the argument --highvram to the .bat and it worked after that. Only thing is that not all Samplers work well. Noticed that the exponential scheduler works well. Any recommendation for Sampler and Scheduler from you end?

Prompt scheduling doesn't really care about which sampler or scheduler you use since it uses Comfy's own timestep range feature; it only affects some samplers if you use the PCSplitSampling node; otherwise its behaviour should be identical to the equivalent graph in ComfyUI,though it will quickly get very complex if you do a lot of prompt editing, which is why I wrote this extension in the first place.

I don't really have any recommendations; I tend to use euler_ancestral and karras, but that's mostly because I don't bother switching samplers or schedulers very often.