damian0815 / compel

A prompting enhancement library for transformers-type text embedding systems

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Issue with negative prompt when using non truncated long prompts

o5faruk opened this issue · comments

self.txt2img_pipe.load_textual_inversion(
        EMBEDDING_PATHS, token=EMBEDDING_TOKENS, local_files_only=True
)

textual_inversion_manager = DiffusersTextualInversionManager(self.txt2img_pipe)


self.compel_proc = Compel(
    tokenizer=self.txt2img_pipe.tokenizer,
    text_encoder=self.txt2img_pipe.text_encoder,
    textual_inversion_manager=textual_inversion_manager,
    truncate_long_prompts=False,
)
if prompt:
    conditioning = self.compel_proc.build_conditioning_tensor(prompt)
    if not negative_prompt:
        negative_prompt = ""  # it's necessary to create an empty prompt - it can also be very long, if you want
    negative_conditioning = self.compel_proc.build_conditioning_tensor(
        negative_prompt
    )
    [
        prompt_embeds,
        negative_prompt_embeds,
    ] = self.compel_proc.pad_conditioning_tensors_to_same_length(
        [conditioning, negative_conditioning]
    )
    ...
    output = pipe(
        prompt_embeds=prompt_embeds,
        negative_prompt_embeds=negative_prompt_embeds,
        guidance_scale=guidance_scale,
        generator=generator,
        num_inference_steps=num_inference_steps,
        **extra_kwargs,
    )

Im having weird issues, all the relevant code is shown above, however, negative_prompt messes up my image results, almost as if negatives are getting mixed up with positives.
Also, this happens only if prompt and negative prompt length exceeds 77 tokens.
extra_kwargs does not contain prompt or negative_prompt so only embeds are passed into pipeline. The pipeline in this case is controlnet text to image

Is it possible that negatives get mixed up into positives in pad_conditioning_tensors_to_same_length function?

This is my image with long negative prompt
image

And this is same seed, same prompt, no negative
image

yes, it's likely caused by #59