damian0815 / compel

A prompting enhancement library for transformers-type text embedding systems

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Documentation: `.and()` feature needs `pad_conditioning_tensors_to_same_length()`

hqt98 opened this issue · comments

commented

I was trying out the new and() feature and it seems like the prompt pieces just get their embedding concatenated so dimension 77 becomes 154 for two pieces, 231 for 3 pieces and so on.

This gives the error RuntimeError: The size of tensor a (154) must match the size of tensor b (77) at non-singleton dimension 1 during inference.

Is this a bug or am I using this new feature wrong?
Thanks.

I also encountered the same issue.
By matching the number of negative prompts with the positive prompts, the error can be avoided.

prompt: ("apple", "orange").and()
negative: ("", "").and()

ahh, right, when using .and() with negative prompts you also need to do [cond, negative_cond] = compel.pad_conditioning_tensors_to_same_length([cond, negative_cond])

i should probably make the documentation clearer.