Invalid blend syntax causes error, does not fallback on treating prompt as raw prompt
psychedelicious opened this issue · comments
compel 2.0.2, on invokeAI main
Example prompt: ("polar bear","james bond").blend(1,1) painting
Traceback (most recent call last):
File "/home/bat/Documents/Code/InvokeAI/invokeai/app/services/processor.py", line 106, in __process
outputs = invocation.invoke_internal(
File "/home/bat/Documents/Code/InvokeAI/invokeai/app/invocations/baseinvocation.py", line 610, in invoke_internal
output = self.invoke(context)
File "/home/bat/invokeai/.venv/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/home/bat/Documents/Code/InvokeAI/invokeai/app/invocations/compel.py", line 119, in invoke
conjunction = Compel.parse_prompt_string(self.prompt)
File "/home/bat/invokeai/.venv/lib/python3.10/site-packages/compel/compel.py", line 158, in parse_prompt_string
conjunction = pp.parse_conjunction(prompt_string)
File "/home/bat/invokeai/.venv/lib/python3.10/site-packages/compel/prompt_parser.py", line 330, in parse_conjunction
root = self.conjunction.parse_string(prompt)
File "/home/bat/invokeai/.venv/lib/python3.10/site-packages/pyparsing/core.py", line 1141, in parse_string
raise exc.with_traceback(None)
pyparsing.exceptions.ParseException: Expected {explicit_conjunction | {[Group:({lora}...)] {blend | Group:([{cross_attention_substitute | lora | attention | Forward: string enclosed in '"' | parenthesized_fragment | free_word | Suppress:(<SP><TAB><CR><LF>)}]...)} [Group:({lora}...)] StringEnd}}, found 'painting' (at char 41), (line:1, col:42)
Would expect this to fallback to treating the prompt as a raw prompt.