bentoml / OpenLLM

Run any open-source LLMs, such as Llama 2, Mistral, as OpenAI compatible API endpoint in the cloud.

Home Page:https://bentoml.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

bug: TypeError attribute name must be string, not 'NoneType'

CHesketh76 opened this issue · comments

Describe the bug

This happens when i run openllm. I am using a fresh ubutnu install with fresh venv running.

To reproduce

1.) Create venv
2.) pip install openllm
3.) Run: OPENLLMDEVDEBUG=3 openllm start ../text-generation-webui/models/TheBloke_Mistral-7B-Instruct-v0.2-AWQ --quantize awq
3.1) Run: OPENLLMDEVDEBUG=3 openllm start microsoft/phi-2
4.) Error

Logs

Make sure to check out '../text-generation-webui/models/TheBloke_Mistral-7B-Instruct-v0.2-AWQ' repository to see if the weights is in 'safetensors' format if unsure.
Tip: You can always fallback to '--serialisation legacy' when running quantisation.
It is recommended to specify the backend explicitly. Cascading backend might lead to unexpected behaviour.
Traceback (most recent call last):
  File "/home/bear/Jupyter/openllm/bin/openllm", line 8, in <module>
    sys.exit(cli())
             ^^^^^
  File "/home/bear/Jupyter/openllm/lib/python3.11/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/bear/Jupyter/openllm/lib/python3.11/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
         ^^^^^^^^^^^^^^^^
  File "/home/bear/Jupyter/openllm/lib/python3.11/site-packages/click/core.py", line 1688, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/bear/Jupyter/openllm/lib/python3.11/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/bear/Jupyter/openllm/lib/python3.11/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/bear/Jupyter/openllm/lib/python3.11/site-packages/openllm_cli/entrypoint.py", line 160, in wrapper
    return_value = func(*args, **attrs)
                   ^^^^^^^^^^^^^^^^^^^^
  File "/home/bear/Jupyter/openllm/lib/python3.11/site-packages/click/decorators.py", line 33, in new_func
    return f(get_current_context(), *args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/bear/Jupyter/openllm/lib/python3.11/site-packages/openllm_cli/entrypoint.py", line 141, in wrapper
    return f(*args, **attrs)
           ^^^^^^^^^^^^^^^^^
  File "/home/bear/Jupyter/openllm/lib/python3.11/site-packages/openllm_cli/entrypoint.py", line 366, in start_command
    llm = openllm.LLM(
          ^^^^^^^^^^^^
  File "/home/bear/Jupyter/openllm/lib/python3.11/site-packages/openllm/_llm.py", line 205, in __init__
    quantise=getattr(self._Quantise, backend)(self, quantize),
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: attribute name must be string, not 'NoneType'

Environment

bentoml: 1.1.11
transformers: 4.37.2
python: 3.11
platform: Ubuntu 22

System information (Optional)

No response

have you solved this? I'm getting the same error

Nope, not using OpenLLM and switching to vLLM was my only fix

You just need to pass a backend
TRUST_REMOTE_CODE=True openllm start microsoft/phi-2 --backend pt