protectai / llm-guard

The Security Toolkit for LLM Interactions

Home Page:https://llm-guard.com/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Installation Error for macOS

udm17 opened this issue · comments

Having some trouble installing the file with the updated requirements.txt file using both pip and a fork of the repo. Error is when building the wheel for xformers

clang: error: unsupported option '-fopenmp'
ninja: build stopped: subcommand failed
Python 3.9.17
pip 23.2.1

Hey @udm17 ,
Thanks for following up. Please try to install it using venv. Alternatively, you can try this method: #36 (comment)

Hope it works. Please let me know

I am having the same issue. I tried this solution but no success.

I'm trying to install with pip install llm-guard in a conda environment

conda 4.10.3
Python 3.9.17
pip 23.2.1

As a follow up, I updated my python to version 3.11.5 and now it's working. I also had to install torch

Yeah, torch installation is how we solve it in Docker build of the API

i am using 3.11.5 with pyenv, installed torch separately, still getting same error.

`Collecting torch==2.0.1 (from -r requirements.txt (line 12))
Using cached torch-2.0.1-cp311-none-macosx_10_9_x86_64.whl (143.1 MB)
Collecting transformers==4.36.1 (from -r requirements.txt (line 13))
Using cached transformers-4.36.1-py3-none-any.whl.metadata (126 kB)
Collecting xformers==0.0.22 (from -r requirements.txt (line 14))
Using cached xformers-0.0.22.tar.gz (3.9 MB)
Installing build dependencies ... done
Getting requirements to build wheel ... error
error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> [20 lines of output]
Traceback (most recent call last):
File "/Users/nara/.pyenv/versions/guard/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in
main()
File "/Users/nara/.pyenv/versions/guard/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 335, in main
json_out['return_val'] = hook(**hook_input['kwargs'])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/nara/.pyenv/versions/guard/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 118, in get_requires_for_build_wheel
return hook(config_settings)
^^^^^^^^^^^^^^^^^^^^^
File "/private/var/folders/68/122823fj3yq0c4kqtsvtpzlh0000gn/T/pip-build-env-bzd4mkf5/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 325, in get_requires_for_build_wheel
return self._get_build_requires(config_settings, requirements=['wheel'])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/private/var/folders/68/122823fj3yq0c4kqtsvtpzlh0000gn/T/pip-build-env-bzd4mkf5/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 295, in _get_build_requires
self.run_setup()
File "/private/var/folders/68/122823fj3yq0c4kqtsvtpzlh0000gn/T/pip-build-env-bzd4mkf5/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 480, in run_setup
super(_BuildMetaLegacyBackend, self).run_setup(setup_script=setup_script)
File "/private/var/folders/68/122823fj3yq0c4kqtsvtpzlh0000gn/T/pip-build-env-bzd4mkf5/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 311, in run_setup
exec(code, locals())
File "", line 23, in
ModuleNotFoundError: No module named 'torch'`

Hey @nara , you can try to install torch by running pip install torch.

Additionally, in our Slack, there was another way to resolve it by installing xformers pip install llm-guard --no-build-isolation

@asofter thank you for reply. tried both but no luck.
with --no-build-isolation, getting new error " error: invalid command 'bdist_wheel'", so i did pip install wheel,
then it goes back to same torch error.

For this error, you can try to do pip install wheel(https://llm-guard.com/get_started/installation/#using-pip)

tried it. it works when i created a venv with conda 3.11.5 version, but failed when i created with penv.

can you please let know what you are using for venv?

I use venv for that

I can confirm that on MacOS if you install torch then, wheel and finally pip install llm-guard --no-build-isolation it will work.