protectai / llm-guard

The Security Toolkit for LLM Interactions

Home Page:https://llm-guard.com/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Installation Issues

h0n33badg3rc0d3 opened this issue · comments

Describe the bug
running pip install llm-guard using python 3.10.9 & pip 23.2.1 on Ventura 13.6 in a venv environment keeps erroring out.

The error message is:

Collecting xformers==0.0.21 (from -r llm-guard/requirements.txt (line 13))
Using cached xformers-0.0.21.tar.gz (22.3 MB)
Installing build dependencies ... done
Getting requirements to build wheel ... error
error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> [17 lines of output]
Traceback (most recent call last):
File "/.../llm-guard/venv/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in
main()
File "/.../Projects/llm-guard/venv/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 335, in main
json_out['return_val'] = hook(**hook_input['kwargs'])
File "/.../llm-guard/venv/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 118, in get_requires_for_build_wheel
return hook(config_settings)
File "/private/var/folders/nd/yvg3hfx11r75f5xbv5wfw4hm0000gq/T/pip-build-env-90386ewn/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 355, in get_requires_for_build_wheel
return self._get_build_requires(config_settings, requirements=['wheel'])
File "/private/var/folders/nd/yvg3hfx11r75f5xbv5wfw4hm0000gq/T/pip-build-env-90386ewn/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 325, in _get_build_requires
self.run_setup()
File "/private/var/folders/nd/yvg3hfx11r75f5xbv5wfw4hm0000gq/T/pip-build-env-90386ewn/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 507, in run_setup
super(_BuildMetaLegacyBackend, self).run_setup(setup_script=setup_script)
File "/private/var/folders/nd/yvg3hfx11r75f5xbv5wfw4hm0000gq/T/pip-build-env-90386ewn/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 341, in run_setup
exec(code, locals())
File "", line 23, in
ModuleNotFoundError: No module named 'torch'
[end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.

To Reproduce
Steps to reproduce the behavior:

  1. python3 -m venv venv
  2. Install llm-guard using python 3.10.9 and pip 23.2.1
  3. See error

I get the same issue installing from source following the instructions here: https://llm-guard.com/installation/#install-from-source

After much more tinkering, i resolved the issue by modifying the pyvenv.cfg and setting include-system-site-packages=true
https://stackoverflow.com/questions/57801495/pip-wont-install-packages-in-virtualenv/57802296#57802296

After much more tinkering, i resolved the issue by modifying the pyvenv.cfg and setting include-system-site-packages=true stackoverflow.com/questions/57801495/pip-wont-install-packages-in-virtualenv/57802296#57802296

it didnt work for me
python 3.10.0, pip 23.3.2