marella / ctransformers

Python bindings for the Transformer models implemented in C/C++ using GGML library.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Question] Run CTransformer with oracle linux server hits error with libctransformers.so

guanw opened this issue · comments

issue

Hi, folks, I'm trying to run a script that does some Q&A with a pretrained model loaded using CTransformers.

I'm able to run the code successfully on mac but when it comes to deploying it on a linux server (e.g oracle linux) it seems to break.

The stacktrace i get is:

python /home/opc/fine_tuned/fine_tune.py
/home/opc/miniconda3/envs/myenv/lib/python3.8/site-packages/langchain/__init__.py:34: UserWarning: Importing PromptTemplate from langchain root module is no longer supported. Please use langchain.prompts.PromptTemplate instead.
  warnings.warn(
loaded all libs!
load and prepare sample training data!
Fetching 1 files: 100%|███████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 18001.30it/s]
Fetching 1 files: 100%|████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 3675.99it/s]
huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
To disable this warning, you can either:
	- Avoid using `tokenizers` before the fork if possible
	- Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
To disable this warning, you can either:
	- Avoid using `tokenizers` before the fork if possible
	- Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
Traceback (most recent call last):
  File "/home/opc/fine_tuned/fine_tune.py", line 46, in <module>
    llm=CTransformers(model="TheBloke/Llama-2-7B-Chat-GGML",
  File "/home/opc/miniconda3/envs/myenv/lib/python3.8/site-packages/langchain/load/serializable.py", line 97, in __init__
    super().__init__(**kwargs)
  File "/home/opc/miniconda3/envs/myenv/lib/python3.8/site-packages/pydantic/v1/main.py", line 339, in __init__
    values, fields_set, validation_error = validate_model(__pydantic_self__.__class__, data)
  File "/home/opc/miniconda3/envs/myenv/lib/python3.8/site-packages/pydantic/v1/main.py", line 1102, in validate_model
    values = validator(cls_, values)
  File "/home/opc/miniconda3/envs/myenv/lib/python3.8/site-packages/langchain/llms/ctransformers.py", line 72, in validate_environment
    values["client"] = AutoModelForCausalLM.from_pretrained(
  File "/home/opc/miniconda3/envs/myenv/lib/python3.8/site-packages/ctransformers/hub.py", line 175, in from_pretrained
    llm = LLM(
  File "/home/opc/miniconda3/envs/myenv/lib/python3.8/site-packages/ctransformers/llm.py", line 246, in __init__
    self._lib = load_library(lib, gpu=config.gpu_layers > 0)
  File "/home/opc/miniconda3/envs/myenv/lib/python3.8/site-packages/ctransformers/llm.py", line 126, in load_library
    lib = CDLL(path)
  File "/home/opc/miniconda3/envs/myenv/lib/python3.8/ctypes/__init__.py", line 373, in __init__
    self._handle = _dlopen(self._name, mode)
OSError: /home/opc/miniconda3/envs/myenv/lib/python3.8/site-packages/ctransformers/lib/basic/libctransformers.so: cannot open shared object file: No such file or directory

Here are the things i've tried so far:

  1. I made sure the python version used in virtual environment (in this case: conda) is correct:
(myenv) [opc@gpt-server fine_tuned]$ python --version
Python 3.8.18
  1. I made sure the .so file existed and have read permission
(myenv) [opc@gpt-server fine_tuned]$ ls -l /home/opc/miniconda3/envs/myenv/lib/python3.8/site-packages/ctransformers/lib/basic/libctransformers.so
-rw-rw-r--. 1 opc opc 1482416 Nov 19 16:28 /home/opc/miniconda3/envs/myenv/lib/python3.8/site-packages/ctransformers/lib/basic/libctransformers.so
  1. I made sure LD_LIBRARY_PATH used by dynamic linker/loader has the path included and ran a sudo ldconfig just in case
(myenv) [opc@gpt-server fine_tuned]$ echo $LD_LIBRARY_PATH
/home/opc/miniconda3/envs/myenv/lib/python3.8/site-packages/ctransformers/lib/basic:/usr/local/lib

wonder if anyone knows what's causing it or why it's not supported in aarch64 architecture? Thanks for any help in advance!

some system attributes:

$ uname -m
aarch64