sugarforever / chat-ollama

ChatOllama is an open source chatbot based on LLMs. It supports a wide range of language models, and knowledge base management.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

数据rerank时报错

xiaolvtongxue-zt opened this issue · comments

你好,在使用数据库问题回答时,数据召回似乎正常,但rerank时报错了。
在chat_data-chatollama-1 的日志日这样的:

Knowledge base file created with ID:  2
URL: /api/models/chat User: null
Chat with knowledge base with id:  1
Knowledge base 新闻数据 with embedding "nomic-embed-text:latest"
Creating embeddings for Ollama served model: nomic-embed-text:latest
Creating Chroma vector store
Initializing ParentDocumentRetriever with RedisDocstore
Redis client options:  { host: 'redis', port: 6379, username: undefined, password: undefined }
Chat with OpenAI, host: http://192.168.0.74:7869/v1
User query:  嫦娥六号是第几次实现月球轨道交会对接?
Reformulated query:  嫦娥六号是第几次实现月球轨道交会对接?
Relevant documents:  [ Document {
    pageContent:
     '{"data":null,"code":451,"name":"SecurityCompromiseError","status":45102,"message":"Domain www.sohu.com blocked until Fri Jun 07 2024 03:59:17 GMT+0000 (Coordinated Universal Time) due to previous abuse found on http://www.sohu.com/a/784279579_461392: DDoS attack suspected: Too many requests: 2001","readableMessage":"SecurityCompromiseError: Domain www.sohu.com blocked until Fri Jun 07 2024 03:59:17 GMT+0000 (Coordinated Universal Time) due to previous abuse found on http://www.sohu.com/a/784279579_461392: DDoS attack suspected: Too many requests: 2001"}',
    metadata: { source: 'https://www.sohu.com/a/784271789_114988', title: '', loc: [Object] } },
�
  Document {
    pageContent:
     '{"data":null,"code":451,"name":"SecurityCompromiseError","status":45102,"message":"Domain www.sohu.com blocked until Fri Jun 07 2024 03:59:17 GMT+0000 (Coordinated Universal Time) due to previous abuse found on http://www.sohu.com/a/784279579_461392: DDoS attack suspected: Too many requests: 2001","readableMessage":"SecurityCompromiseError: Domain www.sohu.com blocked until Fri Jun 07 2024 03:59:17 GMT+0000 (Coordinated Universal Time) due to previous abuse found on http://www.sohu.com/a/784279579_461392: DDoS attack suspected: Too many requests: 2001"}',
    metadata: { source: 'https://www.sohu.com/?strategyid=00001 ', title: '', loc: [Object] } },
�
  Document {
    pageContent:
     '{"data":null,"code":451,"name":"SecurityCompromiseError","status":45102,"message":"Domain www.sohu.com blocked until Fri Jun 07 2024 03:59:17 GMT+0000 (Coordinated Universal Time) due to previous abuse found on http://www.sohu.com/a/784279579_461392: DDoS attack suspected: Too many requests: 2001","readableMessage":"SecurityCompromiseError: Domain www.sohu.com blocked until Fri Jun 07 2024 03:59:17 GMT+0000 (Coordinated Universal Time) due to previous abuse found on http://www.sohu.com/a/784279579_461392: DDoS attack suspected: Too many requests: 2001"}',
    metadata: { source: 'https://www.sohu.com/a/784271789_114988', title: '', loc: [Object] } },
�
  Document {
    pageContent:
     '{"data":null,"path":"url","code":400,"name":"ParamValidationError","status":40001,"message":"Invalid protocol about:","readableMessage":"ParamValidationError(url): Invalid protocol about:"}',
    metadata: { source: 'about:blank#comment_area', title: '', loc: [Object] } } ]
Cohere Rerank Options:  { apiKey: 'xxxxx',
  baseUrl: 'http://peanutshell:8000/v1',
  model: 'ms-marco-MiniLM-L-6-v2',
  topN: 4 }
 ERROR  [nuxt] [request error] [unhandled] [500] Status code: 500
Body: "Internal Server Error"
  at CohereClient.<anonymous> (./node_modules/.pnpm/cohere-ai@7.9.3/node_modules/cohere-ai/Client.js:481:27)  
  at Generator.next (<anonymous>)  
  at fulfilled (./node_modules/.pnpm/cohere-ai@7.9.3/node_modules/cohere-ai/Client.js:31:58)  
  at process.processTicksAndRejections (node:internal/process/task_queues:95:5)

在[chat_data-peanutshell-1]花生壳的日志是这样的:

  File "/usr/local/lib/python3.11/site-packages/requests/adapters.py", line 507, in send
    raise ConnectTimeout(e, request=request)
requests.exceptions.ConnectTimeout: (MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /cross-encoder/ms-marco-MiniLM-L-6-v2/resolve/main/config.json (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x7f5c5f89ac90>, 'Connection to huggingface.co timed out. (connect timeout=10)'))"), '(Request ID: 8539d890-1012-4a77-9ec4-e449865ce6f7)')
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/transformers/utils/hub.py", line 399, in cached_file
    resolved_file = hf_hub_download(
                    ^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1221, in hf_hub_download
    return _hf_hub_download_to_cache_dir(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1325, in _hf_hub_download_to_cache_dir
    _raise_on_head_call_error(head_call_error, force_download, local_files_only)
  File "/usr/local/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1826, in _raise_on_head_call_error
    raise LocalEntryNotFoundError(
huggingface_hub.utils._errors.LocalEntryNotFoundError: An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 411, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 69, in __call__
    return await self.app(scope, receive, send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
    await self.app(scope, receive, _send)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 756, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 776, in app
    await route.handle(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 297, in handle
    await self.app(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 77, in app
    await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 72, in app
    response = await func(request)
               ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 278, in app
    raw_response = await run_endpoint_function(
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 193, in run_endpoint_function
    return await run_in_threadpool(dependant.call, **values)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/starlette/concurrency.py", line 42, in run_in_threadpool
    return await anyio.to_thread.run_sync(func, *args)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/anyio/to_thread.py", line 56, in run_sync
    return await get_async_backend().run_sync_in_worker_thread(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 2144, in run_sync_in_worker_thread
    return await future
           ^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 851, in run
    result = context.run(func, *args)
             ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/endpoints/__init__.py", line 24, in rerank
    service = CrossEncoderRerankService(modelName=model_name)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/services/cross_encoder/cross_encoder_rerank_service.py", line 17, in __init__
    self.cross_encoder = CrossEncoder(model_name=modelName, local_files_only=False)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/sentence_transformers/cross_encoder/CrossEncoder.py", line 72, in __init__
    self.config = AutoConfig.from_pretrained(
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/models/auto/configuration_auto.py", line 934, in from_pretrained
    config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
                                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/configuration_utils.py", line 632, in get_config_dict
    config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/configuration_utils.py", line 689, in _get_config_dict
    resolved_config_file = cached_file(
                           ^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/utils/hub.py", line 442, in cached_file
    raise EnvironmentError(
OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like cross-encoder/ms-marco-MiniLM-L-6-v2 is not the path to a directory containing a file named config.json.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.

问题是:

  1. 既然需要重新从HF上下载模型?我们在拉取镜像的时候,有8G的大小,难道不是已经把模型拉取下来了??这个问题可以解决吗??比如说:如果模型已经拉取下来了,如何修改拉取模型的路径?如果没有拉取下来,是否可以手动去下载,后续放在某一个指定的路径下????

@xiaolvtongxue-zt 最后怎么解决的呢?

你好,在使用数据库问题回答时,数据召回似乎正常,但rerank时报错了。 在chat_data-chatollama-1 的日志日这样的:

Knowledge base file created with ID:  2
URL: /api/models/chat User: null
Chat with knowledge base with id:  1
Knowledge base 新闻数据 with embedding "nomic-embed-text:latest"
Creating embeddings for Ollama served model: nomic-embed-text:latest
Creating Chroma vector store
Initializing ParentDocumentRetriever with RedisDocstore
Redis client options:  { host: 'redis', port: 6379, username: undefined, password: undefined }
Chat with OpenAI, host: http://192.168.0.74:7869/v1
User query:  嫦娥六号是第几次实现月球轨道交会对接?
Reformulated query:  嫦娥六号是第几次实现月球轨道交会对接?
Relevant documents:  [ Document {
    pageContent:
     '{"data":null,"code":451,"name":"SecurityCompromiseError","status":45102,"message":"Domain www.sohu.com blocked until Fri Jun 07 2024 03:59:17 GMT+0000 (Coordinated Universal Time) due to previous abuse found on http://www.sohu.com/a/784279579_461392: DDoS attack suspected: Too many requests: 2001","readableMessage":"SecurityCompromiseError: Domain www.sohu.com blocked until Fri Jun 07 2024 03:59:17 GMT+0000 (Coordinated Universal Time) due to previous abuse found on http://www.sohu.com/a/784279579_461392: DDoS attack suspected: Too many requests: 2001"}',
    metadata: { source: 'https://www.sohu.com/a/784271789_114988', title: '', loc: [Object] } },
�
  Document {
    pageContent:
     '{"data":null,"code":451,"name":"SecurityCompromiseError","status":45102,"message":"Domain www.sohu.com blocked until Fri Jun 07 2024 03:59:17 GMT+0000 (Coordinated Universal Time) due to previous abuse found on http://www.sohu.com/a/784279579_461392: DDoS attack suspected: Too many requests: 2001","readableMessage":"SecurityCompromiseError: Domain www.sohu.com blocked until Fri Jun 07 2024 03:59:17 GMT+0000 (Coordinated Universal Time) due to previous abuse found on http://www.sohu.com/a/784279579_461392: DDoS attack suspected: Too many requests: 2001"}',
    metadata: { source: 'https://www.sohu.com/?strategyid=00001 ', title: '', loc: [Object] } },
�
  Document {
    pageContent:
     '{"data":null,"code":451,"name":"SecurityCompromiseError","status":45102,"message":"Domain www.sohu.com blocked until Fri Jun 07 2024 03:59:17 GMT+0000 (Coordinated Universal Time) due to previous abuse found on http://www.sohu.com/a/784279579_461392: DDoS attack suspected: Too many requests: 2001","readableMessage":"SecurityCompromiseError: Domain www.sohu.com blocked until Fri Jun 07 2024 03:59:17 GMT+0000 (Coordinated Universal Time) due to previous abuse found on http://www.sohu.com/a/784279579_461392: DDoS attack suspected: Too many requests: 2001"}',
    metadata: { source: 'https://www.sohu.com/a/784271789_114988', title: '', loc: [Object] } },
�
  Document {
    pageContent:
     '{"data":null,"path":"url","code":400,"name":"ParamValidationError","status":40001,"message":"Invalid protocol about:","readableMessage":"ParamValidationError(url): Invalid protocol about:"}',
    metadata: { source: 'about:blank#comment_area', title: '', loc: [Object] } } ]
Cohere Rerank Options:  { apiKey: 'xxxxx',
  baseUrl: 'http://peanutshell:8000/v1',
  model: 'ms-marco-MiniLM-L-6-v2',
  topN: 4 }
 ERROR  [nuxt] [request error] [unhandled] [500] Status code: 500
Body: "Internal Server Error"
  at CohereClient.<anonymous> (./node_modules/.pnpm/cohere-ai@7.9.3/node_modules/cohere-ai/Client.js:481:27)  
  at Generator.next (<anonymous>)  
  at fulfilled (./node_modules/.pnpm/cohere-ai@7.9.3/node_modules/cohere-ai/Client.js:31:58)  
  at process.processTicksAndRejections (node:internal/process/task_queues:95:5)

在[chat_data-peanutshell-1]花生壳的日志是这样的:

  File "/usr/local/lib/python3.11/site-packages/requests/adapters.py", line 507, in send
    raise ConnectTimeout(e, request=request)
requests.exceptions.ConnectTimeout: (MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /cross-encoder/ms-marco-MiniLM-L-6-v2/resolve/main/config.json (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x7f5c5f89ac90>, 'Connection to huggingface.co timed out. (connect timeout=10)'))"), '(Request ID: 8539d890-1012-4a77-9ec4-e449865ce6f7)')
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/transformers/utils/hub.py", line 399, in cached_file
    resolved_file = hf_hub_download(
                    ^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1221, in hf_hub_download
    return _hf_hub_download_to_cache_dir(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1325, in _hf_hub_download_to_cache_dir
    _raise_on_head_call_error(head_call_error, force_download, local_files_only)
  File "/usr/local/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1826, in _raise_on_head_call_error
    raise LocalEntryNotFoundError(
huggingface_hub.utils._errors.LocalEntryNotFoundError: An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 411, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 69, in __call__
    return await self.app(scope, receive, send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
    await self.app(scope, receive, _send)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 756, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 776, in app
    await route.handle(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 297, in handle
    await self.app(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 77, in app
    await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 72, in app
    response = await func(request)
               ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 278, in app
    raw_response = await run_endpoint_function(
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 193, in run_endpoint_function
    return await run_in_threadpool(dependant.call, **values)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/starlette/concurrency.py", line 42, in run_in_threadpool
    return await anyio.to_thread.run_sync(func, *args)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/anyio/to_thread.py", line 56, in run_sync
    return await get_async_backend().run_sync_in_worker_thread(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 2144, in run_sync_in_worker_thread
    return await future
           ^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 851, in run
    result = context.run(func, *args)
             ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/endpoints/__init__.py", line 24, in rerank
    service = CrossEncoderRerankService(modelName=model_name)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/services/cross_encoder/cross_encoder_rerank_service.py", line 17, in __init__
    self.cross_encoder = CrossEncoder(model_name=modelName, local_files_only=False)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/sentence_transformers/cross_encoder/CrossEncoder.py", line 72, in __init__
    self.config = AutoConfig.from_pretrained(
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/models/auto/configuration_auto.py", line 934, in from_pretrained
    config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
                                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/configuration_utils.py", line 632, in get_config_dict
    config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/configuration_utils.py", line 689, in _get_config_dict
    resolved_config_file = cached_file(
                           ^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/utils/hub.py", line 442, in cached_file
    raise EnvironmentError(
OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like cross-encoder/ms-marco-MiniLM-L-6-v2 is not the path to a directory containing a file named config.json.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.

问题是:

  1. 既然需要重新从HF上下载模型?我们在拉取镜像的时候,有8G的大小,难道不是已经把模型拉取下来了??这个问题可以解决吗??比如说:如果模型已经拉取下来了,如何修改拉取模型的路径?如果没有拉取下来,是否可以手动去下载,后续放在某一个指定的路径下????

花生壳的8G镜像中本身是没有包含模型文件的。可以手动下载并存放到挂载的docker卷中的对应路径。最新的花生卷的镜像支持从已经下载的模型路径加载,而避免重复下载。

关于为什么花生壳的镜像有8G之大,我也很好奇,还没有时间研究如何降低其大小。

这个方法可用:sugarforever/peanut-shell#6 (comment)
重新指定huggingface镜像源,或者挂代理