jina-ai / clip-as-service

🏄 Scalable embedding, reasoning, ranking for images and sentences with CLIP

Home Page:https://clip-as-service.jina.ai

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Running Docker image with customized yaml doesnt work

pasa13142 opened this issue · comments

Hi, Thanks for this great repo!

If i try to run clip as service in docker with
docker run -p 51009:51000 -v $HOME/.cache:/home/cas/.cache --gpus all jinaai/clip-server:master-onnx everything is okay. It works on gpu with 1 replica.

I just would like to run this proper docker with more replicas and created custom yaml:
python -m clip_server onnx-flow-custom.yml
and tried:
cat onnx-flow-custom.yml | docker run -i -p 51009:51000 -v $HOME/.cache:/home/cas/.cache --gpus all jinaai/clip-server:master-onnx -i

and i just want to add more replicas in it instead of 1.


jtype: Flow
version: '1'
with:
  port: 51000
executors:
  - name: clip_t
    replicas: 4
    uses:
      jtype: CLIPEncoder
      with: 
        device: cuda
      metas:
        py_modules:
          - clip_server.executors.clip_onnx

But I try with above yaml or even with original yamls like:

jtype: Flow
version: '1'
with:
  port: 51000
executors:
  - name: clip_o
    uses:
      jtype: CLIPEncoder
      metas:
        py_modules:
          - clip_server.executors.clip_onnx
    timeout_ready: 3000000
    replicas: 1

they dont work. no gpu, no even replicas. Whatever i do, it throws that warning message with a full of page of it: Default docker image run command doesnt throw this deprecationWarning message.

DeprecationWarning: tostring() is deprecated. Use tobytes() instead. (raised from /usr/local/lib/python3.8/dist-packages/onnxconverter_common/float16.py:95)
DeprecationWarning: The binary mode of fromstring is deprecated, as it behaves surprisingly on unicode inputs. Use frombuffer instead (raised from /usr/local/lib/python3.8/dist-packages/onnxconverter_common/float16.py:91)

I just wanted to use original master-onnx image with more replicas. Can anyone help?

Could you try this:

docker run -i -p 51009:51000 -v $HOME/.cache:/home/cas/.cache -v $PWD:/cas/ --gpus all jinaai/clip-server:master-onnx onnx-flow-custom.yml

Didnt work again. Throws full of a page of this error as same as i explained methods above. Default docker image run command doesnt throw this deprecationWarning message.

DeprecationWarning: tostring() is deprecated. Use tobytes() instead. (raised from /usr/local/lib/python3.8/dist-packages/onnxconverter_common/float16.py:95)
DeprecationWarning: The binary mode of fromstring is deprecated, as it behaves surprisingly on unicode inputs. Use frombuffer instead (raised from /usr/local/lib/python3.8/dist-packages/onnxconverter_common/float16.py:91)

@pasa13142 There is a issue with the clip_server onnx docker image, we are fixing it and will let you know for any progress. Sorry for the inconvenience! You might use clip_server torch image or serve with source code to unblock your work

Hi @pasa13142

The issue is fixed and the pr is merged. Please pull the latest jinaai/clip-server:master-onnx docker image and try again.

Notice that we also updated the instructions on how to serve the docker container here. The correct way to run a onnx docker container using the default config is like
docker run -p 51009:51000 -v $HOME/.cache:/home/cas/.cache --gpus all jinaai/clip-server:master-onnx onnx-flow.yml (If you don't add onnx-flow.yml it will use the default yaml which runs in pytorch mode)
And to use a custom config is the same
cat my.yml | docker run -i -p 51009:51000 -v $HOME/.cache:/home/cas/.cache --gpus all jinaai/clip-server:master-onnx -i

Thanks for reporting this issue 🍻

Hello, Can I have clip-server:master-onnx dockerfile?

It's in the repo:
Dockerfiles/server.Dockerfile

after built image with :


git clone https://github.com/jina-ai/clip-as-service.git
docker build . -f Dockerfiles/server.Dockerfile  --build-arg GROUP_ID=$(id -g ${USER}) --build-arg USER_ID=$(id -u ${USER}) -t jinaai/clip-server

and run with both :

cat server/clip_server/onnx-flow.yml | docker run -i -p 51009:51000 -v $HOME/.cache:/home/cas/.cache --gpus all clip/clon:latest -i

and

docker run -p 51009:51000 -v $HOME/.cache:/home/cas/.cache --gpus all clip/clon:latest onnx-flow.yml

They throw the same error:

ERROR  clip_o/rep-0@24 fail to load file dependency          [04/13/23 16:22:41]
ERROR  clip_o/rep-0@24 FileNotFoundError('can not find       [04/13/23 16:22:41]
       clip_server.executors.clip_onnx') during <class                          
       'jina.serve.runtimes.worker.WorkerRuntime'>                              
       initialization                                                           
        add "--quiet-error" to suppress the exception                           
       details                                                                  
       Traceback (most recent call last):                                       
         File                                                                   
       "/usr/local/lib/python3.8/dist-packages/jina/orchest…                    
       line 79, in run                                                          
           runtime = runtime_cls(                                               
         File                                                                   
       "/usr/local/lib/python3.8/dist-packages/jina/serve/r…                    
       line 41, in __init__                                                     
           super().__init__(args, **kwargs)                                     
         File                                                                   
       "/usr/local/lib/python3.8/dist-packages/jina/serve/r…                    
       line 78, in __init__                                                     
           self._loop.run_until_complete(self.async_setup())                    
         File "/usr/lib/python3.8/asyncio/base_events.py",                      
       line 616, in run_until_complete                                          
           return future.result()                                               
         File                                                                   
       "/usr/local/lib/python3.8/dist-packages/jina/serve/r…                    
       line 106, in async_setup                                                 
           self._request_handler = WorkerRequestHandler(                        
         File                                                                   
       "/usr/local/lib/python3.8/dist-packages/jina/serve/r…                    
       line 53, in __init__                                                     
           self._load_executor(                                                 
         File                                                                   
       "/usr/local/lib/python3.8/dist-packages/jina/serve/r…                    
       line 203, in _load_executor                                              
           self._executor: BaseExecutor =                                       
       BaseExecutor.load_config(                                                
         File                                                                   
       "/usr/local/lib/python3.8/dist-packages/jina/jaml/__…                    
       line 742, in load_config                                                 
           load_py_modules(                                                     
         File                                                                   
       "/usr/local/lib/python3.8/dist-packages/jina/jaml/he…                    
       line 285, in load_py_modules                                             
           PathImporter.add_modules(*mod)                                       
         File                                                                   
       "/usr/local/lib/python3.8/dist-packages/jina/importe…                    
       line 161, in add_modules                                                 
           _path_import(complete_path(m))                                       
         File                                                                   
       "/usr/local/lib/python3.8/dist-packages/jina/jaml/he…                    
       line 229, in complete_path                                               
           raise FileNotFoundError(f'can not find {path}')                      
       FileNotFoundError: can not find                                          
       clip_server.executors.clip_onnx                                          
ERROR  Flow@ 1 Flow is aborted due to ['clip_o'] can not be  [04/13/23 16:22:41]
       started.                                                                 
Traceback (most recent call last):
  File "/usr/lib/python3.8/runpy.py", line 194, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/lib/python3.8/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "/usr/local/lib/python3.8/dist-packages/clip_server/__main__.py", line 25, in <module>
    with f:
  File "/usr/local/lib/python3.8/dist-packages/jina/orchestrate/orchestrator.py", line 14, in __enter__
    return self.start()
  File "/usr/local/lib/python3.8/dist-packages/jina/orchestrate/flow/builder.py", line 33, in arg_wrapper
    return func(self, *args, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/jina/orchestrate/flow/base.py", line 1788, in start
    self._wait_until_all_ready()
  File "/usr/local/lib/python3.8/dist-packages/jina/orchestrate/flow/base.py", line 1927, in _wait_until_all_ready
    raise RuntimeFailToStart
jina.excepts.RuntimeFailToStart

You should use PIP_TAG=onnx in the Dockerfile to install extra onnx dependencies. ref: here and here

@ZiniuYu Hello again!

Do you know how can get "clip-server:latest-onnx"'s dockerfile? Pulling repo and build & run is not latest and producing different encoding values.