jina-ai / langchain-serve

⚡ Langchain apps in production using Jina & FastAPI

Home Page:https://cloud.jina.ai

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Expecting value: line 4 column 15 (char 79)

thomas-yanxin opened this issue · comments

When I tried to use langchain-serve in this project, I encountered such a problem. Could you please help me look at it?

image

Hi @thomas-yanxin, thanks for trying lcserve. Could you please give us the steps to run the lc-serve command ourselves and please copy paste the curl command in a code format.

@deepankarm
Tks!
I wrote the script based on this tutorial. I performed the following steps:

  1. pip install langchain-serve
  2. lc-serve deploy local jina_serving
  3. run the curl command:
    1).
curl -X 'POST' \
  'http://120.237.18.56:8080/reinit_model' \
  -H 'accept: application/json' \
  -H 'Content-Type: application/json' \
  -d '{
  "large_language_model": "ChatGLM-6B-int8",
  "embedding_model": "text2vec-base",
}' 

This command was executed successfully!
2)

curl -X 'POST' \
  'http://localhost:8080/predict' \
  -H 'accept: application/json' \
  -H 'Content-Type: application/json' \
  -d '{
  "input": "ChatGLM-6B的具体局限性?",
  " file_path": "./README.md",
   "use_web": True,
   "top_k": 3,
   "history_len": 1,
   "temperature": 0.01,
   "top_p": 0.1,
   "history": []
}'

but this didn't, and the above error appears.

Can you please change True to true and retry?

curl -X 'POST' \
  'http://localhost:8080/predict' \
  -H 'accept: application/json' \
  -H 'Content-Type: application/json' \
  -d '{
  "input": "ChatGLM-6B的具体局限性?",
  " file_path": "./README.md",
   "use_web": true,
   "top_k": 3,
   "history_len": 1,
   "temperature": 0.01,
   "top_p": 0.1,
   "history": []
}'

Ok, I will try it now and have a conclusion to communicate with you here! Thanks!

Success! Thanks!