acheong08 / ChatGPTProxy

Simple Cloudflare bypass for ChatGPT

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

401 Unauthorized/422 Unprocessable Entity

Louvivien opened this issue · comments

Hi,

Thanks for this code

I have deployed it on a server and then I have added this:
export ACCESS_TOKEN="XXX"
export PUID="XXXX"

I do this request:
curl -X POST https://chatgpt-proxy-v4-2yszjm3tda-od.a.run.app/api/conversation -H "Content-Type: application/json" -d '{"query": "Hello, how are you?", "model": "gpt-3.5-turbo", "logprobs": 10}'

I get this response:

401 Unauthorized
{
    "detail": {
        "message": "Unauthorized - Access token is missing"
    }
}

if I add the access_token:

curl --location 'https://chatgpt-proxy-v4-2yszjm3tda-od.a.run.app/api/conversation' \
--header 'Content-Type: application/json' \
--header 'Authorization: access_token \
--data '{"query": "Hello, how are you?", "model": "gpt-3.5-turbo", "logprobs": 10}'

I get this:

422 Unprocessable Entity
{
    "detail": [
        {
            "loc": [
                "body",
                "action"
            ],
            "msg": "field required",
            "type": "value_error.missing"
        }
    ]
}

What Am I doing wrong?

Looks like you're trying to use this like the official API.

You're looking for https://github.com/acheong08/ChatGPT-to-API I think

I am trying to connect without getting cloudflared, how can I do that?

I am trying to connect without getting cloudflared, how can I do that?

This repo is exactly that. The problem you are getting is that your request data is wrong

I'm not sure where you're getting

{"query": "Hello, how are you?", "model": "gpt-3.5-turbo", "logprobs": 10}

from

This is what a ChatGPT request tends to look like

{
	"action": "next",
	"messages": [
		{
			"id": "cd465ab7-3ee4-40e7-8e48-ade9926ad68e",
			"role": "user",
			"content": {
				"content_type": "text",
				"parts": [
					"What is 4x^2-4x+2=0"
				]
			}
		}
	],
	"parent_message_id": "572aca1b-59e5-4262-85b6-b258fa5a38b8",
	"model": "gpt-4-plugins",
	"plugin_ids":["plugin-d1d6eb04-3375-40aa-940a-c2fc57ce0f51"]
}

Works. Thanks!
Why do i get so many messages in response? Can I limit that number from the request?
It means I can use it with the model with plugins and set the plugins I want in the request?

The messages is due to streaming. If you're looking to parse and use this API in a reasonable manner, check out https://github.com/acheong08/ChatGPT for Python and https://github.com/transitive-bullshit/chatgpt-api for NodeJS

This repo also gets around cloudflare and enables you to use ChatGPT like the OpenAI's official API:
https://github.com/acheong08/ChatGPT-to-API which allows you to control whether it streams or not

Example of ChatGPT-to-API: (Hosted instance. No need for auth.)

curl https://free.churchless.tech/v1/chat/completions -d '{"messages":[{"role":"user", "content":"Explain the concept of ChatGPT"}], "model":"gpt-3.5-turbo","stream":false}'

I see yes to display the text words by words they stream the response so to get the response you need to wait for the last message to be received and get it.

I don't have access to plugins in general yet. Only revChatGPT supports it