TooManyRequestsError when calling OpenAI API
Eth3rnit3 opened this issue · comments
Problem Description
When running our application, we encounter a Faraday::TooManyRequestsError
when calling the OpenAI API. This occurs when we try to read the content of a file named 'example.txt'.
Steps to Reproduce
- Run the
setup
method to initialize the application. - Create a new instance of
Langchain::LLM::OpenAI
with the appropriate API key. - Create a new instance of
Langchain::Thread
. - Create a new instance of
Langchain::Assistant
with the appropriate parameters. - Add a message to the assistant asking to read the content of the 'example.txt' file.
- Run the assistant.
Expected Outcome
The assistant should be able to read the content of the 'example.txt' file without encountering an error.
Actual Outcome
The application encounters a Faraday::TooManyRequestsError
when calling the OpenAI API.
Additional Information
Here is the code:
def setup
require 'bundler/setup'
Bundler.require
Dotenv.load
end
setup
llm = Langchain::LLM::OpenAI.new(api_key: ENV['OPENAI_API_KEY'])
thread = Langchain::Thread.new
assistant = Langchain::Assistant.new(
llm:,
thread:,
instructions: 'You are a FileSystem Assistant that is able to read and write files. You can help with various file operations. To do this, you need clear and precise instructions. You will respond to requests in JSON format. How can you help today?',
tools: [
Langchain::Tool::FileSystem.new
]
)
assistant.add_message content: "I need to read the contents of a file named 'example.txt'."
assistant.run
Here is the error trace:
I, [2024-05-19T10:07:10.261834 #5338] INFO -- : [Langchain.rb] [Langchain::Assistant]: Sending a call to Langchain::LLM::OpenAI
/Users/david/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/faraday-2.9.0/lib/faraday/response/raise_error.rb:34:in `on_complete': the server responded with status 429 (Faraday::TooManyRequestsError)
from /Users/david/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/faraday-2.9.0/lib/faraday/middleware.rb:18:in `block in call'
from /Users/david/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/faraday-2.9.0/lib/faraday/response.rb:42:in `on_complete'
from /Users/david/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/faraday-2.9.0/lib/faraday/middleware.rb:17:in `call'
from /Users/david/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/faraday-2.9.0/lib/faraday/rack_builder.rb:152:in `build_response'
from /Users/david/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/faraday-2.9.0/lib/faraday/connection.rb:444:in `run_request'
from /Users/david/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/faraday-2.9.0/lib/faraday/connection.rb:280:in `post'
from /Users/david/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/ruby-openai-7.0.1/lib/openai/http.rb:22:in `json_post'
from /Users/david/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/ruby-openai-7.0.1/lib/openai/client.rb:30:in `chat'
from /Users/david/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/langchainrb-0.13.1/lib/langchain/llm/openai.rb:137:in `block in chat'
from /Users/david/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/langchainrb-0.13.1/lib/langchain/llm/openai.rb:172:in `with_api_error_handling'
from /Users/david/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/langchainrb-0.13.1/lib/langchain/llm/openai.rb:136:in `chat'
from /Users/david/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/langchainrb-0.13.1/lib/langchain/assistants/assistant.rb:190:in `chat_with_llm'
from /Users/david/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/langchainrb-0.13.1/lib/langchain/assistants/assistant.rb:93:in `run'
from app.rb:22:in `<main>'
Sorry it is due do my limit plan
@Eth3rnit3 So you're verified and it works?
Yes @andreibondarev it works fine 👍