Codium-ai / cover-agent

CodiumAI Cover-Agent: An AI-Powered Tool for Automated Test Generation and Code Coverage Enhancement! 💻🤖🧪🐞

Home Page:https://www.codium.ai/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Unable to run test case for flutter using gpt-3.5-turbo

sandeepgurram opened this issue · comments

Command I am using

  --source-file-path '/Users/sandeepgurram/Documents/codebase/turing-gorout/gr_ui/common_ui/lib/ui/snackbar/gr_snackbar.dart' \
  --test-file-path '/Users/sandeepgurram/Documents/codebase/turing-gorout/gr_ui/common_ui/test/snackbar/gr_snackbar_test.dart' \
  --code-coverage-report-path '/Users/sandeepgurram/Documents/codebase/turing-gorout/gr_ui/common_ui/coverage/coverage.xml' \
  --test-command "flutter test --coverage && lcov_cobertura '/Users/sandeepgurram/Documents/codebase/turing-gorout/gr_ui/common_ui/coverage/lcov.info' --output '/Users/sandeepgurram/Documents/codebase/turing-gorout/gr_ui/common_ui/coverage/coverage.xml'" \
  --model "openai/gpt-3.5-turbo"

Error


Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

2024-06-10 00:10:02,562 - cover_agent.UnitTestGenerator - ERROR - Error during initial test suite analysis: APIError: OpenAIException - Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/httpx/_transports/default.py", line 69, in map_httpcore_exceptions
  yield
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/httpx/_transports/default.py", line 233, in handle_request
  resp = self._pool.handle_request(req)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py", line 216, in handle_request
  raise exc from None
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py", line 196, in handle_request
  response = connection.handle_request(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/httpcore/_sync/connection.py", line 99, in handle_request
  raise exc
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/httpcore/_sync/connection.py", line 76, in handle_request
  stream = self._connect(request)
           ^^^^^^^^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/httpcore/_sync/connection.py", line 122, in _connect
  stream = self._network_backend.connect_tcp(**kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/httpcore/_backends/sync.py", line 205, in connect_tcp
  with map_exceptions(exc_map):
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/contextlib.py", line 155, in __exit__
  self.gen.throw(typ, value, traceback)
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
  raise to_exc(exc) from exc
httpcore.ConnectError: [Errno 61] Connection refused```

Try without the openai portion in the model option. So your command would now look like this:

  --source-file-path '/Users/sandeepgurram/Documents/codebase/turing-gorout/gr_ui/common_ui/lib/ui/snackbar/gr_snackbar.dart' \
  --test-file-path '/Users/sandeepgurram/Documents/codebase/turing-gorout/gr_ui/common_ui/test/snackbar/gr_snackbar_test.dart' \
  --code-coverage-report-path '/Users/sandeepgurram/Documents/codebase/turing-gorout/gr_ui/common_ui/coverage/coverage.xml' \
  --test-command "flutter test --coverage && lcov_cobertura '/Users/sandeepgurram/Documents/codebase/turing-gorout/gr_ui/common_ui/coverage/lcov.info' --output '/Users/sandeepgurram/Documents/codebase/turing-gorout/gr_ui/common_ui/coverage/coverage.xml'" \
  --model "gpt-3.5-turbo"

That should resolve your issue. Please close the issue if it does.

NOTE: Remember to set up your OPENAI_API_KEY environment variable as well

Thanks its working.