google-gemini / generative-ai-python

The official Python library for the Google Gemini API

Home Page:https://pypi.org/project/google-generativeai/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

stream=True returns all chunks at the same time [in Colab]

sykp241095 opened this issue · comments

Description of the bug:

I have read the docs here https://ai.google.dev/api/python/google/generativeai/GenerativeModel to learn that use stream=True could return chunks of response one by one, but when I was trying with the following simple code, it doesn't work, it return chunks, but it seems like to return all the chunks at the same time:

import google.generativeai as genai


genai.configure(api_key='....')
model = genai.GenerativeModel('models/gemini-pro')

response = model.generate_content('generate a long story, more than 200 words, multiple graphs with breakline', stream=True)


for chunk in response:
    print(chunk.text)

Actual vs expected behavior:

No response

Any other information you'd like to share?

No response

@sykp241095 thanks for reporting.

This is a limitation of Colab, long story. Try anywhere except Colab and you should get the chunks back as they are generated. Let me know if that works for you.

Marking this issue as stale since it has been open for 14 days with no activity. This issue will be closed if no further activity occurs.

This issue was closed because it has been inactive for 28 days. Please post a new issue if you need further assistance. Thanks!