jxnl / instructor

structured outputs for llms

Home Page:https://python.useinstructor.com/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Doesn't work inside asyncio.gather()

bryanhpchiang opened this issue · comments

Will add repro later -- but running into

asyncio.exceptions.CancelledError

During handling of the above exception, another exception occurred:

asyncio.exceptions.CancelledError

When you call a completion from inside f via asyncio.gather(*[f ... ])

If you take it out of the asyncio.gather it works fine

here works fine.

example code

import openai
import instructor
from pydantic import BaseModel
import asyncio


client = instructor.from_openai(openai.AsyncOpenAI())


class User(BaseModel):
    name: str
    age: int


async def extract():
    return await client.chat.completions.create(
        model="gpt-4-turbo-preview",
        messages=[
            {"role": "user", "content": "Create a user"},
        ],
        response_model=User,
    )


async def create_users(num_users: int):
    tasks = [extract() for _ in range(num_users)]
    users = await asyncio.gather(*tasks)
    return users


async def main():
    users = await create_users(3)
    for user in users:
        print(user)


if __name__ == "__main__":
    asyncio.run(main())

output

name='John Doe' age=30
name='John Doe' age=30
name='John Doe' age=25

maybe your error is because you are doing this on jupyter notebook?

if this is the case, add this on the beginning of your notebook


nest_asyncio.apply()

Interesting! I was not using notebooks. But it was part of a class, so something like this:

class X
    async def start
           await aynscio.gather(*self.call(....))
    async def call
           ... call instructor from here ...

I wonder if something like this would work?

did you try
nest_asyncio.apply()?

https://pypi.org/project/nest-asyncio/

@bryanhpchiang i used your class to reimplement some of @kevin-weitgenant 's code

import openai
import instructor
from pydantic import BaseModel
import asyncio


client = instructor.from_openai(openai.AsyncOpenAI())


class User(BaseModel):
    name: str
    age: int


class UserFactory:
    def __init__(self, num_users: int):
        self.num_users = num_users

    async def start(self):
        coros = [self.call() for _ in range(self.num_users)]
        return await asyncio.gather(*coros)

    async def call(self):
        return await client.chat.completions.create(
            model="gpt-4-turbo-preview",
            messages=[
                {"role": "user", "content": "Create a user"},
            ],
            response_model=User,
        )


if __name__ == "__main__":
    userFactory = UserFactory(10)
    print(asyncio.run(userFactory.start()))

This works pretty nicely out of the box if you run it. Closing this issue for now since there's been no activity for a while too.