Can we calculate the Token costs (aka figure out what OpenAI will charge)
lrauhockey opened this issue · comments
First: This is really great, and I can see tremendous value that me and others can get out.
And more of a question/than an issue...
Anyway to build the data model/sample questions to estimate costs? As always with any tool we bring in its about ROI - so knowing some way to evaluate it (I assume questions are tokens, but also sending the data to openAI etc. is also)
Hey @lrauhockey, I like this idea but I don't have time to add it, feel free to add a PR if you can think of a nice kwarg or path to test this.
For when running locally, if you change this line
sketch/sketch/pandas_extension.py
Line 221 in 85f6536
you can extract the exact prompt that will be sent to the endpoint with the command:
prompt_string = howto_prompt.prompt_template.render(dfname=dfname, data_description=description, how=how)
From this you could use the openai tokenizer to get the total number of requested tokens.