Support withFunctions in GPT4All
anamariamv opened this issue · comments
Ana Mª Marquez commented
The current integration of GPT4All models doesn't support the ChatWithFunctions or the prompts with deserialisation. So it's not possible to deserialise a response into a Kotlin data class, for example.
In a previous version, we have a previous version to instruct the LLM to produce serializable json based on a schema. This was deleted here:
The objective of this ticket is to see if makes sense to bring this functionality to the local models.
Tomás Cayuelas Ruiz commented
We can close this task.