Giters
philschmid
/
easyllm
Home Page:
https://philschmid.github.io/easyllm/
Geek Repo:
Geek Repo
Github PK Tool:
Github PK Tool
Stargazers:
425
Watchers:
7
Issues:
21
Forks:
33
philschmid/easyllm Issues
bedrock model support...
Updated
2 months ago
Add support for Vertex AI pretrained language models (GCP)
Updated
8 months ago
Pydantic problem
Updated
8 months ago
Comments count
6
Issue setting huggingface.prompt_builder = 'llama2' when using sagemaker as client
Updated
9 months ago
Comments count
2
Need to pass custom_attributes='accept_eula=true' when invoking SageMaker endpoint
Closed
9 months ago
Comments count
1
OverloadedError: Model is overloaded
Updated
9 months ago
Comments count
1
Boto dependency shouldn't be foreced
Updated
9 months ago
Comments count
5
Need to provide additional args to InferenceClient
Updated
9 months ago
Comments count
2
bedrock.ChatCompletion.create Raises ValidationError for Non-Integer Token Values in Python 3.9
Closed
9 months ago
Comments count
1
What is the difference between EasyLLM and Langchain?
Closed
9 months ago
Comments count
2
(Chat)Completion objects cannot generate diverse outputs
Updated
9 months ago
Comments count
6
Is there a way to enable structured output?
Updated
a year ago
Comments count
7
[Feature] Add support for logit_bias
Updated
a year ago
Bad request: Model requires a Pro subscription
Closed
a year ago
Comments count
5
Streaming support in Sagemaker?
Updated
a year ago
Comments count
1
Multiple messages
Closed
a year ago
Comments count
2
Integration with self hosted models through TGI
Closed
a year ago
Comments count
2
Chat completion format for empty system content.
Closed
a year ago
Comments count
3
Move makefile to hatch scripts
Closed
a year ago
Comments count
3
Local inference of TGI
Closed
a year ago
Comments count
4
Bug: `model` must be defined despite docs saying that if not provided, it defaults to base url.
Closed
a year ago
Comments count
3