philschmid / easyllm

Home Page:https://philschmid.github.io/easyllm/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

bedrock model support...

mmgxa opened this issue · comments

When I use model="meta.llama3-8b-instruct-v1:0" it says: ValueError: Model meta.llama3-8b-instruct-v1:0 is not supported. Supported models are: ['anthropic.claude-v2']. Isn't there support for other models like llama-3 and mistral? Also, the chat format for these should be specified via os.environ["BEDROCK_PROMPT"] = "vicuna", right?