bedrock model support...
mmgxa opened this issue · comments
When I use model="meta.llama3-8b-instruct-v1:0"
it says: ValueError: Model meta.llama3-8b-instruct-v1:0 is not supported. Supported models are: ['anthropic.claude-v2']
. Isn't there support for other models like llama-3 and mistral? Also, the chat format for these should be specified via os.environ["BEDROCK_PROMPT"] = "vicuna"
, right?