run-llama / LlamaIndexTS

LlamaIndex in TypeScript

Home Page:https://ts.llamaindex.ai

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Can not use external AI service instead of openai

vanHuong0202 opened this issue · comments

Hi, i am using LlamaIndexTS in my nestjs app server, and use Anthropic instead of default openai, here are my setup code

`@Injectable()
export class ClassService {
constructor(

) {
Settings.llm = new Anthropic({
apiKey: process.env.CLAUDE_API_KEY || '',
model: 'claude-3-sonnet',
});
}

//.... code method

`

but the issue is that, it always show Error: Error: Set OpenAI Key in OPENAI_API_KEY env variable, Pls help!

There are two models, an LLM and an embedding model. You need to change both (anthropic does not offer an embedding model sadly)

@logan-markewich sorry,i m just newbies with the llamaindex so dont really get u. I am using llm model, initializing Anthropic( claude ai) but dont know why it always require openai key. what more config should i add?

@logan-markewich ya, u are right, the claude does not support embedding but my code load a file data so this is reason why it show openai key error.

claude ai does not support embedding api