run-llama / LlamaIndexTS

LlamaIndex in TypeScript

Home Page:https://ts.llamaindex.ai

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Error while using the library in nextjs (app based route)

rr-jino-jose opened this issue · comments

Question

Hello

I was going through the issues section to find out an solution for the issue i am facing.. I did tried some of the solutions provided but it seems like I am getting some wasm fallback error which I have no idea whats happening.. I doubt its on webpack but I wanted a clarity.

The error I see is like this while running npm run dev

 ✓ Compiled /api/openai in 1500ms (3656 modules)
TypeError: Cannot read properties of undefined (reading 'create')
    at constructSession (webpack-internal:///(rsc)/./node_modules/@xenova/transformers/src/models.js:436:39)
    at async Promise.all (index 1)
    at async BertModel.from_pretrained (webpack-internal:///(rsc)/./node_modules/@xenova/transformers/src/models.js:1007:20)
    at async AutoModel.from_pretrained (webpack-internal:///(rsc)/./node_modules/@xenova/transformers/src/models.js:5026:20)
    at async Promise.all (index 1)
    at async loadItems (webpack-internal:///(rsc)/./node_modules/@xenova/transformers/src/pipelines.js:2838:5)
    at async pipeline (webpack-internal:///(rsc)/./node_modules/@xenova/transformers/src/pipelines.js:2790:21)
    at async HuggingFaceEmbedding.getExtractor (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/embeddings/HuggingFaceEmbedding.js:37:30)
    at async HuggingFaceEmbedding.getTextEmbedding (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/embeddings/HuggingFaceEmbedding.js:44:27)
    at async HuggingFaceEmbedding.getTextEmbeddings (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/embeddings/types.js:30:31)
    at async batchEmbeddings (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/embeddings/types.js:61:32)
    at async HuggingFaceEmbedding.getTextEmbeddingsBatch (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/embeddings/types.js:40:16)
    at async HuggingFaceEmbedding.transform (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/embeddings/types.js:44:28)
    at async VectorStoreIndex.getNodeEmbeddingResults (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/indices/vectorStore/index.js:474:17)
    at async VectorStoreIndex.insertNodes (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/indices/vectorStore/index.js:571:17)
    at async VectorStoreIndex.buildIndexFromNodes (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/indices/vectorStore/index.js:486:9)
    at async VectorStoreIndex.init (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/indices/vectorStore/index.js:436:13)
    at async VectorStoreIndex.fromDocuments (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/indices/vectorStore/index.js:514:16)
    at async getOpenAIModelRequest (webpack-internal:///(rsc)/./src/actions/openai.ts:62:23)
    at async POST (webpack-internal:///(rsc)/./src/app/api/openai/route.ts:11:21)
    at async /Users/jino.jose/rakuten/git/rr-services-version-dashboard/node_modules/next/dist/compiled/next-server/app-route.runtime.dev.js:6:63809
    at async eU.execute (/Users/jino.jose/rakuten/git/rr-services-version-dashboard/node_modules/next/dist/compiled/next-server/app-route.runtime.dev.js:6:53964)
    at async eU.handle (/Users/jino.jose/rakuten/git/rr-services-version-dashboard/node_modules/next/dist/compiled/next-server/app-route.runtime.dev.js:6:65062)
    at async doRender (/opt/homebrew/lib/node_modules/next/dist/server/base-server.js:1333:42)
    at async cacheEntry.responseCache.get.routeKind (/opt/homebrew/lib/node_modules/next/dist/server/base-server.js:1555:28)
    at async DevServer.renderToResponseWithComponentsImpl (/opt/homebrew/lib/node_modules/next/dist/server/base-server.js:1463:28)
    at async DevServer.renderPageComponent (/opt/homebrew/lib/node_modules/next/dist/server/base-server.js:1856:24)
    at async DevServer.renderToResponseImpl (/opt/homebrew/lib/node_modules/next/dist/server/base-server.js:1894:32)
    at async DevServer.pipeImpl (/opt/homebrew/lib/node_modules/next/dist/server/base-server.js:911:25)
    at async NextNodeServer.handleCatchallRenderRequest (/opt/homebrew/lib/node_modules/next/dist/server/next-server.js:271:17)
    at async DevServer.handleRequestImpl (/opt/homebrew/lib/node_modules/next/dist/server/base-server.js:807:17)
    at async /opt/homebrew/lib/node_modules/next/dist/server/dev/next-dev-server.js:331:20
    at async Span.traceAsyncFn (/opt/homebrew/lib/node_modules/next/dist/trace/trace.js:151:20)
    at async DevServer.handleRequest (/opt/homebrew/lib/node_modules/next/dist/server/dev/next-dev-server.js:328:24)
    at async invokeRender (/opt/homebrew/lib/node_modules/next/dist/server/lib/router-server.js:163:21)
    at async handleRequest (/opt/homebrew/lib/node_modules/next/dist/server/lib/router-server.js:342:24)
    at async requestHandlerImpl (/opt/homebrew/lib/node_modules/next/dist/server/lib/router-server.js:366:13)
    at async Server.requestListener (/opt/homebrew/lib/node_modules/next/dist/server/lib/start-server.js:140:13)
Something went wrong during model construction (most likely a missing operation). Using `wasm` as a fallback.
TypeError: Cannot read properties of undefined (reading 'create')
    at constructSession (webpack-internal:///(rsc)/./node_modules/@xenova/transformers/src/models.js:446:39)
    at async Promise.all (index 1)
    at async BertModel.from_pretrained (webpack-internal:///(rsc)/./node_modules/@xenova/transformers/src/models.js:1007:20)
    at async AutoModel.from_pretrained (webpack-internal:///(rsc)/./node_modules/@xenova/transformers/src/models.js:5026:20)
    at async Promise.all (index 1)
    at async loadItems (webpack-internal:///(rsc)/./node_modules/@xenova/transformers/src/pipelines.js:2838:5)
    at async pipeline (webpack-internal:///(rsc)/./node_modules/@xenova/transformers/src/pipelines.js:2790:21)
    at async HuggingFaceEmbedding.getExtractor (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/embeddings/HuggingFaceEmbedding.js:37:30)
    at async HuggingFaceEmbedding.getTextEmbedding (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/embeddings/HuggingFaceEmbedding.js:44:27)
    at async HuggingFaceEmbedding.getTextEmbeddings (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/embeddings/types.js:30:31)
    at async batchEmbeddings (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/embeddings/types.js:61:32)
    at async HuggingFaceEmbedding.getTextEmbeddingsBatch (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/embeddings/types.js:40:16)
    at async HuggingFaceEmbedding.transform (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/embeddings/types.js:44:28)
    at async VectorStoreIndex.getNodeEmbeddingResults (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/indices/vectorStore/index.js:474:17)
    at async VectorStoreIndex.insertNodes (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/indices/vectorStore/index.js:571:17)
    at async VectorStoreIndex.buildIndexFromNodes (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/indices/vectorStore/index.js:486:9)
    at async VectorStoreIndex.init (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/indices/vectorStore/index.js:436:13)
    at async VectorStoreIndex.fromDocuments (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/indices/vectorStore/index.js:514:16)
    at async getOpenAIModelRequest (webpack-internal:///(rsc)/./src/actions/openai.ts:62:23)
    at async POST (webpack-internal:///(rsc)/./src/app/api/openai/route.ts:11:21)
    at async /Users/jino.jose/rakuten/git/rr-services-version-dashboard/node_modules/next/dist/compiled/next-server/app-route.runtime.dev.js:6:63809
    at async eU.execute (/Users/jino.jose/rakuten/git/rr-services-version-dashboard/node_modules/next/dist/compiled/next-server/app-route.runtime.dev.js:6:53964)
    at async eU.handle (/Users/jino.jose/rakuten/git/rr-services-version-dashboard/node_modules/next/dist/compiled/next-server/app-route.runtime.dev.js:6:65062)
    at async doRender (/opt/homebrew/lib/node_modules/next/dist/server/base-server.js:1333:42)
    at async cacheEntry.responseCache.get.routeKind (/opt/homebrew/lib/node_modules/next/dist/server/base-server.js:1555:28)
    at async DevServer.renderToResponseWithComponentsImpl (/opt/homebrew/lib/node_modules/next/dist/server/base-server.js:1463:28)
    at async DevServer.renderPageComponent (/opt/homebrew/lib/node_modules/next/dist/server/base-server.js:1856:24)
    at async DevServer.renderToResponseImpl (/opt/homebrew/lib/node_modules/next/dist/server/base-server.js:1894:32)
    at async DevServer.pipeImpl (/opt/homebrew/lib/node_modules/next/dist/server/base-server.js:911:25)
    at async NextNodeServer.handleCatchallRenderRequest (/opt/homebrew/lib/node_modules/next/dist/server/next-server.js:271:17)
    at async DevServer.handleRequestImpl (/opt/homebrew/lib/node_modules/next/dist/server/base-server.js:807:17)
    at async /opt/homebrew/lib/node_modules/next/dist/server/dev/next-dev-server.js:331:20
    at async Span.traceAsyncFn (/opt/homebrew/lib/node_modules/next/dist/trace/trace.js:151:20)
    at async DevServer.handleRequest (/opt/homebrew/lib/node_modules/next/dist/server/dev/next-dev-server.js:328:24)
    at async invokeRender (/opt/homebrew/lib/node_modules/next/dist/server/lib/router-server.js:163:21)
    at async handleRequest (/opt/homebrew/lib/node_modules/next/dist/server/lib/router-server.js:342:24)
    at async requestHandlerImpl (/opt/homebrew/lib/node_modules/next/dist/server/lib/router-server.js:366:13)
    at async Server.requestListener (/opt/homebrew/lib/node_modules/next/dist/server/lib/start-server.js:140:13)

And my code snippets looks like below:

#src/action/openai.ts

"use server";
import { 
    OpenAI,  
    OpenAIAgent, 
    Settings, 
    VectorStoreIndex,
    QueryEngineTool
  } from "llamaindex";
import { HuggingFaceEmbedding } from "llamaindex/embeddings/HuggingFaceEmbedding";
import { SimpleDirectoryReader } from "llamaindex/readers/SimpleDirectoryReader";
import * as path from "path";
import * as fs from 'fs';

export async function getOpenAIModelRequest(query: string, knowledgeBase: any) {

    const knowledgeBaseJson = JSON.stringify(knowledgeBase, null, 2)

    try {
        // set LLM and the embedding model
    Settings.llm = new OpenAI({
      apiKey: process.env.NEXT_PUBLIC_OPENAI_KEY,
      model: "gpt-4o",
    })
    Settings.embedModel = new HuggingFaceEmbedding({
        modelType: "BAAI/bge-small-en-v1.5",
        quantized: false,
        
    })
    /*
    Set up logging so we can see the work in progress.
    Available events:
    llm-start
    llm-end
    agent-start
    agent-end
    llm-tool-call
    llm-tool-result
    //*/
    Settings.callbackManager.on("llm-tool-call", (event) => {
        console.log(event.detail.payload)
    })
    Settings.callbackManager.on("llm-tool-result", (event) => {
        console.log(event.detail.payload)
    })
    
    const currentDir = __dirname;
    const filePath = path.join(currentDir, 'knowledgeBase.json');

    try {
        fs.writeFileSync(filePath, knowledgeBaseJson);
        console.log(`File written successfully to ${filePath}`);
    } catch (err) {
        console.error('Error writing file:', err);
    }

    // load our data and create a query engine
    const reader = new SimpleDirectoryReader()
    const documents = await reader.loadData(currentDir)
    const index = await VectorStoreIndex.fromDocuments(documents)
    const retriever = await index.asRetriever()
    retriever.similarityTopK = 10
    const queryEngine = await index.asQueryEngine({
        retriever
    })
    console.log(4)
    
    
    // define the query engine as a tool
    const tools = [
        new QueryEngineTool({
            queryEngine: queryEngine,
            metadata: {
            name: "deployment_details_per_env",
            description: `This tool can answer detailed questions about deployments happened in various environments.`,
            },
        }),
    ]
    console.log(5)
    
    // create the agent
    const agent = new OpenAIAgent({tools})
    console.log(6)
    let response = await agent.chat({
        message: query,
    })
    console.log(6)
    return {
      message: response.response.message.content
    };
  }
    catch (err) {
        console.error(err)
        return {
            errors: "Error Calling OpenAI Model"
        }
    }
    
}

#src/app/api/openai/route.ts

import {getOpenAIModelRequest } from "@/actions/openai";
import { NextRequest, NextResponse } from "next/server";

export async function POST(request: NextRequest) {
  const body = await request.json();
  const content = await getOpenAIModelRequest(body.query, body.query1);
  
  //if (content instanceof Error) {
  //  return NextResponse.json({ error: body.errors}, { status: 500 });
  //}
  return NextResponse.json(content, { status: 200 });
}

#src/lib/model.ts

"use client";
export async function getAnswer(query: string, knowledgeBase: Record<string, any>[] | undefined) {
    const resp = await fetch(`/api/openai`, {method: "POST", body: JSON.stringify({query: query, query1: knowledgeBase})});
    const openAiResponse = await resp.json();
    return openAiResponse.message;
   
  }

#next.config.mjs

import module from './package.json' with { type: "json" };

/** @type {import('next').NextConfig} */
const nextConfig = {
    experimental: {
        missingSuspenseWithCSRBailout: false,
        //serverComponentsExternalPackages: ['sharp', 'onnxruntime-node'],
    },
    env: {
        version: module.version
    },

    // Override the default webpack configuration
    webpack: (config) => {
      // Ignore node-specific modules when bundling for the browser
      // See https://webpack.js.org/configuration/resolve/#resolvealias
      config.resolve.alias = {
          ...config.resolve.alias,
          "sharp$": false,
          "onnxruntime-node$": false,
      }
      return config;
  },
    
};

export default nextConfig;

Any help is appreciated. Thanks.

Hey @marcusschiesser thanks. Yes I tried that one too like below but didn't work

next.config.mjs

import module from './package.json' with { type: "json" };
import withLlamaIndex from 'llamaindex/next';

/** @type {import('next').NextConfig} */
const nextConfig = {
    experimental: {
        missingSuspenseWithCSRBailout: false,
        //serverComponentsExternalPackages: ['sharp', 'onnxruntime-node'],
    },
    env: {
        version: module.version
    },

    // Override the default webpack configuration
    //webpack: (config) => {
    //  // Ignore node-specific modules when bundling for the browser
    //  // See https://webpack.js.org/configuration/resolve/#resolvealias
    //  config.resolve.alias = {
    //      ...config.resolve.alias,
    //      "sharp$": false,
    //      "onnxruntime-node$": false,
    //  }
    //  return config;
  //},
    
};

//export default nextConfig;
export default withLlamaIndex(nextConfig);

And I still see that error:

File written successfully to /Users/jino.jose/rakuten/git/rr-services-version-dashboard/.next/server/app/api/openai/knowledgeBase.json
TypeError: Cannot read properties of undefined (reading 'create')
    at constructSession (webpack-internal:///(rsc)/./node_modules/@xenova/transformers/src/models.js:436:39)
    at async Promise.all (index 1)
    at async BertModel.from_pretrained (webpack-internal:///(rsc)/./node_modules/@xenova/transformers/src/models.js:1007:20)
    at async AutoModel.from_pretrained (webpack-internal:///(rsc)/./node_modules/@xenova/transformers/src/models.js:5026:20)
    at async Promise.all (index 1)
    at async loadItems (webpack-internal:///(rsc)/./node_modules/@xenova/transformers/src/pipelines.js:2838:5)
    at async pipeline (webpack-internal:///(rsc)/./node_modules/@xenova/transformers/src/pipelines.js:2790:21)
    at async HuggingFaceEmbedding.getExtractor (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/embeddings/HuggingFaceEmbedding.js:37:30)
    at async HuggingFaceEmbedding.getTextEmbedding (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/embeddings/HuggingFaceEmbedding.js:44:27)
    at async HuggingFaceEmbedding.getTextEmbeddings (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/embeddings/types.js:30:31)
    at async batchEmbeddings (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/embeddings/types.js:61:32)
    at async HuggingFaceEmbedding.getTextEmbeddingsBatch (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/embeddings/types.js:40:16)
    at async HuggingFaceEmbedding.transform (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/embeddings/types.js:44:28)
    at async VectorStoreIndex.getNodeEmbeddingResults (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/indices/vectorStore/index.js:474:17)
    at async VectorStoreIndex.insertNodes (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/indices/vectorStore/index.js:571:17)
    at async VectorStoreIndex.buildIndexFromNodes (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/indices/vectorStore/index.js:486:9)
    at async VectorStoreIndex.init (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/indices/vectorStore/index.js:436:13)
    at async VectorStoreIndex.fromDocuments (webpack-internal:///(rsc)/./node_modules/llamaindex/dist/indices/vectorStore/index.js:514:16)
    at async getOpenAIModelRequest (webpack-internal:///(rsc)/./src/actions/openai.ts:62:23)
    at async POST (webpack-internal:///(rsc)/./src/app/api/openai/route.ts:11:21)
    at async /Users/jino.jose/rakuten/git/rr-services-version-dashboard/node_modules/next/dist/compiled/next-server/app-route.runtime.dev.js:6:63809
    at async eU.execute (/Users/jino.jose/rakuten/git/rr-services-version-dashboard/node_modules/next/dist/compiled/next-server/app-route.runtime.dev.js:6:53964)
    at async eU.handle (/Users/jino.jose/rakuten/git/rr-services-version-dashboard/node_modules/next/dist/compiled/next-server/app-route.runtime.dev.js:6:65062)
    at async doRender (/opt/homebrew/lib/node_modules/next/dist/server/base-server.js:1333:42)
    at async cacheEntry.responseCache.get.routeKind (/opt/homebrew/lib/node_modules/next/dist/server/base-server.js:1555:28)
    at async DevServer.renderToResponseWithComponentsImpl (/opt/homebrew/lib/node_modules/next/dist/server/base-server.js:1463:28)
    at async DevServer.renderPageComponent (/opt/homebrew/lib/node_modules/next/dist/server/base-server.js:1856:24)
    at async DevServer.renderToResponseImpl (/opt/homebrew/lib/node_modules/next/dist/server/base-server.js:1894:32)
    at async DevServer.pipeImpl (/opt/homebrew/lib/node_modules/next/dist/server/base-server.js:911:25)
    at async NextNodeServer.handleCatchallRenderRequest (/opt/homebrew/lib/node_modules/next/dist/server/next-server.js:271:17)
    at async DevServer.handleRequestImpl (/opt/homebrew/lib/node_modules/next/dist/server/base-server.js:807:17)
    at async /opt/homebrew/lib/node_modules/next/dist/server/dev/next-dev-server.js:331:20
    at async Span.traceAsyncFn (/opt/homebrew/lib/node_modules/next/dist/trace/trace.js:151:20)
    at async DevServer.handleRequest (/opt/homebrew/lib/node_modules/next/dist/server/dev/next-dev-server.js:328:24)
    at async invokeRender (/opt/homebrew/lib/node_modules/next/dist/server/lib/router-server.js:163:21)
    at async handleRequest (/opt/homebrew/lib/node_modules/next/dist/server/lib/router-server.js:342:24)
    at async requestHandlerImpl (/opt/homebrew/lib/node_modules/next/dist/server/lib/router-server.js:366:13)
    at async Server.requestListener (/opt/homebrew/lib/node_modules/next/dist/server/lib/start-server.js:140:13)
Something went wrong during model construction (most likely a missing operation). Using `wasm` as a fallback.
TypeError: Cannot read properties of undefined (reading 'create')
    at constructSession (webpack-internal:///(rsc)/./node_modules/@xenova/transformers/src/models.js:446:39)
    at async Promise.all (index 1)

Could you please give a repo to reproduce?

I double-check the npm source code, it uses ONNX

https://www.npmjs.com/package/@xenova/transformers?activeTab=code

what 's your next.js version?

I will check this today

Hello alex thanks. My next js version is 14.

"next": "14.1.0",

The repo I am using is private so can't share it .. but I did shared the code I am using in the first conv. Please let me know if you need something else.

hello, wonder which version of llamaindex has the fix?

releasing very soon... waiting for CI check

0.4.8