run-llama / LlamaIndexTS

LlamaIndex in TypeScript

Home Page:https://ts.llamaindex.ai

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

`@llamaindex/env` TypeError: Module "..." needs an import attribute of "type: json"

parhammmm opened this issue · comments

When trying to use the latest llamaindex 0.4.1 I get the following compilation error when running tests in vitest

This is likely not a LITS issue, but just wonder if anyone knows the reason I can't figure this out!

TypeError: Module "file:///Users/p/LlamaIndexTS/node_modules/.pnpm/tiktoken@1.0.15/node_modules/tiktoken/encoders/cl100k_base.json" needs an import attribute of "type: json"

This seems to be the problematic line, which was added last week:
https://github.com/run-llama/LlamaIndexTS/blob/main/packages/env/src/tokenizers/node.ts#L2

EDIT:
What I mean by problematic is that this import should have with { type: 'json' } at the end of it, as specified here:
https://nodejs.org/docs/latest-v20.x/api/esm.html#json-modules

That requirement is also in the Node 18 docs, with Node 18 being the earliest currently-supported Node LTS version.

I had to add resolveJsonModule: true to tsconfig.json of @llamaindex/env for this to work:

{
  "compilerOptions": {
    "resolveJsonModule": true
  },
}

my guess is as we're not bundling, you have to use this setting for any project (what we don't want).

@travellingprog I tried your idea, it only works with module set to esnext

@himself65 any idea?

I had to add resolveJsonModule: true to tsconfig.json of @llamaindex/env for this to work:

{
  "compilerOptions": {
    "resolveJsonModule": true
  },
}

my guess is as we're not bundling, you have to use this setting for any project (what we don't want).

@travellingprog I tried your idea, it only works with module set to esnext

@himself65 any idea?

It's only for TS compilation time, not user-side runtime.

let me fix it

Perfect thanks everyone!

@himself65 seems like the build failed https://github.com/run-llama/LlamaIndexTS/actions/runs/9618158421/job/26531444358

Maybe the file needs to included in the bundle?

@himself65 seems like the build failed run-llama/LlamaIndexTS/actions/runs/9618158421/job/26531444358

Maybe the file needs to included in the bundle?

It seems like JSR issue

@parhammmm does #958 fix your issue? If not how can we reproduce it?

I am running into the same issue with the express template from create-llama. I tried 0.4.1 (in this PR run-llama/create-llama#143) and the latest main branch (containing #958) locally (using pnpm add) - both ways the result is:

pnpm run start

> my-app@0.1.0 start /Users/marcus/1/my-app
> node dist/index.js

node:internal/modules/esm/assert:89
        throw new ERR_IMPORT_ASSERTION_TYPE_MISSING(url, validType);
              ^

TypeError [ERR_IMPORT_ASSERTION_TYPE_MISSING]: Module "file:///Users/marcus/code/llamaindex/LlamaIndexTS/node_modules/.pnpm/tiktoken@1.0.15/node_modules/tiktoken/encoders/cl100k_base.json" needs an import attribute of type "json"
    at validateAttributes (node:internal/modules/esm/assert:89:15)
    at defaultLoad (node:internal/modules/esm/load:153:3)
    at async ModuleLoader.load (node:internal/modules/esm/loader:403:7)
    at async ModuleLoader.moduleProvider (node:internal/modules/esm/loader:285:45) {
  code: 'ERR_IMPORT_ASSERTION_TYPE_MISSING'
}

@parhammmm #967 is fixing the issue for the create-llama express template

@parhammmm #967 is fixing the issue for the create-llama express template

could you add a e2e test in this packages/llamaindex/e2e/...? when you find a way to fix such issue

@parhammmm #967 is fixing the issue for the create-llama express template

could you add a e2e test in this packages/llamaindex/e2e/...? when you find a way to fix such issue

#963

@marcusschiesser was checking but #978 got in the way

@parhammmm thanks, just released 0.4.5 containing it

@marcusschiesser looks like this is resolved! thanks again