leonardfactory / babel-plugin-transform-typescript-metadata

Babel plugin to emit decorator metadata like typescript compiler

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Reference Error "X is not defined"

whimzyLive opened this issue · comments

@leonardfactory @wtho
When working with Nestjs and aws sdk, any injections of AWS sdk are not being tranformed properly. I think the issue is somewhere in here. when the condition expression fails for typeof Type === 'undefined' ? Object : Type it falls back to default type which for example is AWS.S3 and later js has no idea of how to resolve AWS.

reproduction of the issue is here. The original issue was filed in babel repro as #12150

Happy to work on this, but will need more guidence on what actually needs to be fixed.

commented

@whimzyLive I wanted to have a look at your repro, but the build script always throws an error. Can you fix this?

Error [ERR_PACKAGE_PATH_NOT_EXPORTED]: No "exports" main defined in /home/wtho/workspace/repro/babel-ts-imports-removed-repro/node_modules/@babel/helper-compilation-targets/package.json
commented

Ok after googling a bit more for the error I found a solution to the build error:
I deleted node_modules and package-lock.json and re-installed using npm install. This resolved a newer version of helper-compilation-targets which enables compilation with babel 👍

I will now look into your issue.

commented

So I took a quick look at it, but I am not familiar enough with the babel typescript compilation plugins to tell if this is a final solution.

Workarounds

I found two workaround, where I like the second one better:

  1. by turning import AWS from 'aws-sdk' into the more verbose, but also more generally accepted standard-ish non-default import version import { default as AWS } from 'aws-sdk'. This will somehow keep the name AWS as is and therefore will work.
  2. by turning import AWS from 'aws-sdk' into import { S3 } from 'aws-sdk' and turning AWS.S3 into S3. When doing this, the transformation from S3 into _awsSdk.S3 seems to work in both places.

Possible Solution

I also made a small modification in this plugin to make it work, but I am not sure this is the right solution: I turned t.clone(reference) into reference in the serializeTypeReferenceNode function:

  return t.conditionalExpression(
    t.binaryExpression(
      '===',
      t.unaryExpression('typeof', reference),
      t.stringLiteral('undefined')
    ),
    t.identifier('Object'),
-   t.clone(reference)
+   reference
  );

This will keep the reference to the imported module, but will duplicate the reference instance, which might lead to problems in further transformations. I think it only works as the transformations are done correctly to the first reference, which then will actually be both. @leonardfactory Please check if this solution is valid.


Please add your findings so we can figure out the right solution.

commented

I think I just figured it out:

  return t.conditionalExpression(
    t.binaryExpression(
      '===',
      t.unaryExpression('typeof', reference),
      t.stringLiteral('undefined')
    ),
    t.identifier('Object'),
-   t.clone(reference)
+   t.cloneDeep(reference)
  );

This seems to apply the import transformation correctly to both references.

Okay, I will give it a try, for time being I have set made AWS as a global variable so that I can avoid the reference error of it not being defined. Obvio it is not an ideal solution.

@wtho it seems fixed. Thanks. Can you do a PR to have it included in next release.

Thank you guys for issue and solution 👍 I'll make a release when the PR is ready&merged.

Hi @whimzyLive! v0.3.1 is out, let me know if it's working right now so we can close the issue

@leonardfactory Yeah, it's working. Thanks.