amplify-education / python-hcl2

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Unexpected token after multiplication

dbalucas opened this issue · comments

Current behaviour fails to parse following hcl code:

resource "confluent_connector" "source" {
  config_nonsensitive = {
    "poll.interval.ms"         = local.poll_interval * 1000
    "salesforce.grant.type"    = "JWT_BEARER"
  }
}

Logs:

Traceback (most recent call last):
  File "/home/developer/projects/bug_pythonhcl2/.venv/lib/python3.11/site-packages/lark/parsers/lalr_parser_state.py", line 77, in feed_token
    action, arg = states[state][token.type]
                  ~~~~~~~~~~~~~^^^^^^^^^^^^
KeyError: 'STRING_LIT'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/developer/projects/bug_pythonhcl2/main.py", line 4, in <module>
    obj = hcl2.load(fp)
          ^^^^^^^^^^^^^
  File "/home/developer/projects/bug_pythonhcl2/.venv/lib/python3.11/site-packages/hcl2/api.py", line 14, in load
    return loads(file.read(), with_meta=with_meta)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/developer/projects/bug_pythonhcl2/.venv/lib/python3.11/site-packages/hcl2/api.py", line 27, in loads
    tree = hcl2.parse(text + "\n")
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/developer/projects/bug_pythonhcl2/.venv/lib/python3.11/site-packages/lark/lark.py", line 658, in parse
    return self.parser.parse(text, start=start, on_error=on_error)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/developer/projects/bug_pythonhcl2/.venv/lib/python3.11/site-packages/lark/parser_frontends.py", line 104, in parse
    return self.parser.parse(stream, chosen_start, **kw)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/developer/projects/bug_pythonhcl2/.venv/lib/python3.11/site-packages/lark/parsers/lalr_parser.py", line 42, in parse
    return self.parser.parse(lexer, start)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/developer/projects/bug_pythonhcl2/.venv/lib/python3.11/site-packages/lark/parsers/lalr_parser.py", line 88, in parse
    return self.parse_from_state(parser_state)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/developer/projects/bug_pythonhcl2/.venv/lib/python3.11/site-packages/lark/parsers/lalr_parser.py", line 111, in parse_from_state
    raise e
  File "/home/developer/projects/bug_pythonhcl2/.venv/lib/python3.11/site-packages/lark/parsers/lalr_parser.py", line 102, in parse_from_state
    state.feed_token(token)
  File "/home/developer/projects/bug_pythonhcl2/.venv/lib/python3.11/site-packages/lark/parsers/lalr_parser_state.py", line 80, in feed_token
    raise UnexpectedToken(token, expected, state=self, interactive_parser=None)
lark.exceptions.UnexpectedToken: Unexpected token Token('STRING_LIT', '"salesforce.grant.type"') at line 4, column 5.
Expected one of: 
        * COMMA
        * PERCENT
        * __ANON_8
        * PLUS
        * __ANON_0
        * __ANON_4
        * __ANON_5
        * __ANON_7
        * QMARK
        * __ANON_1
        * __ANON_9
        * MINUS
        * __ANON_2
        * RBRACE
        * LESSTHAN
        * STAR
        * SLASH
        * __ANON_6
        * MORETHAN

Expected behaviour is that the parsing should succeed.

Workaround: place a comma after the multiplication

I have the same issue, it fails to parse a code like this:

    service = {
      high_memory_seconds       = 60 * 10
      high_memory_percentage    = 95
    }

Logs:

lark.exceptions.UnexpectedToken: Unexpected token Token('__ANON_3', 'high_memory_percentage') at line 3, column 7.

@alpinweis Thanks for bringing it up again. Have you tried the workaround with the comma mentioned in the last line?

@alpinweis Thanks for bringing it up again. Have you tried the workaround with the comma mentioned in the last line?

the comma workaround does work, but updating all the files at my job I need to parse is not realistic.
also, using commas looks kind of ugly :)

In my case the alternative workaround was to convert that tf file to json using a tool like hcl2json used with the -simplify flag that runs all the basic calculations like add/multiply etc before saving the json file.