aws / aws-mwaa-local-runner

This repository provides a command line interface (CLI) utility that replicates an Amazon Managed Workflows for Apache Airflow (MWAA) environment locally.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) no such table

Puck-Chen-Bose opened this issue · comments

MWAA version: 2.7.2
My local env: Mac intel chip

I use this repo to build an local image, and then run start to compose a container successfully.
Airflow UI is okay to access, and no issues report on that.

but when I execute some commands inside the aws-mwaa-local-runner container, some errors will raise as below:

[airflow@56b35f8126ca ~]$ airflow dags list
/usr/local/airflow/.local/lib/python3.11/site-packages/airflow/configuration.py:755 UserWarning: Config scheduler.max_tis_per_query (value: 512) should NOT be greater than core.parallelism (value: 32). Will now use core.parallelism as the max task instances per query instead of specified value.
/usr/local/airflow/.local/lib/python3.11/site-packages/airflow/configuration.py:861 FutureWarning: The 'log_filename_template' setting in [logging] has the old default value of '{{ ti.dag_id }}/{{ ti.task_id }}/{{ ts }}/{{ try_number }}.log'. This value has been changed to 'dag_id={{ ti.dag_id }}/run_id={{ ti.run_id }}/task_id={{ ti.task_id }}/{% if ti.map_index >= 0 %}map_index={{ ti.map_index }}/{% endif %}attempt={{ try_number }}.log' in the running config, but please update your config before Apache Airflow 3.0.
[2023-12-20T03:47:00.549+0000] {{db.py:903}} INFO - Log template table does not exist (added in 2.3.0); skipping log template sync.
Error: Failed to load all files. For details, run `airflow dags list-import-errors`
Traceback (most recent call last):
  File "/usr/local/airflow/.local/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 1910, in _execute_context
    self.dialect.do_execute(
  File "/usr/local/airflow/.local/lib/python3.11/site-packages/sqlalchemy/engine/default.py", line 736, in do_execute
    cursor.execute(statement, parameters)
sqlite3.OperationalError: no such table: dag

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/airflow/.local/bin/airflow", line 8, in <module>
    sys.exit(main())
             ^^^^^^
  File "/usr/local/airflow/.local/lib/python3.11/site-packages/airflow/__main__.py", line 59, in main
    args.func(args)
  File "/usr/local/airflow/.local/lib/python3.11/site-packages/airflow/cli/cli_config.py", line 49, in command
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/airflow/.local/lib/python3.11/site-packages/airflow/utils/cli.py", line 113, in wrapper
    return f(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^
  File "/usr/local/airflow/.local/lib/python3.11/site-packages/airflow/utils/cli.py", line 376, in _wrapper
    f(*args, **kwargs)
  File "/usr/local/airflow/.local/lib/python3.11/site-packages/airflow/utils/providers_configuration_loader.py", line 55, in wrapped_function
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/airflow/.local/lib/python3.11/site-packages/airflow/cli/commands/dag_command.py", line 361, in dag_list_dags
    AirflowConsole().print_as(
  File "/usr/local/airflow/.local/lib/python3.11/site-packages/airflow/cli/simple_table.py", line 114, in print_as
    dict_data: Sequence[dict] = [mapper(d) for d in data]
                                ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/airflow/.local/lib/python3.11/site-packages/airflow/cli/simple_table.py", line 114, in <listcomp>
    dict_data: Sequence[dict] = [mapper(d) for d in data]
                                 ^^^^^^^^^
  File "/usr/local/airflow/.local/lib/python3.11/site-packages/airflow/cli/commands/dag_command.py", line 368, in <lambda>
    "paused": x.get_is_paused(),
              ^^^^^^^^^^^^^^^^^
  File "/usr/local/airflow/.local/lib/python3.11/site-packages/airflow/utils/session.py", line 77, in wrapper
    return func(*args, session=session, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/airflow/.local/lib/python3.11/site-packages/airflow/models/dag.py", line 1373, in get_is_paused
    return session.scalar(select(DagModel.is_paused).where(DagModel.dag_id == self.dag_id))
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/airflow/.local/lib/python3.11/site-packages/sqlalchemy/orm/session.py", line 1747, in scalar
    return self.execute(
           ^^^^^^^^^^^^^
  File "/usr/local/airflow/.local/lib/python3.11/site-packages/sqlalchemy/orm/session.py", line 1717, in execute
    result = conn._execute_20(statement, params or {}, execution_options)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/airflow/.local/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 1710, in _execute_20
    return meth(self, args_10style, kwargs_10style, execution_options)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/airflow/.local/lib/python3.11/site-packages/sqlalchemy/sql/elements.py", line 334, in _execute_on_connection
    return connection._execute_clauseelement(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/airflow/.local/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 1577, in _execute_clauseelement
    ret = self._execute_context(
          ^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/airflow/.local/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 1953, in _execute_context
    self._handle_dbapi_exception(
  File "/usr/local/airflow/.local/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 2134, in _handle_dbapi_exception
    util.raise_(
  File "/usr/local/airflow/.local/lib/python3.11/site-packages/sqlalchemy/util/compat.py", line 211, in raise_
    raise exception
  File "/usr/local/airflow/.local/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 1910, in _execute_context
    self.dialect.do_execute(
  File "/usr/local/airflow/.local/lib/python3.11/site-packages/sqlalchemy/engine/default.py", line 736, in do_execute
    cursor.execute(statement, parameters)
sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) no such table: dag
[SQL: SELECT dag.is_paused 
FROM dag 
WHERE dag.dag_id = ?]
[parameters: ('dag_with_taskflow_api',)]
(Background on this error at: https://sqlalche.me/e/14/e3q8)
Snipaste_2023-12-20_11-56-41

I did not change docker-compose-local.yml, and image is built on source of this repo. If other command execute such as airflow connections list, same issues will report.

It seemed that mwaa-local container can not find and fetch data from backend database.

Please help solve this issue. Thanks a lot!

This seemed to be an issue with Fernet key, as I reset db with errors:

raise AirflowException(f"Could not create Fernet object: {value_error}")
airflow.exceptions.AirflowException: Could not create Fernet object: Fernet key must be 32 url-safe base64-encoded bytes.

I rebuild image with no-cache, and $FERNET_KEY is empty inside my container.

[airflow@20d73dbac09a ~]$ echo $FERNET_KEY

Search for more info...

Resolved with: #24