Unknown column 'created' in 'field list'
bruvv opened this issue · comments
Describe the bug
I installed mariaDB into HASS OS as an addon in my VM. I changed my database in home assistant to mariadb and rebooted HASS OS. So it is now filling mariadb.
I moved the sqlite database to my main PC. Downloaded your script, and used the following command:
sqlite3mysql -f home-assistant_v2.db -d homeassistant -u homeassistant -p -h 192.168.2.3 -S --debug
Expected behaviour
Import the database into the current database and fill in the missing gaps.
Actual result
Program crashed with the following error:
2022-05-02 10:39:32 ERROR MySQL transfer failed inserting data into table events: 1054 (42S22): Unknown column 'created' in 'field list'
Traceback (most recent call last):
File "/home/niels/.local/bin/sqlite3mysql", line 8, in <module>
sys.exit(cli())
File "/usr/lib/python3/dist-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/usr/lib/python3/dist-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/usr/lib/python3/dist-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/usr/lib/python3/dist-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/home/niels/.local/lib/python3.8/site-packages/sqlite3_to_mysql/cli.py", line 204, in cli
converter.transfer()
File "/home/niels/.local/lib/python3.8/site-packages/sqlite3_to_mysql/transporter.py", line 766, in transfer
self._transfer_table_data(sql=sql, total_records=total_records)
File "/home/niels/.local/lib/python3.8/site-packages/sqlite3_to_mysql/transporter.py", line 651, in _transfer_table_data
self._mysql_cur.executemany(
File "/home/niels/.local/lib/python3.8/site-packages/mysql/connector/cursor.py", line 1253, in executemany
self.execute(operation, params)
File "/home/niels/.local/lib/python3.8/site-packages/mysql/connector/cursor.py", line 1209, in execute
self._prepared = self._connection.cmd_stmt_prepare(operation)
File "/home/niels/.local/lib/python3.8/site-packages/mysql/connector/connection.py", line 1420, in cmd_stmt_prepare
result = self._handle_binary_ok(packet)
File "/home/niels/.local/lib/python3.8/site-packages/mysql/connector/connection.py", line 1360, in _handle_binary_ok
raise errors.get_exception(packet)
mysql.connector.errors.ProgrammingError: 1054 (42S22): Unknown column 'created' in 'field list'
System Information
| software | version |
|------------------------|-------------------------------------------------------------------------------------------|
| sqlite3-to-mysql | 1.4.15 |
| | |
| Operating System | Linux 5.10.102.1-microsoft-standard-WSL2 |
| Python | CPython 3.8.10 |
| MySQL | mysql Ver 15.1 Distrib 10.3.34-MariaDB, for debian-linux-gnu (x86_64) using readline 5.2 |
| SQLite | 3.31.1 |
| | |
| click | 7.0 |
| mysql-connector-python | 8.0.28 |
| pytimeparse | 1.1.8 |
| simplejson | 3.16.0 |
| six | 1.14.0 |
| tabulate | 0.8.9 |
| tqdm | 4.64.0 |
Hi,
Can you send me a snippet of your SQLite database, or at least its DDL (structure without any data)? It looks like it's trying to access an unknown column for some reason.
Thanks @techouse any idea how to get a small portion ? The database itself is around 18GB (its a home assistant database)
Can you try and just get the DDL. https://stackoverflow.com/questions/38832802/sqlite3-dump-schema-into-sql-file-from-command-line
Awesome thanks here you go
CREATE TABLE events (
event_id INTEGER NOT NULL,
event_type VARCHAR(64),
event_data TEXT,
origin VARCHAR(32),
time_fired DATETIME,
created DATETIME,
context_id VARCHAR(36),
context_user_id VARCHAR(36),
context_parent_id VARCHAR(36),
PRIMARY KEY (event_id)
);
CREATE INDEX ix_events_event_type_time_fired ON events (event_type, time_fired);
CREATE INDEX ix_events_time_fired ON events (time_fired);
CREATE INDEX ix_events_context_parent_id ON events (context_parent_id);
CREATE INDEX ix_events_context_id ON events (context_id);
CREATE INDEX ix_events_context_user_id ON events (context_user_id);
CREATE TABLE statistics_meta (
id INTEGER NOT NULL,
statistic_id VARCHAR(255),
source VARCHAR(32),
unit_of_measurement VARCHAR(255),
has_mean BOOLEAN,
has_sum BOOLEAN,
name VARCHAR(255),
PRIMARY KEY (id)
);
CREATE INDEX ix_statistics_meta_statistic_id ON statistics_meta (statistic_id);
CREATE TABLE recorder_runs (
run_id INTEGER NOT NULL,
start DATETIME,
"end" DATETIME,
closed_incorrect BOOLEAN,
created DATETIME,
PRIMARY KEY (run_id)
);
CREATE INDEX ix_recorder_runs_start_end ON recorder_runs (start, "end");
CREATE TABLE schema_changes (
change_id INTEGER NOT NULL,
schema_version INTEGER,
changed DATETIME,
PRIMARY KEY (change_id)
);
CREATE TABLE statistics_runs (
run_id INTEGER NOT NULL,
start DATETIME,
PRIMARY KEY (run_id)
);
CREATE TABLE states (
state_id INTEGER NOT NULL,
domain VARCHAR(64),
entity_id VARCHAR(255),
state VARCHAR(255),
attributes TEXT,
event_id INTEGER,
last_changed DATETIME,
last_updated DATETIME,
created DATETIME,
old_state_id INTEGER, attributes_id INTEGER,
PRIMARY KEY (state_id),
FOREIGN KEY(event_id) REFERENCES events (event_id) ON DELETE CASCADE,
FOREIGN KEY(old_state_id) REFERENCES states (state_id)
);
CREATE INDEX ix_states_event_id ON states (event_id);
CREATE INDEX ix_states_entity_id_last_updated ON states (entity_id, last_updated);
CREATE INDEX ix_states_last_updated ON states (last_updated);
CREATE INDEX ix_states_old_state_id ON states (old_state_id);
CREATE TABLE statistics (
id INTEGER NOT NULL,
created DATETIME,
start DATETIME,
mean FLOAT,
min FLOAT,
max FLOAT,
last_reset DATETIME,
state FLOAT,
sum FLOAT,
metadata_id INTEGER,
PRIMARY KEY (id),
FOREIGN KEY(metadata_id) REFERENCES statistics_meta (id) ON DELETE CASCADE
);
CREATE INDEX ix_statistics_metadata_id ON statistics (metadata_id);
CREATE INDEX ix_statistics_start ON statistics (start);
CREATE TABLE statistics_short_term (
id INTEGER NOT NULL,
created DATETIME,
start DATETIME,
mean FLOAT,
min FLOAT,
max FLOAT,
last_reset DATETIME,
state FLOAT,
sum FLOAT,
metadata_id INTEGER,
PRIMARY KEY (id),
FOREIGN KEY(metadata_id) REFERENCES statistics_meta (id) ON DELETE CASCADE
);
CREATE INDEX ix_statistics_short_term_metadata_id ON statistics_short_term (metadata_id);
CREATE INDEX ix_statistics_short_term_start ON statistics_short_term (start);
CREATE UNIQUE INDEX ix_statistics_statistic_id_start ON statistics (metadata_id, start);
CREATE UNIQUE INDEX ix_statistics_short_term_statistic_id_start ON statistics_short_term (metadata_id, start);
CREATE TABLE state_attributes (
attributes_id INTEGER NOT NULL,
hash BIGINT,
shared_attrs TEXT,
PRIMARY KEY (attributes_id)
);
CREATE INDEX ix_state_attributes_hash ON state_attributes (hash);
CREATE INDEX ix_states_attributes_id ON states (attributes_id);
I can't replicate the error on my MacBook using this setup
| software | version |
|------------------------|------------------------------------------------------|
| sqlite3-to-mysql | 1.4.15 |
| | |
| Operating System | Darwin 21.4.0 |
| Python | CPython 3.10.0 |
| MySQL | mysql Ver 8.0.28 for macos12.2 on x86_64 (Homebrew) |
| SQLite | 3.37.0 |
| | |
| click | 8.0.2 |
| mysql-connector-python | 8.0.23 |
| pytimeparse | 1.1.8 |
| simplejson | 3.17.5 |
| six | 1.16.0 |
| tabulate | 0.8.9 |
| tqdm | 4.62.3 |
It completes successfully.
(sqlite_to_mysql_3.10) ➜ sqlite3-to-mysql git:(master) sqlite3mysql -f ~/Downloads/test_created.db -d homeassistant -u root -p --debug -S
MySQL password:
2022-05-02 12:30:48 INFO Adding index to column "context_user_id" in table events
2022-05-02 12:30:48 INFO Adding index to column "context_id" in table events
2022-05-02 12:30:48 INFO Adding index to column "context_parent_id" in table events
2022-05-02 12:30:48 INFO Adding index to column "time_fired" in table events
2022-05-02 12:30:48 INFO Adding index to column "event_type, time_fired" in table events
2022-05-02 12:30:48 INFO Adding index to column "statistic_id" in table statistics_meta
2022-05-02 12:30:48 INFO Adding index to column "start, end" in table recorder_runs
2022-05-02 12:30:48 INFO Adding index to column "attributes_id" in table states
2022-05-02 12:30:48 INFO Adding index to column "old_state_id" in table states
2022-05-02 12:30:48 INFO Adding index to column "last_updated" in table states
2022-05-02 12:30:48 INFO Adding index to column "entity_id, last_updated" in table states
2022-05-02 12:30:48 INFO Adding index to column "event_id" in table states
2022-05-02 12:30:49 INFO Adding foreign key to states.old_state_id referencing states.state_id
2022-05-02 12:30:49 INFO Adding foreign key to states.event_id referencing events.event_id
2022-05-02 12:30:49 INFO Adding unique index to column "metadata_id, start" in table statistics
2022-05-02 12:30:49 INFO Adding index to column "start" in table statistics
2022-05-02 12:30:49 INFO Adding index to column "metadata_id" in table statistics
2022-05-02 12:30:49 INFO Adding foreign key to statistics.metadata_id referencing statistics_meta.id
2022-05-02 12:30:49 INFO Adding unique index to column "metadata_id, start" in table statistics_short_term
2022-05-02 12:30:49 INFO Adding index to column "start" in table statistics_short_term
2022-05-02 12:30:49 INFO Adding index to column "metadata_id" in table statistics_short_term
2022-05-02 12:30:49 INFO Adding foreign key to statistics_short_term.metadata_id referencing statistics_meta.id
2022-05-02 12:30:49 INFO Adding index to column "hash" in table state_attributes
2022-05-02 12:30:49 INFO Done!
The field events.created
gets created successfully
Can you try yourself and transfer just this empty database that you exported? Does it work?
Can you also please add some data to the tables, like a few megs, so I can investigate further?
Did you import this into a database? I am currently already using a mariaDB database that home assistant is actively writing to. When trying this on an empty database it works. (obviously). Can you show me commands on how to debug this future?
Oh, you are importing into an existing database with pre-existing data?
Hah, that's a tricky one. The tool was not designed to handle this kind of stuff and will NOT update existing table definitions.
I'm guessing that if all the tables match in terms of DDL, then you could try using
-X, --without-foreign-keys Do not transfer foreign keys.
-W, --ignore-duplicate-keys Ignore duplicate keys. The default behavior
is to create new ones with a numerical
suffix, e.g. 'exising_key' ->
'existing_key_1'
so add -X -W
at the end of your command.
But make sure that the DDL (table structure) matches!!!
P.S. I'm guessing that the error here is that your existing MySQL homeassistant
database's events
table doesn't have a created
column and when the tool tries to transfer it (because it assumes it's there) it throws an exception. In this case, you'll have to make sure manually that all the tables and keys match in terms of structure.
Ah! Now you mention it indeed it is missing the created
column. When adding that column it is working just fine :) (without needing the -X
and -W
)
Good catch and I should have payed more attention. Sorry!