Error convertling sqlite db with custom column type name
tslater opened this issue · comments
Describe the bug
I have a db I'm trying to convert that uses "META" as the column type.
Expected behaviour
Script to complete successfully, assuming string type for the meta column
Actual result
Script exited with error:
2021-04-13 12:51:53 ERROR MySQL failed creating table ilis: 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'META NULL, PRIMARY KEY (`rowid`), CONSTRAINT `ilis_rowid` UNIQUE (`rowid`) ) ENG' at line 1
System Information
$ sqlite3mysql --version
| software | version |
|------------------------|-----------------------------------------------------|
| sqlite3-to-mysql | 1.3.12 |
| | |
| Operating System | Darwin 20.3.0 |
| Python | CPython 3.9.2 |
| MySQL | mysql Ver 8.0.23 for osx10.16 on x86_64 (Homebrew) |
| SQLite | 3.34.1 |
| | |
| click | 7.1.2 |
| mysql-connector-python | 8.0.23 |
| pytimeparse | 1.1.8 |
| simplejson | 3.17.2 |
| six | 1.15.0 |
| tabulate | 0.8.9 |
| tqdm | 4.59.0 |
So I'm wondering if this is a more safe way of handling unknown column types? I'm probably lacking a lot of context so I apologize if this is an ignorant proposal. It solves my problem, but might have other implications.
Heh, non-standard fields. That's a fringe case. The only ones I built support for are these. Kinda the same ones SQLAlchemy supports.
I apologize, I meant to put a link to a proposed change in the above comment:
https://github.com/tslater/sqlite3-to-mysql/pull/1/files
Your PR won't work since it will default all the fields not covered in the specific conditions to the string default. That's not an option.
Well just add print(full_column_type)
and observe it.
It will return any type of valid column name, that's not handled in the above conditions. If you just default it to string anything that's not in those conditions will be a string. Not good. You need to check if the data type is valid or not before you do that.
I'm currently working on a patch that does just that.
I just pushed a commit handling this. Check out the master branch and report back if the error still persists. I'll release a new version tomorrow probably.