Revolution1 / Flask-WhooshAlchemyPlus

Whoosh indexing capabilities for Flask-SQLAlchemy, Flask-WhooshAlchemy plus some improvement

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Error closing cursor error when use index_all() in celery with mysql

gaara4896 opened this issue · comments

Database that I am using is mysql. I had few flask_sqlalchemy configuration which include as below:

SQLALCHEMY_POOL_SIZE = 100
SQLALCHEMY_POOL_RECYCLE = 3600
SQLALCHEMY_POOL_TIMEOUT = 1800

What I am doing is I try to cron manual index_all(app) method using celery, which the task looks something like this

@celery.task
def index_product():
    try:
        index_all(app)
    except OperationalError:
        db.session.close()
        db.engine.dispose()

The trace back:

[2018-02-12 06:30:00,134: WARNING/PoolWorker-5] Indexing ProductList...
[2018-02-12 06:38:14,193: ERROR/PoolWorker-5] Error closing cursor
Traceback (most recent call last):
  File "/home/gaara/.virtualenvs/bst/lib/python3.6/site-packages/sqlalchemy/engine/result.py", line 1159, in fetchmany
    l = self.process_rows(self._fetchmany_impl(size))
  File "/home/gaara/.virtualenvs/bst/lib/python3.6/site-packages/sqlalchemy/engine/result.py", line 1318, in _fetchmany_impl
    row = self._fetchone_impl()
  File "/home/gaara/.virtualenvs/bst/lib/python3.6/site-packages/sqlalchemy/engine/result.py", line 1308, in _fetchone_impl
    self.__buffer_rows()
  File "/home/gaara/.virtualenvs/bst/lib/python3.6/site-packages/sqlalchemy/engine/result.py", line 1295, in __buffer_rows
    self.__rowbuffer = collections.deque(self.cursor.fetchmany(size))
  File "/home/gaara/.virtualenvs/bst/lib/python3.6/site-packages/pymysql/cursors.py", line 485, in fetchmany
    row = self.read_next()
  File "/home/gaara/.virtualenvs/bst/lib/python3.6/site-packages/pymysql/cursors.py", line 446, in read_next
    return self._conv_row(self._result._read_rowdata_packet_unbuffered())
  File "/home/gaara/.virtualenvs/bst/lib/python3.6/site-packages/pymysql/connections.py", line 1430, in _read_rowdata_packet_unbuffered
    packet = self.connection._read_packet()
  File "/home/gaara/.virtualenvs/bst/lib/python3.6/site-packages/pymysql/connections.py", line 1008, in _read_packet
    recv_data = self._read_bytes(bytes_to_read)
  File "/home/gaara/.virtualenvs/bst/lib/python3.6/site-packages/pymysql/connections.py", line 1037, in _read_bytes
    CR.CR_SERVER_LOST, "Lost connection to MySQL server during query")
pymysql.err.OperationalError: (2013, 'Lost connection to MySQL server during query')

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/gaara/.virtualenvs/bst/lib/python3.6/site-packages/sqlalchemy/orm/loading.py", line 70, in instances
    fetch = cursor.fetchmany(query._yield_per)
  File "/home/gaara/.virtualenvs/bst/lib/python3.6/site-packages/sqlalchemy/engine/result.py", line 1166, in fetchmany
    self.cursor, self.context)
  File "/home/gaara/.virtualenvs/bst/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1413, in _handle_dbapi_exception
    exc_info
  File "/home/gaara/.virtualenvs/bst/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 203, in raise_from_cause
    reraise(type(exception), exception, tb=exc_tb, cause=cause)
  File "/home/gaara/.virtualenvs/bst/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 186, in reraise
    raise value.with_traceback(tb)
  File "/home/gaara/.virtualenvs/bst/lib/python3.6/site-packages/sqlalchemy/engine/result.py", line 1159, in fetchmany
    l = self.process_rows(self._fetchmany_impl(size))
  File "/home/gaara/.virtualenvs/bst/lib/python3.6/site-packages/sqlalchemy/engine/result.py", line 1318, in _fetchmany_impl
    row = self._fetchone_impl()
  File "/home/gaara/.virtualenvs/bst/lib/python3.6/site-packages/sqlalchemy/engine/result.py", line 1308, in _fetchone_impl
    self.__buffer_rows()
  File "/home/gaara/.virtualenvs/bst/lib/python3.6/site-packages/sqlalchemy/engine/result.py", line 1295, in __buffer_rows
    self.__rowbuffer = collections.deque(self.cursor.fetchmany(size))
  File "/home/gaara/.virtualenvs/bst/lib/python3.6/site-packages/pymysql/cursors.py", line 485, in fetchmany
    row = self.read_next()
  File "/home/gaara/.virtualenvs/bst/lib/python3.6/site-packages/pymysql/cursors.py", line 446, in read_next
    return self._conv_row(self._result._read_rowdata_packet_unbuffered())
  File "/home/gaara/.virtualenvs/bst/lib/python3.6/site-packages/pymysql/connections.py", line 1430, in _read_rowdata_packet_unbuffered
    packet = self.connection._read_packet()
  File "/home/gaara/.virtualenvs/bst/lib/python3.6/site-packages/pymysql/connections.py", line 1008, in _read_packet
    recv_data = self._read_bytes(bytes_to_read)
  File "/home/gaara/.virtualenvs/bst/lib/python3.6/site-packages/pymysql/connections.py", line 1037, in _read_bytes
    CR.CR_SERVER_LOST, "Lost connection to MySQL server during query")
sqlalchemy.exc.OperationalError: (pymysql.err.OperationalError) (2013, 'Lost connection to MySQL server during query')

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/gaara/.virtualenvs/bst/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1288, in _safe_close_cursor
    cursor.close()
  File "/home/gaara/.virtualenvs/bst/lib/python3.6/site-packages/pymysql/cursors.py", line 426, in close
    self._result._finish_unbuffered_query()
  File "/home/gaara/.virtualenvs/bst/lib/python3.6/site-packages/pymysql/connections.py", line 1447, in _finish_unbuffered_query
    packet = self.connection._read_packet()
  File "/home/gaara/.virtualenvs/bst/lib/python3.6/site-packages/pymysql/connections.py", line 991, in _read_packet
    packet_header = self._read_bytes(4)
  File "/home/gaara/.virtualenvs/bst/lib/python3.6/site-packages/pymysql/connections.py", line 1022, in _read_bytes
    self._sock.settimeout(self._read_timeout)
AttributeError: 'NoneType' object has no attribute 'settimeout'

The error happen around 8 minute after it start indexing, far shoter than the timeout.

Is it a probblem specific to mysql? I only having problem with index_all()

index_all() queries 100 records everyloop and index them to whoosh, if time doesn't reach the timeout, maybe the package size of 100 records reach the max_allowed_packet limit (1mb by default)?

https://dev.mysql.com/doc/refman/5.5/en/packet-too-large.html