json2insert fails if thread concurrency decreases
mikethebeer opened this issue · comments
Michael Beer commented
If the concurrency parameter gets decreased to a certain number json2insert
command throws an exception.
(.venv) cr8 [master +] > cat /Users/mibe/demo_0_.json | cr8 json2insert -c 10 mibe.demo st1.p.fir.io:44200 | jq '.'
Executing requests async bulk_size=1000 concurrency=10
191 requests [00:09, 16.29 requests/s]Traceback (most recent call last):
File "/Users/mibe/sandbox/cr8/.venv/bin/cr8", line 9, in <module>
load_entry_point('cr8==0.4.1.dev10+ng61b1deb', 'console_scripts', 'cr8')()
File "/Users/mibe/sandbox/cr8/cr8/main.py", line 22, in main
p.dispatch()
File "/Users/mibe/sandbox/cr8/.venv/lib/python3.5/site-packages/argh/helpers.py", line 55, in dispatch
return dispatch(self, *args, **kwargs)
File "/Users/mibe/sandbox/cr8/.venv/lib/python3.5/site-packages/argh/dispatching.py", line 174, in dispatch
for line in lines:
File "/Users/mibe/sandbox/cr8/.venv/lib/python3.5/site-packages/argh/dispatching.py", line 277, in _execute_command
for line in result:
File "/Users/mibe/sandbox/cr8/.venv/lib/python3.5/site-packages/argh/dispatching.py", line 265, in _call
for line in result:
File "/Users/mibe/sandbox/cr8/cr8/json2insert.py", line 72, in json2insert
aio.run(f, bulk_queries, concurrency, loop)
File "/Users/mibe/sandbox/cr8/cr8/aio.py", line 58, in run
consume(q)))
File "/Users/mibe/.pyenv/versions/3.5.0/lib/python3.5/asyncio/base_events.py", line 342, in run_until_complete
return future.result()
File "/Users/mibe/.pyenv/versions/3.5.0/lib/python3.5/asyncio/futures.py", line 274, in result
raise self._exception
File "/Users/mibe/.pyenv/versions/3.5.0/lib/python3.5/asyncio/tasks.py", line 239, in _step
result = coro.send(value)
File "/Users/mibe/sandbox/cr8/cr8/aio.py", line 32, in map_async
await q.put(task)
File "/Users/mibe/.pyenv/versions/3.5.0/lib/python3.5/asyncio/queues.py", line 140, in put
'queue non-empty, why are getters waiting?')
AssertionError: queue non-empty, why are getters waiting?
Task was destroyed but it is pending!
task: <Task pending coro=<measure() running at /Users/mibe/sandbox/cr8/cr8/aio.py:24> wait_for=<Future pending cb=[wrap_future.<locals>._check_cancel_other() at /Users/mibe/.pyenv/versions/3.5.0/lib/python3.5/asyncio/futures.py:403, Task._wakeup()]>>
Task was destroyed but it is pending!
task: <Task pending coro=<measure() running at /Users/mibe/sandbox/cr8/cr8/aio.py:24> wait_for=<Future pending cb=[wrap_future.<locals>._check_cancel_other() at /Users/mibe/.pyenv/versions/3.5.0/lib/python3.5/asyncio/futures.py:403, Task._wakeup()]>>
201 requests [00:09, 20.88 requests/s]
If the concurrency parameter gets increased - OKAY
(.venv) cr8 [master +] > python -V 16:40:12
Python 3.5.0
Mathias Fußenegger commented
Seems like you're running into this issue here
I don't think there is anything I can do about it.
But I've added a note in the readme